Ang Lee Shows Preview of 4K 3D 120FPS Movie in Las Vegas

Ang Lee showed preview footage of his upcoming film, Billy Lynn’s Long Halftime Walk, at an event in Las Vegas on Saturday. Unlike most previews, the plot or characters weren’t the centerpiece of the movie, but instead the advanced photography and projection techniques used to screen the movie, which the director believes may be the future of the filmmaking industry in the digital age.

The showing was part of the National Association of Broadcasters trade show, where Lee showed off 11 minutes of 3D footage from the film. It wasn’t presented in the typical fashion however, the show making use of a pair of laser projectors that were able to project a full 4K image at an amazing 120 frames per second per eye. This puts regular movies to shame in a number of ways, with the framerate alone being five times that of a typical film with Lee explaining that the jump to 120 FPS 4k allowing for crystal clear images without many of the negative visual artifacts that appear in traditional films.

This isn’t the first time a director has tried to make use of higher than standard framerates in their films, with Peter Jackson making use of 48 FPS recording in his 2012 film, The Hobbit. There were mixed feelings on the use of the higher frame rate with many feeling it stripped away the pseudo-realism that 24 FPS invokes. Lee, however, did not cite Jackson’s use of the technique as part of his reasoning for using it himself.

Whether the use of these techniques will catch on remains to be seen, but Ang Lee and his team feel that it is worth experimenting with new techniques to bring the film industry forward. Lee believes that employing technologies such as those that allow for higher frame rates is not just a case of mindlessly applying new technologies but instead requiring it’s own creative development. “This will be a long journey. I think we’re at the beginning of finding out what digital cinema means,” he said. “We’re not quite there yet, but it’s a strobe-free dimension.”

Unfortunately, for many, they will not get a chance to see the film as Lee showed off this weekend as no cinemas in the world have the ability to project the same way it was at the NAB show, with the film instead being shown in a variety of formats when it is released in November, including HFR versions.

Firaxis Working on XCOM 2 Fixes

People can’t wait for their new favourite game, with trailers and advertisements teasing and tempting you into saving every world from a threat or every road for a quick race. One of the former games is the eagerly anticipated XCOM 2, a game which has had some issues with their initial release version.

According to lead designer, Jake Solomon, the game’s technical problems are “the first thing we talk about when we come in in the morning”. With such a high-profile release, when the game runs badly for people they get upset quite vocally, as is the case with XCOM 2. While they are working on fixes for a range of things, with small hotfixes being released commonly, large fixes for things such as the framerate issues are “coming soon”.

In fairness, the game seems to only have problems at rare and random intervals, with most players being able to enjoy the game with no issues (or at least, minor ones). Some of the things that bugged players was the camera pauses for a second after kills or certain events in-game, something which Solomon claims responsibility for saying “that stuff’s on me… if things can be sped up they certainly will”.

It’s nice to see people making games that feel so passionate about their creation, even after the initial paychecks. With releases such as Aliens: Colonial marines damaging companies reputations beyond repair for a lot of fans and gamers, having a good aftercare package is essential these days.

Early Assassin’s Creed: Syndicate GTX 970 Performance Revealed

Assassin’s Creed: Syndicate’s PC release is imminent and scheduled for the 19th November worldwide. Compared to previous titles, this is a fairly brief delay and raises questions about the studio’s ability to create an optimized port. Back in August, Sam Kovalev, Studio Production Manager at Ubisoft Kiev proclaimed:

“We have introduced several new improvements to our production pipeline and validation process, which allowed us to focus on polishing, stabilizing and optimizing the PC version very early on in the project,”

“This has been one of the top priorities for the production team this year.” 

“The additional four weeks are for us to really bear down and finalize all of the polish and optimization, to make sure the game and all of its systems are stable when it launches, so it runs smoothly for all players starting on day one,”

Without trying to sound too cynical, PC gamers have heard similar promises before and experienced poor scaling across a wide range of hardware. Thankfully, just before release, a video has emerged which provides an insight into the game’s performance. The video was originally found by Twitter user @RobotBrush who was kind enough to share the technical analysis:

The test system in question revolves around a GTX 970 played at a resolution of 1920×1080. This is a fairly popular configuration among hardware enthusiasts and shouldn’t encounter any major problems when aiming for 60 frames-per-second. To test system performance, every setting was turned up to maximum and recorded with two pieces of monitoring software. MSI Afterburner and FRAPS were both used to determine GPU utilization and frame-rate. Although, having both of these running simultaneously might have impacted on performance.

Nevertheless, during indoor sections, Assassin’s Creed: Syndicate hovers around the 40-45 frames-per-second mark which is pretty disappointing albeit playable. However, once the player moves to outdoor environments containing large crowds, the frame-rate suddenly drops to around 30 and can get as low as 23. Initially, I thought the dramatic change could be a result of low GPU usage, but the MSI Afterburner clearly shows 99% utilization.

Theoretically, the disappointing performance could be improved after a release-date patch but this is still not an ideal situation. Additionally, if a GTX 970 cannot attain over 30 frames-per-second in densely populated areas, how will 2560×1440, 3440×1440 or 4K users be able to reach a playable frame rate?

Fallout 4’s Engine Runs Faster at High Framerates

Fallout 4 has finally been released and received widespread acclaim among RPG fans. However, technical analysis shows the console versions can drop to 20 frames-per-second, and the game suffers from noticeable bugs. Any relatively modern PC is capable of reaching higher framerates, and provide a smooth, fluid experience. Many PC gamers opt for a high refresh rate 1920×1080 panel over a 60Hz 2560×1440 or 4k monitor as they prioritize gameplay over image quality.

Unfortunately, Fallout 4’s engine ties the running speed to the framerate. As a result, the game at high refreshes dramatically speeds up by approximately 20-30 percent. This makes it an unplayable mess and illustrates how the engine has remained largely unchanged from Bethesda’s older titles. Clearly, 120-144hz gamers are a niche, but this doesn’t excuse poor programming. This personally would not effect me as I use a 3440×1440 60Hz monitor, but I can see it becoming a major problem for people using monitors like the ASUS ROG Swift.

I’m not entirely sure this can be fixed via a patch because it’s an integral component of how the engine has been coded. Furthermore, it won’t be given a priority as most people are using 60Hz monitors at 1920×1080 or less.

Do you think there is a noticeable difference going from 60Hz to 144Hz?

AMD Fury X Gets Benchmarked on Far Cry 4

Details about AMD’s latest Fury X graphics card have been focused on the hardware and specs. We did know the card came with AMD’s HBM technology and a lot of computing power. If you’ve missed the spec details, you can view them again here. So how well can it handle the latest games? Well, AMD seems to have ut it to the test.

The Fury X was recently benchmarked on Ubisoft’s latest Far Cry title, Far Cry 4, at the Beijing conference. The company showed that Fury X is able to handle 4K resolution while rendering Far Cry 4 in Ultra Settings at an average rate of 54FPS. The minimum was indeed 43FPS, but we are talking about 4K here and I’m sure nothing that is within an acceptable price range can handle that at the latter FPS.

Also, compared to other cards benchmarked on Far Cry 4, the Fury X seems to be at the top line. I mean it looks to have even taken on NVIDIA’s GTX Titan X as shown below. The only competition it has, according to the benchmarks, is the R9 295X2. However, you can’t really compare a dual-GPU card or an SLI configuration’s statistic with just a single card’s output.

What we need to know now is what price tag coming with this beast. We don’t want to speculate on a price right now, but since it’s AMD, we can’t be looking at something greater than what NVIDIA is asking for their own cards. Or will we?

Images courtesy of LinusTechTips

AMD Says Nvidia Sabotaged Witcher 3 Performance

There was an internet war  earlier this week as gaming fans decided that, once again, Nvidia’s GameWorks technology was messing with the performance of its games on AMD hardware.

At first it was the racing game, Project Cars that attracted the attention of the vast Reddit community, with people stating that the game is built on a version of PhysX that doesn’t work on AMD hardware.

In response, AMD’s corporate VP of alliances Roy Taylor responded with a tweet, saying “Thank for supporting/ wanting an open and fair PC gaming industry.”This was  followed by a Reddit reply from Nvidia’s GameWorks director Rev Lebaredian, saying that “PhysX within Project Cars does not offload any computation to the GPU on any platform, including NVIDIA. I’m not sure how the OP came to the conclusion that it does, but this has never been claimed by the developer or us; nor is there any technical proof offered in this thread that shows this is the case.”

With complaints flowing in, Project Cars developer Slightly Mad Studios joined the ongoing battle, and proceeded to place the blame for the game’s issues squarely on AMD. “We’ve provided AMD with 20 keys for game testing as they work on the driver side,” said Ian Bell. “But you only have to look at the lesser hardware in the consoles to see how optimised we are on AMD based chips.”

Whilst AMD seems to have made up with Slightly Mad Studios, the company is facing yet another supposedly GameWorks-related struggle with CD Projekt Red’s fresh release The Witcher 3. The game makes use of several GameWorks technologies, one of  which adds tens of thousands of tessellated hair strands to characters, dramatically decreases frame rate performance on AMD graphics cards, sometimes by as much as 50 percent.

A developer from the company stated :

“Many of you have asked us if AMD Radeon GPUs would be able to run NVIDIA’s HairWorks technology—the answer is yes! However, unsatisfactory performance may be experienced, as the code of this feature cannot be optimized for AMD products. Radeon users are encouraged to disable NVIDIA HairWorks if the performance is below expectations.”

AMD make it sound that Nvidia isn’t willing to share the source code for its proprietary graphics APIs like HairWorks and HBAO+. Without that source code, AMD can’t optimize its drivers for Nvidia’s tech. Nvidia responded to the claims and refused to go along with the argument that access to source code would solve AMD’s problems.

You can disable the HairWorks feature to return your frame rate, but obviously, you will lose some of the phenomenal detail.

Thank You to arstechnica for providing us with this information

Image courtesy of GameSpot

Confusing Assassin’s Creed: Unity GPU Recommendations and Optimising Issues

As we’re inundated with comments about Assassins Creed: Unity barely performing above 50 FPS on low settings throughout higher end graphic card offerings from NVIDIA and AMD, we’ve taken some time out to look at NVIDIA’s recommendations as to what exactly you should purchase to run Ubisoft’s latest game.

Take a look at this table and explanation below, it mentions that Unity has some pretty demanding system requirements (we wrote about it pre-release) and suggests exactly what you should expect to run to reach smooth frame-rates at full HD – 1920 x 1080 resolution.

Now we must admit, aiming for 40 FPS is pretty poor – we suggest you should be looking at around 120 FPS minimum on any game to have a fully smooth experience. But according to this table, you should be looking at a rather nice experience running your budget grade GTX 780 Ti which will set you back a pocket change amount of $570 US in today’s market. Obviously we’re being sarcastic in this remark, and it leaves us thinking if Ubisoft spent enough time optimizing their game.

Looking at the plethora of posts on Reddits PC Master Race shows that even given these minimum system requirements, the game still isn’t properly functional. Below you can see Reddit user ElliotCarter94 using his R9 290 card coupled with a i5 3570k running at 3.8GHz, only managing to achieve 39 FPS on a 640 x 480 resolution set at the lowest graphic quality.

This is coupled with other users claiming 47-52 FPS when utilizing their GTX 760’s on low settings, among other issues of console wording being unchanged when Ubisoft went through through the process of porting it to PC.

Furthermore, if you take a quick look at NVIDIA’s apparent amazing comparisons when comparing high and low settings within Unity, they provide two total comparisons set in different environments, set to display the major difference between running high or low settings. You can see example one here, and example two here. If you look closely, you’ll find that there’s basically no difference at all – what’s up with that? On top of this fact, NVIDIA are offering free Assassins Creed: Unity copies with some of their graphic cards, which begs the question – why are they offering a game which the card wont run?

Please don’t let it end like this Ubisoft, they’ve made some great games in the past – but as of recent times, they’ve really proven that they are out of touch with the community and are seemingly dragging NVIDIA down with them.

We are currently working on securing a PR contact for Ubisoft – we will continue to report as soon this happens or as the story develops.

Cover image courtesy of eTeknix

Nvidia Researcher Discovers a Way to Quadruple Future VR Headset Resolution

A researcher from NVIDIA has apparently discovered a new manufacturing technique which could quadruple the perceived resolution of virtual reality gear in the future. The technique in question is called ‘display cascading’ and uses cascade displays (of course).

Nvidia is said to already have produced a prototype of a headset using the above mentioned technique. A report from MIT is said that the new technique improves the perceived resolution of virtual reality displays. Senior director of research in visual computing at NVIDIA, David Luebke, is said to be the man behind the new technology. He has been stated to use a cascaded display system made up of two modified ‘off-the-shelf’ liquid crystal display panels.

A layer of tiny shutters (one per pixel) which can block off or allow light through, called the spatial light modulation panel, is said to be removed from one LCD and placed over a second panel, offset from its own. This method is said to split each pixels into four individually addressed areas, thus quadrupling the effective resolution at a cost of a decrease in brightness.

Luebke states that along with some driver optimizations, a cascade display should provide both improved resolution and a double perceived framerate, achievable by having both panels run in perfect synchronization. Also, the NVIDIA researcher stated he will unveil the manufacturing technique at a conference in August. For those interested, the research is currently available over at NVIDIA’s website.

Thank you Bit-Tech for providing us with this information

Performance Drops Seen In Latest Xbox One Patch For Call of Duty: Ghosts

Usually when you hear about a Patch coming up, you think about fixes, improvements, better performance rating or eventually additional content. Well, this is not the case for the latest Call of Duty: Ghosts patch for Xbox One, since users are reporting performance drops after updating.

The Onslaught DLC along with its stability patch have degraded the performance for Microsoft’s latest next-gen console, leaing to slower framerate during cetrain points on the map. And as we all know it, framerate drops are the worst kind of issues anyone wants to face.

David Bierton from Digial Foundry has stated that when in the Stonehaven map, “it’s immediately obvious that things are not quite right. While the game does indeed hit 60fps in this stage, frame-rates are dramatically impacted when the whole level is in view, with metrics varying between 46-60fps as we run across the landscape. As frame-rates fall below the desired 60fps target, we also see the appearance of some screen-tear, adding some judder, making drops in smoothness more noticeable.”

It is also said that looking through the scope in certain points of the map, players could experience framerate drops as low as 30 FPS. Activision is expected to release a patch soon to fix the issue, though nobody knows how the buggy patch was released in the first place.

Thank you Ubergizmo for providing us with this information

Sniper Elite 3 Rumored To Run Slower On Xbox One While PS4 Said To Boast 60 FPS

Sniper Elite 3, the tactical shooting game and successor of Sniper Elite V2, will be launched on both next-gen consoles, PS4 and Xbox One, as well as PC. New screenshots have been released for the upcoming title, as well as some good news for players.

Tim Jones, Rebellion Head of Design, states that the Sniper Elite 3 title has been “built from the ground up to plug all the holes in gameplay that didn’t manage to go as far as we wanted to with Sniper Elite V2.” He also mentioned some tweaks made to the game, including more stealth options, non-linear gameplay as well as new advanced weapons with weapon customization, something that was not available in any Sniper Elite titles before. The new modifications are said to be wrapped in an interesting story based on historical WW2 events

“Our fans, they really have a deep passion and understanding, not just for the weapons, but for the history as well,” Jones said. “We can’t wait ’til they get their hands on it.”

WCCF also reports that Sony PS4 players will have a slight advantage than Xbox One fans. They have learned that Sniper Elite 3 will perform better on the PS4, having the game run on 60 FPS, while the Xbox one will have a slightly less FPS count. Although, the framerate and resolution on both consoles along with its performance has not yet been officially announced or determined. Yet, it would be nice to see the game running smooth on both PS4 and Xbox One. PC specs and performance has not yet been released though.

Sniper Elite 3 will be released this year on both next-gen consoles as well as PC, however an official launch date has yet to be revealed.

Thank you WCCF for providing us with this information
Images courtesy of WCCF

AMD 14.11 Beta Driver Gets Released To The Press, First Impressions Revealed

Reports are that AMD has finally released the Catalyst 14.1 beta to the press. The driver is said to bring along the first release of Mantle, AMD’s ambitious 3D graphics API to rival Direct3D and OpenGL. The Catalyst 14.1 beta is said to enable the 3D renderer option in Battlefield 4, which lets you choose between DirectX 11.1 and Mantle.

TechPowerUp apparently gave it a try and they reported that the first ‘impressions’ about AMD’s mantle were nothing close to amazing. The difference between Mantle and non-Mantle systems is currently not noticeable on a Radeon R9 290 graphics card, having all settings set to 1920 x 1080 resolution, Ultra details and 4x MSAA on Battlefield 4. It is reported that the game runs smooth and well over 60 on both Mantle and non-Mantle settings. However, TechPowerUp point out that owners of a Radeon R9 270X will see a significant increase having the same settings applied, whereas on non-Mantle specs, the game’s FPS would drop below 60 FPS. Also, the driver is reported not to be optimized for any of the Graphics CoreNext (GCN) based GPUs other than Radeon R9 290 series, R9 260X, and A-Series “Kaveri” APUs.

TechPowerUp’s specs were as following: 8 GB of DCh DDR3-1333 memory, an AMD 990FX motherboard (ASUS M5A99FX-PRO R2.0, 2201 BIOS, UEFI mode); Windows 8.1 64-bit, and of course, a Radeon R9 290 (BIOS: 015.042.000.000.003747). The game settings are said to be 1920 x 1080 pixels resolution, disabled V-Sync, “Ultra” preset, HBAO, and 4x MSAA. We used the “PerfOverlay.FrameFileLogEnable 1” console command to spit out CSV files with frame-times in ms.

A 167-second playthrough on the single-player campaign chapter “Fishing in Baku,” was performed with the above specs, first being a non-Mantle test. An average frame-time of 16.26 ms, which works out to 61.5 fps. For Mantle, the same 167-second mark was performed, and the average frame-time was 14.45 ms, which works out to 69.2 fps. Overall, a 12.5 percent performance uplift was recorded. It is not much, but given the driver is still a Beta and still not optimized for anything other than the R9 290 series and R29X (if the optimization is final for those as well), we can see the 319% increase stated by AMD in the future.

Thank you TechPowerUp for providing us with this information