DX12’s Bonuses Only Achievable by Dropping DX11 Says Hitman Dev

When new technology comes out it tends to take time for systems and developers to get to grips with them, with their advertised bonuses normally offered at some price, but is it too steep with Hitman’s developer saying that if you want to experience DX12’s bonuses only achievable by dropping DX11 support completely.

Hitman was released earlier this year to favourable reviews, with an entire YouTube series putting people in command of Agent 47, including the likes of the chuckle brothers. The lead developer behind the game, Jonas Meyer of IO Interactive, has not stated that if you want the 20% CPU and 50% GPU bonuses that Microsoft promise with DX12 you will have to drop DirectX 11 support entirely. Hitman, on the other hand, was more of a port from the former framework to DX12.

With games getting released more and more often with DX12 at their core, such as the remake of the classic Gears of War games, suffering from less than amazing performances the new graphical library doesn’t look to show off as much as it was advertised just yet.

When asked about another low-level API for graphics, Vulkan, Meyer responded by describing the API as a “graphics programmer’s wet dream” but stated that they don’t have any plans to add Vulkan support to Hitman.

AMD Says DX12 And Vulkan “Serve a Need And Add Value”

The advent of low-level APIs such as DirectX 12 and Vulkan have the potential to revolutionize the way modern games scale across various hardware setups. Clearly the gains compared to DirectX 11 are still unknown until a game’s engine offers a direct comparison between the two APIs on identical hardware. Theoretically, it could be the most significant change to PC gaming in years and allow for enhanced optimization. There’s a huge debate regarding Microsoft’s DirectX 12 system and the open source Vulkan API. In a recent interview with Tom’s Hardware, AMD’s VR director, Daryl Sartain described the current state of modern APIs and how mantle contributed to the development of DirectX 12:

“I view Mantle as something – because we did a lot of contribution to the features into DX12 –  that has been spun into DX12 in so many ways. But to your question on Vulkan versus DX12, without getting into all the religious aspect, what I said yesterday [on the VR Fest panel] is that I think that both serve a need and add value. Can you make an argument that one is better than the other? You can make an argument about anything. Just bring a lawyer into the room.”

“But I do believe that, and what I most am concerned about is our ISVs, the ISV community, where they gain the greatest benefit. You know, there are some people developing on Linux, all different flavors of life – so it’s a difficult question as to which [API] should we be focused on, which one is better”.

“My opinion is that Windows as a platform, as an OS, is far better and far more evolved today than some of the previous generations, and that’s to be expected. DX12 and its integration into Windows is a great experience, is a great development environment, and has great compatibility. Does that mean that Vulkan doesn’t have a place? No. I think that answer really has to come from the development community, not from us.”

This is a fairly non-committal response but it’s too easy to see a clear advantage from either API. At least there’s a clear alternative to DirectX 12 if you want to go down the open source route. Given the success of Windows as a gaming operating system, I cannot see DirectX 12 being overtaken unless there are some very clear performance or feature benefits.

AMD Launches GPUOpen Sofware Development Package

AMD’s open source philosophy deserves a great deal of credit especially when you consider the competition utilizes proprietary features as demonstrated by NVIDIA Gameworks. During CES 2016, I had the pleasure of playing Star Wars Battlefront on AMD’s upcoming Polaris architecture. This open ideology is very impressive and showcases that AMD’s future chips are fully functional ahead of their launch. In contrast to this, details about Pascal are almost non-existent barring a few marketing photographs. Today, AMD released a huge blog post about their latest initiative entitled, GPUOpen. Here is a detailed run-down of the project in AMD’s words:

“GPUOpen is composed of two areas: Games & CGI for game graphics and content creation (which is the area I am involved with), and Professional Compute for high-performance GPU computing in professional applications.

GPUOpen is based on three principles:

The first is to provide code and documentation allowing PC developers to exert more control on the GPU. Current and upcoming GCN architectures (such as Polaris) include many features not exposed today in PC graphics APIs, and GPUOpen aims to empower developers with ways to leverage some of those features. In addition to generating quality or performance advantages such access will also enable easier porting from current-generation consoles (XBox One and PlayStation 4) to the PC platform.

The second is a commitment to open source software. The game and graphics development community is an active hub of enthusiastic individuals who believe in the value of sharing knowledge. Full and flexible access to the source of tools, libraries and effects is a key pillar of the GPUOpen philosophy. Only through open source access are developers able to modify, optimize, fix, port and learn from software. The goal? Encouraging innovation and the development of amazing graphics techniques and optimizations in PC games.

The third is a collaborative engagement with the developer community. GPUOpen software is hosted on public source code repositories such as GitHub as a way to enable sharing and collaboration. Engineers from different functions will also regularly write blog posts about various GPU-related topics, game technologies or industry news.”

This is fantastic news for developers and assists the optimization process through open source tools. In theory, the GPUOpen model allows developers free reign to properly understand the hardware and code in a much more efficient manner. Time will tell how popular this endeavour is, but it looks like a really good idea to build a strong relationship between developers and AMD users.

Gears of War Ultimate Gets Glorious Graphics Features on PC!

It’s no secret that PC gaming is often more graphically impressive than its console counter parts, the hardware, your own budget permitting, can be a lot more powerful, so that’s hardly a surprise. So what can PC gamers expect to enjoy when the game hits our screens?

The game is one of the first AAA DX12 games to hit the market, so it’ll take advantage of the new API to bring better framerates, rendering technologies and effects, as well as Async Compute, which already makes me want to play it more than anything. On top of that, you can enjoy the games unlocked frame rate, so those of you running 144hz monitors will be able to reap the full rewards of your displays technology.The developer even said they’re putting a lot of effort into this version of the game, ensuring it’s a showcase for 4K, so those of you with high resolution displays will, again, be able to fully enjoy that aspect.

So we’ve got improved frame rates, a new API taking control to bring us better CPU and GPU performance, as well as better rendering, we’ve got upgraded 4K assets to make it look awesome, as well as support for MSAA and FXAA.

Here’s a quick run down of what you can expect. Excited?

  • Remastered for Dolby 7.1 Surround
  • New Xbox Live achievements (1,250 Gamerscore)
  • Concept art gallery and unlockable comics
  • Modernized Multiplayer featuring:
  • Dedicated servers
  • Skill-based matchmaking
  • New game types – Team Deathmatch, King of the Hill (Gears of War 3 style), and new 2v2 Gnasher Execution
  • Total of 19 maps, including all DLC maps
  • 17 unlockable Gears of War 3 characters for Multiplayer progression
  • More Match customization including Actives, Respawn Time, Self-revive and Weapon Respawn
  • Modernized gameplay with smoother movement and updated controls:
  • Alternate Controls and all new Tournament Controls
  • Adding the Gears of War 3 features you love: Enemy Spotting, Multiplayer Tac-Com, Improved sensitivity customization

Thank you WCCFTech for providing us with this information.

Square Enix Pulls Plug on OS X Final Fantasy 14

Pulling game sales after a failed launch seems to be a more common occurrence. Square Enix has announced that they will be pulling the Mac version of Final Fantasy 14 A Realm Reborn after a launch riddled with poor performance. In addition to stopping sales, Square Enix is facilitating refunds for customers interested. In the meanwhile, Square Enix plans to work on the Mac version of the game until performance is up to par. Square Enix has admitted the game was accidentally released too early before all the bugs were fixed.

Producer and Director Naoki Yoshida noted that Square Enix had made a number of serious mistakes. First, the Mac system requirements were inadequately communicated and released incorrect system requirements, meaning those some bought the game thinking it would play fine on their system when it couldn’t. Yoshida admitted that if accurate system requirements had been communicated, many might not have purchased the game.

One major source of the low performance was blamed on OpenGL. Square Enix turned to TransGaming to provide a WINE based middleware to let the DirectX native game work on OS X. A native port was not possible due to development cost concerns and relatively low demand. According to Yoshida, coding a version to OpenGL would be sub-optimal due to a 30% performance deficit compared to DirectX for FFXIV specifically. While work continues on the OpenGL version, hopes are that Apple’s new Metal API will kick in to improve performance in the future.

When creating cross-platform games, creating a native version of each platform can be a pain and getting a good OpenGL port out can be problematic as most devs are focused on DirectX. However, this is where cross-platform game engines like Unity Unreal Engine 4 and Cryengine come in, moving the responsibility for managing multiple APIs to the engine developers. Big devs can and do create their own engines to run their game. In these cases, enough attention has to be given to the secondary platforms, something that Square Enix did not seem to do.

DirectX 11 Unofficially Heading to Linux

Prior to the launch of Windows 8, gaming on Linux had very little coverage and solely relied on Wine compatibility layers to compile Windows software. While Linux is still a niche outlet, the introduction of SteamOS and a number of native blockbuster games such as Metro 2033 has expanded its appeal. However, too many concessions have to be made if you want the flexibility to play your entire library. In theory, you can use Wine to force unsupported applications but this can cause various stability problems.

Even though Wine itself adopts the open source principle, there is a premium offering from Codeweavers which pays developers to add functionality which eventually becomes an integral component of Wine. This piece of software entitled, Crossover is almost an investor tool to improve Wine’s UI, compatibility and ease of use. Unfortunately, Wine has been limited to DirectX 9 for a considerable time and this makes it incompatible with many modern games which opt for the latest DirectX API. In a blog post during E3 2015, Codeweavers announced they had finally cracked the DirectX 11 problem and plan on bringing it to Crossover by the end of 2015. Their project lead clarified:

“In the coming months, CodeWeavers will have support for DirectX 11; better controller support; and further improvements to overall GPU performance. While these incremental improvements for game support may seem small (at first), the cumulative improvements for game support will allow for many of these games to ‘just run’ when released.” 

This is an intriguing development and although DirectX 12 will be a monumental improvement, it’s always welcome to see games being supported on every platform. I still have doubts though if proponents of open source will be comfortable with this and simply prefer the OpenGL implementation.

If Linux offered a truly low overhead and comprehensive game support, would it be enough to make you switch?

Thank you PC World for providing us with this information.

First DX12 CryEngine Tech Demo Revealed

Want to see the difference between DirectX 11 and DirectX 12, again? Of course you do, so I’m very happy to reveal that the hard-working team at Snail Games has just released their brand new DirectX 12 tech video.

The new video demonstrates the power of the new DirectX 12 API and combined with the graphical powerhouse that is CryEngine, as well as the already gorgeous looking online game King of Wushu, it’s certainly a real treat for your polygon loving eyes.

The game already takes advantage of features such as Nvidia Gameworks for HairWorks and PhysX Clothing, but with DirectX 12 the game can push more polygons, increased draw calls, more on-screen characters and still keep within the same level of demand on the GPU hardware; which is nothing short of incredible.

Of course, you won’t actually just be able to update DirectX just yet and enjoy the game in its new form, as DirectX 12 will be launching on Windows 10 later this year; something that simply can not come soon enough!

https://youtu.be/sV9aa_12Ap0

Thank you DSO for providing us with this information.

Microsoft Talks About Using AMD and Nvidia GPUs Simultaneously in DirectX 12

It’s no secret that DirectX 12 is going to make a huge change in the way we game, it’s going to bring improved performance, optimisation and a whole lot more when it launches alongside Windows 10 later this year. What is interesting, however, is how the API will be able to utilize all available hardware, even if it’s from different hardware manufacturers, something that hasn’t really been hit upon with current generation hardware and software.

DirectX 12 is seeing widespread adoption, at a rate that hasn’t been seen since the switch to DirectX 9. “It’s been the fastest adopted API in more than a decade.” said Microsoft’s Principal Direct X Development Lead Max McMulle.

McMulle also went on to explain the DX12 Multiadapter functionality, which will support both linked and unlinked GPUs. Linked is what we’re familiar with, Crossfire and SLI, but unlinked is quite different, as it’ll accept GPU hardware such as Intel, AMD and NVidia simultaneously and even allow for the use of each cards unique feature sets. “This feature exposes direct control of two linked GPUs as well. So you might have known as Crossfire or SLI from AMD and NVIDIA, respectively, but you’ll get direct control of both GPUs in DX12 if you so choose.” said McMulle.

The discretion lies with the developer, if they code their game or software to handle both types of hardware at the same time, it can be done, what this means for gamers remains to be seen, but the prospect of upgrading to a new GPU and no having to get rid of your old one is no bad thing, allowing to use all the rendering power at your disposal. Even more importantly, it’ll allow independent memory allocation in Crossfire and SLI systems, giving you the VRAM of all cards (linked or unlinked), something that is currently unavailable in current multi-GPU configurations.

Thank you DSO for providing us with this information.

 

 

DirectX 12 Can Push 6 to 12 Times the Polygons of DirectX 11

Microsoft is eager to show the advancements made with DirectX 12 and so far, things have been exceedingly positive. We’ve seen that DirectX 12 can handle a lot more draw calls, more polygons, improved frame rates, more advanced graphical features, a list of benefits that just goes on and on. Now, at Microsoft’s Build 2015, they’re eager to show it off again.

The demo system was no slouch, a quad-SLI Digital Storm gaming PC, running the popular 2012 Final Fantasy Agni’s Philosophy technical demo. When this demo was run on DirectX 11 on the same hardware, it was impressive, no doubt about it, but DirectX 12 ran circles around it. The DirectX 12 version was able to run with 63 million polygons, which is between 6 and 12 times as many as were shown on the DirectX 11 version of the demo, dependant on which scene was being rendered at that time.

“As a part of a research project that studies a variety of next generation technologies, Square Enix conducted extensive research on real-time CG technology utilizing DirectX 12 in collaboration with Microsoft and NVIDIA Corporation.  Revealed today as a technology demonstration titled, WITCH CHAPTER 0 [cry], the results of this research will be incorporated into Square Enix’s LUMINOUS STUDIO engine, and is intended for use in future game development.”

Not only that, but Microsoft were able to move freely through the scene in real-time, adjust the lighting, as well as show off the new 8K x 8K textures, which Microsoft said was “Significantly more than we were able to do [with DX11].” They were also keen to show off that “Every piece of hair that you’re seeing is being rendered as a polygon” while using more than 50 shaders for that effect alone; incredible!

WITCH CHAPTER 0 [cry], has achieved some of the world’s highest-level quality of real-time CG. It portrays the human emotion of crying– one of the most difficult representations for existing real-time CG technology. The emotion is displayed in a level of quality which has never been seen before with a real-time CG animated character teemed with life.

WITCH CHAPTER 0 [cry] was created mainly by the developers of AGNI’S PHILOSOPHY – FINAL FANTASY REALTIME TECH DEMO, a tech demo for LUMINOUS STUDIO revealed in June 2012. Square Enix research enabled the real-time CG to feature even more refined graphics and improved processing capabilities in WITCH CHAPTER 0 [cry].

There’s just one thing left to add to this, HURRY UP AND RELEASE DIRECTX 12 ALREADY!

ExecuteIndirect Command in DirectX 12 Brings Improved Performance and Low CPU Usage

Microsoft has revealed a new indirect dispatching draw solution that can be used by all DirectX 12 compatible hardware, completely replacing the DrawIndirect and DispatchIndirect commands. The company says that this solution will bring ‘major performance improvements to the already incredible performance that DX12 can achieve’.

ExecuteIndirect is said to perform multiple draws with a single API call, and gives the ability to both the CPU and the GPU to control draw calls, as well as change bindings between draw calls. Principal Development Lead for Direct3D and DXGI at Microsoft, Max McMullen, has demoed the new feature at GDC with the help of Intel’s Asteroid benchmark.

McMullen first demoed the DX11 version, where the benchmark results came in at only 29 FPS. Switching to DX12 however, an outstanding 75 FPS result was achieved, having a 4 to 6 FPS increase when bindless mode was added to the equation. For those unaware, bindless mode is a DX12 feature which can pre-bake all textures used in the application. Microsoft states that the feature is “a major efficiency improvement in how the GPU is running”.

Lastly, when switching to ExecuteIndirect, an epic 90 FPS result was achieved in the benchmark. This is where we see a significant reduction of CPU usage as well compared to the previous two DX12 benchmarks, making the feature one of the best solutions for delivering high-quality graphics at the lowest possible hardware usage. Now it remains to be seen if developers can take full advantage of Microsoft’s new feature.

Thank you DSOGaming for providing us with this information

New Tool Coming To Benchmark DirectX 12, 11 and Mantle

We saw some impressive figures from Microsoft this week, making huge promises about the performance of DirectX 12. However, you’ll soon be able to test out the new API for yourself on your own hardware.

Futuremark will be adding a new update to their 3DMark suite called “API Overhead Feature Test” that will let you test DirectX 12 against the current DirectX 11, as well as the AMD Mantle API.

“Games make thousands of draw calls per frame, but each one creates performance-limiting overhead for the CPU. APIs with less overhead can handle more draw calls and produce richer visuals. The 3DMark API Overhead feature test is the world’s first independent test for comparing the performance of DirectX 12, Mantle, and DirectX 11. See how many draw calls your PC can handle with each API before the frame rate drops below 30 fps.”

No exact release date just yet, but we do know it will be “coming soon.”

DirectX 11 vs DirectX 12 Performance Slides Revealed

DirectX 12 is on its way and it promises to bring massive performance improvements for PC gaming. The new API will achieve improved performance and efficiency by using multiple cores more effectively vs the current DirectX 11 API.

As you can see in the slides above, you can see how much load is put onto the first core using the D3D11 API and how much less is used on the D3D12 API. What does this mean to the end user? Well for one, the frame took 9ms to render in DirectX11, but it only took 4ms in Direct12. By sharing the workflow more evenly across multiple cores, the render time is greatly reduced and the strain on the hardware is also reduced.

I really can’t wait for the official release of the DirectX 12 API, especially since Epic, Crytek and more are already pushing to utilize the new technology.

Thank you Littletinyfrogs for providing us with this information.

Nvidia Speak Their Mind About Mantle

In a recent podcast from MaximumPC, Nvidia finally spoke out about AMD’s Mantle and best we can tell, they couldn’t care less about it. Nvidia Engineer Tom Petersen and Senior Director of Engineering Rev Lebaradian were on hand to discuss the topic and according to them, there are no big benefits from using it (mantle).

What is interesting is here at eTeknix know for a fact there are tangible benefits to using Mantle, but Nvidia may just be down playing the gains to save their own asses, especially given that they’ll be pegging their performance gains on DirectX12, which is likely capable of rendering Mantle obsolete.

“We don’t know much about Mantle, we are not part of Mantle. And clearly if they see value there they should go for it. And if they can convince game developers to go for it, go for it. It’s not an Nvidia thing. The key thing is to develop great technologies that deliver benefits to gamers. Now in the case of Mantle it’s not so clear to me that there is a lot of obvious benefits there.”

“It’s possible to pull performance out of DirectX, we’re approving that, and so you can argue that maybe it’s not a good idea to put extra effort into yet another API that does the same thing essentially. Feature wise there is nothing more.”

“DX12 is coming and a lot of the features, the benefits of having a lower level API (the extra calls and stuff), it’s going to be in DX12.”

From what we’ve seen, Mantle can really pull a lot of benefits for slower hardware, although if you’ve got a GTX 780Ti or R9 290X powered system, gaining a few FPS is hardly going to be noticeable for you anyway. Ball is in your court Nvidia, time to stop talking and get your API in line, then we’ll see if Mantle really is a waste of time.

[youtube width=”800″ height=”450″]http://youtu.be/aG2kIUerD4c[/youtube]

Thank you MaximumPC for providing us with this information.

AMD Hints at Steam OS / Linux Release of Their Mantle API

AMD have fired some shots over at the Nvidia team with a recent statement on their blog. They imply that AMD will be working with Linux much sooner than Nvidia will see any benefit of their latest DirectX 12 API, giving AMD an upper hand as Linux gains strength, especially in light of the Linux based SteamOS.

SteamOS and Mantle are a perfect pairing, as the lightweight OS and the to-the-metal nature of the API could offer tangible performance gains for gamers. This is also true of DirectX 12, but with Mantle having been put into practice much sooner than the new Microsoft DirectX 12, it certainly gives AMD a head start, not to mention it could be as much as 18 months before we see DirectX 12 implemented into AAA titles, Mantle is already here.

“On March 20 Microsoft announced DirectX® 12, the next major evolution of its own game API. This is terrific news because it really draws attention to the value of low-level programming and Mantle’s leading contribution. With DirectX 12 games still over 18 months away and no alternatives in sight for Linux gamers, Mantle’s future looks bright” said AMD on their blog.

The important part to takeaway from that is “no alternatives in sight for Linux gamers, Mantle’s future looks bright.” AMD are practically spelling it out here, they’re working on Linux and it could be what it really takes to push Linux towards being a more fully fledged gaming platform. Either way, it doesn’t sound like we’ll have to wait too long to find out more.

Thank you WCCFTech for providing us with this information.

 

Thief Mantle API Support Delayed Until First Patch In March, Brings TrueAudio as Well

There have been rumors about Thief getting Mantle support, and it appears that Edios Montreal has finally made it official. However, players are not to expect AMD’s Mantle API at launch. Edios Montreal has confirmed that Thief will be launched only with DirectX support.

However, a match stipulated to be released in March will bring Mantle support, as well as TrueAudio. This will make Thief the first game to use dedicated DSP found on GNC 1.1-based graphics cards like the Radeon R9 290, R9 290X, R7 260 and R7 260X. The TrueAudio technology is designed to shift advanced audio processing from the CPU to the GPU’s DSP, giving both better performance and more immersive audio experiences.

“We confident this patch will ensure the best and fastest Thief experience for AMD Radeon customers”. Edios Montreal stated. “We’re sorry we couldn’t bring this to you sooner – although we will use this time to bring you the very best experience possible and will let you know when the patch is ready.”

Edios was known to become partners with AMD to make Thief a Mantle-supported title back in November. Other games announcing support for Mantle API consist of Star Citizen and Sniper Elite 3, while EA’s Battlefield 4 already received the update a while ago.

Thank you Tech Spot for providing us with this information