Ashes of the Singularity is a futuristic real-time strategy game offering frenetic contests on a large-scale. The huge amount of units scattered across a number of varied environments creates an enthralling experience built around complex strategic decisions. Throughout the game, you will explore unique planets and engage in enthralling air battles. This bitter war revolves around an invaluable resource known as Turinium between the human race and masterful artificial intelligence. If you’re into the RTS genre, Ashes of the Singularity should provide hours of entertainment. While the game itself is worthy of widespread media attention, the engine’s support for DirectX 12 and asynchronous compute has become a hot topic among hardware enthusiasts.
DirectX 12 is a low-level API with reduced CPU overheads and has the potential to revolutionise the way games are optimised for numerous hardware configurations. In contrast to this, DirectX 11 isn’t that efficient and many mainstream titles suffered from poor scaling which didn’t properly utilise the potential of current graphics technology. On another note, DirectX 12 allows users to pair GPUs from competing vendors and utilise multi graphics solutions without relying on driver profiles. It’s theoretically possible to achieve widespread optimization and leverage extra performance using the latest version of DirectX 12.
Of course, Vulkan is another alternative which works on various operating systems and adopts an open-source ideology. Although, the focus will likely remain on DirectX 12 for the foreseeable future unless there’s a sudden reluctance from users to upgrade to Windows 10. Even though the adoption rate is impressive, there’s a large number of PC gamers currently using Windows 7, 8 and 8.1. Therefore, it seems prudent for developers to continue with DirectX 11 and offer a DirectX 12 render as an optional extra. Arguably, the real gains from DirectX 12 will occur when its predecessor is disregarded completely. This will probably take a considerable amount of time which suggests the first DirectX 12 games might have reduced performance benefits compared to later titles.
Asynchronous compute allows graphics cards to simultaneously calculate multiple workloads and reach extra performance figures. AMD’s GCN architecture has extensive support for this technology. In contrast to this, there’s a heated debate questioning if NVIDIA products can even utilise asynchronous compute in an effective manner. Technically, AMD GCN graphics cards contain 2-8 asynchronous compute cores with 8 queues per core which varies on the model to provide single cycle latencies. Maxwell revolves around two pipelines, one designed for high-priority workloads and another with 31 queues. Most importantly, NVIDIA cards can only “switch contexts at draw call boundaries”. This means the switching process is slower and gives AMD and a major advantage. NVIDIA has dismissed the early performance numbers from Ashes of the Singularity due to its current development phase. Finally, the game’s release has exited the beta stage which allows us to determine the performance numbers after optimizations were completed.
One of the biggest changes DX12 brings to the table is the increased reliance on developers to properly optimize their code for GPUs. Unlike DX11, there will have fewer levers to tweak in the GPU driver, with more work being needed in the game engine and the game itself. To address this, AMD has announced a partnership with multiple game engine and game developers to implement DX12.
To kick start the effort, AMD is headlining 5 games and engines they are partnering with to ensure DX12 works smoothly with Radeon GPUs with the software. These are Ashes of the Singularity by Stardock and Oxide Games, Total War: WARHAMMER by Creative Assembly, Battlezone VR by Rebellion, Deus Ex: Mankind Divided by Eidos-Montréal and the Nitrous Engine by Oxide Games. These titles span a wide range from RTS to RPG and FPS which gives a sense that AMD is trying to cast as wide a net as possible.
In addition to this, AMD will also be working with EA and Dice to get the Frostbite 3 engine to enable DX12. This engine is of particular importance due to the many AAA EA and other titles using it. AMD is also hoping to push Asynchronous Compute and to make sure games will squeeze the most out of GCN using DX12.
Stardock has revealed that it is developing a unique software solution that will allow GPUs from different vendors to be used in unison. While DirectX 12 already boasts such support – though the only game that supports it as yet is Stardock’s own Ashes of the Singularity – Stardock CEO Brad Wardell says that his will open this option up to everyone.
“One of the biggest problems with games is that a new video card comes out from AMD and Nvidia, and they’re like [expensive], and you have to make a call,” Wardell told Venturebeat. “I like my video card. I can play most games on it, and I don’t want to spend $800 on some new video card. But imagine, instead, hey, they’re having a sale [using my GTX 760 as an example]. Hey, they’re having a sale on an AMD 290 for $75. Wouldn’t it be cool to put this into your computer and double your performance. You keep this in there [the 760]. You put this in there [the 290], and your games are twice as fast without doing anything else.”
Wardell says that his company has been working with NVIDIA and AMD on the solution for the past year and that, while the two video card giants aren’t necessarily happy at the idea of their hardware being combined with that of their competitors, they certainly approve of anything that mean more people will buy their products.
“They don’t love that part [mixing competing cards in one PC], but [what they do love] is the idea that people will buy more cards,” Wardell added. “It’s a major friction where someone says, ‘I have a card that works. I’m not going to spend $800.’ They don’t get the sale. But you’re going to get the same effect by adding [an] $80 video card [to your existing setup].”
More news on Stardock’s new multi-GPU software is expected to be revealed by Microsoft at GDC 2016 this week.
NVIDIA has released the 364.47 WHQL Driver for its GeForce graphics cards, bringing with it Game Ready optimisation for Tom Clancy’s The Division, Hitman, Need for Speed, Ashes of the Singularity, and Rise of the Tomb Raider, plus fixes for many known issues with Windows 10.
The first public beta of Stardock’s real-time sci-fi strategy game, Ashes of the Singularity, has been released. Ashes of the Singularity is the first game to support DirectX 12 natively, powered by the Nitrous engine, which is said to be able to handle busy screens with interactive and visual complexity.
“In Ashes of the Singularity, gamers aren’t fighting a battle, they’re fighting a war,” Brad Wardell, CEO of Stardock, said. “Players command thousands of units across a vast battlefield while building up their economic and technological might.”
“Over the past few months we’ve worked closely with AMD and NVIDIA to fully leverage their hardware,” he added. “Our alpha testers have reported substantial performance gains, which is allowing us to begin lowering the hardware requirements.”
According to the announcement of the public beta on the official website, the game boasts:
The first native DirectX 12 game allowing each CPU core to command the player’s GPU simultaneously, which allows for an order of magnitude more rendered units to be on screen at the same time than previous RTS games.
A multi-core real-time strategy AI that allows for excellent single player RTS gaming.
A new native 64-bit 3D engine called Nitrous that makes full use of the features of DirectX 11 and DirectX 12, allowing for thousands of light sources on screen simultaneously.
A new type of unit group organization known as a “meta” unit that makes it easy for players to manage potentially tens of thousands of units across a world.
Advanced Nitrous 3D engine allows players to zoom out on the map without having to transform the map into a simplified view of the battlefield.
“Our goal with Ashes is to help introduce a new generation of gamers to real-time strategy games,” Wardell said. “We want to make a game where players can invite their friends in and be up and playing relatively quickly without a lengthy explanation about how to play.”
The Ashes of the Singularity public beta is available now from Steam and GOG.
While much of the talk around DX12 recently has been around the reduced CPU/driver overhead and AsyncCompute, another feature is getting its first real world test. Dubbed Explicit Multi-Adapter in Microsoft’s material, the feature allows multiple GPUs that support DX12, irrespective of vendor, to work together on the same task. Developer Oxide has created a Multi-Adapter demo from their now famous Ashes of the Singularity title, using the in-house Nitrous engine.
While DX12 continues to allow the GPU drivers to allow multi-GPU setups like SLI and Crossfire, Microsoft has decided to build in a more powerful feature right into DX12. This means if the developer takes the time and effort to implement it, any DX12 title will allow any 2 DX12 card work together and boost performance. This is exactly what Anandtech tested when Oxide provided a custom build of Ashes of the Singularity with early support.
Using the built-in DX12 Multi-Adapter, top end cards like the Fury X, 980Ti and Titan X were able to show gains of between 46 to 72%. While lower than what Crossfire can offer at about 80% gains, this is pretty crazy considering the fact that it is using two cards with vastly different architectures at times from 2 different vendors. Interestingly enough, combinations with the Fury X as primary card out did those with the Nvidia card as the main one, even when the Titan X was used. This held true of older cards like the 7970 vs the 680, with the 680+7970 doing worse than just the 680 or 7970 alone. This may be due to the inherent nature of AMD’s GCN architecture being better suited to the task, but it’s still early in the game.
If developers choose to make use of this feature later on, it could make big performance boosts in teh future. Instead of having to buy two of a card, gamers can just use 1 higher performance card with a lower end one. When it comes time to upgrade, the weaker card can be tossed out and a new top-tier card takes control of the old master card. This extends to pairing up mostly unused iGPUs to get that extra bit of eye candy and fps. With control in the hands of developers and not hardware vendors, it will be interesting to see if this feature takes off.
AMD has released the Catalyst 15.8 driver which includes a number of hotfixes and enhanced performance in Batman: Arkham Knight and Ashes of the Singularity. Here is the full list of optimizations:
Highlights of AMD Catalyst 15.8 Beta Windows Driver
This driver provides support for the Oculus 0.7 SDK on Windows 7, Windows 8.1 and Windows 10. More information on the Oculus 0.7 SDK can be found at the following link on the Oculus Developer site:
Batman: Arkham Knight – Performance and quality/stability updates
Ashes of the Singularity – Performance optimizations for DirectX® 12
 Adobe® Lightroom may crash if GPU rendering is enabled
 Mouse cursor coordinates may be swapped on some 3×1 Eyefinity configurations
 The Witcher® 3: Wild Hunt – Corruption may be observed when AA is enabled in AMD CrossFire mode
 The Firefox browser may crash while opening multiple tabs (2 or more)
 Anti Aliasing settings are not retained after Applying in the AMD Catalyst Control Center
 System hangs when launching Call of Duty® – Modern Warfare 3 or Diablo III
 Metal Gear Solid® : Ground Zero may crash when launched
 Call Of Duty®: Black Ops III – texture corruption observed when launched in DirectX® 11 mode
 Sword Coast Legends – FRTC settings are not activated in the game
 Project CARS may experience corruption when AA is set to D2SM
 A green screen may be observed on some “Llano”/”Ontario” APU’s when playing video under Windows 10
 Watch Dogs may experience flickering / corruption after changing game resolution
 AMD HDMI® Audio is disabled after driver installation
 F1 2015 may experience flickering during gameplay or in game benchmarking
 Call of Duty® – Advanced Warfare may freeze randomly when run in Quad CrossFire mode
 Text corruption observed when using Windows 10 Mapp application
 System may hang when installing the driver on a Windows 10 system with 2 or more GPUs
 “Device being used by another application” error is displayed when attempting audio playback on Windows 7 systems
 Unable to create an Eyefinity SLS if one of the displays is a MST display device
 Unable to apply Fill mode in Eyefinity if 2560×1600 and 2560×1440 resolutions are used together
 DiRT Rally crashes during gameplay and benchmarking when launched in DirectX 11® mode on some BENQ 144HZ Freesync monitors
 Mad Max – Color corruption is observed when Alt+Ctrl+Del is pressed followed by the Escape key
 Battlefield Hardline crashes on pressing Ctrl+Alt+Del while running in AMD Mantle mode
 Corruption may occur in DiRT Rally with CMAA enabled with Portrait SLS and AMD CrossFire mode enabled
 Windows 10 driver installation may halt on some systems with an AMD 990FX chipset and AMD CrossFire enabled
As a temporary workaround, please uninstall the existing driver before installing the AMD Catalyst 15.8 Beta driver
 Some BENQ 144hz Freesync monitors may lose the signal while uninstalling the driver
 Assassin’s Creed® Unity may experience minor frame stutter when AMD CrossFire mode is enabled
Please note, the driver is still in a Beta stage meaning there could be instability issues, or undocumented bugs. Nevertheless, it’s promising to see updates for Batman: Arkham Knight which is not only a terrible PC port, but also prone to crashing on AMD graphics cards. On another note, it will be interesting to see the effect optimizations will have in the Ashes of the Singularity DirectX 12 patch. Will the trend of significant gains on AMD cards continue?
Do you download Beta drivers or patiently wait for the final release?
The Ashes of the Singularity DirectX 12 benchmark results has caused a great deal of animosity between AMD and NVIDIA. Previously, one of Oxide’s developers commented on the recent furore and suggested NVIDIA GPUs would struggle to utilize Async Compute Cores which should lead to greater gains using DirectX 12. Unsurprisingly, this viewpoint has been categorically shared by AMD’s Technical Marketing Lead, Robert Hallock:
“Oxide effectively summarized my thoughts on the matter. NVIDIA claims “full support” for DX12, but conveniently ignores that Maxwell is utterly incapable of performing asynchronous compute without heavy reliance on slow context switching. GCN has supported async shading since its inception, and it did so because we hoped and expected that gaming would lean into these workloads heavily. Mantle, Vulkan and DX12 all do. The consoles do (with gusto). PC games are chock full of compute-driven effects. If memory serves, GCN has higher FLOPS/mm2 than any other architecture, and GCN is once again showing its prowess when utilized with common-sense workloads that are appropriate for the design of the architecture.”
In basic terms, it appears that the Async Compute Cores rely on context switching which means threads are stored to allow for future executions. A good analogy is hyperthreading which works in a similar way. This process requires a large computational workload and it’s unsure if Maxwell’s core architecture will receive any benefit from Async Compute Cores. Supposedly, using this method will actually result in poorer raw performance.
WCCFTech received a fairly meagre response from NVIDIA surrounding this matter, and PR manager Brian Burke said:
“We’re glad to see DirectX 12 titles showing up. There are many titles with DirectX 12 coming before the end of the year and we are excited to see them.”
With DirectX 12 being the most revolutionary API ever devised, the war of words based upon contrasting technological approaches was bound to occur. It will be fascinating to see if the trend towards AMD gains occurs across many DirectX 12 titles. Arguably, NVIDIA feels this will become an isolated incident.
Thank you WCCFTech for providing us with this information.
AMD and NVIDIA have engaged in a fairly bitter dispute after the Ashes of the Singularity DirectX 12 benchmarks indicated a sharp increase in performance which worked in AMD’s favour. One could argue this is due to similarity between DirectX 12 and Mantle but it seems Async Compute Cores are at the heart of this discrepancy. NVIDIA were quick to dismiss the figures and suggested any early benchmarks will not reflect typical DirectX 12 performance gains. Only time will tell if this is true but one of Oxide’s leading developers commented on the furore and tried to give an explanation:
“Wow, there are lots of posts here, so I’ll only respond to the last one. The interest in this subject is higher then we thought. The primary evolution of the benchmark is for our own internal testing, so it’s pretty important that it be representative of the gameplay. To keep things clean, I’m not going to make very many comments on the concept of bias and fairness, as it can completely go down a rat hole.
Certainly I could see how one might see that we are working closer with one hardware vendor then the other, but the numbers don’t really bare that out. Since we’ve started, I think we’ve had about 3 site visits from NVidia, 3 from AMD, and 2 from Intel ( and 0 from Microsoft, but they never come visit anyone ;(). Nvidia was actually a far more active collaborator over the summer then AMD was, If you judged from email traffic and code-checkins, you’d draw the conclusion we were working closer with Nvidia rather than AMD wink.gif As you’ve pointed out, there does exist a marketing agreement between Stardock (our publisher) for Ashes with AMD. But this is typical of almost every major PC game I’ve ever worked on (Civ 5 had a marketing agreement with NVidia, for example). Without getting into the specifics, I believe the primary goal of AMD is to promote D3D12 titles as they have also lined up a few other D3D12 games.
If you use this metric, however, given Nvidia’s promotions with Unreal (and integration with Gameworks) you’d have to say that every Unreal game is biased, not to mention virtually every game that’s commonly used as a benchmark since most of them have a promotion agreement with someone. Certainly, one might argue that Unreal being an engine with many titles should give it particular weight, and I wouldn’t disagree. However, Ashes is not the only game being developed with Nitrous. It is also being used in several additional titles right now, the only announced one being the Star Control reboot. (Which I am super excited about! But that’s a completely other topic wink.gif).
Personally, I think one could just as easily make the claim that we were biased toward Nvidia as the only ‘vendor’ specific code is for Nvidia where we had to shutdownasync compute. By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn’t really have Async Compute so I don’t know why their driver was trying to expose that. The only other thing that is different between them is that Nvidia does fall into Tier 2 class binding hardware instead of Tier 3 like AMD which requires a little bit more CPU overhead in D3D12, but I don’t think it ended up being very significant. This isn’t a vendor specific path, as it’s responding to capabilities the driver reports.
From our perspective, one of the surprising things about the results is just how good Nvidia’s DX11 perf is. But that’s a very recent development, with huge CPU perf improvements over the last month. Still, DX12 CPU overhead is still far far better on Nvidia, and we haven’t even tuned it as much as DX11. The other surprise is that of the min frame times having the 290X beat out the 980 Ti (as reported on Ars Techinica). Unlike DX11, minimum frame times are mostly an application controlled feature so I was expecting it to be close to identical. This would appear to be GPU side variance, rather then software variance. We’ll have to dig into this one.
I suspect that one thing that is helping AMD on GPU performance is D3D12 exposes Async Compute, which D3D11 did not. Ashes uses a modest amount of it, which gave us a noticeable perf improvement. It was mostly opportunistic where we just took a few compute tasks we were already doing and made them asynchronous, Ashes really isn’t a poster-child for advanced GCN features.”
“Our use of Async Compute, however, pales with comparisons to some of the things which the console guys are starting to do. Most of those haven’t made their way to the PC yet, but I’ve heard of developers getting 30% GPU performance by using Async Compute. Too early to tell, of course, but it could end being pretty disruptive in a year or so as these GCN built and optimized engines start coming to the PC. I don’t think Unreal titles will show this very much though, so likely we’ll have to wait to see. Has anyone profiled Ark yet?
In the end, I think everyone has to give AMD alot of credit for not objecting to our collaborative effort with Nvidia even though the game had a marketing deal with them. They never once complained about it, and it certainly would have been within their right to do so. (Complain, anyway, we would have still done it, wink.gif)
P.S. There is no war of words between us and Nvidia. Nvidia made some incorrect statements, and at this point they will not dispute our position if you ask their PR. That is, they are not disputing anything in our blog. I believe the initial confusion was because Nvidia PR was putting pressure on us to disable certain settings in the benchmark, when we refused, I think they took it a little too personally.
AFAIK, Maxwell doesn’t support Async Compute, at least not natively. We disabled it at the request of Nvidia, as it was much slower to try to use it then to not.
Weather or not Async Compute is better or not is subjective, but it definitely does buy some performance on AMD’s hardware. Whether it is the right architectural decision for Maxwell, or is even relevant to it’s scheduler is hard to say.”
Theoretically, AMD cards should reap greater rewards from DirectX 12 as upcoming games begin to take full advantage of Async Compute Cores, a feature which is supposedly missing from NVIDIA’s line-up. As a result, it’s sensible to believe that AMD’s core architecture features the potential for greater gains. However, DirectX 12 is still an unknown entity and the Async Compute Cores could only be a small factor in the widespread performance numbers. Additionally, the spokesperson for Oxide admitted that Unreal Engine 4 doesn’t properly utilize Async Compute Cores so the difference will be negligible.
NVIDIA’s DirectX 11 implementation is very impressive through optimized drivers meaning their hardware had a long standing history of outperforming AMD on a software and hardware level. Now, AMD is more familiar with DirectX 12, it’s possible the gap could be reduced as NVIDIA gets to grips with the new API. Whatever the case, the true impact of Async Compute Cores is unknown and it’s up to future games to see if this will be a real-world advantage or something restricted to synthetic benchmarks.
Stardock’s Brad Wardell, probably the biggest advocateforDirectX 12 outside of Microsoft, has released the first in-game footage from his forthcoming real-time strategy Ashes of the Singularity, comparing differences in performance between DX11 and DX12.
Ashes of the Singularity is powered by the Nitrous Engine, designed to handle busy screens with interactive and visual complexity, and by the looks of it, the game pushes that engine to its extreme.
The video below, posted to Wardell’s YouTube account, is a tad shaky…
…but, thankfully, Wardell produced a more professional-looking follow-up with AMD: