First DirectX 12 In-Game Benchmarks A Level Playing Field? Drama And Framerates Collide

Sometimes I wonder if we should cook up a TMZ-style column explicitly for reporting on all the drama that transpires between AMD and NVIDIA. Today we’ve witnessed one of the first shots fired from one of those camps, on what’s poised to be a very heated battleground over the next several years: DirectX 12.

Over the weekend press was handed access to one of the world’s first DirectX 12 benchmarks courtesy of Oxide’s Ashes of the Singularity. This isn’t a synthetic test like 3DMark’s API overhead feature test, but rather a true real-world benchmark using a pre-Beta version of the upcoming game. And we’ve all been salivating over a way to measure real-world performance improvements from DirectX 11 to DirectX 12 since first learning about Microsoft’s updated API.  

Aside from being our first glimpse into how an actual engine for an actual game performs under DirectX 12, it’s also something crucially important to NVIDIA and AMD: How their drivers and hardware stack up under DirectX 12. While we are talking about understandably immature drivers and an unreleased game, it’s nevertheless the world’s first impression.

ashes

As some early results show, DX11 to DX12 scaling on NVIDIA’s GTX 980 is almost nonexistent. Paired with Intel’s monster Core i7-5960x, the performance improvement on Low quality settings is 9%, and on High settings, curiously enough, there’s a decrease of 12%.

Meanwhile, those same tests on AMD’s Radeon 390x – also paired with the i7-5960x -- illustrate a whopping 80% gain between DX11 and DX12. That’s not remotely the whole story but it seems to be enough to put NVIDIA on the defensive. As you well know, data can be manipulated to satisfy many different arguments.

Shots Fired

Even though several other Ashes of the Singularity results shined a very favorable light on NVIDIA, their PR department fired a statement to our inbox, stating that any tests using Oxide’s Ashes of the Singularity benchmark presets are straight up invalid: “Ashes of Singularity has an application-side bug for MSAA running the DX12 executable on GeForce GPUs. Note that MSAA is enabled by default when you select the ‘low’, ‘medium’, ‘high’, and ‘crazy’ presets on the DirectX 12 version. As a result, benchmark runs using these presets are invalid.” [Emphasis ours]

Several, if not most, tech outlets used these presets in their testing. 

NVIDIA went on to say “We believe there will be better examples of true DirectX 12 performance. We believe the GeForce architecture and drivers DX12 performance is second to none. When accurate DX12 metrics arrive, the story will be the same as it was for DX11.”

Strong words. And Oxide fired equally strong ones back in response, saying that NVIDIA’s statement was inaccurate: “Our code has been reviewed by NVIDIA, Microsoft, AMD and Intel. It has passed the very thorough D3D12 validation system provided by Microsoft specifically designed to validate against incorrect usages. All IHVs [Independent Hardware Vendors] have had access to our source code for over year, and we can confirm that both NVIDIA and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months. Fundamentally, the MSAA path is essentially unchanged in DX11 and DX12. Any statement which says there is a bug in the application should be disregarded as inaccurate information.” [Emphasis ours]

Furthermore, a developer post on Oxide's forum claims that "NVIDIA mistakenly stated that there is a bug in the Ashes code regarding MSAA. By Sunday, we had verified that the issue is in their DirectX 12 driver. Unfortunately, this was not before they had told the media that Ashes has a buggy MSAA mode." It's also worth mentioning that Oxide says that bug or not, the effect on NVIDIA's numbers is "fairly inconsequential." 

When we gave NVIDIA the opportunity to respond in kind, their attitude seemed to have mellowed, offering up the following: “The DirectX 12 version of the Ashes of Singularity pre-Beta currently exhibits lower than expected MSAA performance on GeForce GPUs. NVIDIA is working closely with Oxide Games to ensure this gets resolved as soon as possible.”

ashesdx12
Ashes of the Singularity features a useful range of data points to analyze.
Here's a DX12 test running on an Intel Core i7-5960x and Radeon Fury. (Click for high res)

ashesdx11
Here's the same benchmark as above, but running on DX11.
We see an incredible 79% performance boost between DX11 and DX12. (Click for high res)

We had an internal roadblock with our own Ashes of the Singularity testing (the executable simply wouldn’t launch on our AMD APU test bench), but we intend to share our insights as soon as possible. The reason we wanted to focus our attention mainly on APU-based machines lines up with what other journalists have discovered as well. While Intel’s single-core performance has always reigned supreme, DirectX 12 loves extra cores, and those cores come cheaper on AMD processors. That means AMD could see a much-needed sales boost in their FX and APU lineups, and we're starting to understand why they're preaching the gospel of DirectX 12. 

Now we're chomping at the bit to run this benchmark on a wide array of hardware environments ranging from cheap to ludicrously expensive. 

As for the drama, it could be a simple case of miscommunication, or early warning signs of an intense battle to come. There’s no denying that AMD and NVIDIA have a rather bitter rivalry as of late and that doesn’t show any signs of ending.

The good news for gamers? Whether you’re rocking GeForce or Radeon video cards, Intel or AMD CPUs, these very early DirectX 12 results indicate that you’re going to get significant performance boosts simply by installing Windows 10 and playing games coded with DirectX 12. Granted, this is only one game and one engine, but it’s a promising sign.