No DX12 Software is Suitable for Benchmarking *spawn*

Could also be a source of marketing revenue from IHV's to gaming companies to use specific internal benchmark maps that favor their product.
A good conspiracy theory needs a situation where few people would need to know the truth and this fits, but I suspect this doesn't happen. It's much easier to get specific algorithms into a game that are optimized for your hardware than to test levels in a pre-production game, find what works best on your hardware and hope the competitive situation doesn't change before the game ships. Game companies don't like giving a complete game to IHVs early so this testing would need to happen on site at the game developer. There seem to be too many obstacles to this being a practical situation.
 
Civilization VI DX12 support press release:

http://www.marketwired.com/press-re...eiers-civilizationr-vi-nasdaq-amd-2142057.htm

Gamers Can Depend on Radeon™ GPUs for a World Class DirectX® 12 Experience With Full Support for Asynchronous Compute and Explicit Multi-Adapter



SUNNYVALE, CA--(Marketwired - Jul 13, 2016) - Today AMD (NASDAQ: AMD), 2K and Firaxis Games announced a technical partnership to implement a truly exceptional DirectX® 12 renderer for Radeon™ GPUs into the graphics engine powering Sid Meier's Civilization® VI.

Complete with support for advanced DirectX® 12 features like asynchronous compute and explicit multi-adapter, PC gamers the world over will be treated to a high-performance and highly-parallelized game engine perfectly suited to sprawling civilizations designed to win hearts, minds, and the stars.

"Radeon™ graphics cards have rapidly become the definitive platform for next-generation DirectX® 12 content," said Roy Taylor, corporate vice president of alliances, Radeon Technologies Group, AMD. "We're thrilled to bring our leading DirectX® 12 hardware and expertise to bear in the next installment of the Civilization franchise, which has long been adored by gamers for its intoxicating mix of beautiful graphics and hopelessly addictive gameplay."

"For 25 years the Civilization franchise has set the standard for beautiful and masterfully crafted turn based strategy," said Steve Meyer, Director of Software Development, Firaxis Games. "AMD has been a premiere contributor to that reputation in past Civilization titles, and we're excited to once again join forces to deliver a landmark experience in Sid Meier's Civilization® VI."

DirectX® 12 Asynchronous Compute
Asynchronous compute is a DirectX® 12 feature exclusively supported by the Graphics Core Next or Polaris architectures found in many AMD Radeon™ graphics cards. This powerful feature allows for parallel execution of compute and graphics tasks, substantially reducing the time other architectures need to execute the same workloads in a longer step-by-step manner. Asynchronous compute on many Radeon™ GPUs will perfectly complement the unit-rich late game of Civilization VI.

DirectX® 12 Explicit Multi-Adapter
Explicit multi-adapter represents the first time the DirectX® graphics API has officially supported multi-GPU configurations for gamers. Though past versions of the DirectX® API did not prevent multi-GPU support, there were no extensions that specifically aided its addition. DirectX® 12 explicit multi-adapter not only adds official Microsoft support, but augments that support with a range of powerful features and flexibility to unleash the imagination of a game developer. The benefit of multi-GPU can be legion: higher framerates, lower input latency, capacity for higher image quality and more. Explicit multi-adapter support will be an excellent feature addition for Radeon™ graphics customers who demand the very most from their Civilization VI experience.

Errr since when did Async Compute become a GCN exclusive feature? I guess that because it never really "worked" on Maxwell, AMD's in full PR mode here. But then again NVidia has yet to show it running on Pascal GPU with meaningful results/perf improvements...:-\

No sign of Shader Intrinsic Functions support on any DX12 game yet though... which is disappointing.
 
A good conspiracy theory needs a situation where few people would need to know the truth and this fits, but I suspect this doesn't happen. It's much easier to get specific algorithms into a game that are optimized for your hardware than to test levels in a pre-production game, find what works best on your hardware and hope the competitive situation doesn't change before the game ships. Game companies don't like giving a complete game to IHVs early so this testing would need to happen on site at the game developer. There seem to be too many obstacles to this being a practical situation.
A good example at the extremes supporting either AMD or Nvidia.
Fallout 4 for Nvidia, Quantum Break for AMD.
It is fair to say they are designed closer to a specific IHV, not saying this is the norm but shows there is influence-collaboration even at the development level that can change priority/focus.
Cheers
 
Errr since when did Async Compute becom.e a GCN exclusive feature? I guess that because it never really "worked" on Maxwell, AMD's in full PR mode here. But then again NVidia has yet to show it running on Pascal GPU with meaningful results/perf improvements...:-\
As well they should promote the feature. It's not an IHV specific feature but it's definitely a big strength of the GCN architecture that is bringing their performance up to Nvidia and beyond for the comparable perf/$$. Hopefully create some good competition again.
 
Civilization VI DX12 support press release:

http://www.marketwired.com/press-re...eiers-civilizationr-vi-nasdaq-amd-2142057.htm
Errr since when did Async Compute become a GCN exclusive feature? I guess that because it never really "worked" on Maxwell, AMD's in full PR mode here. But then again NVidia has yet to show it running on Pascal GPU with meaningful results/perf improvements...:-\

No sign of Shader Intrinsic Functions support on any DX12 game yet though... which is disappointing.
Legal probably would have this interpreted as:
„Asynchronous compute is a DirectX® 12 feature exclusively supported by the Graphics Core Next or Polaris architectures found in many AMD Radeon™ graphics cards.“ -> exclusively as opposed to the earliert VLIW architecture, which, as i have learned here, is also only true, because Cayman is no DX12 card.
 
Fireaxis has always been in very good terms with AMD. Several of their employees are former AMD engineers and researchers. Dan Baker of AotS "fame" is a big AMD supporter (often partecipates to AMD events) and used to work there. This partnership really makes sense given how both companies go way back..
 
But then again NVidia has yet to show it running on Pascal GPU with meaningful results/perf improvements...:-\

Meaning there's still no hard proof that Pascal has that ability. Or that it gains anything with it.
 
A good example at the extremes supporting either AMD or Nvidia.
Fallout 4 for Nvidia, Quantum Break for AMD.
It is fair to say they are designed closer to a specific IHV, not saying this is the norm but shows there is influence-collaboration even at the development level that can change priority/focus.
Cheers

Using Occam's Razor it's more likely that Quantum Break just defaulted to whatever code was used for the XBO which happens to have AMD hardware. I get the feeling that the decision to make a simultaneous PC release (as opposed to a delayed PC release like Alan Wake) came late in development and the rough nature of the PC port shows that.

And Fallout 4 likely defaulted to whatever path was used for PC development. It's obviously a title that used the PC as the primary development target and platform with Nvidia hardware used during development and then ported to consoles.

IE - unlikely to have been specifically IHV influenced. Just the nature of their development paths.

Regards,
SB
 
Using Occam's Razor it's more likely that Quantum Break just defaulted to whatever code was used for the XBO which happens to have AMD hardware. I get the feeling that the decision to make a simultaneous PC release (as opposed to a delayed PC release like Alan Wake) came late in development and the rough nature of the PC port shows that.

And Fallout 4 likely defaulted to whatever path was used for PC development. It's obviously a title that used the PC as the primary development target and platform with Nvidia hardware used during development and then ported to consoles.

IE - unlikely to have been specifically IHV influenced. Just the nature of their development paths.

Regards,
SB
The AO/God Rays/Volumetric lighting/Shadows were all made with Nvidia collaboration going back to earlier generations of Fallout and continuing with Fallout 4.
Quantum Break, it has been out for a long time and no patches to date has been focused on resolving the issues the game has in terms of post processing effects causing heavy performance impact on Nvidia cards, nor for resolving the stability issues.

Notice both are using similar development areas focused for a specific IHV, also there was no attempts for Fallout 3/4 to resolve these causing heavy performance impact on AMD cards.
That is one reason I chose these 2 games; both never patched to work with the other card manufacturer and both causing performance impact with Volumetric lighting-shadows-etc.
But my original point was more to do with how the IHV can influence and possibly change the priority/focus even at the development stage, and these 2 games being the most extreme example of that process (not saying the norm).
Cheers
 
Last edited:
The AO/God Rays/Volumetric lighting/Shadows were all made with Nvidia collaboration going back to earlier generations of Fallout and continuing with Fallout 4.
Quantum Break, it has been out for a long time and no patches to date has been focused on resolving the issues the game has in terms of post processing effects causing heavy performance impact on Nvidia cards, nor for resolving the stability issues.

Notice both are using similar development areas focused for a specific IHV, also there was no attempts for Fallout 3/4 to resolve these causing heavy performance impact on AMD cards.
That is one reason I chose these 2 games; both never patched to work with the other card manufacturer and both causing performance impact with Volumetric lighting-shadows-etc.
Cheers

You can add at least on the Nvidia side, a tons of DX11 games who was run completelly abnormaly on AMD GPU's ( including most of Ubisoft ones ) .. this type of discussion can only turn in round, as this is nothing new, whatever it is DX11 or 12,.. The fact is most games in DX11 on the end ( after the first waves of titles ), have been highly promoted and made in collaboration with Nvidia .. this incude most unreal engine ones, Activsion COD ( that we never seen in any review anyways, even if they sell as crazy ) Assassin Creeds series, Batman series etc ... AMD was the first with DX11 ( Battlefield and some other ), Nvidia have attack really hard then, where a lot of AAA titles was under their wings ( way before gameworks ) . with later release, things seems a bit inverted .. butt just a little bit.

Suddenly, i got the feeling that, some want to say, that every DX11 games where Nvidia was perform att 60 to 120% faster that their counterpart is due to "overhead" of bad driver from AMD .. I dont say that in that was not the case, in some cases ..

So suddenly, AMD is the devil in the low API.
 
Last edited:
Just to dispel the myth that Remedy didn't bother doing anything to get the game up and running correctly on NVidia hardware..this is from the game's end credit (and no there's no AMD mentions in there just the NVidia engineers)

vPO8Unb.jpg


Two helped with the engine design and 2 others are part of NVidia's developer relation (TWIMTBP). As stated earlier, the game was so far into development as an Xbox One exclusive (the early PC port was decided just a few months before release) that there probably was no way to do better than what we got.
 
You can add at least on the Nvidia side, a tons of DX11 games who was run completelly abnormaly on AMD GPU's ( including most of Ubisoft ones ) .. this type of discussion can only turn in round, as this is nothing new, whatever it is DX11 or 12,.. The fact is most games in DX11 on the end ( after the first waves of titles ), have been highly promoted and made in collaboration with Nvidia .. this incude most unreal engine ones, Activsion COD ( that we never seen in any review anyways, even if they sell as crazy ) Assassin Creeds series, Batman series etc ... AMD was the first with DX11 ( Battlefield and some other ), Nvidia have attack really hard then, where a lot of AAA titles was under their wings ( way before gameworks ) . with later release, things seems a bit inverted .. butt just a little bit.

Suddenly, i got the feeling that, some want to say, that every DX11 games where Nvidia was perform att 60 to 120% faster that their counterpart is due to "overhead" of bad driver from AMD .. I dont say that in that was not the case, in some cases ..

So suddenly, AMD is the devil in the low API.
Just to clarify my original post and context goes back to this:
A good conspiracy theory needs a situation where few people would need to know the truth and this fits, but I suspect this doesn't happen. It's much easier to get specific algorithms into a game that are optimized for your hardware than to test levels in a pre-production game, find what works best on your hardware and hope the competitive situation doesn't change before the game ships. Game companies don't like giving a complete game to IHVs early so this testing would need to happen on site at the game developer. There seem to be too many obstacles to this being a practical situation.
It is a good point, and I was showing potentially 2 games that most highlight the influence of the IHV in the game development and its priority/focus.
Cheers
 
Just to dispel the myth that Remedy didn't bother doing anything to get the game up and running correctly on NVidia hardware..this is from the game's end credit (and no there's no AMD mentions in there just the NVidia engineers)



One helped with the engine design and 2 others are part of NVidia's developer relation (TWIMTBP). As stated earlier, the game was so far into development as an Xbox One exclusive (the early PC port was decided just a few months before release) that there probably was no way to do better than what we got.
Ah that must explain why the performance is absolutely dire on all Nvidia cards and also the game crashes :)
See the Quantum Break thread even on this site, or the reviews with decent benchmarking.
So you think AMD would not be involved with it from a console side of things or even PC (Remedy has mentioned AMD being involved along with Nvidia for trying to resolve performance issues)?
And AMD would still had engineers involved with the Fallout games, it does not mean they are getting the engine/post processing effects optimised for their cards though (which they are not in either games case for each IHV).
Cheers
 
Ah that must explain why the performance is absolutely dire on all Nvidia cards and also the game crashes :)
See the Quantum Break thread even on this site, or the reviews with decent benchmarking.
So you think AMD would not be involved with it from a console side of things or even PC?
And AMD would still had engineers involved with the Fallout games, it does not mean they are getting the engine/post processing effects optimised for their cards though (which they are not in either games case for each IHV).
Cheers
Wut?
 
So your saying Nvidia had engineers involved but AMD did not.
But somehow on PC the game crashes on all Nvidia cards while also causing dire performance (more so for 970 and 980 and older cards).
This is probably the worst of the recent console ports in terms of general performance for Nvidia cards (970,980 and earlier cards) relative to AMD; Hitman was initially bad but episode 2 did provide performance increases for Nvidia.
Also as I mentioned AMD has been involved as well with Fallout games, but it never meant those post processing effects being changed.
Cheers
Edit:
Here is the thread regarding Quantum Break on PC: https://forum.beyond3d.com/threads/quantum-break-uwp.57771/page-8#post-1916559
 
Last edited:
It is late and tired here, so not that enthusiastic to find the old historical articles where AMD raised their issues regarding earlier Fallout games, came down to Volumetric lighting/shadows/etc.

However Bethesda do briefly mention Nvidia's further involvement in Fallout 4.
We want objects and characters in the world to feel tactile and grounded, and a big part of that is ensuring that these materials are distinct – that metal reflects light in a distinct manner from wood, for example.
As always, our world features fully dynamic time of day and weather. To create that volumetric light spilling across the scene (sometimes called “god rays”) we worked with our friends at NVIDIA, who’ve we worked with dating back to Morrowind’s cutting-edge water. The technique used here runs on the GPU and leverages hardware tessellation. It’s beautiful in motion, and it adds atmospheric depth to the irradiated air of the Wasteland. Like all the other features here, we’ve made it work great regardless of your platform.

When a rain storm rolls in, our new material system allows the surfaces of the world to get wet, and a new cloth simulation system makes cloth, hair, and vegetation blow in the wind.
https://bethesda.net/?utm_source=Tw...raphics-technology-of-fallout-4/2015/11/04/45
Also shadow detail still hammers AMD cards along with the volumetric lighting, and other integral effects.
And the other side of the coin is Quantum Break with AMD.
Cheers
 
Back
Top