NVIDIA Game Works, good or bad?

It may harm some end users by not being 100% optimized for AMD. It benefits all end users by increasing the chance that games will have kick-ass visual effects.
 
I'm not suggesting NV should make its IP available to AMD for free. I'm stating a belief that this kind of work distorts the gaming market and harms end users by making it vastly more difficult for AMD to reach parity in GW titles.

You are distorting the whole meaning and purpose of GameWorks. GameWorks provides a vast array of NVIDIA IP (including physics, ray tracing, compute, visual effects, and graphics tools, not to mention precious engineers and engineering resources too) to game developers that are interested in working with NVIDIA to develop cutting-edge games.

As for the whole nonsense about "parity", NVIDIA and AMD graphics cards are not equal, and have different strengths and weaknesses. They will almost never achieve "parity", especially in IHV-sponsored games. And FWIW, NVIDIA likely has had a far more difficult time achieving "parity" in recent AMD-sponsored games such as Dirt: Showdown and Tomb Raider (due to GCN-friendly shader replacements and GCN-friendly rendering techniques that were added to the game at the last minute) then AMD ever has had with any GameWorks title.
 
As a hardware reviewer, we absolutely *will* compare Mantle's performance to DX. That's the entire question about how good Mantle is.
Right but I think the point silent_guy is making is completely valid in this context. i.e. you as a hardware reviewer are not able to separate "Mantle" testing from "one game's implementation on Mantle" testing. So for the initial batch of games I imagine most people will compare a few screenshots and conclude that they look similar, therefore comparisons of the two paths are effectively comparing DirectX vs Mantle.

Unfortunately, each implementation could be using arbitrarily different algorithms (this is akin to comparing a console version to a PC version for instance) and since the Mantle implementation is going through a proprietary interface, there's far more issues detecting shenanigans than with something like GW that goes through DX.

Obviously there's a degree of trust with the involved game developers here, but it's definitely true that Mantle is waaaaay more ripe for "abuse" and other underhanded stuff than GW or anything else that goes through DX. Hopefully AMD et al. will be happy with the performance they get from the Mantle path, but I do worry in the back of my mind that since they know it is likely going to be compared on high-end hardware, there is pressure to produce perceptible gains on enthusiast-class stuff. Unfortunately, that's the sort of hardware and settings that are going to end up being mostly GPU-bound and while there are a few legitimate things they can do in Mantle there (overlapping compute/gfx/etc.) it's definitely not the case that is going to see 10x improvements or anything.

Anyways, bit of a tangent but if you don't "trust" game developers/IHVs you fundamentally can't trust Mantle paths as they are almost completely opaque, while DX - even compiled - is still pretty open and easy to understand. Download GPA or GPU Perf Studio or Nsight to get an idea of the wealth of information you can get from even a released game...
 
To further elaborate the point of Andy: in the past, we've seen outcries at different levels when IHVs optimized their drivers for games by replacing render targets with similar render targets with different precision or calculations with different precisions to gain speed advantages. Sometimes those optimizations are glaringly obvious, sometimes not. Usually they're marked as a cheat.

What makes them a cheat in the eyes of many is that the developer is asking for a target (or a calculation) of one precision but that it's overruled by the driver and replaced by something of lower precision.

Enter Mantle: what if a developer at the subtle suggestion of AMD does exactly that in its own code? And what if the visual difference is there, but subtle enough not to be picked up right away?

It's not a cheat in anyway. But is it fair to compare that Mantle implementation against a DX11 implementation?

I have no answer to this. I suspect that, for many, the answer will depend on the mental allegiances they have at the moment. ;)

Edit: and what then if Nvidia then changes its driver and patches the shader to make a similar change in the DX11 path? Ah, I see a golden future of forum flame wars ahead of us!
 
It's not a cheat in anyway. But is it fair to compare that Mantle implementation against a DX11 implementation?
To be clear, I absolutely do think it's fair to compare but only from the point of view of the end user game experience. i.e. "path X looks the best and is the fastest" or even to be more specific, "the Mantle path on the 7970 gives this quality and performance which is superior to DX on the 680" or something. This sorts of end user experience conclusions are valid to make simply from judging the final product.

Where it gets finicky is where tech reviewers are almost certainly going to start trying to broaden the conclusions to apply to Mantle itself (and generalize to other future games), hardware architectures, etc. There's no way that they will be able to resist doing that as those are the questions tech enthusiasts want to know.

Unfortunately there's not straight answers to those questions since it fundamentally is not an apples-to-apples comparison when you're talking about two distinct paths. For all reviewers know the DX path could be intentionally sabotaged/inefficient or the Mantle path could be skipping work or doing simpler algorithms. More likely, one path will just receive more optimization effort than the other.

But it's inevitable... I'll try to just shake my head quietly or avoid the conclusion sections of such articles in any case :)

Don't get me wrong, this is not a problem with Mantle or a knock against any of the improvements that it brings. I'm just noting that it is true that if you have a problem with GW because you don't trust the devs to optimize for non-NVIDIA hardware, you absolutely should have a problem with Mantle too for the same reasons. The implicit assumption that devs will happily ignore AMD for GW but spend lots of time optimizing the DX path vs. a Mantle path is conjecture.

All these decisions are ultimately up to the developers themselves. Both Mantle and GW are just tools that they can choose to use or not.

Edit: and what then if Nvidia then changes its driver and patches the shader to make a similar change in the DX11 path? Ah, I see a golden future of forum flame wars ahead of us!
Dear lord and I thought the console forums were bad enough ;) Maybe this time B3D could do an article (yes, B3D has a front page!) ahead of time to actually get people on the same page in terms of how the various tech works though. Maybe I'm naive to think that will help though...
 
Silent_guy, Andy:

You raise good points about the various ways in which code and drivers can be massaged to yield favorable results that don't reflect a neutral playing field. Here's my own personal opinion:

Changes that have no impact on visual quality or a competitor's performance are generally ok. Changes that have even a modest impact on a game's visual quality must be called out to be considered fair. So it's fine with me if AMD and Nvidia want to offer an "Optimized" texture filtering option, or a texture detail slider, but it's not ok for a company to suddenly decide that their "High Quality" option now looks like the "Balanced" option because they're losing in a benchmark.

Regarding Mantle:

The problem with critiquing Mantle in this fashion is that Nvidia doesn't have any public plans to implement it. It's similar to PhysX. AMD didn't implement PhysX, so there was no way to look at a PhysX implementation and say "Well, Nvidia is cheating."

When David Kanter published his PhysX investigation back a few years ago, I wrote up the situation and basically said "It's Nvidia's code." If Nvidia had aggressively optimized GPU PhysX for CPU execution it might have enticed more developers to adopt hardware PhysX for games -- but it would've made Nvidia's hardware less necessary for doing so. Nvidia chose to keep hardware PhysX fairly close to their chest. That's fine with me.

Mantle will be evaluated for visual quality. Hopefully existing screenshot tools will be able to grab Mantle screenshots. If they can't, I'll set up identical systems on identical monitors and run them side-by-side with DX11. If AMD pushes image quality southwards to make Mantle look faster than DX11, you *will* hear about it from journalists.

I don't expect NV to adopt Mantle, but if the day comes when Nvidia is looking to do so, I'll be the first person to say that they need full access to the API implementation in order to build their own hardware solution.
 
Changes that have no impact on visual quality or a competitor's performance are generally ok.
Yup, that's effectively what I was trying to get it with the point that it's still completely valid to compare these different implementations in terms of quality and performance and draw results for users. It's just in the attempt to generalize to the platforms and technologies themselves that things get grey.

And for the record, I doubt you're going to get a laundry list of detailed differences between Frostbite's Mantle and DX paths for instance, but here's hoping :)

The problem with critiquing Mantle in this fashion is that Nvidia doesn't have any public plans to implement it.
There is no public Mantle spec that I've seen, so the ball is still in AMD's court here. I don't think they have any intention of allowing anyone else to support Mantle 1.0 as it will ship in the initial games. The talk has all been in terms of taking Mantle-like ideas/specs and standardizing in the future.
 
May I call you out on that when the first Mantle games arrive on the scene? Because, boy, are you going to be proven wrong on that one! Especially if the first batch of Mantle supported games don't have any additional features, which I kinda expect to happen.

My post didn't say that no review won't ever have Mantle performance numbers, it was a line of thought demonstrating that gameworks could unbalance the reviews, while Mantle numbers are probably going to be relegated to an second-to-last page, like overclock or a side-review, GW could affect data that is expected to be neutral.
 
There is no public Mantle spec that I've seen, so the ball is still in AMD's court here. I don't think they have any intention of allowing anyone else to support Mantle 1.0 as it will ship in the initial games. The talk has all been in terms of taking Mantle-like ideas/specs and standardizing in the future.
Initial titles supporting Mantle are not even "1.0", they are using a beta version.
 
Obviously there's a degree of trust with the involved game developers here, but it's definitely true that Mantle is waaaaay more ripe for "abuse" and other underhanded stuff than GW or anything else that goes through DX. Hopefully AMD et al. will be happy with the performance they get from the Mantle path, but I do worry in the back of my mind that since they know it is likely going to be compared on high-end hardware, there is pressure to produce perceptible gains on enthusiast-class stuff. Unfortunately, that's the sort of hardware and settings that are going to end up being mostly GPU-bound and while there are a few legitimate things they can do in Mantle there (overlapping compute/gfx/etc.) it's definitely not the case that is going to see 10x improvements or anything.

How do you expect Mantle to be abused? I can't follow. (If we have the same or better IQ, OC).
 
There is no public Mantle spec that I've seen, so the ball is still in AMD's court here. I don't think they have any intention of allowing anyone else to support Mantle 1.0 as it will ship in the initial games.

Ultimately I agree with you. NV and AMD both have a tendency to talk about how their standards are agnostic and available to the other party. Nvidia never said "Oh, we'd be happy to give CUDA or PhysX to AMD" but they would give the impression that AMD could implement analogous systems if, you know, it wanted to give developers the same advantage they currently enjoyed with NV hardware.

Even if AMD threw the doors open on Mantle and said "Here's how we do it!" I expect that the final details would map poorly on to GK104. If Mantle catches on, what should likely happen at a minimum is that Microsoft, AMD, and Nvidia get together to hash out the future of the API as a successor to or alternate mode of DirectX.

It might be better for the two companies to jointly form up with the other mobile graphics developers, but I don't know if the OGL working group exactly wants the competition. Bottom line? Big hassle.

it was a line of thought demonstrating that gameworks could unbalance the reviews, while Mantle numbers are probably going to be relegated to an second-to-last page, like overclock or a side-review, GW could affect data that is expected to be neutral.

No and yes. ;)

Mantle data will not be a sideshow in 2014. Mantle will be talked about and closely analyzed in every title it supports. You're going to have people eyeing it closely, looking for both flaws and strengths. If Mantle doesn't give AMD any kind of performance advantage, people will talk about it. If Mantle gives AMD ass-kicking performance over Intel and NV, people are going to talk about *that.*

But yes, GW could impact results that are thought to be neutral. And because the majority of games will be DX11-based, not Mantle-based, maintaining good results in both will be important.
 
As for the whole nonsense about "parity", NVIDIA and AMD graphics cards are not equal, and have different strengths and weaknesses. They will almost never achieve "parity", especially in IHV-sponsored games. And FWIW, NVIDIA likely has had a far more difficult time achieving "parity" in recent AMD-sponsored games such as Dirt: Showdown and Tomb Raider (due to GCN-friendly shader replacements and GCN-friendly rendering techniques that were added to the game at the last minute) then AMD ever has had with any GameWorks title.

You ignore the countless TWIMTBP-games AMD had similar problems with, you ignore the vendor locking of features both can do, there's no "parity" there no matter how you twist it.
Also, there's only, what, 1 game that can be confirmed to be GW title at the moment, possibly 2 more, is it wonder that there hasn't been many cases to have problems with yet?
 
Initial titles supporting Mantle are not even "1.0", they are using a beta version.
Sure, I just meant the first public release to end users. Whether you call that "1.0" or "beta" or even "pre-release alpha still-in-test build" doesn't really affect my point :) Once there's code out there that uses it on end user machines - which is going to predate any spec or SDK release from you guys I assume - that path/version is not going to be supportable by other IHVs.

How do you expect Mantle to be abused? I can't follow. (If we have the same or better IQ, OC).
From an end user point of view, IQ and perf is all that matters. As long as reviews are just talking about that, it's all good. I'm just noting that since the Mantle and DX paths can be doing arbitrarily different things, you simply can't make generalizations to other games, IHVs or hardware architectures from the performance of Mantle vs DX on a specific game.

Note that I quoted "abuse"... making a game faster with Mantle is great and not abuse by the developers - the abuse I'm talking about is reviewers drawing conclusions that are not supportable by the test results and methodologies that they have. We'll see, but I honestly don't expect them to understand some of these subtleties and the demand from end users to get "simple" answers will likely be strong.

I mean hell, how many bytes in the B3D database have been wasted on console vs console, console vs PC and other nonsense? This is akin to the same thing ultimately in terms of the fruitful grounds of partial information that fuel the fanboys and can be applied equally to any conclusion :)

Mantle data will not be a sideshow in 2014. Mantle will be talked about and closely analyzed in every title it supports. You're going to have people eyeing it closely, looking for both flaws and strengths. If Mantle doesn't give AMD any kind of performance advantage, people will talk about it. If Mantle gives AMD ass-kicking performance over Intel and NV, people are going to talk about *that.*
<rant mode>
You've hit the nail on the head but therein lies the entire problem of all this: the consumers want to just lump Mantle into a "good" or "bad" category in their heads regardless of the actual situation and AMD knows that and thus is highly motivated to make it look "good" upon release.

Fundamentally the big gains here are in CPU bound cases and given that Battlefield 4 is not particularly CPU bound on high end configurations (especially in Single Player where most reviews test), a straightforward Mantle port might come out looking unimpressive to the enthusiast press who tend to test on very high end configurations.

Of course the fact that lower CPU overhead is critically important on SoCs and power-constrained platforms and additionally enables game developers to do some stuff that wasn't really feasible before is going to get lost in that noise if they were to just do that. Thus there's this stupid, arbitrary pressure to produce impressive gains even on high end platforms so that reviews look nice and the consumers can happily lump it into the "good" side of their brains along with the marketing platitudes that generalize the result to every possible avenue.

So what happens if people implement a Mantle path and it's only a few % faster on high end configs when used to port existing workloads that have been designed with the current DX driver situation in mind? Well... you gotta come up with other ways to make it arbitrarily look better. This is where the consumer pressure is the source of the real harm here, because their trivial understanding of the technology pushes the IHVs to bend the results in different ways. You really think engineers enjoy pulling shenanigans? I don't know a single one who does... all that crap is driven by marketing (no offense Dave) and ultimately end users' desires to validate their purchases and brand loyalty.
</rant mode>

Now in the case of Mantle I think they do have a few axes from which to produce legitimate gains without changing algorithms or otherwise manipulating the comparison away from a "port/backend" which could buy enough to be considered acceptable by the press even on high end platforms, but the very existence of this silly motivation casts a shadow on the legitimacy of anything that comes out of Mantle in the first little while. Thanks god it's repi who's involved with the first game release because at least I have confidence that he's not going to do anything super-shady.

This is getting a bit off-topic, but hopefully I've reasonable made the case for why ultimately you either trust the game developer to optimize for various hardware, or it doesn't matter if it's Mantle or GW or just pure DX; at least in the latter two cases you still have an opportunity to mess with it in the driver.

And again, I'm not supporting this GW policy. All the samples I've been involved with are licensed for use, modification, redistribution, etc. As Nick said, it's always best if you have the opportunity to make tweaks for various pieces of hardware ahead of time. But fundamentally it's up to the game developers and they absolutely have the option of not using GW on non-NVIDIA hardware.
 
The potential gain for the industry from GW pales in comparison to Mantle, IMHO GW is beneficial mostly in a business sense and as such is more prone to be manipulated.

From an end user point of view, IQ and perf is all that matters. As long as reviews are just talking about that, it's all good. I'm just noting that since the Mantle and DX paths can be doing arbitrarily different things, you simply can't make generalizations to other games, IHVs or hardware architectures from the performance of Mantle vs DX on a specific game.

Note that I quoted "abuse"... making a game faster with Mantle is great and not abuse by the developers - the abuse I'm talking about is reviewers drawing conclusions that are not supportable by the test results and methodologies that they have. We'll see, but I honestly don't expect them to understand some of these subtleties and the demand from end users to get "simple" answers will likely be strong.

I mean hell, how many bytes in the B3D database have been wasted on console vs console, console vs PC and other nonsense? This is akin to the same thing ultimately in terms of the fruitful grounds of partial information that fuel the fanboys and can be applied equally to any conclusion :)

Thanks for clearing that statement. I agree.
 
You ignore the countless TWIMTBP-games AMD had similar problems with, you ignore the vendor locking of features both can do, there's no "parity" there no matter how you twist it.
Oh come on Kaotik. You very well know that almost every games has some sort of vendor preference inherited from it's background. Take the countless console games, take physx, take gw, take GE - take whatever you like. Probably except Microsoft's Gorillas.
 
Oh come on Kaotik. You very well know that almost every games has some sort of vendor preference inherited from it's background. Take the countless console games, take physx, take gw, take GE - take whatever you like. Probably except Microsoft's Gorillas.

Obviously they do, but NVIDIA has dirtier and longer track record on it ;)
 
(especially in Single Player where most reviews test),

Ugh, really? I've never tested BF3 or BF4 in single-player. I looked at the campaigns, which were never very good, concluded no one played them for that, and ran the multiplayer version. ;)

Of course the fact that lower CPU overhead is critically important on SoCs and power-constrained platforms and additionally enables game developers to do some stuff that wasn't really feasible before is going to get lost in that noise if they were to just do that.

I don't think Mantle on the 290X and Ivy Bridge-E is nearly as interesting as Mantle on Kaveri, or when paired when an FX CPU or Kabini APU. If AMD picks up an advantage against Nvidia, yes, that's important. But ultimately, AMD makes the overwhelming majority of its revenue from APU sales in both consoles and PCs. Its GPU profits are, and always have been, fractional. AMD's net profit percentage from 2008 - 2013 on GPUs is something like 5%. I haven't updated this graph in awhile, but nothing has happened to change it: http://hothardware.com/newsimages/Item21429/AMD-GPU-Revenue2.png

The figures still look that way.

Let's say the R9 290X + IVB-E picks up 20% in a benchmark where the base test was already 150 FPS. That's great for competitive purposes against NV, but it doesn't change the actual experience the user enjoys, provided that frame latency remains consistent at both frame rates.

If the 7850K picks up 20% and, in so doing, moves from 30 FPS to 36 FPS, that's a much bigger improvement. That's something the end user immediately sees. Furthermore, AMD needs Mantle to go big on the APU because there's no way it can afford to do what Intel is doing with adding huge amounts of L4 cache directly on-chip. At 14nm, those huge L4s are going to spread through more of the product stack -- and AMD simply can't compete with it.

If GCN picks up 10-20% from Mantle at the low end, AMD can use those advantages to help fend off the impact of Crystalwell. Right now, Intel reserves the L4 for chips that don't compete with APUs, but that'll change as the cost adder of using the cache gets smaller.

And any change Mantle offers to FX gaming performance is going to be a much bigger deal. AMD needs a graphics solution that puts less work on the CPU because latency on these parts is ugly.
 
Last edited by a moderator:
Ugh, really? I've never tested BF3 or BF4 in single-player. I looked at the campaigns, which were never very good, concluded no one played them for that, and ran the multiplayer version. ;)

That's a very well know fact. How did it miss you for years?
 
Because I don't waste time paying attention to what I consider bad performance testing?

The BF3 single-player campaign was mediocre at best. The people playing the game clearly cared about its performance in multiplayer. So when I benchmarked it, I benchmarked multiplayer.
 
What AMD and Intel can do, IMHO, is to provide similar libraries
If they can't provide the same API they won't be used, now Oracle v Google established that APIs can't be copyrighted (in the US) for the moment ... but unlike PhysX a lot of this code doesn't seem to have an openly available API.

If the Gameworks license is particularly obnoxious it might even become impossible for them to meaningfully cooperate with developers using it at all.
 
Back
Top