NVIDIA Game Works, good or bad?

A well researched article that tries to show the arguments of both sides without getting sensational.

Too bad that due to the nature of the Internet it will get way less clicks than what came before...
 
Good article effectively, for the speculation on the end, let say it was supposed to be speculation of how can or will not turn the thing.
 
The article shows the points some of us were making again and again:

According to Valve programmer Rich Geldreich, the principle benefit of GameWorks is that it gives Nvidia an opportunity to optimize segments of code that it wouldn’t normally control directly.

Developers say: Source code access is often vital.

“[T]here are fundamental limits to how much perf you can squeeze out of the PC graphics stack when limited to only driver-level optimizations,” Geldreich told ExtremeTech. “The PC driver devs are stuck near the very end of the graphics pipeline, and by the time the GL or D3D call stream gets to them there’s not a whole lot they can safely, sanely, and sustainably do to manipulate the callstream for better perf. Comparatively, the gains you can get by optimizing at the top or middle of the graphics pipeline (vs. the very end, inside the driver) are much larger.”

Geldreich spoke directly to this question, saying: “I don’t think NV is purposely tilting the table here. But if you look at the big picture, I wouldn’t preclude a sort of emergent behavior that could effectively tilt the competitive advantage to whatever vendor manages to embed the best developers into the most key game teams.”

I'm bummed that the article didn't expand further on the differences between Mantle and GW, which are brought up every single time. IMHO it's all about potential: GW is middleware through and through, it won't ever grow beyond that, while Mantle had the potential to become an industry standard. That and the fact that AMD stands behind standards way more than Nvidia.

I did became concerned about Mantle further development, though.
 
Nobody here ever claimed that it doesn't give Nvidia more control or that it doesn't give them the opportunity for cheating. If course it does.

But some seem to see GameWorks as only a mean and evil plot whereas Mantle is all for the good of humanity. Even the developers agree that the former is not the case.
 
I'm bummed that the article didn't expand further on the differences between Mantle and GW, which are brought up every single time. IMHO it's all about potential: GW is middleware through and through, it won't ever grow beyond that, while Mantle had the potential to become an industry standard. That and the fact that AMD stands behind standards way more than Nvidia.

Industry standards is not immune from and has been abused and manipulated for the benefit of certain parties over the competition.

AMD has been for years consistently saying this about Intel:

As long as Intel remains in this dominant position, we may be materially adversely affected by Intel’s:

  • business practices, including rebating and allocation strategies and pricing actions, designed to limit our market share and margins;
  • product mix and introduction schedules;
  • product bundling, marketing and merchandising strategies;
  • exclusivity payments to its current and potential customers and channel partners;
  • control over industry standards, PC manufacturers and other PC industry participants, including motherboard, memory, chipset and basic input/output system, or BIOS, suppliers and software companies as well as the graphics interface for Intel platforms; and
  • marketing and advertising expenditures in support of positioning the Intel brand over the brand of its OEM customers.
Intel’s dominant position in the microprocessor market, its introduction of competitive new products, its existing relationships with top-tier OEMs and its aggressive marketing and pricing strategies could result in lower unit sales and a lower average selling price for our products, which could have a material adverse effect on us.

Heck, right now AMD is actively misusing VESA name to spread FUD and confusion about certain competition product/technology.
 
IMHO it's all about potential: GW is middleware through and through, it won't ever grow beyond that, while Mantle had the potential to become an industry standard.
I don't really find that a compelling differentiation (seems pretty subjective). Mantle as it is today will never be an industry "standard". If Mantle inspires other APIs (DX12, Metal at least to some extent) that's the relevant success criteria, right? And so if that is the case, wouldn't a company that supports standards declare victory, drop their proprietary version and encourage people to use the standardized one?

i.e. Do you disagree that if AMD continues to push Mantle on Windows once DX12 is available they deserve the same ire that GW is getting?
 
Heck, right now AMD is actively misusing VESA name to spread FUD and confusion about certain competition product/technology.
They're doing what now? :???:
Last time I checked, nothing Adaptive-Sync-related has been FUD
 
Ubisoft and NVIDIA Partner Up for Assassin's Creed Unity Far Cry 4 And More

Ubisoft and NVIDIA today announced the next chapter in their strategic partnership bringing amazing PC gaming experiences to life in Ubisoft's highly anticipated upcoming titles including Assassin's Creed Unity, Far Cry 4, The Crew and Tom Clancy's The Division.
NVIDIA's GameWorks Team is working closely with Ubisoft's development studios to incorporate cutting edge graphics technology and gaming innovations to create game worlds that deliver unprecedented realism and immersion. NVIDIA's GameWorks technology includes TXAA antialiasing, which provides Hollywood-levels of smooth animation, soft shadows, HBAO+ (horizon-based ambient occlusion), advanced DX11 tessellation, and NVIDIA PhysX technology.

This announcement builds on the successful collaboration between Ubisoft and NVIDIA that added visually stunning effects to Tom Clancy's Splinter Cell Blacklist, Assassins Creed IV Black Flag and Watch Dogs.

http://www.guru3d.com/news_story/ub...assassins_creed_unity_far_cry_4_and_more.html
 
Would I personally use GameWorks if I was making a game? Nope. But consumers would have every right to blame *me*, not NVIDIA if I decided to and it ran poorly on their hardware (be it NVIDIA, Intel, AMD or otherwise) just like any other middleware. Ultimately I'm responsible to my users for their experience with my game on whatever hardware I claim to support.
But is it like any other middleware? NVIDIA says that when you get a source code license it only applies to you, which is technically true... but I suspect disingenious. Do embedded developers from the IHV's ever have trouble working with the source code of games with other source level middleware? I suspect getting a waiver/nda would be trivial.
 
Nobody here ever claimed that it doesn't give Nvidia more control or that it doesn't give them the opportunity for cheating. If course it does.

But some seem to see GameWorks as only a mean and evil plot whereas Mantle is all for the good of humanity. Even the developers agree that the former is not the case.

Not really, people just presume nvidia will be uncompetitive.

Industry standards is not immune from and has been abused and manipulated for the benefit of certain parties over the competition.

AMD has been for years consistently saying this about Intel:

Heck, right now AMD is actively misusing VESA name to spread FUD and confusion about certain competition product/technology.

You are correct, abuse and manipulation comes easily when somebody has too much power over something. We can see that in your example about Intel.

We could also say that about VESA right now, but IMHO that's only because the only other "relevant" GPU IHV (nvidia, sorry Intel) is MIA. It's obvious that VESA could "take the side" of the IHV that actively tries to empower VESA!

What if nvidia tried to really support VESA/OpenCL/etc for a change?

I don't really find that a compelling differentiation (seems pretty subjective). Mantle as it is today will never be an industry "standard". If Mantle inspires other APIs (DX12, Metal at least to some extent) that's the relevant success criteria, right? And so if that is the case, wouldn't a company that supports standards declare victory, drop their proprietary version and encourage people to use the standardized one?

i.e. Do you disagree that if AMD continues to push Mantle on Windows once DX12 is available they deserve the same ire that GW is getting?

As I already mentioned in that same post, I'm concerned, yes. But that's a tainted question you bring to paint Mantle in the way that most benefits your views, why? Because you can't forget that DX12 will only work on Windows, and also because not supporting windows would doom any development of Mantle in any other OS.

What changed: if I had no reason to doubt AMD before, now I may be concerned. That isn't because Mantle changed yet, I'm just cautious that way.
 
People are just angered by gameworks because their assumption is that nvidia will abuse their position with the middleware. That stems from nvidias actions over many years so you can't really blame consumers for assuming the worst from them.
 
Last edited by a moderator:
You are correct, abuse and manipulation comes easily when somebody has too much power over something. We can see that in your example about Intel.

Make no mistake, having your tech powering all the console would put you in a very powerful, strategic position. There is no reason to make an exception for AMD, of similar scrutiny being put over Intel or NVidia.

We could also say that about VESA right now, but IMHO that's only because the only other "relevant" GPU IHV (nvidia, sorry Intel) is MIA. It's obvious that VESA could "take the side" of the IHV that actively tries to empower VESA!

What if nvidia tried to really support VESA/OpenCL/etc for a change?

OK, you have some fair point here, but I think it's a bit too much to call what AMD have done in this context "empowering" VESA. What they do is just to make "MSA TIMING PARAMETER IGNORE" option now acceptable to DisplayPort instead of just limited to eDP. That's it, a simple convention, inspired by G-Sync. The bulk of the work still need to be done by participating display manufacturers, whose decision to implement it -- wheter it is worth the hassle or not -- very likely will depends on the success of G-Sync.
 
Make no mistake, having your tech powering all the console would put you in a very powerful, strategic position. There is no reason to make an exception for AMD, of similar scrutiny being put over Intel or NVidia.

There seems to be one, and big:

Mantle is part of the engine's back end, it's open and NVIDIA can assist game developers if they wish to improve their game performance. Adding mantle support is not a waiver to NVIDIA. Also, at worst, Your game engine can use any other backed render codepath. Its not that games are 'mantle only' (which would make for the above).

Using GameWorks, as AMD reported, generates a waiver towards AMD, so that such products cannot be at least partially optimized towards AMD cards, making the job of offering a decent performance (not maximised, decent!) very hard.

I see a huge difference here.
 
There seems to be one, and big:

Mantle is part of the engine's back end, it's open and NVIDIA can assist game developers if they wish to improve their game performance. Adding mantle support is not a waiver to NVIDIA. Also, at worst, Your game engine can use any other backed render codepath. Its not that games are 'mantle only' (which would make for the above).

Using GameWorks, as AMD reported, generates a waiver towards AMD, so that such products cannot be at least partially optimized towards AMD cards, making the job of offering a decent performance (not maximised, decent!) very hard.

I see a huge difference here.

Without saying we got the example today with BF4 and Thief, does thoses games run badly on DX11path ? Does Mantle have create a context where Nvidia was not able to optimize their drivers, work with Dice developpers for optimize the game for their GPU ? And does it have prevent them to release a DX11" special" driver who increase DX performance for thoses titles ? no.
 
but gpu's have scalers (or can scale) can the gpu scaler be used instead of the panel scaler ?


Yes, but only if the GPU controls all input to the monitor. Usually monitors have multiple inputs and an OSD to control brightness, etc. if you get rid of the scaler, you lose such functionality.
 
Make no mistake, having your tech powering all the console would put you in a very powerful, strategic position. There is no reason to make an exception for AMD, of similar scrutiny being put over Intel or NVidia.

I'm not asking for making exception for AMD, it's the opposite. We can't with a straight face scrutinize either IHV if we don't do the same to every other.

OK, you have some fair point here, but I think it's a bit too much to call what AMD have done in this context "empowering" VESA. What they do is just to make "MSA TIMING PARAMETER IGNORE" option now acceptable to DisplayPort instead of just limited to eDP. That's it, a simple convention, inspired by G-Sync. The bulk of the work still need to be done by participating display manufacturers, whose decision to implement it -- wheter it is worth the hassle or not -- very likely will depends on the success of G-Sync.

G-sync is probably inspired by eDP, which is a VESA standard. Nvidia took that and concocted a proprietary solution. It may make business sense, but it obviously takes away power from VESA. AMD apparently shoehorned a response from VESA with async.

AMD did the same with DockPort, a "response" to Thunderbolt.
 
OK, you have some fair point here, but I think it's a bit too much to call what AMD have done in this context "empowering" VESA. What they do is just to make "MSA TIMING PARAMETER IGNORE" option now acceptable to DisplayPort instead of just limited to eDP. That's it, a simple convention, inspired by G-Sync. The bulk of the work still need to be done by participating display manufacturers, whose decision to implement it -- wheter it is worth the hassle or not -- very likely will depends on the success of G-Sync.

Eh? A change that is probably going to be a few pennies or a few USD at most versus something that is potentially 100 USD or so to implement by the monitor vendors? And both achieve the same results, although reports are that Gsync doesn't do well when framerate drops below (http://anandtech.com/show/7582/nvidia-gsync-review/3) 35 fps. It doesn't even do anything if framerate drops below 30. OTOH, Adaptive Sync supports framerates as low as 9 (http://techreport.com/news/26451/adaptive-sync-added-to-displayport-spec). Granted gaming at 9 fps isn't going to be pleasant, but at least it'll be tear free. :p

I do applaud Nvidia for pushing to make something like this happen. I just wish they'd done it in a way similar to AMD by just adopting an existing method for doing it on the notebook display side of things into the desktop side of things.

But regardless, it's only a matter of time before Nvidia drops the Gsync hardware requirement for monitors and just uses the VESA standard Adaptive Sync. Although I would expect them to still call it Gsync when they list the feature on their graphics cards.

And in the end consumer's win. Nvidia pushes the tech into the limelight. AMD comes along and makes the proprietary Nvidia hardware redundant. And consumer's get to experience it at minimal cost.

What I wonder the most is how it will handle windowed gaming, or if it'll only be available when full screen gaming. I'm expecting it'll be full screen only, unfortunately.

Regards,
SB
 
Back
Top