NVIDIA Game Works, good or bad?

Mantle is as open as CUDA. The only difference is CUDA is actually well documented and independent third parties have contributed significantly to CUDA libraries and optimizations.

Mantle isn't open just because AMD says it's a future possibility. You can say that for anything.
 
Haha isn't that still in the future?

Either way I'm not sure what the long game is here. The whole CPU overhead angle will be dead with DX12 on the horizon. There will be no incentive for developers to code a Mantle path unless it offers something above and beyond DirectX.
 
Mantle is as open as CUDA. The only difference is CUDA is actually well documented and independent third parties have contributed significantly to CUDA libraries and optimizations.

Mantle isn't open just because AMD says it's a future possibility. You can say that for anything.

CUDA has also been a released product for many years with several versions and many implementations in the field. Mantle is still in beta.
 
Haha isn't that still in the future?

Either way I'm not sure what the long game is here. The whole CPU overhead angle will be dead with DX12 on the horizon. There will be no incentive for developers to code a Mantle path unless it offers something above and beyond DirectX.

It's in the future but it's not just a possibility, it's a commitment.
 
CUDA has also been a released product for many years with several versions and many implementations in the field. Mantle is still in beta.


Yes it's in beta but that doesn't give it a free stamp to claim openness. What exactly is open about it besides (so far) empty promises? Also what incentive is there for any other IHV to develop Mantle support? Intel is already showing off DX12 demos.
 
Yes it's in beta but that doesn't give it a free stamp to claim openness. What exactly is open about it besides (so far) empty promises? Also what incentive is there for any other IHV to develop Mantle support? Intel is already showing off DX12 demos.

How are they "empty promises"? By your definition, every single promise ever made would be "empty promise" to begin with
 
Also what incentive is there for any other IHV to develop Mantle support? Intel is already showing off DX12 demos.

Mantle is available now I guess? DX12 will be available who knows when. Even if Mantle was fully open and SDK or whatever are available, I'm not sure it would make a difference. I can't see Nvidia using it.

The fact that it's not "released" and isn't available for other IHVs to implement is somewhat hurting AMD as well. I guess it's a line between wanting to sell more AMD chips with Mantle, or making Mantle more widespread in the hope of selling more chips.
 
How are they "empty promises"? By your definition, every single promise ever made would be "empty promise" to begin with


Not necessarily. I'm just saying that PR talk isn't enough to claim openness. Several people are claiming Mantle is open. It's not, in any sense of the word.

@Malo AMD did us all a great service by lighting the fire under Microsoft's and Khronos' collective asses. DirectX 12 isn't that far behind though so there's a very narrow window for AMD to capitalize.
 
If Mantle is already appearing in *some* games, then officially it should be "out-of-beta" stage and as such why the need for NDA and secrecy?

Trini is right ... definitely not open by any "stretch of the imagination".
 
If Mantle is already appearing in *some* games, then officially it should be "out-of-beta" stage and as such why the need for NDA and secrecy?

Trini is right ... definitely not open by any "stretch of the imagination".

It's not out of beta stage, they've made it clear ages ago, despite the fact that some games already support it.
 
Answering to the upset mod in the other (closed) thread: it's not that game developers are incompetent or not passionate or unwilling to make better games. And nobody claimed that either. It's that a visualization library requires a particular expertise that is probably not all that common among game developers. When you see that siggraph has articles about how to render water or hair just a little better year after year, I can only conclude that it's not something that all of your poorly paid game programmers can implement in a few weeks.

And when then faced with the choice of working on a game specific feature (story line, ...) or something generic (hair, clothes, ...), it totally makes sense to take an off the shelve solution. Nvidia wouldn't do this stuff if there wasn't a demand for it.
 
I'm okay with Nvidia providing solution, even if it closed as long as the solution isn't made with the intention to ruin the performance of their competition, which is a lot of people believe, my self included, with the tessellation and hairworks.
Devs free to choose whether to use Nvidia or other solution, and the solution can even be biased to a certain vendor, but the bias should be because necessity and not sacrificing performance because other vendor will be be hit even harder.
Anyway, since you can actually pay to get the source code (as long as you don't disclose it to others), what stopping AMD to pay for the source code? Or Nvidia specifically says that other hardware vendor can't see their gameworks/hairworks/etc code? If Nvidia actively blocking AMD even if AMD wants to pay, then it's not okay.
 
Last edited:
I think you're allowed to share the source with other licensees. So if AMD had a license they could. But who knows whether either AMD or nvidia would be interested in that arrangement.
 
Of course Nvidia can block AMD from buying the source code. Nvidia's ambition (largely fulfilled) is to be the one stop shop for everything graphics. They do not want to be just another chip company. They want you to think Nvidia when you think graphics, not just an interchangeable component maker.

GameWorks is just a small part of that. There's Gsync for monitors. There's iRay for high quality rendering. There's 3D Vision for stereo. There's CUDA libraries for neural networks. They have the biggest (and only? ;) ) conference dedicated to GPU computing. They have the fastest Android gaming devices.

They invest tons of money in it, and it seems to work. I rarely watch anything game related in YouTube, but I'll replay those cool videos with water and physics destruction over and over again. And at the beginning and the end, it shows the Nvidia logo. Year by year, they build up the image that they are the ones that are pushing the graphics forward. (See also the moon landing demo with Maxwell.) When was the last time AMD has shown something like that?

It must cost millions and then people simply expect them to just open it up and give it all away.

As for tessellation: how long has AMD been worse at it? 6 years? It's starting to get a bit old, doesn't it? If Nvidia sees tessellation as a way to implement their effects and it doesn't hurt their performance, then all power to them. I'm still waiting for AMD to show the first proof that Nvidia is actively sabotaging AMD GPUs.
 
It must cost millions and then people simply expect them to just open it up and give it all away.
No, I expect them to not A: compete unfairly and not attempt to corner the entire market* and B: not torpedo game performance for people with other hardware than their own while attempting to fulfil (A)...

*Yes, I'm an unrealistic, brainless socialist, you don't need to remind me. I know already.
 
That is where you wrong, it does hurt Nvidia performance, it just hurt AMD more. Again, I'm okay with Nvidia leveraging their hardware, but from what I see, Nvidia could use a more sane setting which should benefited their hardware or even better, they could give access for devs so they can implement a user setting to change the tessellation level. And for the code, this isn't a driver code but a game/fx code. If AMD were to make a game and want to use the code and willing to pay to see the source, should AMD be blocked? And you forget about AMD hair (TressFX?) where I believe you can see the code.
If all graphics vendor doing their own closed library, then it would be a mess for future PC gaming.
 
No, I expect them to not A: compete unfairly
What does that mean? Is it unfair to use features that happen to have the best performance on your own products? Is it unfair to spend millions and then not give it away for free?

and B: not torpedo game performance for people with other hardware than their own while attempting to fulfil (A)...
"Hey, this algorithm is easiest to implement with static tessellation settings and it's guaranteed to look best as well when does settings are high. Or should we spend extra effort to make sure it runs well on that braindead geometry architecture of our competitors?"

What would you do? They're not a charity.
 
Back
Top