NVIDIA Game Works, good or bad?

When 'open' (or access) is the core tenet of AMD's PR campaign, it's hard to see how this can't be on the table.

Any value AMD and Intel have about openness, big or small, are very hard to see on nvidia, gameworks included.

Has Nvidia been given access to TressFX?

I don't work for nvidia nor AMD, so I can't say.

That's what this entire discussion is about... NVIDIA will happily claim their stuff is "open" too. Everyone says it - it's a PR fluff word. If you want to criticize NVIDIA's definition I can absolutely criticize AMD's.

I believe nvidia never stated that gameworks was open.

People who think any for-profit company wouldn't do exactly the same thing given the opportunity are the ones being naive.

I know companies will do those things and yourself should know that, since I repeatedly expressed my opinion that AMD must prove a lot with Mantle.

I just have no problem with saying that something is bad when it's bad.

As even yourself said, nvidia abused tessellation repeatedly, but according to your view "They should change that... but it's not their fault. They are a company anyway, therefore evil. There's no reason to hold them accountable on anything."

This minimization of the issue goes completely against your expressed position that ISV should be educated about gameworks.

Feel free to rephrase that as you think it should be if I'm somehow understanding your "we can only complain and be jealous" moto.
 
As even yourself said, nvidia abused tessellation repeatedly, but according to your view "They should change that... but it's not their fault. They are a company anyway, therefore evil. There's no reason to hold them accountable on anything."
That's not what I said. I said in the case of the tessellation stuff they should be called out for it given that there was no real visual advantage (in the Crysis case that was pretty definitively shown - TBD on the fir accusations). That does not automatically mean everything they do is the same situation. Clearly they didn't need to ship binary DLLs to do it in the past which is my point - GameWorks hardly changes anything. They are separate issues.
 
Since its sort of related
Grid autosport has just been released the 2nd game (if ive got that right) to feature avx2 exclusive features
 
That's not what I said. I said in the case of the tessellation stuff they should be called out for it given that there was no real visual advantage (in the Crysis case that was pretty definitively shown - TBD on the fir accusations). That does not automatically mean everything they do is the same situation. Clearly they didn't need to ship binary DLLs to do it in the past which is my point - GameWorks hardly changes anything. They are separate issues.

It does make it easier and more systematic.

Since its sort of related
Grid autosport has just been released the 2nd game (if ive got that right) to feature avx2 exclusive features

Since AVX2 is both useful and accessible to AMD (though it hasn't been implemented yet) I'd say that's a good thing.
 
Since its sort of related
Grid autosport has just been released the 2nd game (if ive got that right) to feature avx2 exclusive features
I think I corrected you on this last time (assuming they are what I think)... these are not Intel CPU-specific features (i.e. AVX2), they are Intel *GPU* features. i.e. they use pixel synchronization, an Intel graphics API extension. If you run an Intel CPU + discrete GPU you can't use them.
 
Since its sort of related
Grid autosport has just been released the 2nd game (if ive got that right) to feature avx2 exclusive features
Only Advanced Blending (OIT) is exclusive to Intel GPUs in AutoSport as part of the PixelSync tech, smoke shadows is no longer exclusive and is available to all GPUs(on contrary to GRID 2 where both were exclusive).
 
Only Advanced Blending (OIT) is exclusive to Intel GPUs in AutoSport as part of the PixelSync tech, smoke shadows is no longer exclusive and is available to all GPUs(on contrary to GRID 2 where both were exclusive).
Cool, yeah I think the smoke stuff was important enough for their desired look that they wanted to implement a general path - even if slower - for other GPUs, but I don't know the details.

Anyways just wanted to point out it's GPU features, not CPU/AVX :)
 
Wowe, this thread could look nice or bad by the incredible times you have all pass on be really creative or objective, or subjective too at the same time for make pass your opinion..

At the end of the day, this will not change anyything:

- Nvidia have some plan, some strategy ( who have allready work by the past ), and whatever can be said about gameworks or is not said,this will dont change their strategy. ( their strategy for this have started 8years ago and they have never move of 1degree of this one ) ..

- AMD have some plan, some strategy too, and whatever is said here, this will certainly not change anything to them too.

- Intel have too his own strategy, and whatever is write here, in this thread, this will not change, again, anything for it ..

This said, a problem subsist: this will be really important to inform the consumer on what they see, what they read, and what they buy the next 5years ... Will do i create a 4x gpu setups next ? 2x Nvidia gpu's +2x AMD gpu's ?
It will be really important, from independant sources to inform the reviewers of what they test and what mean their results....

For the next years anyways, i think, a lot of reviewers will do like they do today, try to remove the games using gameworks from their test, benchmark (ofc todays, not every sites is do it, buf for their credibility, this will be necessary ).. and surely as i dont think AMD will not react: they will surely need do the same for the AMD gaming evolved ones ..

Andrew, you seems today to defend gameworks, i hope not see you tomorrow attack AMD, and even support them when they will do the same..when you will see them do the same ... because, they will be forced to do it, and they will surely do it ( In reality it have allready started ) .....

In reality when i read all this threads, i m like :
What a bad.. bad... day for the industry of the gaming and a real black day for the gamers and again we are not speaking about the consumers.



When today or tomorrow, you will get, that the crossfire will not work in batman games ( last version ), because developpers cant integrate it ( thanks you gameworks ) .. and you get review who tell you: if you have SLI, it is perfect, if you use CFX it dont.. same for AC4.. bugs cant be fixed with AMD ( flashing texutes and performance ) ( because of Gameworks ) .. developpers dont even try, they need to use their Nvidia consultant for it, who have suddenly disappears in the nature working now on watchodg ( this was 6 months ago ) funny.....

its funny because when i see today Gameworks vs Gaming evolved, i see Ubisoft vs Electronic Arts ... So whats next ? ubsisoft games will work perfectly on Nvidia and EA games perfectly on AMD gpu's ? ..
 
Last edited by a moderator:
Lanek ... you seem to defend only AMD. Unfortunately AMD has nothing comparable to GameWorks because they either lack sufficient know-how or willingness to spend the resources ... really is not Nvidia's fault or problem but is just what the company's corporate strategy is.

But you are correct about Nvidia, AMD, and Intel each having their own strategy. And each strategy not to benefit their competitor but people who are using their products. If you think any company is doing anything out of the goodness of their heart then you may be alittle naïve.

For example, just like pharmaceutical companies who spend huge amounts on R&D costs ... trade secrets are not shared with competitors freely and do not expect this to happen in the graphics industry.
 
That's not what I said. I said in the case of the tessellation stuff they should be called out for it given that there was no real visual advantage (in the Crysis case that was pretty definitively shown - TBD on the fir accusations). That does not automatically mean everything they do is the same situation. Clearly they didn't need to ship binary DLLs to do it in the past which is my point - GameWorks hardly changes anything. They are separate issues.

They are as separate as turning a blind eye to a man who threatens you one day and buys a gun in the next.

I think we have discussed this enough.
 
I dunno , I think they should compete with performance and features. I much rather both companies try and improve performance on all hardware than cripple it .

If you want to set yourself apart then do something like NVidia's 3d support or the new recording feature from amd.
 
Cool, yeah I think the smoke stuff was important enough for their desired look that they wanted to implement a general path - even if slower - for other GPUs, but I don't know the details.
Is there any chance that by the release of DX12 hardware, these exclusive options would be available for AMD/NVIDIA GPUs to run?
If you want to set yourself apart then do something like NVidia's 3d support or the new recording feature from amd.
Actually NVIDIA had the recording feature since more than 6 months ago (it's called ShadowPlay), AMD is yet to play catch up to it on both compatibility and functionality.
 
Lanek ... you seem to defend only AMD. Unfortunately AMD has nothing comparable to GameWorks because they either lack sufficient know-how or willingness to spend the resources ... really is not Nvidia's fault or problem but is just what the company's corporate strategy is.

But you are correct about Nvidia, AMD, and Intel each having their own strategy. And each strategy not to benefit their competitor but people who are using their products. If you think any company is doing anything out of the goodness of their heart then you may be alittle naïve.

For example, just like pharmaceutical companies who spend huge amounts on R&D costs ... trade secrets are not shared with competitors freely and do not expect this to happen in the graphics industry.

. Tressfx is not only hair or fur creation and animation, it work for fire animation, it work for natural environnement animations ( weeds, plants and forest). (particules ( smoke but i have a doubt on it, or it is a second "features") it was called tressfx because invovled on the hair animations of LaraCroft. I think the pdf( sorry powerpoint ) you can find on AMD site, on how use it could be interessant to read ( everything, includes the way to use the code and how it work is described ). Actually TressFX is just an example on what use this codes.. you can adapt it and use it as you want for whatever you want.

In general AMD have offer all their technic to MS ( DirectX ) or have been offered directly to developpers, when Nvidia create a library labelled from them ( who in the case is a real good library, well finished and ready to use ( and this reduce the cost of implementations for the developpers ).
 
Last edited by a moderator:
Actually NVIDIA had the recording feature since more than 6 months ago (it's called ShadowPlay), AMD is yet to play catch up to it on both compatibility and functionality.

oh my god... yes, and VCE recording in Sony/MS was not exampled in its usage by AMD, which was NOT able to even test it over hundreds of different settings, right? Come on...
 
Is there any chance that by the release of DX12 hardware, these exclusive options would be available for AMD/NVIDIA GPUs to run?
If a similar feature is available in DirectX feature level 12, game developers will have to modify their code to use it (i.e. it won't just automatically start working). But yes in theory eventually other vendors should be able to support pixel synchronization.
 
If a similar feature is available in DirectX feature level 12, game developers will have to modify their code to use it (i.e. it won't just automatically start working). But yes in theory eventually other vendors should be able to support pixel synchronization.

What's the difference between the Intel OIT and the one AMD demoed already years ago?
 
What's the difference between the Intel OIT and the one AMD demoed already years ago?


The AMD version has unbounded memory requirements, which means it can unpredictably fail on complicated scenes. This makes developers somewhat wary of using AMD's OIT algorithm.

The Intel OIT demo uses special hardware support to implement a bounded memory OIT that approximates the transparency.

Besides the stability improvement, I think the Intel version is likely to have better performance characteristics, as well.
 
cant you code a boundary like you can code to prevent buffer overuns ?
Yes, but you don't really have any good options once you run out of memory. Basically in the case of "too much transparency" you just have to drop fragments. This obviously produces artefacts and unfortunately you can't really control where and how serious the artefacts are. For instance, once you run out of space you may end up with 200 nodes for one pixel and none for some other pixel... the situation can become arbitrarily bad.

The Intel pixel synchronization version continually compresses/approximates the transparency function in a way that minimizes the visible errors (and, crucially, allocates a fixed amount of memory per pixel rather than across the entire scene). This "streaming compression" requires the ability to handle incoming fragments at a given pixel sequentially, which is what the Intel extension provides.

The extension is useful in a lot of other cases too, but OIT is definitely a big impact one that game developers have been wanting for a long time.

As noted, the Intel version (adaptive transparency) is also a lot faster than the general DX11 linked list one for a variety of reasons.
 
Back
Top