NVIDIA Game Works, good or bad?

Matrox supported this, too, it was just ATi's name for N-patches
I am not sure, do you have a source for that?

These need to be separated, as TressFX is open for anyone to view and optimize, while GameWorks is hidden from AMD/Intel and devs without licence they have to pay for
This classification only deals with technologies locked to specific hardware, Neither TressFX or GameWorks are locked in this way.

TrueAudio is supported by at least 2 games, Thief and Lichdom: Battlemage
I stand corrected, 2 games it is.
 
I am not sure, do you have a source for that?


This classification only deals with technologies locked to specific hardware, Neither TressFX or GameWorks are locked in this way.


I stand corrected, 2 games it is.
http://ixbtlabs.com/articles/matroxparhelia512/

They at least mention it, there was iirc more about it when it was still relevant
(and for truform being n-patches, it was even called that in some games (I think one was neverwinter nights))
 
Siggraph 2015

In other Nvidia news the firm has been trailing its appearance at SIGGRAPH 2015. The green team reckons that attendees will be asking "Is it real or is it rendered?" at Nvidia's booth 500 exhibit area. Nvidia claims that its rendering and light simulation software, and latest suite of design tools, "make it virtually impossible to tell the real from the rendered."

733137b6-653c-4121-829c-14fee59ea149.jpg


http://hexus.net/tech/news/graphics/85388-ucla-demos-geforce-powered-augmented-reality-sandbox/
 
AMD has been pretty vocal about VR recently. We can only assume that have some advantage there. How dare they push that!
Non sequitur. A: AMD has no VR to push. Yeah, they've been talking about it, but it's just talk, and you, I and everybody else knows it. At this stage it's doubtful the company will still even be around by the time VR catches on in the mainstream (assuming it ever does, that is.) B: there's been no talk at all of any AMD-proprietary VR.

So not the same thing, at all.
 
Non sequitur. A: AMD has no VR to push. Yeah, they've been talking about it, but it's just talk, and you, I and everybody else knows it. At this stage it's doubtful the company will still even be around by the time VR catches on in the mainstream (assuming it ever does, that is.) B: there's been no talk at all of any AMD-proprietary VR.

So not the same thing, at all.
Contrary to what you claim, LiquidVR SDK is available, AMD has just as much VR to talk about as NVIDIA does with their VR Direct. At least Oculus has also used AMD hardware on several occasions, not just NVIDIA
I can't find the link right now, but IIRC around the time LiquidVR was relesed some Oculus employee said he's much more impressed with the stuff AMD has made available to them right now, compared to what NVIDIA is offering
 
So where is AMD's proprietary VR then? Where can I buy it? Where are the games that support it?

Oh. Nowhere, you say? Well, okay then! ;)
 
Well, I didn't say anything about NV VR, so not sure what you're laughing about? :p
Not sure myself either, got few drinks on me though. I suppose I was assuming it was the usual comparison to NVIDIA, which has also been vocal about their VR support
 
I suppose I was assuming it was the usual comparison to NVIDIA, which has also been vocal about their VR support
Nah, I'm against all kinds of artificial divisions in the marketplace, including platform exclusives and so on. Like, if Oculus is going to release games that only work with their 3D glasses, I won't be buying their product. Proprietary software puts the normal mechanisms of market competition out of order; you no longer compete with the product you have, but rather the size of your pocketbook, by buying additional mercenary support for your product.

I think it's crap, for that reason and also, what if my Oculus goggles break, and next time I desire a different brand of goggles because they're better? I'm just going to give up part of my software library as well then?

Proprietaryness almost always ends up holding the market back. 3dfx did that for several years, but the company went down the shitter because of their own incompetence, so that's OK. MS has successfully held the market back numerous times for many years each time - browser wars is just one such example. But we're going a bit off-topic here, so I'll wind this post down now... ;)
 
Originally open v1.0 + 2.0, Then locked to Creative hardware. Now opened to run on all CPUs using the OpenAL initiative

I thought the only extensions that were made public with OpenAL were the ones up to EAX 4.0, and EAX 5.0 died a bastard's death.
 
I always says that I don't mind Nvidia providing these library. What I do mind is Nvidia making something that is harmful using their own hardware but they still do it because the impact is more severe for their competitor. If a dev make a game using tessellation for flat objects or objects hidden from view, that dev would be called incompetent. With Nvidia providing it, Nvidia supporter calling it leveraging their hardware.
Actually, I would prefer for Nvidia to make their enhancements proprietary and not accessible without Nvidia hardware so devs need to think more when using game works or other Nvidia stuff. It would probably be more optimized since they don't have to think about the performance of their code on other hardware. At least that way we could prevent game works from being an essential part of a game (like using hairworks for a game where grass is an essential gameplay element for hiding). Also preventing the possibility of Nvidia sabotaging AMD performance because it can run on AMD hardware in the first place.

Also, if an IHV have an exclusive hardware feature that they want to use, I'm really okay with them making an exclusive stuff using it. If AMD can't do tessellation and NV can and they want to show it, it's okay.
If AMD want to utilize TrueAudio stuff, I would really be happy.
Instead of this hardware stuff sitting idle, I would prefer for it to be utilized.

Although having said that, if the stuff that are proprietary actually can be run on other hardware with relatively similar or even better performance than on NV hardware, I would be really annoyed by that. For example, on Android, there are Tegra version of a game with better graphics than the non Tegra version that you can hack to run on other hardware and it run better than on Tegra. Or another game where if the game detects a Tegra SoC, it will display better effects (which again, if you hack it, other SoC can run it better).
You can see it as Nvidia actively trying to help making better games or you can see it as Nvidia actively blocking other IHV to reach their full potential.

How can someone read this next post and not see it as harmful behavior by Nvidia?

Ok well lets take Crysis 2 for example, since AMD's Huddy complained about that recently, was the dev really incompetent or was it nV's try to exclude AMD hardware, when Huddy was stating that they used too much tessellation?

Ever since Cry Engine 1 that water plane is always visible under the terrain. The only way to remove the water terrain was to get the camera far enough away, same goes with Cry Engine 2..... Complaining about this is a moot point, the engine wasn't set up for anything else. Now lets take the objects, were they too tessellated? I agree some of those details will show up from the normal, ao maps, but the problem is the tessellation factor is for ALL objects for an entire scene, not individual objects in a scene.

So now this is the similar with Batman or any other game out there. Can't we see the differences with Witcher 3 in hairworks, with AMD's slider?

Crysis 2 tessellation was on USELESS objects, according to reports http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2
 
How can someone read this next post and not see it as harmful behavior by Nvidia?



Crysis 2 tessellation was on USELESS objects, according to reports http://techreport.com/review/21404/crysis-2-tessellation-too-much-of-a-good-thing/2


I don't care what articles say what, unless you have tried to develop an application or game with Cry Engine 1, 2 or 3 you don't really know what the limitations of said engine are. I can go into a slew of limitations which will force a developer to make design decisions which would actually limit the way the artwork was created. Repectfully the article you have linked to, they are very good peeps and know graphics cards well, but as above without really understanding the engine and how that differs from what is being rendered and how its being rendered it falls short.
 
Yes, because obviously CryTek needed to make a concrete slab like this:

UbBXNNZ.jpg


And whoever says otherwise doesn't really know what the limitations of said engine are.
 
I don't care what articles say what, unless you have tried to develop an application or game with Cry Engine 1, 2 or 3 you don't really know what the limitations of said engine are. I can go into a slew of limitations which will force a developer to make design decisions which would actually limit the way the artwork was created. Repectfully the article you have linked to, they are very good peeps and know graphics cards well, but as above without really understanding the engine and how that differs from what is being rendered and how its being rendered it falls short.

I'm pretty sure that no version of CryEngine forces you to use thousands of triangles for utterly flat surfaces.
 
I'm pretty sure that no version of CryEngine forces you to use thousands of triangles for utterly flat surfaces.


Crysis 2 was never part of gameworks, gameworks wasn't even around then. Crytek made the tesselation code they didn't use any premade code from nV ;)

And so what is this about again? If you only look at one or two objects that doesn't do anything. Unless your saying that one object is enough to crush AMD performance!

The engine doesn't force anything, but with default settings if a dev doesn't change the tessellation factors in the ini or cfg files, it will stay the same. Blaming this on nV is stupid for Huddy has nothing else he can do but point at nV because if he points at Crytek, you think Crytek will work well with AMD again?


http://docs.cryengine.com/display/SDKDOC2/Tessellation+and+Displacement

Tessellation even in Cry Engine 3 has limited factors but at each factor it is clearly visible that the highest setting looks better. You can also have different types of tessellation for even more detail, and tessellate shadows as well for increased realism, don't know how beneficial that would be in gaming but the feature is there.

Now the way the hull shader works, is it only gets data at certain times, so if there is one object or many objects being tessellated it won't really matter, but what matters is how much procedural geometry is created. In AMD's case it will stall the pipeline and this will also cause the shader array more pressure if complex shaders are being used at the same time.
 
Last edited:
Back
Top