R500 and NV50 to sport PS/VS 4.0 ??

XBit Labs said:
It is not clear whether Microsoft supports Pixel Shaders 3.0 and Vertex Shaders 3.0 widely or not, but in case it enables their functionality at all, NVIDIA is very likely to promote the new capabilities among game and professional applications developers. Given that both NV50 and R500 processors are most likely to support Pixel Shaders 4.0 and Vertex Shaders 4.0 as well as all previous versions of DirectX shaders, support for innovative capabilities may really prolong the lifespan of NVIDIA’s next-generation NV40 graphics processing unit. However, there are no games announced that utilise Pixel Shaders 3.0 or Vertex Shaders 3.0 this year.

Ok so X-Bit states the the R420 will not have PS/VS 3.0(not new since it's been discussed here). But now they state the next gen cards will have PS/VS 4.0? When will the PS/VS standard level out as there aren't even games the support PS/VS 2.x/3.0 out atm.

US
 
DX-Next mentions unified grids; in other words shaders 4.0.

The DX-Next preview article here on B3D is an interesting read, if anyone hasn't read it yet.

On to xbit-labs paragraph:

It is not clear whether Microsoft supports Pixel Shaders 3.0 and Vertex Shaders 3.0 widely or not, but in case it enables their functionality at all, NVIDIA is very likely to promote the new capabilities among game and professional applications developers.

It's been in dx9.0 since the beginning. I'm not so sure what they mean with support exactly.

http://msdn.microsoft.com/library/d...reference/shaders/vertexshaderdifferences.asp

Given that both NV50 and R500 processors are most likely to support Pixel Shaders 4.0 and Vertex Shaders 4.0 as well as all previous versions of DirectX shaders, support for innovative capabilities may really prolong the lifespan of NVIDIA’s next-generation NV40 graphics processing unit. However, there are no games announced that utilise Pixel Shaders 3.0 or Vertex Shaders 3.0 this year.

Very oversimplyfied:

Shaders 4.0= PS/VS3.0 in a unified grid + unlimited resources.
 
It is not clear whether Microsoft supports Pixel Shaders 3.0 and Vertex Shaders 3.0 widely or not
Not clear to some.

However, there are no games announced that utilise Pixel Shaders 3.0 or Vertex Shaders 3.0 this year.
O, at least one SM 3.0 title will (well, should) be announced at E3 this year (I've been offered the chance of a behind-doors demo). But the statement is correct nonetheless -- none announced thus far.
 
991060 said:
hoping to see advanced HOS being supported by next major DX API update.

API support will be most likely there, the real question is whether we'll see any of it and to what degree implemented in hardware.

Where's my displacement mapping and/or adaptive tesselation found in dx9.0 as an example?
 
Ailuros said:
API support will be most likely there, the real question is whether we'll see any of it and to what degree implemented in hardware.

Where's my displacement mapping and/or adaptive tesselation found in dx9.0 as an example?

Maybe we'll see "render to vertex buffer" ability first get implemented,I personally think it's quite an useful technique,and we've already had its implementation in XBOX(not XBOX2 :D ) and through some OGL extensions.
 
DX10 aka DirectX Next have Vertex/Pixel Shader version 4.0.

NV50, R500 (and Xbox2 graphics processor) are ment to be DX10 / DirectX10 parts, so, yeah, I would think they'll both support shaders 4.0 8)
 
I truly hope that MS moves off of standardized assembly for the next iteration of DX. That is, I don't want to see PS/VS 4.0.
 
well, if the next one is turing complete, there should be no need for further "enhancements".

but first lets get 3.0 into hw.
 
You need more than turing completeness. You need the operations that you are interested in to run fast. Maybe 4.0 will allow user programmable tessellation, but the fact that something can be written is not enough. CPU's can also render video, since they are turing complete, but they're too slow. Thus. although 4.0 may deliver true turing completeness, it will still have to deliver the right accelerated abstractions to make algorithms of interest run acceptably.
 
How will it look better? Well, we have a long way to go in the lighting department before lighting in computer games starts to really look good. For example, in the images you posted, that creature is just artificially shiny. Moving forward, more complex shaders will lead to more organic-looking creatures and environments.

And why move forward? Well, there are just some algorithms that would be vastly too slow for PS 2.0 or PS 3.0. The more programmability is available, the more algorithms become available, as fewer algorithms totally destroy performance.
 
Chalnoth said:
How will it look better? Well, we have a long way to go in the lighting department before lighting in computer games starts to really look good. For example, in the images you posted, that creature is just artificially shiny. Moving forward, more complex shaders will lead to more organic-looking creatures and environments.

And why move forward? Well, there are just some algorithms that would be vastly too slow for PS 2.0 or PS 3.0. The more programmability is available, the more algorithms become available, as fewer algorithms totally destroy performance.

Thanks for your reponse. :D Will we ever see grahpic cards with 4xAA 8xAA with not pentalty in frames per second?
 
Chalnoth said:
How will it look better? Well, we have a long way to go in the lighting department before lighting in computer games starts to really look good. For example, in the images you posted, that creature is just artificially shiny.

It's supposed to be that way - like the chitin on a Scarab beetle.
 
It's not that shiny in real life. But specular is not the problem, it's diffuse and ambient that are hard. But even after you crack global illumination, you've still got lots of other problems. Even given the massive uber offline rendering power of today's farms, there's few CG images that completely fool us, especially in motion.

This doesn't even begin to address the problem of natural motion and physics in games, especially WRT biological physics. For the forseeable future, motion capture or puppetry will still be used to get realistic movement, but stitching together that movement and not appearing repetitive is a challenge. Once you see the same animations over and over, and jerky transistions that no human would do, you lose the realistic feel and enter the realm of "something doesn't look right"
 
kenneth9265_3 said:
Thanks for your reponse. :D Will we ever see grahpic cards with 4xAA 8xAA with not pentalty in frames per second?
Nope. Such things simply take more processing power. But the performance hit will continue to drop, as graphics cards will be designed with the expectation that users will run with antialiasing and anisotropic filtering enabled, and so will no longer optimize for situations where they are not enabled.

Which reminds me. One major shortfall of DirectX 9 is that it is impossible to enable multisampling AA when rendering to a texture. This makes it impossible to perform antialiasing for some games that do this.
 
Bouncing Zabaglione Bros. said:
Chalnoth said:
How will it look better? Well, we have a long way to go in the lighting department before lighting in computer games starts to really look good. For example, in the images you posted, that creature is just artificially shiny.
It's supposed to be that way - like the chitin on a Scarab beetle.
Sure, they made it that way, but there's nothing (short of the polished finish of a car) that is that shiny in real life. It's really hard to get shininess just right when there's still supposed to be some scattering, such as on a beetle (i.e. it's supposed to be reflective, but the reflections are smudged out).
 
kenneth9265_3 said:
Hello, I am pretty much new to the forum. With DX9 cards from Nvidia and ATI already being pretty much amazing with the use of Shaders 2.0 (like from screen shots form Half-Life 2) how much more can they improve upon this?

http://image.com.com/gamespot/images/2003/screen0/914642_20030911_screen001.jpg
http://image.com.com/gamespot/images/2003/screen0/914642_20030911_screen002.jpg

Most of the "real" thing you perceived from the HL2 screenshots are done through pre-computed lightmaps,this is also why valve can't give you a moving sun which is of course very important to simulate the real world. :LOL: Ironically,we have to thank them for "fooling" us in order to keep the FPS to a playable level.
 
Back
Top