Bob3D said:can anyone here write some relevant PS 3.0 shaders to run with NV40?
In a Word... NO.digitalwanderer said:What about when dx9.1 comes out? Won't nVidia wipe the floor with ATi then? :|
Hellbinder said:In a Word... NO.digitalwanderer said:What about when dx9.1 comes out? Won't nVidia wipe the floor with ATi then? :|
Chalnoth said:In a Word... YES.digitalwanderer said:What about when dx9.1 comes out? Won't nVidia wipe the floor with ATi then? :|
Bob3D said:Hellbinder said:In a Word... NO.digitalwanderer said:What about when dx9.1 comes out? Won't nVidia wipe the floor with ATi then? :|
In a future...not far away...
Chalnoth said:In a Word... YES.digitalwanderer said:What about when dx9.1 comes out? Won't nVidia wipe the floor with ATi then? :|
Stryyder said:Any gueses until how long this is featured as a Inquirer article without giving source credit? anyone?
That depends, what were we talking about again? I kind of lost track...Stryyder said:Any gueses until how long this is featured as a Inquirer article without giving source credit? anyone?
tEd said:why are there so huge results differences between sm2.0 and sm2.1?
Ante P said:tEd said:why are there so huge results differences between sm2.0 and sm2.1?
wth is sm2.1?
Shadermark 2.1 (beta) is one of them and here we test the performance with Pixel Shader 2.0
tEd said:Ante P said:tEd said:why are there so huge results differences between sm2.0 and sm2.1?
wth is sm2.1?
Shadermark 2.1 (beta) is one of them and here we test the performance with Pixel Shader 2.0
from nordichardware review
Uhm...I think you're using the words "superior" and "inferior" incorrectly in your post, but I think I get your gist.Bob3D said:We are talking about a superior piece of hardware losing for a inferior piece of hardware, with a grain of hope about DX9.0c and shaders performance
digitalwanderer said:Uhm...I think you're using the words "superior" and "inferior" incorrectly in your post, but I think I get your gist.Bob3D said:We are talking about a superior piece of hardware losing for a inferior piece of hardware, with a grain of hope about DX9.0c and shaders performance
digitalwanderer said:Uhm...I think you're using the words "superior" and "inferior" incorrectly in your post, but I think I get your gist.Bob3D said:We are talking about a superior piece of hardware losing for a inferior piece of hardware, with a grain of hope about DX9.0c and shaders performance
Ante P said:They are wrong, I've updated them in an upcoming article:
By the way the second 6800 Ultra (the one on the right) is the Extreme version.
Ante P said:By the way any idea why performance drops when going partial precision in the "PS 2.0 - Simple" shader?
I remember seeing it mentioned here at B3D before but I can't find the thread.
Looking at the 6800 Ultra results a little more closely though, we see some curious performances because in some cases the PS1.x and partial precision tests are slower than the PS2.0 tests! With the PS1.x and partial precision tests the internal precision is likely to be calculated at FP32 precision, but a type conversion must occur at some point - each of the shaders in this test are actually relatively, with the first few only using about 4 instructions - these are probably executed in two cycles and it may be the case that the the type conversion is not free and hence there is an extra cycle penalty for the partial precision and PS1.x integer shaders, meaning that the FP32 shaders are actually faster in these short shader cases.
Tridam said:trinibwoy said:Ouch. How is it possible for NV40 to be so completely humiliated in those synthetic tests yet have a much better showing in actual games? And why can't the NV40 run many of the tests given it's full SM2.0 compliance?
I presume that this problem comes from a lack of some render target formats support. For example, NVIDIA doesn't support D3DFMT_R16F. Some tech demos are using this RT format as ATI has support for it for a while. However NVIDIA supports D3DFMT_R32F and D3DFMT_G16R16F. If the problem is the lack of D3DFMT_R16F support it's easy to use D3DFMT_R32F or D3DFMT_G16R16F instead. If this is the problem then I presume that the next revision of Shadermark will use one of these RT formats for NVIDIA GPU.
Evildeus said:When do you think we will see this new version? This month, next month, later?