Is truform on the radeon 9700 working to full potential?

Demalion,
FYI I tried out an 8500LE 64Mb card (which I returned due to it´s awfully inadequate 2D quality) in the same machine that nomally carries my trusted old Radeon32DDR, and the scores in asbestos botmatch and flyby was ~49 and ~150 respectively.
I couldn´t see any difference in shadows or water rendering, it was just faster.
 
Ok finally got around to it

Ok first off when I ran the benchmark it went so fast it almost gave me a head ache. It went so fast that it basically looked like a big flashy mess. But I have since wied my hard drive and these are my scores.

dm-antalus
42.695679 / 118.153679 / 470.661163 fps
Score = 118.213791

dm-asbestos
41.245491 / 148.731049 / 399.817474 fps
Score = 148.865143

ctf-citadel
33.263191 / 106.109741 / 288.800079 fps
Score = 106.333084

br-anubis
18.126766 / 55.494637 / 115.028160 fps
Score = 55.498112

dm-antalus
16.813156 / 37.458611 / 69.029190 fps
Score = 37.476734

dm-asbestos
21.016539 / 43.642838 / 116.166374 fps
Score = 43.664833

They are better scores now but I do not get that 300++ fps one
Now one little question if I wanted to overclock my system particular my cpu would I have to unlock the cpu from the info all over the net I have read so much about?????

Thanks,
Raystream
 
rubank said:
Demalion,
FYI I tried out an 8500LE 64Mb card (which I returned due to it´s awfully inadequate 2D quality) in the same machine that nomally carries my trusted old Radeon32DDR, and the scores in asbestos botmatch and flyby was ~49 and ~150 respectively.
I couldn´t see any difference in shadows or water rendering, it was just faster.

I wish you'd post screenshots for clarity, heh...it sounds like you have speed optimized settings for your drivers, and mine are always optimized for image quality
So it looked like my screenshots? I have a better one of the water deformation I can post, but if you go to where I took that shot you can just tell me if you see the ripples against the side of the small pool when you are standing in it to answer that part. This could likely easily be done with the CPU, or a vertex shader implementation on the CPU, in any case, but that should impact the speed or polygon detail...I wish there was some way to turn off the dynamic adjustment of details for more meaningful comparisons, or some clearer specification on what occurs.

On another note, that wasn't one of those disgusting sounding Crucial 8500 LE cards, was it? Whether it was or not, you got burned by ATi's very unfortunate LE policy, which was actually worsened by them allowing other manufactures to make their cards. Hopefully they have FINALLY killed that dead with the 9000 and above cards and the PRO designation.
 
OK Demalion,
here you are

1.jpg

2.jpg

3.jpg

4.jpg
 
Screenshots! Thanks, they are the most efficient way to answer these questions. ;)


All the things I mentioned are indeed there, and it seems the demo (and likely the full game as the demo looks good enough) doesn't use "shaders" at all for these effects.

This leaves, atleast as far as the concerns of the comparison between our cards, only the question of D Vogel's comments in this thread and some other places about the automatic scaling based on video card, and what exactly it is turning on and off as mentioned in this post which seems to imply it is enabling features for each card specifally.

I wonder if the -UPT option fixes the random number seeding or possibly the level of detail adjustments for the engine, to make benchmarks more comparable? It would make sense for benchmark usage, but I hadn't seen mention of it in that thread.

Back to the truform on 9700 issues, maybe I should go PM Hellbindier on Rage3D and tell him this site is back up and point him here to this thread.
 
I heard someone had tested the throughput of N-Patches with a llittle app and 8500 was pulling about 20M tri's which 9700 only 5M, so it seemes there is something a little underpowered with it right now. Driver issue or hardware?
 
In the game with a tesselation level of 1, everything turned up to ultra, anisotropic quality 16x filtering, and 4xfsaa, I am chugging at an average of 20fps while without I am running at an average of 30fps. This performace increase is only for a tess level of 1, and seems a bit steep to me, being that Return to Castle Wolfenstein on my 8500 only lost 2-5 fps (not from the average but from peak). Something either seems to be wrong with the truform driver implementation or with the fixed function T&L emulation. I tend to believe it is the truform, being that the 9700 is not seemingly crippled on other polygon benchmarks.
 
I seem to remember somebody (from ATI?) saying that the 9700 does less of the n-patching in hardware than the 8500. I think that on the 8500 they generate the control points in hardware, whereas on the 9700 they don't. Having to send those over AGP would explain why it's a lot slower.

Of course, I could be misremembering.
 
What I do remembering was reviews stating that Ati has less displacement mapping hardware support than Matrox due to patent issues. I don't know if that bit of information is useful, but may clear up some things.
 
Maverick said:
I seem to remember somebody (from ATI?) saying that the 9700 does less of the n-patching in hardware than the 8500. I think that on the 8500 they generate the control points in hardware, whereas on the 9700 they don't. Having to send those over AGP would explain why it's a lot slower.

Of course, I could be misremembering.

Wasnt that the 9000?
 
Yes, the 9000 basically has the TRUFORM engine removed AFAIK, and they are reverting back the the software stuff that have for Radeon/Radeon 7500.

As for the Displacement Mapping, I can't see why there would be legal issues - if Matrox have licensed it to MS then ATI should be able to fully support it under the terms of their DX hardware license. OpenGL may turn out to be a little sticky but there's enough prior art floating about for the feature that any legal issues would end up in a rght mess.
 
My theory is a driver bug or oversight having the 9000 codepath being used for the 9700...I just don't buy the 9700 being hardware limited no matter what the "fixed T&L emulation" concerns, there just seems to be too much geometry power there.
I think another possibility is that they had some issue with implementing NPatches on DX 8 for the 9700. I just don't know why this would be the case, however, since the DX 8 functionality is just a subset (?), and my theory here again just points to it being an oversight or lack of focus on DX 8 versue DX 9 on their part.

OpenGL guy, can you comment?
 
I think the water "reflection" on the walls is just a procedural texture applied over the base texture. It doesn't appear to have any relation to the water itself, nor does it change in response to additional light sources that are introduced into the scene. All of the water effects in your screenshot look the same as they do on my GeForce 2 from what I can tell.
 
Hi there,
demalion said:
All the things I mentioned are indeed there, and it seems the demo (and likely the full game as the demo looks good enough) doesn't use "shaders" at all for these effects.
*nod*

UT2003 uses pixel shaders to speed up texture operations, that's all. The game doesn't use any special pixel shader effects per se, nor does it use vertex shaders. The .ini entry is there because the engine could use vertex shaders; UT2003 itself doesn't, though.

In short: UT2003 looks identical on non-shaders-hardware and DX8-class hardware as long as cube mapping is supported.

ta,
-Sascha.rb
 
Back
Top