nVidia release new all singing all dancing dets.

MotoGP seems to be borked with this release...flashing polys and artifacts all over the screen, and no bumpmapping or anything.
 
Doomtrooper said:
Looking at your anisotropic shots, the filtering is affected, I'm not sure but it also looks like textures are missing, darker or something around the door compared to the old driver...

8x aniso in d3d isn't working if you set it in the driver panel (probably a wrong value gets assignd)you need to set it in rivatuner to work correct. opengl aniso is working correct as always
 
Ok..but that would also affect benchmark scores if the wrong value is not being entered..so his benchmark numbers are not accurate then ??
 
I'm seeing DirectX blitting as screwed up right now, as well as some z buffer problems with these det 40's. I get weird flashing textures/blits in the wrong place in the character selection screen in DAoC, and flashing mountains in the distance in-game. Furthermore, I get out-of-whack screen blits in Heroes of Might and Magic, which I find really odd... it doesn't use any 3D.
 
Well, the driver set is listed as beta...it appears it was mostly released for the reason of allowing NV30 emulation.
 
What i really want to know from people here..

Is how the hell does a driver update increase nature by 50 FPS and affect virtually nothing else in a *positive* way?? I mean.. Nature has had virtually the same score on nvidia boards for months and months now..

What could they have possibly done in the drivers go from 50 FPS to nearly 100 in nature?? are you seriously telling me that Ati's D3D drivers are than *bad* in nature on the 9700?? or that the GF4 has been codded that poorly for so long???

I just cant believe either. There simply MUST be something fishy going on. There is no other reasonable explanation.
 
RussSchultz said:
Wow, they even included a panel for overclocking in their drivers.
I wish IHVs wouldn't do that. After all it's game developers who usually get blamed if their games crash because they are more demanding than game XYZ which runs fine with the overclocked settings.

"But game XYZ runs fine... it can't possibly hang because I overclocked my card/memory/CPU"

-- Daniel
 
Hellbinder[CE said:
]What i really want to know from people here..

What could they have possibly done in the drivers go from 50 FPS to nearly 100 in nature??

Perhaps they only render every other frame? ;)

--|BRiT|
 
Doomtrooper said:
I have a question here, why is Quake 3 still a target for speed bumps (ummm its 3 years old and there is maybe 1200 people online playing it), wouldn't it make more sense for MOHAA,Wolfenstein etc ??
I doubt NVIDIA specifically targetted Quake3 as one of the "criterias" for performance improvements in new drivers. But since Quake3 continues to be used by many, many websites (even 3 years after its debut, like you said), I can understand why NVIDIA chosed to mentioned Quake3 improvements to me. Every new video card (be it by Matrox, NVIDIA or ATI) that gets reviewed usually have Quake3 listed as a benchmark. If you have a problem with why NVIDIA chose to specifically mention Quake3 to me, perhaps the "blame" lies with the majority of websites out there, not NVIDIA.
 
Yeah why would you want to continue to optimize for an engine used in jk2, rtcw, mohaa, elite force, etc...


That's ridiculous.
 
walkndude said:
Yeah why would you want to continue to optimize for an engine used in jk2, rtcw, mohaa, elite force, etc...

That's ridiculous.

That would be fine, but if you examine the benchmark scores between NVIDIA's drivers on Q3A and JK2/RTCW, you will see the improvements do not carry over.

I'm basing this on comparisons between the Nvidia TI-4200 and the ATI 8500 cards. The 4200 card is ahead of the 8500 by a higher percentage in Q3A. When you move to the other engines, the 8500 is ahead of the 4200 and nearly on par with a 4600.

This seems to indicate that the drivers are optimized only for Q3A and not the newer games using the same engine.

--|BRiT|
 
Doomtrooper said:
No I just don't see the reason to optimize a three year old engine that already is getting 277 fps..call me crazy... :LOL:

Agreed, exactly. I never understood this... emphasizing the difference between 296 and 318 fps absolutely meaningless, sure.

PS: Thanks God, they've discovered the JKII.
 
8x aniso in d3d isn't working if you set it in the driver panel (probably a wrong value gets assignd)you need to set it in rivatuner to work correct. opengl aniso is working correct as always

I think your correct. I took some screenshots from UT2003 comparing 8x aniso from the driver level, and from Rivatuner. Something everyone should take a look at before you benchmark, maybe thats where the extra 20% comes from? Who knows, but even going from 4x-8x isnt much of a performance hit.

aniso1.jpg


aniso2.jpg
 
Back
Top