nVidia release new all singing all dancing dets.

Hellbinder - you've either not read or just ignored the fact that I'd tested 3DMark against nearest-point and bilinear filtering, and got the same results. I'd even done back-to-back testing of 30.30 drivers with just the high detail tests, across the available filtering options; whatever is going on with the Nature test, it's significantly higher (for a GF4 that is) regardless of the level of filtering.
 
Aquamark would be useful to see, if only it was publically (and legally ;) ) available. I guess the nearest would be a recorded lap in MotoGP and checked using FRAPS.
 
martrox said:
It's beginning to look like the "up to 25% performance improvement" may be just another half truth in the continuing nVidia Fud campain. I hope I'm wrong here, but this is the kind of thing that nVidia has done in the past, and constitutes many peoples greatest complaint about nVidia.

Please can we stop perpetuating the myth that Nvidia is any worse than other vendors in this regard? Please refer to this ATI press release that came out exactly 2 months ago:
http://mirror2.ati.com/drivers/Catalyst-02.2-Web-Posting-Release-Notes.pdf

This PDF says that the new Catalyst drivers give up to 50% boost in performance in OpenGL games. Fact is noone on this forum was able to substantiate any significant performance improvements.

I'm not saying I approve of these overly-optimistic marketing figures. I'm just saying that everybody does it, so let's stop pretending that it's an Nvidia problem.
 
This PDF says that the new Catalyst drivers give up to 50% boost in performance in OpenGL games.

Actaully, it doesn't say that. It only gives the performance increase numbers for very clearly defined situations. (Particular games, resolutions, and quality settings.)

Up to 25% improvement in Quake III (1600x1200, Max Quality)
Up to 35% improvement in Return to Castle Wolfenstein (1600x1200, High Quality)
Up to 50% improvement in Serious Sam: The Second Encounter (1600x1200, Extreme Quality)

That is certainly not the same type of claim that nVidia is making, which one can't even really begin to know how to validate or not.

Still, I agree that it would be interesting to see if ATI's claims are backed up.
 
Lets be fair now.. there were definite improvements in the DX8 graphics. Here are some results I managed to obtain for your viewing pleasure:

Giants
28.xx----Det 4
110fps - 75fps

Improvement = -46%

GP4
28.xx---Det 4
30fps - 35fps

Improvement = 16.7%

SS:SE
28.xx---Det 4
50fps - 50fps

Improvement = 0%

The obligatory 3dMark2001 SE 330 Results:

Detonator 4

Platform NVIDIA GeForce4 Ti 4200
CPU Optimization D3D Pure Hardware T&L
Width 1024
Height 768
Depth 32 bit
Z-Buffering 24 bit
Texture Format Compressed
Buffering Double
Refresh Rate 60 Hz
FSAA Mode None

OPTIONS
Show Title Screens Yes
Continuous Benchmark No
Benchmark Run Count 1
Demo Sounds Enabled Yes
Continuous Demo No
Game Sound Effects Enabled Yes
Game Music Enabled Yes
Game Detail Level Low

RESULTS
3DMark Score 10667
Game 1 - Car Chase - Low Detail 156.2 fps
Game 1 - Car Chase - High Detail 55.8 fps
Game 2 - Dragothic - Low Detail 191.3 fps
Game 2 - Dragothic - High Detail 104.5 fps
Game 3 - Lobby - Low Detail 145.0 fps
Game 3 - Lobby - High Detail 65.6 fps
Game 4 - Nature 61.2 fps
Fill Rate (Single-Texturing) N/A
Fill Rate (Multi-Texturing) N/A
High Polygon Count (1 Light) 49.5 MTriangles/s
High Polygon Count (8 Lights) 10.5 MTriangles/s
Environment Bump Mapping N/A
DOT3 Bump Mapping N/A
Vertex Shader 82.8 fps
Pixel Shader 103.6 fps
Advanced Pixel Shader 79.9 fps
Point Sprites 27.4 MSprites/s


28.xx Drivers
PROJECT
Name My Benchmark
Description
Registration Name
Registration Key
3DMark Version 330

DISPLAY
Platform NVIDIA GeForce4 Ti 4200 (Omega KX 1.1.82)
CPU Optimization D3D Pure Hardware T&L
Width 1024
Height 768
Depth 32 bit
Z-Buffering 24 bit
Texture Format Compressed
Buffering Double
Refresh Rate 60 Hz
FSAA Mode None

OPTIONS
Show Title Screens Yes
Continuous Benchmark No
Benchmark Run Count 1
Demo Sounds Enabled Yes
Continuous Demo No
Game Sound Effects Enabled Yes
Game Music Enabled Yes
Game Detail Level Low

RESULTS
3DMark Score 9837
Game 1 - Car Chase - Low Detail 153.2 fps
Game 1 - Car Chase - High Detail 55.6 fps
Game 2 - Dragothic - Low Detail 162.1 fps
Game 2 - Dragothic - High Detail 96.5 fps
Game 3 - Lobby - Low Detail 145.5 fps
Game 3 - Lobby - High Detail 67.1 fps
Game 4 - Nature 42.3 fps
Fill Rate (Single-Texturing) N/A
Fill Rate (Multi-Texturing) N/A
High Polygon Count (1 Light) 41.9 MTriangles/s
High Polygon Count (8 Lights) 10.4 MTriangles/s
Environment Bump Mapping N/A
DOT3 Bump Mapping N/A
Vertex Shader 86.0 fps
Pixel Shader 102.9 fps
Advanced Pixel Shader 79.4 fps
Point Sprites 27.3 MSprites/s



System Specs:

P4 2.66AGHz 133MHz*20
256MB DDR 166MHz*2
SB Live! Value
60GB Maxtor ATA100 7200 RPM
GF4 Ti4200
 
Joe DeFuria said:
This PDF says that the new Catalyst drivers give up to 50% boost in performance in OpenGL games.

Actaully, it doesn't say that. It only gives the performance increase numbers for very clearly defined situations. (Particular games, resolutions, and quality settings.)

Up to 25% improvement in Quake III (1600x1200, Max Quality)
Up to 35% improvement in Return to Castle Wolfenstein (1600x1200, High Quality)
Up to 50% improvement in Serious Sam: The Second Encounter (1600x1200, Extreme Quality)

That is certainly not the same type of claim that nVidia is making, which one can't even really begin to know how to validate or not.

Actually, the pdf first makes the following blanket statement:
ATI said:
OpenGL Performance Boosts:
CATALYST 02.2 provides significant performance improvements in OpenGL games across the entire RADEON product line, especially at high resolutions.

Then it goes on to give the examples you listed.

Joe DeFuria said:
Still, I agree that it would be interesting to see if ATI's claims are backed up.

No, their claims were not backed up. Here is the B3D thread:
http://www.beyond3d.com/forum/viewtopic.php?t=1922
 
Actually, the pdf first makes the following blanket statement:

As I said, the "blanket" statement does not apply a "percentage" increase. The only time percentages are mentioned are with particular apps with particular settings.

No, their claims were not backed up. Here is the B3D thread:

And, uh, in that same thread is the same discussion about 'misleading' performance statements, only that time coming from ATI. So as for your statement

"Please can we stop perpetuating the myth that Nvidia is any worse than other vendors in this regard?

It seems pretty clear to me that when ATI makes performance claims, they are explored here as well. I'm not sure what your issue is? (Or is it just with one person, not this forum?)
 
Galilee said:
Doomtrooper said:

Thanks for the link I'll test it when it comes down, sometimes tomorrow :) 6.5k/sec :(

Huh what link...
kloguck.gif
 
I installed the det4 drivers on a 16mb DDR-geforce2go laptop, without any problems. Nice control panels, and screen rotation.
The first thing that jumped to my eye were the ugly unfiltered mipmap textures. This was definitly with the default settings. As others reported, setting the Aniso-slider to 1 fixed that without any speed hit. No idea what nvidia was thinking with this settings.

The next thing, the 3DMark default bug is back in:
Until the 29.x drivers, 3D-Mark failed to run with default settings with enabled splashscreen, resulting in an out of videomemory error. Changing the z-buffer to 16bit fixed this.
Also the newer drivers had this fixed, but the det4 driver has the problem again, also disabling the splash screen does not help.
Really a strange driver.

btw.: to the catalyst discussion. Compared to the latest official drivers at the release time (and not the leaked one), the catalyst driver was at high texture settings even more than 50% faster at some levels in jedi knight for me.
 
Gery said:
I installed the det4 drivers on a 16mb DDR-geforce2go laptop, without any problems. Nice control panels, and screen rotation.
The first thing that jumped to my eye were the ugly unfiltered mipmap textures. This was definitly with the default settings. As others reported, setting the Aniso-slider to 1 fixed that without any speed hit. No idea what nvidia was thinking with this settings.

The next thing, the 3DMark default bug is back in:
Until the 29.x drivers, 3D-Mark failed to run with default settings with enabled splashscreen, resulting in an out of videomemory error. Changing the z-buffer to 16bit fixed this.
Also the newer drivers had this fixed, but the det4 driver has the problem again, also disabling the splash screen does not help.
Really a strange driver.

btw.: to the catalyst discussion. Compared to the latest official drivers at the release time (and not the leaked one), the catalyst driver was at high texture settings even more than 50% faster at some levels in jedi knight for me.

Hmm no problem with running the default benchmark here. Strange.
 
Joe DeFuria said:
So as for your statement

"Please can we stop perpetuating the myth that Nvidia is any worse than other vendors in this regard?

It seems pretty clear to me that when ATI makes performance claims, they are explored here as well. I'm not sure what your issue is? (Or is it just with one person, not this forum?)

For the context of my comments, simply refer to what I quoted in my original post:
It's beginning to look like the "up to 25% performance improvement" may be just another half truth in the continuing nVidia Fud campain. I hope I'm wrong here, but this is the kind of thing that nVidia has done in the past, and constitutes many peoples greatest complaint about nVidia.

I'm reacting to a pattern I've observed: accusations of this sort are seldom levelled against ATI, but are often levelled against Nvidia (as evidenced by the wording of the original poster, who writes as if these are widely accepted "truths" about Nvidia -"just another half truth"; "continuing Nvidia FUD campaign"; "kind of thing Nvidia has done in the past" "constitutes many people's greatest complaint about Nvidia"). However, in reality, I haven't observed Nvidia behaving any differently/worse than ATI, as I tried to point out by referencing ATI's press release. Hope that makes it clear.
 
I'm reacting to a pattern I've observed: accusations of this sort are seldom levelled against ATI, but are often levelled against Nvidia (as evidenced by the wording of the original poster....

OK, then you have an issue with the original poster. When you used the word "we" in your original complaint ("Please can we stop perpetuating the myth..."), it sounded like you were referencing this board in general...
 
Enable conformant OpenGL texture clamp behavior

Funny no one has mentioned the said option finally appearing in OpenGL settings. Until now NVidia has used GL_CLAMP texture wrap parameter as synonym for the extension GL_CLAMP_TO_EDGE which has shown up as "bugs" with other cards that implemented it correctly. The last time I noticed this was with Tenebrae. The difference is that GL_CLAMP takes border color into account for example when doing bilinear filtering near the texture border causing usually unwanted artifacts in polygon edges (darker/lighter/wrong color, depending on the border color). GL_CLAMP_TO_EDGE clamps the samples to be withing the texture, which has hidden the compliant behavior from people using NVidia cards. Of course this has usually been a good thing for them, but bad for people using compliant drivers. Too bad the default is still non-compliant, but perhaps it saves some support grief from NVidia.
 
Back
Top