GF4 has inflated 3dmarks scores so says the INQ.....

Doomtrooper,

Concerning the wall screenshot on the GF2, I'd rather say it was an early point release game quirk where mistakenly lightmaps were compressed too. When I had the KYRO2, if I used an early point release I had to add "CompressLightmap=0" to the registry. However it wasn't a necessity with recent point releases.

Skyboxes are an entirely different story though.
 
TAP TAP TAP .........You know something I am still waiting for every site on the net to jump on nVidias back about this the way they did ATI. I don't understand the double standard at all.... nVidia should be hung out to dry the way ATi was for their "driver bug". :-?

Sabastian

I'm still waiting...........
 
Crusher said:
Apparently I was looking in the wrong spot, because I did manage to find that archway just now, and it does appear to have those blocks. I guess I just never noticed that particular area before. It's certainly not indicitive of the entire game, since the majority (nearly all) of the textures do not have that kind of problem (which is what it seemed like Doomtrooper was trying to suggest). Even the identical archway on the other side of that courtyard doesn't have the same artifacts, and I didn't notice it in the other maps at all (which are what I usually play the few times I actually play the demo).
Crusher,

As I stated before, you should expect to see lightmap troubles when compression is enabled. This is true for all drivers that compress the lightmaps. It's not a big deal.
So NVIDIA does 16-bit interpolation on DXT1 textures, and this causes some graphic anomolies in isolated areas, and presumably gives them a performance boost over doing 24-bit interpolation due to the smaller texture size, correct? That would explain how other cards could have a better looking sky comparatively (although you can't deny that it still loses quality with S3TC on any hardware). I'll agree with that.
But the 16-bit interpolation is the cause of the awful looking sky textures in Quake 3. It seems that nvidia should be improving these types of things (DXT1 decoding) on newer hardware in order to improve visual quality. But they haven't, so you will get terrible results on some textures. This leads people to conclude that texture compression is a bad idea, which is hardly the case.
I won't agree with a blanket statement like "the GeForce's texture compression looks horrible", because that implies all texture compression, and it's apparently limited to DXT1. And for the most part, DXT1 looks fine too, the sky and that one wall area being the only places I've seen otherwise. I'm guessing EverQuest uses DXT3, because it looks just as good with TC on as it does without it (you can notice a slight degredation through the screenshots, but no anomolies are present, and you can't tell the difference unless you're directly comparing screenshots).
The cases where 16-bit interpolation look poor are when you are doing a smooth gradient (like on a sky with clouds). It gets especially bad in Quake 3 because they blend a couple of these textures together (when the multilayer sky feature is enabled) and the errors accumulate.

Yes, there is a loss of data when compressing using DXTC, but the hardware should't make the result look worse than it should be!
 
Sabastian said:
TAP TAP TAP .........You know something I am still waiting for every site on the net to jump on nVidias back about this the way they did ATI. I don't understand the double standard at all.... nVidia should be hung out to dry the way ATi was for their "driver bug". :-?

Sabastian

I'm still waiting...........

Oh come on, this is old news and dozens of sites have had stories and articles about it in the past, incuding some of the big and more influential sites - Nvidia's had their fair share of bad press about this issue and others too.

The thing that really annoys me is that Nvidia hasn't been able to entirely fix this problem yet ...
 
No Golum, you are wrong. The internet nearly raped ATi to death over it. nVidia gets a one day noteworthy slap on the hand???? Thats it?

Sabastian
 
noko said:
The GF3 still has bad texture quality when using DXTC1 and I presume the GF4 isn't much better.

The gf4 still uses 16bit interpolation but dithers looking better in some situations but worse in others.

In later q3 point releases (since 1.27ish i think) texture compression has been completely off by defualt rather than on.
 
application > API (and drivers) > HAL > hardware

Please do correct me if I'm miles off the mark here but doesn't the general order of things go:

application > API (and drivers) > HAL > hardware

How is 3DMark supposed to get around this? Surely it's the fact that the drivers can be tweaked to hell and back that raises doubts of the results from ANY 3D benchmark, and not just MadOnion's.

You're correct, that's the general order of things.

BUT - what is said (wanted to say) is that: in order to produce the most reliable benchmarks a programmer writing a benchmark would be better off to write directly to the hardware layer (read: write their own assembler code that interacts directly with the hardware, i.o.w. sort of a limited "driver"), this in order to skip dependencies on 3rd parties like the mfr. of the driver or an API (Directx, OpenGL,...).

The question in this case is: how can driverdevelopers cheat a benchmark? Answer: by checking which application is using the drivers' services. They can check to fix compatibilities with certain programs or games (even different versions) - what makes you think they can't do the same with 3DMark for tweaking?

Personally I'm more concerned about the ethics involved around the cheating than about the number the benchmark is producing. There's no better benchmark than actually using a product...

I'd probably be a more likely candidate for an upgrade to Parhelia than to NV30... (knowing&experiencing Nvidia's image "quality" first hand to for instance Matrox').
 
Back
Top