3DMark05 and certain websites

Mariner said:
Oddly enough, it seems to me the members of Neeyik's Group 2 tend to be somewhat NVidia-centric. Perhaps this is because they tend to agree with NV's previous denigration of 3DMark after all the trouble regarding the 'optimisations'?

Yes when nVidia's 3DMark2000 optimisations where not discovered/called into question it was the other why round right.. nVidia fans bashed 3dfx fans over the head with their 3dmarks.
 
trinibwoy said:
Well they may be receiving some bashing of their own this time around if the Inq is right.

It's not like Nvidia will be able to claim ATI are cheating, given their own record and current position on optimising for this particular benchmark.
 
Bouncing Zabaglione Bros. said:
trinibwoy said:
Well they may be receiving some bashing of their own this time around if the Inq is right.

It's not like Nvidia will be able to claim ATI are cheating, given their own record and current position on optimising for this particular benchmark.

That's precisely why they may be receiving a bashing this round with no recourse but to shut up and take it. Unless they have their own 3dmark05 super-driver in the works :LOL:
 
trinibwoy said:
That's precisely why they may be receiving a bashing this round with no recourse but to shut up and take it. Unless they have their own 3dmark05 super-driver in the works :LOL:

Yeah, we are not going to see Nvidia drop out of the 3DMark programme and claim the benchmark is biased against them.

Are we? :LOL: :oops:
 
I can't wait to find out where ATI got that massive boost from. There was either some horrible deficiency before or they did some 3DM05 specific tuning ala Nvidia.
 
I think a lot of people in Group 2 worry about people focusing TOO much on 3DMark. If you made your decisions solely or to a large part on on benchmark, then that is a problem. Same issue if you bought a card just for Doom 3 (unless you only want to play Doom3). I agree with Group 2 and 3. I know 3DMark is useful if benched correctly, but I also worry about average Joes using it and taking it to seriously. We need to remember what 3DMark is, A BENCHMARK. Knowing life, not including just graphics performance, benchmarks are great for predicting issues but they are not always right. People just need to think of that.
 
trinibwoy said:
I can't wait to find out where ATI got that massive boost from. There was either some horrible deficiency before or they did some 3DM05 specific tuning ala Nvidia.

That new driver is supposed to fix a memory allocation bug for 256MB cards, plus ATI have been talking about lots of headroom left in the memory/job scheduler for a long time. Is it possible ATI have been sandbagging in order to ambush Nvidia on the 3DMark05 benchies?

ATI got so beaten up over D3 when they wern't *that* much slower, maybe they are fighting back with 3DMark and then HL2? Maybe ATI realised this time around that they need to get their updated code out *before* the target application ships, not a few weeks later?
 
CMAN said:
I think a lot of people in Group 2 worry about people focusing TOO much on 3DMark. If you made your decisions solely or to a large part on on benchmark, then that is a problem. Same issue if you bought a card just for Doom 3 (unless you only want to play Doom3). I agree with Group 2 and 3. I know 3DMark is useful if benched correctly, but I also worry about average Joes using it and taking it to seriously. We need to remember what 3DMark is, A BENCHMARK. Knowing life, not including just graphics performance, benchmarks are great for predicting issues but they are not always right. People just need to think of that.

I think this is part of it. Currently 3DMark is regarded more highly than any individual game benchmark by lots of folks. So maybe it's not that 3DMark isn't useful but that it's a bit overrated. Imagine if one day Futuremark decides to design its benchmarks to be more friendly to a particular IHV's hardware ala Doom or HL ? That would be chaos.
 
Bouncing Zabaglione Bros. said:
That new driver is supposed to fix a memory allocation bug, plus ATI have been talking about lots of headroom left in the memory/job scheduler for a long time. Is it possible ATI have been sandbagging in order to ambush Nvidia on the 3DMark05 benchies?

ATI got so beaten up over D3 when they wern't *that* much slower, maybe they are fighting back with 3DMark and then HL2? Maybe ATI realised this time around that they need to get their updated code out *before* the target application ships, not a few weeks later?

Yep, that would explain it. But then it should mean 30% improvements in almost every game :oops:
 
trinibwoy said:
Yep, that would explain it. But then it should mean 30% improvements in almost every game :oops:

Also, ATI were talking about changing scheduling depending on what the application needed. If the new AI stuff is enabled in these drivers, they might be doing on-the-fly scheduling optimisations, maybe changing for each of the different requirements of each 3DMark test. :oops: That could be quite a clever trick, especially if it is just scheduling and is still mathematically correct.
 
Mariner said:
I'm expecting NV40 and R420 to be pretty much on a par in the overall 3DMark2005 score (as they are in games, Doom3 being the exception). If one outperforms the other greatly, I hope we won't be encountering the same FUD/driver cheats which we've seen previously.

I'm not. 3DMark05 uses shadowmapping, and that could be the key factor in performance, since NVIDIA has hardware features to support shadowmaps while ATi does not. I see 3 possible scenarios:

1) 3DMark05 does not use NVIDIA's features. The performance comes from the arithmetic shading power of the GPU. R420 is probably the fastest, by a small margin.

2) 3DMark05 uses NVIDIA's features. ATi has to settle for less image quality because it lacks a PCF filter in hardware. R420 may be about as fast as NV40, but at significantly less shadow quality, so ATi still loses.

3) 3DMark05 uses NVIDIA's features. To achieve the same image quality on ATi cards, a more complex shader is used, to emulate PCF. NV40 will be significantly faster.

If I interpreted the scarce info on 3DMark05 correctly, it will be using NVIDIA's hardware features. In which case I think option 3) is the most likely, based on the history of 3DMark.
 
Scali said:
If I interpreted the scarce info on 3DMark05 correctly, it will be using NVIDIA's hardware features. In which case I think option 3) is the most likely, based on the history of 3DMark.

So what about Dave's claim that Nvidia is pressuring the press to use some new emergency driver? If they held some significant advantage they wouldn't need to.
 
trinibwoy said:
So what about Dave's claim that Nvidia is pressuring the press to use some new emergency driver? If they held some significant advantage they wouldn't need to.

I must have missed that claim. Is this driver is pressured for all NVIDIA products, or only for a certain range, say the FX series? No doubt in my mind that the FX series still performs like rubbish even if PCF is used.
There is ofcourse also a possibility that even though PCF is used, 6800 is still slower than X800. Humus' demos have demonstrated that 6800's released drivers are still very poor with geometry instancing or user clipplanes. Perhaps these features are used, and the new drivers fix performance for this.
In short, I have no idea, this was just my guess. We'll know tomorrow, I suppose (another guess of mine, the release date :)).
 
Scali said:
trinibwoy said:
So what about Dave's claim that Nvidia is pressuring the press to use some new emergency driver? If they held some significant advantage they wouldn't need to.

I must have missed that claim. Is this driver is pressured for all NVIDIA products, or only for a certain range, say the FX series? No doubt in my mind that the FX series still performs like rubbish even if PCF is used.
There is ofcourse also a possibility that even though PCF is used, 6800 is still slower than X800. Humus' demos have demonstrated that 6800's released drivers are still very poor with geometry instancing or user clipplanes. Perhaps these features are used, and the new drivers fix performance for this.
In short, I have no idea, this was just my guess. We'll know tomorrow, I suppose (another guess of mine, the release date :)).

It's being pushed for all NV4x hardware at least. Whether or not it's a 3DMark05 driver remains to be seen, but it breaks so much other stuff, I'm not going to bother testing it.

Rys
 
Well, my personal reasons for not trust 3dMark comes like this.

Firstly, my biases.
Professionally, I'm Nvidia biased. I use lots of 3DLabs and Nvidia cards at work, they serve me very well, and to be honest, ATI cards has given me quite a lot of grief at work. Personally, I'm ATI biased. At home, I still use a 9600, having RMA'ed my 6800 for freezing issues. How on earth do you ship a product that freezes in game? Nobody caught this in testing? Utter rubbish.

As to why I don't trust 3dMark? Simple. If I simply looked at the 3dMark2003 scores, I would have assumed the 6800Ultra is the fastest graphics card bar none. Large clockspeed differential from the X800XTPE, yet almost equivalent 3dMarks. Given its SM3.0 support, it's almost a no-brainer. But like I iterated earlier, the individual game tests of 3dMark does give an indication of each video cards limitations and capabilities. The whole benchmark looked upon as a blackbox result generator, doesn't.

So how does 3dMark help? The benchmark as a whole fails to really give a good indication of a videocard's performance in the game you want. It's a very general indicator of what FM thinks the card should perform like, based on their scoring system. So if a manufacturer tailor builds a videocard for 3dMark(like Nvidia did for Doom 3), would we agree that said videocard is the best? As much as the individual game tests give us an idea on the limitations of each card, do we know which real world game does each game test represent?

Which is why I will still prefer to benchmark by engine. Each time a review is made, I skip right straight to the marque games and look at how each graphics card performs for each game engine, and thus be more informed on how games based on a certain engine will perform for each graphics card. If engine A works well for graphics card A, future games based on engine A should therefore work well for graphics card A.

That was how I realized the fastest card is the X800XTPE, not the 6800Ultra.
 
Smurfie said:
That was how I realized the fastest card is the X800XTPE, not the 6800Ultra.

Given the monologue you gave just before this statement, I think it should read something like: "That was how I realized the fastest card in my favourite game/application is the X800XTPE, not the 6800Ultra."
 
Scali said:
Smurfie said:
That was how I realized the fastest card is the X800XTPE, not the 6800Ultra.

Given the monologue you gave just before this statement, I think it should read something like: "That was how I realized the fastest card in my favourite game/application is the X800XTPE, not the 6800Ultra."

My favorite game currently is City of Heroes, which the 6800s seem to do a lot better in. :)
 
Back
Top