HardOCP and Doom 3 benchmarks

indio

Newcomer
Very sad that HArdOCP would do this.They benchmark a game that isn't out with cards from two manufacturers . Manufacturer A sets up the tests. Manufacturer B PROBABLY isn't aware that test for the game (which isn't available yet) is even going to happen. One driver set is clearly broken , the other cripples the hardware. This clearly indicates ATI wasn't prepared for this (is that any surprise?) Nvidia was .
Where are HARDOCP's journalistic standards?
 
I honestly think it is a fair reflection on how these cards will perform in Doom 3 (assuming comparable IQ - hard to conclude from the article, I know). Perhaps ATi will gain a tiny bit of ground with the Cat 2.4's, but it's not as if it hasn't been widely speculated than the NV3x architecture would ultimately excel at stencil-based rendering.

MuFu.
 
indio said:
Very sad that HArdOCP would do this.They benchmark a game that isn't out with cards from two manufacturers . Manufacturer A sets up the tests. Manufacturer B PROBABLY isn't aware that test for the game (which isn't available yet) is even going to happen. One driver set is clearly broken , the other cripples the hardware. This clearly indicates ATI wasn't prepared for this (is that any surprise?) Nvidia was .
Where are HARDOCP's journalistic standards?

Er, question.

How can ATI not beaware when their card was used to display the game last year on a 9700 Pro no less? /boggle

But - I will give you running benchmarks on unreleased games is rather silly.
 
I think using Doom3 tests with iD's blessings makes it overall OK. Especially since iD took the time to not accept nVidia's own demo, and they made their own.

However I do agree that basically ATI was not aware this was coming. If that were the case, then the cat 3.4s would be working properly. And that is unfortunate, and not totally fair. I'm also a bit turned off by the fact that it seems this is a "one shot deal." That is, with new drivers and new hardware coming, it's not clear that the tests will be re-run. Will we get new benchmarks when the next R3xx card is released? Or only when the next NV3x card is realeasd? I would be much more comfortable if iD provided the sites with the build, so that they can re-run benchmarks at any time.

The way this is being handled by iD, this might be their way to "pay back" ATI for leaking the doom Alpha (assuming an ATI employee did leak it.) By letting nVidia more or less dictate when and how Doom3 bechmarks are released.

So it does more or less paint an un-clear picture on Doom3 because obviously nVidia has the goal of tweaking the hell out of their drivers for this test, and ATI didn't have that luxury. For example, R350 is supposed to have stencil and HyperZ working much more optimally compared to R300. However, it's not clear if this had been implemented in the drivers as of yet.
 
I'm assuming the point of the benchmark is to show relative performance on a level playing field. Do you think it's fair one side knew there was a game and the other side didn't find out till the game was over??? Especially when people WILL make buying decisions based on what they read.
 
indio said:
Very sad that HArdOCP would do this.They benchmark a game that isn't out with cards from two manufacturers . Manufacturer A sets up the tests. Manufacturer B PROBABLY isn't aware that test for the game (which isn't available yet) is even going to happen. One driver set is clearly broken , the other cripples the hardware. This clearly indicates ATI wasn't prepared for this (is that any surprise?) Nvidia was .
Where are HARDOCP's journalistic standards?
You gotta be kidding me. Who in their right mind would refuse such a chance?

I thought [H] and Anand gave the proper disclaimers regarding their tests and that's enough for me to understand what this means (the usual caveats re ATi, unreleased game, etc).
 
John Carmack has said a couple of times that the GeForce FX was still ahead in performance.

Is it so hard to believe that this has finally been confirmed by benchmarks?
 
indio said:
I'm assuming the point of the benchmark is to show relative performance on a level playing field. Do you think it's fair one side knew there was a game and the other side didn't find out till the game was over???

No, I don't think it's completely fair (as I said.)

However, you do have to keep in mind that iD is comfortable with having the scores released, and it being a general indicator of current performance, so that has to count for something in the name of "fairness."

If ATI ever requests iD to give review sites another crack at running the tests (with a new driver build or new hardware), and id refuses, then I would say that's VERY unfair.

Again, I'm most disappointed with the fact that this appears to be a one-shot deal at this time, it doesn't allow reviewers to go back and look at details (rendering path used, etc.), and leaves a lot of unanswered questions that could be examined if they had the build and could re-run tests...
 
Chalnoth said:
John Carmack has said a couple of times that the GeForce FX was still ahead in performance.

Is it so hard to believe that this has finally been confirmed by benchmarks?

Only if the GFFX pipeline was used. Not the ARB2 pipeline. If using the standard ARB2 rendering, the GFFX was at most half as fast as the 9700 Pro, not to mention the 9800.
 
indio said:
I'm assuming the point of the benchmark is to show relative performance on a level playing field. Do you think it's fair one side knew there was a game and the other side didn't find out till the game was over??? Especially when people WILL make buying decisions based on what they read.

As I said - I believe ATI knew as they demo'd Doom 3 on their hardware before.

My opinion is that it is rather silly to benchmark a game that has yet to be released. But that is just my opinion. As with most benchmarks, take them with a grain of salt...

Sort of like benchmarking a piece of hardware that is still in the final phase of development. Both hardware and drivers are not mature yet we sit here nit picking over the results.

Truth be told, they are all just preliminary results.
 
Yes, and it is most likely that performance will increase more for the NV3x architecture than the R3xx architecture before release.

Remember that the NV3x architecture is newer, and there is certainly much more headroom for improvement in future driver releases.
 
Chalnoth said:
Does it matter, Natoma?

I think it does. It shows a lack of standards adherance from Nvidia. And from the tenor of Carmack's .plan updates, it seems that he's quite miffed that he has to code a completely separate path for the Nvidia cards to get them to work correctly.

So I think it does matter. I haven't had a chance to read the (p)reviews yet, but I wonder if the DOOM3 benchies were run on the ATI and Nvidia cards using ARB2 and the Nvidia specific path, as well as the ATI path.
 
Natoma said:
Chalnoth said:
John Carmack has said a couple of times that the GeForce FX was still ahead in performance.

Is it so hard to believe that this has finally been confirmed by benchmarks?

Only if the GFFX pipeline was used. Not the ARB2 pipeline. If using the standard ARB2 rendering, the GFFX was at most half as fast as the 9700 Pro, not to mention the 9800.
To a gamer where DOOM3 is his life, I fail to see how this is important to him. He will buy the hardware that runs DOOM3 best. In fact this should essentially be the case for every game benchmarked in a review -- the card that runs a certain game the best simply says "This is the card that runs GameX the best". The reviewer would have to word his "Overall Score" conclusion carefully of course if a bunch of games are benchmarked in a review/shootout.
 
Natoma said:
Chalnoth said:
Does it matter, Natoma?

I think it does. It shows a lack of standards adherance from Nvidia. And from the tenor of Carmack's .plan updates, it seems that he's quite miffed that he has to code a completely separate path for the Nvidia cards to get them to work correctly.

So I think it does matter. I haven't had a chance to read the (p)reviews yet, but I wonder if the DOOM3 benchies were run on the ATI and Nvidia cards using ARB2 and the Nvidia specific path, as well as the ATI path.

I'm not seeing much difference between this rendering path issue and the traditional anisotropic filtering hubbub between ATI and nvidia (ie nvidias application mode being slightly better/more comprehesenive than ati's best mode, but coming with too large a performance penalty to really be competitive). If the visual quality between the two is essentially nonexistant then I have no issues with it.
 
Ok people, I'm sure we will hear from JC in the very near future on this. But, please take a step back and THINK. This runs totally counter to what he has been saying. Do any of you honestly think that JC would let this happen? Do you think he can afford for this to happen? For him to purposely appear to favor one IHV to this extent would be a terrible thing, for Id, ATI and the industry as a whole. In fact, the only one that would benifit from this IS nVidia.....

And, given nVidia's past PR efforts, shouldn't we wait this out a bit and see before we make any pronouncements.....
 
All hail to the closet fanbois.

The DOOM3 benchmarks are worthless IMO, anyone with half a brain can see that. I wont buy a card for a single game based on benchmarks using a build non reflective of the shipping product with drivers that will be outdated once such game ships. I will buy what gives my the best overall performance with good image quality and at a reasonable price.
 
Natoma said:
Chalnoth said:
Does it matter, Natoma?

I think it does. It shows a lack of standards adherance from Nvidia.
What?! What is the "standard"? We're talking OpenGL!

OpenGL is "open" -- until we see the demise of IHV-specific extensions, we have to live with it. Carmack included.

And from the tenor of Carmack's .plan updates, it seems that he's quite miffed that he has to code a completely separate path for the Nvidia cards to get them to work correctly.
You care about Carmack, or you care about gamers, or you care about yourself?

Developers in the PC industry have had to live with this extra work for ages. Carmack obviously puts in a lot of work to make sure his games runs best on all sorts of hardware. If you are complaining for political reasons, that's fine. But if you're complaining because this does not result in better games all round, please...
 
martrox said:
Do any of you honestly think that JC would let this happen?

Absolutely - if it is to the benefit of the game. I doubt he'd have any qualms over chosing a default path for nV cards that trades a little IQ for a lot of performance.

You can't have a perfectly level playing field because it's all about balancing IQ and performance - the "balancing point" for each product is different.

MuFu.
 
Natoma said:
It shows a lack of standards adherance from Nvidia. And from the tenor of Carmack's .plan updates, it seems that he's quite miffed that he has to code a completely separate path for the Nvidia cards to get them to work correctly.

:rolleyes:

Please get your facts straight.
Carmack has done such things for YEARS.
He got paths for a LOT of architectures in Doom 3: R200, NV20, NV10, ARB2, ... - not like adding one was monumental in comparaison.
Heck, remember the original Quake? He even did a Verite version for it! He always did different paths. It's one of his habits, and it's ( partly ) why his engines are so good and fast.

Carmack just likes speed and IQ. Saying he "had to make another path to make it work right" ( rough quote ) is kinda, well, false.


Uttar
 
Back
Top