HardOCP and Doom 3 benchmarks

Doomtrooper said:
Interesting...thx for the info Joe..
arge.gif

Just FYI, that info came from this thread:

http://www.beyond3d.com/forum/viewtopic.php?t=5776

Joe DeFuria said:
Also, get on the horn with Carmack, and get him to send you a copy of the Doom3 benchmark. ;)

DaveBaumann said:
The good Reverend already did, id's reply: "Talk to NVIDIA".
 
stevem said:
Joe DeFuria said:
...free and unfettered access...
Now isn't that a phrase that's crept into the vernacular...? I hope this doesn't mean we're dealing with WMDs...

Thomas Pabst obviously hasn't been successful inspecting Nvidia. There are still a lot of reports of Doom III comming out through reviewers about Nvidia. Unless ATI gets free and unfettered access soon I think they no choice but to invade. Hopefully, they can liberate gaming populace in the process and keep IHV casualties to a minimum. King of Spades, Jen-Hsun Huang, must be brought before the independant FutureMark Beta coalition to pay for his marketing crimes!
 
BenSkywalker said:
Instead we have people speculating that there will be noticeable differences between the rendering paths in Doom3 and it is being made in to a large issue while we know there are major differences in SplinterCell and there isn't much commentary from the reviewers and that is OK. Why aren't people flaming {H}, Anand and Tom for ignoring the superior IQ of nV boards running SC? We know that's real, it requires no speculation.

Hmmm, has this issue ever been fixed:

http://www.hardocp.com/article.html?art=NDQ0LDM=

I'd say that SC looks much worse on NV hardware in those shots. This may have been fixed though, I don't have an Nvidia board so I don't know.

The sites running the Splinter Cell benchmark are also running the same codepaths for both hardware when benchmarking. Will they do the same with Doom3?
 
boobs said:
Does anyone care to take a guess at what Ati will be able to do in their next driver to improve performance over the 3.2 drivers?

I am guessing their next drivers will kill performance to the 10 fps range. Why, because we saw what the next available drivers (Cat 3.4) did. These are the next drivers from ATI released with the R9800-256. From what I can tell, these will infact be the next release of the drivers. But they were "broken" for all the review sites. I still want to know why none of the sites could get a meaningful number using the D3 using these "public" drivers.

If the Cat 3.4s really are "fully standard compliant," why did they tank when the last set worked fine? Or are there things broken in the drivers that maybe none of the "standard" benchmark programs have yet revealed? Or is it that the version of D3 released didn't properly recognize the drivers? That seems like a possiblity, but that raises a lot more questions.
 
DadUm said:
I still want to know why none of the sites could get a meaningful number using the D3 using these "public" drivers.

Does it really matter? D3 isn't a "public" game yet.
 
Joe DeFuria said:
Doomtrooper said:
Interesting...thx for the info Joe..
arge.gif

Just FYI, that info came from this thread:

http://www.beyond3d.com/forum/viewtopic.php?t=5776

Joe DeFuria said:
Also, get on the horn with Carmack, and get him to send you a copy of the Doom3 benchmark. ;)

DaveBaumann said:
The good Reverend already did, id's reply: "Talk to NVIDIA".
Please refer to my reply/comment in the same thread quoted above. I don't know what exactly is going on and until I do, I hope you guys won't speculate too wildly.
 
Joe DeFuria said:
Joe DeFuria said:
I will feel MUCH better about the situation if in a few weeks or so, id re-releases the same benchmark after ATI having a chance to optimize for it.

Apparently, (according to what iD told Dave) NVIDIA and not iD is actually controlling the release of this benchmark. I guess the chances of the above happening are about nil. :rolleyes:

At this point, I take back what I said at the very beginning of this thread about this being somewhat legitimate...
id corresponded with me, not Dave, regarding this and Dave's "Talk to NVIDIA" comment (link just above this post) should not be miscontrued, as is evident by Joe's thoughts/comments. I don't know what exactly is going on and if I don't, neither Joe, Dave nor anyone else can claim to know (based on the info in the link provided above).

Don't speculate about this.
 
Reverend said:
Don't speculate about this.

You need to get nvidia's permission to run / acquire a Doom3 benchmark. Full stop. Correct? No speculation there.

Asking us to not speculate WHY nVidia has a say in this is out of line, IMO. There would be no speculation involved, if id was "allowed" to release their own benchmark without nVidia's "blessing". :rolleyes:
 
Asking us to not speculate WHY nVidia has a say in this is out of line, IMO. There would be no speculation involved, if id was "allowed" to release their own benchmark without nVidia's "blessing".

Perhaps nVidia will be the ones held accountable if there is a leak.

Edit-

Meant to post this and forgot, from Carmack-

"We have been planning to put together a proper pre-release of Doom for benchmarking purposes, but we have just been too busy with actual game completion. The executable and data that is being shown was effectively lifted at a random point in the development process, and shows some obvious issues with playback, but we believe it to be a fair and unbiased data point. We would prefer to show something that carefully highlights the best visual aspects of our work, but we recognize the importance of providing a benchmark for comparison purposes at this time, so we are allowing it to be used for this particular set of tests. We were not happy with the demo that Nvidia prepared, so we recorded a new one while they were here today. This is an important point -- while I'm sure Nvidia did extensive testing, and knew that their card was going to come out comfortably ahead with the demo they prepared, right now, they don't actually know if the demo that we recorded for them puts them in the best light. Rather nervy of them, actually.

The Nvidia card will be fastest with "r_renderer nv30", while the ATI will be a tiny bit faster in the "r_renderer R200" mode instead of the "r_renderer ARB2" mode that it defaults to (which gives some minor quality improvements). The "gfxinfo" command will dump relevant information about the functioning renderer modes and optimizations. At some point, after we have documented all of the options and provided multiple datasets, Doom is going to be an excellent benchmarking tool, but for now you can still make some rough assessments with it."
 
B3D: "Is there any way we could work together on a similar basis as per what was accorded to HardOCP and AnandTech?"
"

ID: "Anthony,

NVIDIA chose those outlets - we did not. It would be best to place this
request with them."


I see nothing about the Demo, I see asking for a meeting.
 
andypski said:
When it comes to rendering in DirectX it is a general rule that the reference rasterizer is the specification. In Microsoft's own words the reference rasterizer "Supports every Direct3D feature".
Except Z bias! :? (Until DX9 that is.)
 
Joe DeFuria said:
Reverend said:
Don't speculate about this.

You need to get nvidia's permission to run / acquire a Doom3 benchmark. Full stop. Correct? No speculation there.

Asking us to not speculate WHY nVidia has a say in this is out of line, IMO. There would be no speculation involved, if id was "allowed" to release their own benchmark without nVidia's "blessing". :rolleyes:
Incorrect. From the way I read id's reply to me, it meant id prefered I liase with NVIDIA to arrange for a benchmarking session. Big difference.

Again, you're either speculating or jumping the gun.

PS. 2AM, going to bed, so don't expect any further quick responses from me.
 
Doomtrooper said:
B3D: "Is there any way we could work together on a similar basis as per what was accorded to HardOCP and AnandTech?"
"

ID: "Anthony,

NVIDIA chose those outlets - we did not. It would be best to place this
request with them."


I see nothing about the Demo, I see asking for a meeting.
Holy crap! I'm in agreement with Doom! I think iD is thinking of it in terms of Beyond3d wanting to benchmark with equipment provided by NVIDIA. iD has no control over who gets the equipment, which is why he said: go ask NVIDIA if you want to meet here and use their equipment and run my 'demo'.

(I think you chicken little types are a bit off base on this one)
 
Carmack said:
...The executable and data that is being shown was effectively lifted at a random point in the development process,

Translation: "nVidia picked a build from some point in the development process, and as far as we're concerned, that's random enough."

...but we recognize the importance of providing a benchmark for comparison purposes at this time, so we are allowing it to be used for this particular set of tests.

Translation: "We realize that for the past 10 months ATI has had the clearly superior hardware, so providing a .plan update that indicated that, or worse, providing a benchmark "during that time" to show it, wasn't important for comparison purposes. On the day that nVidia introduces a competitive part, the defintion of "important at this time" has been fulfilled."

We were not happy with the demo that Nvidia prepared, so we recorded a new one while they were here today.

Translation: "We're not COMPLETELY off our rocker and realize that we could not spin enough to justify actually using an nVidia created demo, and have the community accept that."

This is an important point -- while I'm sure Nvidia did extensive testing, and knew that their card was going to come out comfortably ahead with the demo they prepared, right now, they don't actually know if the demo that we recorded for them puts them in the best light. Rather nervy of them, actually.

Translation: "nVidia was not nervy enough to give ATI the heads up and some time to compete with tuned drivers."

At some point, after we have documented all of the options and provided multiple datasets, Doom is going to be an excellent benchmarking tool, but for now you can still make some rough assessments with it."

Translation: At some point, there will be a benchmark version that is actually useful. But for now, accept this as the marketing gimmick that it is.
 
Reverend said:
Incorrect. From the way I read id's reply to me, it meant id prefered I liase with NVIDIA to arrange for a benchmarking session. Big difference.

Not so big.

Why should anyone have to liase with any IHV for a benchmarking session using id's software?

It would not be an issue, if you can go to ATI and ask THEM for a "benchmarking session" as well. Go ahead and try it. (Serious request...if id allows ATI to also orchestrate a benchmarking session, then my concerns are in fact unfounded.)

Again, you're either speculating or jumping the gun.

Indeed...provided that ATI is allowed to arrange a similar benchmarking session with Id's blessing.

PS. 2AM, going to bed, so don't expect any further quick responses from me.

I'm more interested in a response from ATI and nVidia at this point.
 
RussSchultz said:
Humus, I think Natoma is trying to say that the "ARB2 path" is utilizing ARB2 extensions, which are presumably blessed by the ARB/OpenGL guiding committee as the shader input mechanism for OpenGL.

There is no such thing as "ARB2 extensions" either. It's either ARB or non-ARB.

RussSchultz said:
Heh. I take that back. Searching opengl.org for "ARB2" yields nothing.

What is meant by ARB2? using GL2_xxxxxx functions? Or....? Could somebody fill us in and vanquish the perpetuated misunderstanding?

edit:sssss my precioussss

ARB2 is simply the name Carmack gave the second path using ARB extensions. The first ARB path he used is using other ARB extensions, such as GL_ARB_texture_env_combine, that is, fixed function (DX7 level) capabilities. The second path uses extensions such as GL_ARB_fragment_program and GL_ARB_vertex_program.
 
Why should anyone have to liase with any IHV for a benchmarking session using id's software?

To avoid leaks. nV has the bench with them when they arrive at the site. They take it with them when they leave.

It would not be an issue, if you can go to ATI and ask THEM for a "benchmarking session" as well.

Perhaps if you substituted S3 or Matrox that would be a good point. Throwing the demo up on a website likely wouldn't be the best way to avoid leaks if that is the reason they are doing it this way ;)
 
Back
Top