R300 the fastest for DoomIII, John Carmack Speaks Again

Status
Not open for further replies.
DaveB - sorry, didnt realise.

Doom - Chalnoth had already said he didnt want to continue, you wanted to carry on, so I though it best if it was carried on in another more specific thread, thats all. Not saying I'm not interested in the comparison as an 8500 owner.
 
DaveBaumann said:
So, were you one of the ones up all night at E3? :)

No... my handle here is a holdover from a previous job in the graphics industry (*cough* S3 *cough*). I'm strictly D3D now... but I do miss OpenGL.
 
ben6,

No, I never had any dealings with the FireGL guys.

Typedef Enum said:
Well, you're going to have to change your name to "Direct3D Guy," or similar :)

I'm keeping my current handle as it will (hopefully) help prevent me from being brainwashed... we all know OpenGL is the superior API, right? :)
 
OpenGL guy ah cool, he used to work over at Diamond , then S3 , now ATI (went with the FireGL team I guess...)
 
can't speak for John Carmack. What I can tell you is that John is a long-time supporter of NVIDIA and Doom III is being designed with NVIDIA graphics in mind. The first two public showings of DOOM III were done on NVIDIA GF3 and GF4 hardware (spreading the love?). John is on the record in his dot plan file as saying that id's primary development platform is on NVIDIA hardware and NVIDIA's OpenGL drivers are his "gold standard". When Doom 3 is ready to ship, our next generation architecture, known as NV30, will provide the ultimate platform for DOOM III and deliver the game the way it's meant to be played. No question about it. Besides, we have the four fastest products on the market right now (GF4 Ti 4600, 4400, 4200 128MB and 4200 64MB) and have little to gain by pushing our fall part. That said, we feel very good about our product offering for the fall.

hmm, all i can say is... what a bunch of hooey. If its beign developed with NVIDIA in mind, then Why has JC stated multiple times that the ATi approach is better. The only issue he has had is with ATi's drivers, which are constantly improving.

He also specifically stated that Ati's Next Gen Part has the *perfect* feature set for DoomIII. And that Nvidia is a half step behind. I totally HATE Nvidias constant twisting of the truth, Lies, doubble talk, slander etc etc... this is why they are the most loathsome company i am aware of.

When Doom 3 is ready to ship, our next generation architecture, known as NV30, will provide the ultimate platform for DOOM III and deliver the game the way it's meant to be played

thats great brian... to bad for YOU that ATi has ALREADY Shown a product that is the PERFECT FIT FOR DOOM, and ALREADY DELIVERS the untimate Doom experience. Or didn't you see the pics....

Not to mention SERIOUSLY doubt that the Nv30 is going to outdo what we have already seen.
 
Though I'm sure Brian didn't mean his words to be interpreted this way, his phrase "When Doom 3 is ready to ship. . .NV30. . .will provide the ultimate platform. . . ." Hmm, with Doom 3 not shipping until next year, you'd think he would've said NV35 and not NV30. :eek:
 
Hellbinder[CE said:
]
can't speak for John Carmack. What I can tell you is that John is a long-time supporter of NVIDIA and Doom III is being designed with NVIDIA graphics in mind. The first two public showings of DOOM III were done on NVIDIA GF3 and GF4 hardware (spreading the love?). John is on the record in his dot plan file as saying that id's primary development platform is on NVIDIA hardware and NVIDIA's OpenGL drivers are his "gold standard". When Doom 3 is ready to ship, our next generation architecture, known as NV30, will provide the ultimate platform for DOOM III and deliver the game the way it's meant to be played. No question about it. Besides, we have the four fastest products on the market right now (GF4 Ti 4600, 4400, 4200 128MB and 4200 64MB) and have little to gain by pushing our fall part. That said, we feel very good about our product offering for the fall.

hmm, all i can say is... what a bunch of hooey. If its beign developed with NVIDIA in mind, then Why has JC stated multiple times that the ATi approach is better. The only issue he has had is with ATi's drivers, which are constantly improving.

Those statements are consistent with what JC has stated...except for the "NV30 will provide the ultimate platform for DOOM III." What he has said was that the comparison between the NV30 and R300 will be very interesting.

Of course, I do believe him when he says that the NV30 will provide the ultimate platform for DOOM III...after all, nVidia's had the highest-performing parts on the market since the release of the TNT2 Ultra.
 
Hehe, John , not only are we expecting Nvidia to talk about NV30 in more than general terms in the middle of a product cycle, but we expect them to comment on the refresh of the NV30 or the part after that? Just a silly thought.

Hellbinder, we'll see , I can't wait to see who will be the best card for DoomIII among Matrox's eventual next card, NV35, and R400 (whatever) as all 3 will be out before Doom3 is likely to ship. Perhaps even a P11 (whatever 3DLabs calls their next chip next year) .

I don't forsee Doom3 shipping before Q2 2003 and that's a long time
 
Chalnoth,
Nothing lasts forever, especially in the technology field. I've already commented on your bold statements on Nvnews and shows your 'objectivity'. The Ti 500 was not the fastest card on the market till Nvidia released the Geforce 4, the Radeon 8500 was...
The Radeon 8500 held top spots in 3Dmark which EVERYONE seems to think is the 3D bible
The Radeon 8500 was beating the Ti 500 in MOHAA, RTCW, Serious Sam, Aquanox.
How you can boldy make statements like that is beyond me and I hope ATI delivers so you can EAT your words ;)
 
OpenGL guy said:
If you are talking about the Quake/Quack issue, I am pretty sure I've commented on this before. Compare the benchmark results before the bug was fixed to after the bug was fixed. I think you'll see they are pretty much the same. In fact, I didn't even buy my 8500 until I knew the problem was resolved.

I don't have rose-colored glasses on, but it sounds like some people remember every bit of dirt no matter what.
Sigh. I'm glad you cleared up your misunderstanding with Kristof, so I don't want to reopen any closed aggravations, but I must reply when you overlook facts. The fact is, with the original driver, the 8500 showed a marked difference in benchmark scores between a quake and quack executable. So there was an "issue" with the drivers, and, cheat or mistake, it did affect scores.

I'm not sure what to say--I fault you (tho that was mostly directed toward Hellbinder) for overlooking truths, and you fault me for actually remembering them. I guess I can't win. I admire ATi's driver improvements, but I won't overlook problems, especially those that look like deliberate attempts to mislead consumers...

...Unless ATi's later drivers showed the same speed difference between quake/quack, but with the exact same image quality for both (specifically, no messed up MIP-map levels). If that happened, and I just missed that part of [H]'s follow-up, then I apologize for my ignorance.

Hellbinder[CE said:
]hmm, all i can say is... what a bunch of hooey. If its beign developed with NVIDIA in mind, then Why has JC stated multiple times that the ATi approach is better. The only issue he has had is with ATi's drivers, which are constantly improving.

He also specifically stated that Ati's Next Gen Part has the *perfect* feature set for DoomIII. And that Nvidia is a half step behind. I totally HATE Nvidias constant twisting of the truth, Lies, doubble talk, slander etc etc... this is why they are the most loathsome company i am aware of.
[...]
thats great brian... to bad for YOU that ATi has ALREADY Shown a product that is the PERFECT FIT FOR DOOM, and ALREADY DELIVERS the untimate Doom experience. Or didn't you see the pics....

Not to mention SERIOUSLY doubt that the Nv30 is going to outdo what we have already seen.
Firstly, JC has not stated that the problem lies with the drivers. It may just be a hardware issue, that the NV2x is faster at low-level operations than ATi's part, which explains why the GF4 still outperforms the 8500 despite its required extra passes.

Secondly, do you have an inside track on Doom3, R300, and NV30 development? I doubt it, and that's why I'm annoyed with (UNNECESSARILY BOLDED) statements like "PERFECT FIT" and "SERIOUSLY doubt." You don't know, period--your diatribe above was solely speculation, certainly in fanboy territory from my perspective, and I'd appreciate if you saved your venom for something provable. If the NV30 turns out to be faster than an R300, then it will assume the mantle of "perfect card for Doom 3," because, in the end, all we're concerned with is performance (with the same image quality).

And what "we have already seen" is a blurry demo video of an alpha game on alpha hardware with alpha drivers. That's not reason enough to base RIGHTEOUS INDIGNATION on.
 
Doomtrooper said:
The Ti 500 was not the fastest card on the market till Nvidia released the Geforce 4, the Radeon 8500 was...

1. 3DMark2k1 is not a game.
2. I only care about real game benchmarks
3. Try giving a full list of games where the Radeon won/lost...here's a far more complete list:

(From Tomshardware)
Aquanox: Ti500
Max Payne: Ti500
Quake 3: Radeon
Jedi 2: Radeon
3DMark2k1: Radeon
Giants: Ti500

(From Anandtech)
Black & White: Ti500
Quake 3: Ti500
Wolfenstein: Tie
Serious Sam: Ti500
Max Payne: Ti500
Unreal Tournament: Ti500

(From x-bit labs)
Villagemark: Radeon
Quake3: Ti500
Serious Sam: Ti500
3DMark2k1: Radeon

From these, it does seem like the GeForce3 Ti 500 won most of them (Especially if you throw out the synthetic benches...*cough* 3DMark *cough*)....and I'm sure I'd find the same result if I went to even more websites.
 
What if the Quake/Quake improved image quality, would you call it a cheat...screen shots of the original Radeon 64 DDR are all over the NET that also had the Quake Reference in them (application detection). You will not find one review that says the originial Radeon had poor IQ, in fact every reviews states the IQ is superior.
Then the Radeon 8500 is released, the texture slider doesn't work and Kyle Bennett gets tipped by IMO Nvidia (yes Nvidia, why else would someone just start looking a driver :rolleyes: ) about application detection..they think they discovered the world but if they would have looked at Rage3D..this was discovered way before they did.
The only issue was the IQ was screwed up on 5 Textures, 5 specific textures to be exact.
So people can believe all they want, I still believe it was a driver flaw may due to the 8500's different architecture, but its certainly not something ATI debuted for the 8500.
 
Application detection for benchmarks is cheating. Benchmarks are meant to guage relative performance across more programs than those benchmarked. Application detection skews the results, meaning that the Radeon 8500 will almost certainly be generally worse than other cards than it is in the major benchmark programs.
 
Chalnoth said:
Doomtrooper said:
The Ti 500 was not the fastest card on the market till Nvidia released the Geforce 4, the Radeon 8500 was...

1. 3DMark2k1 is not a game.
2. I only care about real game benchmarks
3. Try giving a full list of games where the Radeon won/lost...here's a far more complete list:

(From Tomshardware)
Aquanox: Ti500
Max Payne: Ti500
Quake 3: Radeon
Jedi 2: Radeon
3DMark2k1: Radeon
Giants: Ti500

(From Anandtech)
Black & White: Ti500
Quake 3: Ti500
Wolfenstein: Tie
Serious Sam: Ti500
Max Payne: Ti500
Unreal Tournament: Ti500

(From x-bit labs)
Villagemark: Radeon
Quake3: Ti500
Serious Sam: Ti500
3DMark2k1: Radeon

From these, it does seem like the GeForce3 Ti 500 won most of them....and I'm sure I'd find the same result if I went to even more websites.

Classic,

Since 85 % of your benchs are from the 8500's debut then again it shows your objectivity...can a Ti 500 keep up with a 8500 in Modern games like Jedi Knight or Serious Sam...

page6.12.gif

page4.6.gif


http://www.tomshardware.com/graphic/02q2/020522/ti4400_4600-13.html

There is no way you can say a Ti500 is faster than a Radeon 8500 today..sure in some highly Nvidia optimized games like Dronez..but not now. The 8500 drivers have improved performane ALOT ( Wavy's graph showing the improvement). Lets not even include Anistropic performance ..

Since Unreal Tournament is my game I was one of the Rage3D members that identifed the Vsync bug in Direct3D that was capping the frames, I jst benched UT with 16X anistropic @ 1024 x 768 with the Thunder demo..kinda blows Toms Radeon Score of 83 doesn't it.
I also did Max Payne...kinda beats that too at Anands...

shot0020.jpg

maxpane.jpg
 
Chalnoth said:
Doomtrooper said:
The Ti 500 was not the fastest card on the market till Nvidia released the Geforce 4, the Radeon 8500 was...

1. 3DMark2k1 is not a game.
2. I only care about real game benchmarks
3. Try giving a full list of games where the Radeon won/lost...here's a far more complete list:

(From Tomshardware)
Aquanox: Ti500
Max Payne: Ti500
Quake 3: Radeon
Jedi 2: Radeon
3DMark2k1: Radeon
Giants: Ti500

(From Anandtech)
Black & White: Ti500
Quake 3: Ti500
Wolfenstein: Tie
Serious Sam: Ti500
Max Payne: Ti500
Unreal Tournament: Ti500

(From x-bit labs)
Villagemark: Radeon
Quake3: Ti500
Serious Sam: Ti500
3DMark2k1: Radeon

From these, it does seem like the GeForce3 Ti 500 won most of them (Especially if you throw out the synthetic benches...*cough* 3DMark *cough*)....and I'm sure I'd find the same result if I went to even more websites.

okay fine, how bout some where the 8500 beats a ti4200 AND beats a ti4400 in some tests? Jedi Knight 2 anyone???

http://www6.tomshardware.com/graphic/02q2/020522/ti4400_4600-13.html


Serious Sam SE
http://www.xbitlabs.com/video/suma-ti4400/
(although this test helps the 8500 due to Ansiotropic)

edit.... Dang it Doom, beat me to it.
 
Chalnoth said:
Application detection for benchmarks is cheating. Benchmarks are meant to guage relative performance across more programs than those benchmarked. Application detection skews the results, meaning that the Radeon 8500 will almost certainly be generally worse than other cards than it is in the major benchmark programs.

Are you trying to tell me that Nvidia doesn't do application detection... :rolleyes:
I owned a Geforce 3 ...between driver sets either 3Dmark was stellar and game peformance was poor, or Quake 3 was faster with one set and 3Dmark was poor.
Like I said before dude, if the Quake 3 reference was raising IQ would it be considered a cheat..No it wouldn't as how is improving something cheating.
ATI got caught, nvidia didn't...yet....thats the only difference
 
Status
Not open for further replies.
Back
Top