R300 the fastest for DoomIII, John Carmack Speaks Again

Status
Not open for further replies.
DaveBaumann said:
I don't think thats a certainty from whats being talked about here.

And talk of violating NDA's goes both ways (albeit one for the better and one for the worse).

Don't forget that ATI already formally announced its use of next-generation technology in the DOOM3 demo.
 
http://www.gamespy.com/e32002/pc/carmack/index2.shtml



Here's the quote from earlier in E3:

GameSpy: The world of video cards seems to change on a daily basis. What do you think of the current crop of cards on the market, and where do you see things heading? Are there any new cards that interest you? Where would you like to see things go?

Carmack: There are interesting things to be said about the upcoming cards, but NDAs will force me to just discuss the available cards.

In order from best to worst for Doom:

I still think that overall, the GeForce 4 Ti is the best card you can buy. It has high speed and excellent driver quality.

Based on the feature set, the Radeon 8500 should be a faster card for Doom than the GF4, because it can do the seven texture accesses that I need in a single pass, while it takes two or three passes (depending on details) on the GF4. However, in practice, the GF4 consistently runs faster due to a highly efficient implementation. For programmers, the 8500 has a much nicer fragment path than the GF4, with more general features and increased precision, but the driver quality is still quite a ways from Nvidia's, so I would be a little hesitant to use it as a primary research platform.

The GF4-MX is a very fast card for existing games, but it is less well suited to Doom, due to the lower texture unit count and the lack of vertex shaders.

On a slow CPU with all features enabled, the GF3 will be faster than the GF4-MX, because it offloads some work. On systems with CPU power to burn, the GF4 may still be faster.

The 128 bit DDR GF2 systems will be faster than the Radeon-7500 systems, again due to low level implementation details overshadowing the extra texture unit.

The slowest cards will be the 64 bit and SDR ram GF and Radeon cards, which will really not be fast enough to play the game properly unless you run at 320x240 or so.
 
jb said:
OMG I thought he made it very clear why he chose the R300 as the nV30 is not ready yet.

When he says its not ready doe he mean 'its physically not here' or 'its still in a very alpha state along with its drivers and it wasn't ready in comparison to ATi's part'?

This also makes me wonder:

“The new ATI card was clearly superior. I don’t want to ding NVidia for anything because NVidia has done everything they possibly could; but in every test we ran, ATI was faster.â€￾

If it was NV25 what would 'everything they possibly could' be? Supply a board and perhaps some latest drivers? Seems to me they were putting more effort in than that.
 
Some possibilities

1. JC violated his NDA with Nvidia
2. He was talking about NV25 versus R300
3. Or a more interesting theory, Nvidia gave him permission to talk about their next generation in this instance . Otherwise he would have mentioned it earlier with Gamespy
4. He doesn't have NV30 to test .
 
give me a break guys....

he is OBVIOUSLY talking about the Nv30. The R300 is not due to be released for severl moths. It doesn't even make LOGICAL SENSE to assume he is comparing the next generation to this generation. The next generations superiority is an OBVIOUS GIVEN.

No he did not say it specifically. But the intent is clear, and the reason he said it is clear. He HAD to put to rest all the BS rumors that Nvidiots started spreading.
 
It's a small PR win for ATI.... could be ATI asked him to talk about their next gen part...or JC just loves talking about the latest gfx hardware tech just like us geeks.

It does sound weird that JC would compare R300 to a GF4 Ti4600 but I think that is precisely what he is doing .... IMHO the next NVIDIA part is going to be late (a couple of months or more) than the R300 and so even though developers may have it - drivers etc may not be ready for it.

ATI have been saying for quite some time now that they will have the first DX9 card out....
 
Wasnt it nVidia that said late august we would annouce thier next gen part? I have not seen any date from ATI. AFAIK no developers have NV30 boards yet. But r300 have been out for quite some time now (and please correct me if I am wrong).
 
Hmm maybe I am wrong but I just seem to remember reading a rebuttal from ATI... which kind of addressed the fact that most people thought NV30 would be released first.

Whatever way it turns out to be I still find it unlikely that JC had an NV30 unit to demonstrate DOOM III on in time for E3 no matter how 'hard' NVIDIA tried to make it a possiblity.. and then even if they did it could be that NV30 would have [is] still slower than R300.

One encouraging aspect of the updates from JC is that a few days ago he specified in order of performance:
1) Gf4 Ti4600
2) Radeon 8500
3) GF3
4) GF2 etc

A few weeks ago JC was disappointed in a bug (possible hardware bug) which made the Radeon 8500 slower than a GF3. Looks like that issue has been addressed now.

Whatever JC is comparing to the fact is that it WAS a next gen ATI card and not a card from NVIDIA. Sounds very promising for ATI :)
 
- The fastest card in every way
- The ideal feature set
- The best quality
- Working now

R300 sounds excellent to me 8)
 
Like Wavey said, I've tried on several occasions to get JC to talk to me but he has only ever emailed me once. I emailed him about a week ago and hasn't heard anything back.

Regarding the topic - IMO JC didn't break any NDA with his statements. It's likely he's talking NV30.

Don't forget, however, the timeframe NV30 and R300 will go retail, DOOM3's release and how NVIDIA always has faster refreshes. Call me ignorant but I don't think ATI has such "faster refreshes".
 
Call me ignorant but I don't think ATI has such "faster refreshes".

Traditionally they haven't. However, with the restructure of their business over the past few years it wouldn't surprise me to see them operating in a fashion much closer to NVIDIA and do this.
 
Reverend said:
Regarding the topic - IMO JC didn't break any NDA with his statements. It's likely he's talking NV30.
Yes, I'll go with Occam's razor on this one too.

But we're left with the question of whether NV30 is performance-deficient or delayed, relative to R300. If it's the latter, I still don't think it's meaningful. Both companies have validated at TSMC, and even if ATI wanted to switch to UMC's process I suspect the extra overhead would mean they wouldn't save any time. So, as a result of fab issues, the R300's design lead will disappear during the production ramp.

If, on the other hand, the R300 is just a flat-out better design, things will be much more interesting. It's about time someone seriously challenged nVIDIA. It would take more than one superior design for a true "changing of the guard" at the high end, but it would be an excellent start.

NVIDIA always has faster refreshes.
Indeed, this is their greatest strength, and why they will be difficult to displace.
 
I'm a bit astounded by the resistance to the possibility of the R300 being clearly better than the NV30 by some. A lot of posts seem to be saying "it can't be the NV30 because it just couldn't be that the R300 beats it", nevermind what the text that has been quoted states. The alternative (which is, of course, possible) is that JC being sloppy and misleading in his comments...though possible, I thought automatically assuming such was the case instead of a simple interpretation of what was stated clearly was reserved for "Swedish swear word" sites. ;)

It still seems pretty clear that it isn't the Ti 4600...since...why wouldn't he mention it? nVidia certainly wouldn't mind him doing so instead of leaving the impression it was the NV30. Also, all that talk about "half a step behind" CAN be interpreted to mean the NV30 was a no show, but would he be so irresponsible to give endorsement on performance so clearly when the issue was simply that the NV30 sample was broken instead of unable to perform up to the R300?

In any case, nVidia may be able to up clock speeds or perform driver magic to boost performance, so it isn't a closed book in any case, but I'm just astounded by the unfounded resistance to the idea that the R300 could just simply be faster than a working NV30 at this juncture.
 
demalion said:
I'm a bit astounded by the resistance to the possibility of the R300 being clearly better than the NV30 by some.

You shouldn't be. nVidia has been consistently putting out superior products to ATI's.
 
You shouldn't be. nVidia has been consistently putting out superior products to ATI's.

I think that some may see that as a matter of conjecture and not just a blanket fact that you seem state. It clear that NVIDIA have consistently been putting out faster products than ATi (with a minor blip between Radoen 8500 and GeForce 4), but ATi's products have been sufficiently different in other areas (quality, features, video, etc., etc.) for that not to be correct for everyone.

However I could see a significant shift in ATi when Dave Orton took the helm and that was starting to pay dividends from Radeon 8500 IMO; 3dfx had also tried to have a similar shift in the managment change but they did it much later in the 'business decline' and they were far too damaged to do anything about it - ATi may have saved their bacon in time.
 
DaveBaumann said:
I think that some may see that as a matter of conjecture and not just a blanket fact that you seem state.

Sorry, I was actually speaking in reference to what JC was talking about: performance. Yes, there are other reasons to consider an 8500 over a GeForce3, but none of those reasons ever held for me.
 
Chalnoth said:
Sorry, I was actually speaking in reference to what JC was talking about: performance. Yes, there are other reasons to consider an 8500 over a GeForce3, but none of those reasons ever held for me.

If performance is all you care about, then why don't you compare ATI's superior mobile products to nvidia's?

I mean, you can't just say "nvidia has been putting out superior products to ATI's" and not qualify what you are talking about. It's quite clear that it isn't true, even by your standards.
 
Status
Not open for further replies.
Back
Top