R300 the fastest for DoomIII, John Carmack Speaks Again

Status
Not open for further replies.
Joe DeFuria said:
I'll take your word for it. :) They were also launches with little or no fanfare, because those parts were not designed to be "faster" than the previous incarnation...just more cost efficient.

You don't have to. GF2MX was launched in the summer. TNT2 PRO and M64 were announced in June, but released in the fall.

But that would seem to conflict with what nVidia was trying to "win" with the Doom3 demo with, unless nVidia was purposely trying to supply a card that they know will never exist, just for demo purposes.

What would be wrong with that?
 
pascal said:
I think Kristof tried to say that probably this next generation card has early drivers needing some speciall attention (from both sides) to run with Doom3.

I think one quote from JC (which in a way proves OpenGL guy right) is important in that aspect:

JC said:
“The new ATI card was clearly superior. I don’t want to ding NVidia for anything because NVidia has done everything they possibly could; but in every test we ran , ATI was faster.â€￾

So, he did more with the card than just showing off Doom3 at E3; it must be functionable enough so he could test it more thoroughly against the high speed NV25. Question is, what did he test? Just the features needed for Doom3? FSAA?

An a related note, did anyone who was at E3 and saw the Doom3 demostration first hand saw any form AAing?

Thanks.
 
Take a look at the screenshots. 3 out of the 4 have some sort of AA. Its too dark to tell how many colour graduations there are (the jpg doesnt help either) but AA is definitely on.
 
Joe DeFuria said:
But that would seem to conflict with what nVidia was trying to "win" with the Doom3 demo with, unless nVidia was purposely trying to supply a card that they know will never exist, just for demo purposes.

What conflict?

An 0.13µm GF4 should be highly overclockable. So an GF4 on steroids should be easy to improvisate
 
On steroids or not, GF4 could not compete against an early revision of R300. nVidia needs NV30 bad if they want to compete in the high end.
 
OK, so it looks like the NV30 maybe a bit late.

Could nVidia decide to delay the NV30 a bit, fill in the gap with the NV28, then release the NV30 when Microsoft releases DX9.1, a bit like what ATI did with the 8500, support a newer version of DX so they can say that their card has better features.

This will also give them additional time to get better yealds and driver performance. Remembering that the NV30 is a completely new architecture and the drivers may well take a time to reach their full speed.

Didn't the NV20 take like 3-4 months before driver speed was up to scratch??
 
What would be wrong with that?

Nothing would really be wrong with it, unless nVidia also made some PR about it, stating or implying that the board running Doom3 would be available to the public.

What conflict?

An 0.13µm GF4 should be highly overclockable. So an GF4 on steroids should be easy to improvisate

There is no "technical" conflict, but there would be a product line-up conflict. A higher performing, similar price (similar to current Ti-4600 price) GF4 ultra based on a 0.13 process is certainly technically feasible. The conflict would arise in product positioning if nVidia is going to ship NV30 in a relatively short time afterward.

In short: I do not see a scenario where nVidia launches a GeForce4 "ultra" in June, AND the NV30 in August. One or the other.
 
I'm wondering if more is being made out of this than we could expect. Nvidia will have a DX9 part to release at least very close if not at DX9 launch. They said all but said as much at E3 . (sorry wait for my Nvidia E3 report for specifics as I'll be quoting them verbatim)
 
Nexus said:
So, he did more with the card than just showing off Doom3 at E3; it must be functionable enough so he could test it more thoroughly against the high speed NV25. Question is, what did he test? Just the features needed for Doom3? FSAA?

Since his needs were to show the demo off at E3, why would he do more than test functionality that would be shown in that demo?

Besides, since he cannot release the results of these tests anyway, it's pointless to conjecture on how thorough they may have been.

Given the fact that this is very early hardware, and that there *were* problems (supposedly the E3 demo was only running at medium quality because of this...), we have every reason to believe that the card in its current incarnation simply would not work very well for anybody actually trying to play the game (not to mention the game itself is most definitely very buggy right now...).
 
Bambers said:
Pete said:
BTW, I'd love to see someone post aniso and FSAA scores to compare with Chalnoth's, particularly with a level (bilinear) aniso playing field. Someone step up to the plate! :)

I would do (i'd have to run the 1600x1200 tests blind as my monitor only goes to 1280x1024) but I don't know which version of Q3 he is using or which demo. Post those and i'll have a go.

Preliminary values for playing though are 80-140fps for 2xFSAA 16xaniso and 30-60fps for 4xFSAA and 16x aniso (both quality modes) at 1024x768x32.

-bumb- :)

I assume on doom3 the tests would most likely be quick benchmarks of different parts of the game (presumably with AA on as there is AA in the screenshots).
 
McElvis said:
OK, so it looks like the NV30 maybe a bit late.

We really do not have any conclusive evidence on this yet. nVidia still has a good 2-3 months to get the chip in working order. That's quite a ways off. And, by the way, we WON'T have any conclusive evidence on it, either. Not even nVidia truly knows whether it will be ready in time, though I'm sure they feel they can make the deadline. Since they've not come terribly late on any of their recent designs, chances are they'll be fine this time, too.

And don't forget that there were apparently some problems with the R300 that they did have, as it was stated that only medium details were used. That is, the R300 is most certainly not ready to ship yet.
 
jb said:
GeForce4 Ti 4200 64MB (no o/c)
Athlon 933MHz on nForce 415-D

1600x1200x32
No aniso: 107.3 fps
2x: 89.2 fps
4x: 76.2 fps
8x: 68.0 fps

My system

KT7A-Raid with KT764 BIOS
2 - 256 MB Kingston CAS2.5 SDRAM
AMD 1.333 Ghz Sock CPU
2 - IBM 7200 30 GB HDD in RAID 0 Array
Sound Blaster Live
Win2k w/SP2
ATI 8500 w/3286 Drivers

1600x1200x32
No Aniso
88.6
"High Aniso"
81.8
"Highest Ansio"
75.4

Did you make sure that the texture and geometry quality were set to high?
 
I don't think anyone ever implied that R300 is ready to ship. And it's also not obvious from the statements abot the "problems" if they were video card related, or code related, or a combination of both.

Not even nVidia truly knows whether it will be ready in time...

I agree with that. (Which is why I don't put large amounts of "faith" in the comments from the CEO.) I am certain that nvidia is AIMING to get NV30 shipped this fall. Whether or not they do, and if they do, whether or not it meets their internal specification targets, is anyone's guess.

Since they've not come terribly late on any of their recent designs, chances are they'll be fine this time, too.

I don't know how you can make that assumption of "not being late". For example, the nforce 615/620 was completely canned. Many think that the now defunct GeForce3 ti was a "stop gap" because the GeForce4 was late.

Add to that, the fact that everyone, including nVidia, is hyping the NV30 to be a "big departure" from their previous designs...that alone makes using recent history as an indication of execution of this chip a bit risky.
 
Joe DeFuria said:
I don't know how you can make that assumption of "not being late". For example, the nforce 615/620 was completely canned. Many think that the now defunct GeForce3 ti was a "stop gap" because the GeForce4 was late.

There are also strong indications that the problems in both of these situations were more related to the market atmosphere than anything. Many people apparently believe that the 4Ti's were delayed solely because there was no real competition from ATI at the time. There nForce 615/620's may have been canned for similar reasons...we really have no reason to believe that these parts were canned due to engineering concerns.

But, my point was more that none of their recent "brand-new" designs have been significantly delayed. That means, by the way, the RIVA 128, the TNT, the GeForce, and the GeForce3. The TNT was even put out at much lower clock speeds than originally planned in order to release it in the fall (of '98, if I remember my timeline correctly).

It should just be obvious that there would be no reason for any of their refreshes to be delayed for manufacturing reasons, so I really don't believe that either the GF4 or nForce 615/620's were delayed/nixed (respectively) due to manufacturing concerns.
 
Chalnoth said:
And don't forget that there were apparently some problems with the R300 that they did have, as it was stated that only medium details were used. That is, the R300 is most certainly not ready to ship yet.

Where is the information that the decision to use Medium Details was caused by an ATI problem though?

From Shacknews: <JC> "We actually screwed up at E3 -- we should have been running it at high quality settings (uncompressed textures, anisotropic filtering), but we were chasing some problems the first day, and it got set back to medium quality. The problems had gone away, so we left it that way, rather than risk changing it back."
 
And it's also not obvious from the statements abot the "problems" if they were video card related, or code related, or a combination of both.

Seeing as Carmack states that ATi’s part was faster in all respects then you would assume that this would be a code issue – unless relative tried and tested architecures such as NVIDIA also have these board issues.

It should just be obvious that there would be no reason for any of their refreshes to be delayed for manufacturing reasons, so I really don't believe that either the GF4 or nForce 615/620's were delayed/nixed (respectively) due to manufacturing concerns.

NVIDIA have long relied on the engineering processes advancing in line with their product development, and this has certainly been the case a number of times (TNT = .18um, GF256 = .22um, GF2 = .18um, GF3 = .15um) – its reasonable to expect .13um to have come on stream in a similar time period. I think NVIDIA had expected .13um to be on stream for the initial GF4 release timeframe but when that didn’t happen they had to redesign it for a .15um process. I think this slightly skewed their release and they had to make a decision of releasing the GF3 Ti lineup or waiting another few months for GF4 to ready. I remember talking to NVIDIA PR in September and even they didn’t really even know what was going to happen (that was the impression I got). I think that because Radoen 8500 was too much of a threat they held GF4 back a while longer thus leaving them breathing room for NV30.

A couple of other things to go with that – the only other time they refreshes have gone out for step was with GF2 Ultra, and that was again due to dealys with NV20. Also, I still believe NV2A was specced with .13um in mind and as they realised the process wasn;t going to be ready and they’d need to use .15um that’s when we saw the climdown in expected clock speed (300Mhz -> 250Mhz -> 233Mhz).
 
means, by the way, the RIVA 128, the TNT, the GeForce, and the GeForce3. The TNT was even put out at much lower clock speeds than originally planned...

Well, we'll never know if the Riva 128 was "late". It was nVidia's first shipped product in many years. ;)

Yes, TNT shipped 90 Mhz vs. targeted 125 Mhz, because nVidia released the TNT on a larger fab process than hoped....because the more advanced fab was not ready. Just a few months before product launch, nVidia's best guess was that the advanced process WOULD be ready....but by shipping time, it wasn't. So, the TNT-2 (next product cycle) was what the TNT-1 "was suppossed to be".

I've heard the same of the GeForce. Most people were surprised at how low the clock speed of the original GeForce was, and how hot that chip was at that speed. Unlike the TNT situation, nVidia never disclosed their "internal target" clock-speed for GeForce. (They learned their lesson. ;)) So we'll never know for sure.

Then there's the NV2a (x-box) which also shipped at a lower clock speed than expected.

I would say that there is a history of nVidia not meeting their "internal targets" for these new chips. Debuting at a lower clock speed than intended.

So YES, it is certainly possible that nVidia would push out an NV30 of "some form", even if it's not running at the spec that they were planning to release it on. But then, any rumors we hear today about "expected perforamance" based on "insider knowledge of the specs" of the chip would have to be taken with a huge grain of salt.

It should just be obvious that there would be no reason for any of their refreshes to be delayed for manufacturing reasons, so I really don't believe that either the GF4 or nForce 615/620's were delayed/nixed (respectively) due to manufacturing concerns.

It's not obvious to me at all.

But if you're to assume that those products were not released because of marketing reasons and not "physical" reasons, then we can assume the same could happen to NV30. Ala: GeForce4 Ti Ultra can be "competitive" with R300, especially on todays "benchmarks." NV30 is soooo advanced, that it would blow R300 out of the water. So why not wait until the next product cycle to release NV30? Why should nVidia launch it now?
 
Status
Not open for further replies.
Back
Top