G71

Ailuros said:
But only because NV40@400MHz (130nm) wasn't possible as a single slot design back then.
Just as a G70@5xxMHz might not be possible as a single slot design.
 
Well considering how high you can clock the 7800GTX with its trim cooler, the question is how much more with a like card and a big beefy cooler like that?
 
Why did the 6800U need a huge cooler when so many 6800GTs clocked to 6800U speeds (and beyond) on their single slot cooler?

Jawed
 
Jawed said:
Why did the 6800U need a huge cooler when so many 6800GTs clocked to 6800U speeds (and beyond) on their single slot cooler?

Jawed

They had to cover the worst-case situations as well. You know, like badly cooled cases and cards on the outer limits of the bell-curve...
 
Ailuros said:
Well, they did both single and dual slot versions of NV40, didn't they?

But only because NV40@400MHz (130nm) wasn't possible as a single slot design back then.

I'd be very, very surprised if there wasn't a dual slot version of G70 (even if its given another name) that launches to compete with R520.

G70 has 10.32 GTexels/sec theoretical fill-rate. In order to reach that fill-rate R520 has to be clocked at 645MHz. Now let's assume ATI managed to get even higher than that, it'll never reach the 30% fillrate advantage the X800XT PE had against the NV40.

G70 was launched to compete against what exactly? I'm not excluding the possibility, I just don't seem to find a good reason for a higher clocked dual slot version this far. Even more if ultra high end R520 turns up with a dual slot cooling system, NVIDIA will have a marketing advantage against it.

How much more effective are they, or is there as much "show" as "go" in using a dual slot cooler?

If they would be just there for decorational reasons, I think the IHVs would spare the extra expenses and "bulkiness" for those.

Bottom line is this - they didn't use the Ultra name. That was very, very likely so that they left space for an Ultra as and when neccesary. The current 7800 is single slot because it doesnt need to be dual slot, it's that simple in my view. If ATI had a faster card than the X850 when the 7800 launched then the 7800 would have been dual clot. As it was, NV could lunch a single slot 7800, show off about how power miserly it is, how cool it runs and how clever they are for doing a high-end single slot card. Hats off to them, they played 7800 vey well indeed. But I think its pretty obvious that if NV wanted, they could produce dual slot G70 with significantly higher core clock speed. Of course, the memory bandwidth problem remains. Indeed, the memory bandwidth problem will be very interesting when R520 launches.
Many people are making calculations regarding how high R520 would need to be clocked if 16 pipes to match or beat G70 in fill rate etc. That's really irrelevant, since G70 is basically bandwidth limited except when it's shader, or HDR etc limited. So given that ATI will have access to pretty much the same memory as NV, its awfully likely that R520 will have fairly similar performance to G70 in gmaes like Doom 3, Far Cry etc where G70 is bandwidth limited (without HDR on Far Cry of course - oh and I am assuming that the rumoured new memory interface/controler, whatever, on R520 only provides a marginal advantage, otherwise all bets are off).
Where things will get interesting is in games that are shader limited on G70, which Battlefield 2 appears to be for instance.
Finally, there's HDR issue, both in terms of pure performance and AA compatibility. With all that in mind, I think it's pretty easy to imagine an R520 with only 16 pipes and clocked at well under 700MHz that delivers very similar peformance to G70. I fancy it will beat G70 (in GTX trim) on balance, but that an Ultra G70 will give back the lead to NV pretty much immediately. Perhaps R580 vs whatever NV have going at that stage will be the closest fight of all.

Of course, I dont really know what I am talking about, but it was ever thus.
 
Many people are making calculations regarding how high R520 would need to be clocked if 16 pipes to match or beat G70 in fill rate etc. That's really irrelevant, since G70 is basically bandwidth limited except when it's shader, or HDR etc limited. So given that ATI will have access to pretty much the same memory as NV, its awfully likely that R520 will have fairly similar performance to G70 in gmaes like Doom 3, Far Cry etc where G70 is bandwidth limited (without HDR on Far Cry of course - oh and I am assuming that the rumoured new memory interface/controler, whatever, on R520 only provides a marginal advantage, otherwise all bets are off).


Makes sense.

Where things will get interesting is in games that are shader limited on G70, which Battlefield 2 appears to be for instance.
Finally, there's HDR issue, both in terms of pure performance and AA compatibility. With all that in mind, I think it's pretty easy to imagine an R520 with only 16 pipes and clocked at well under 700MHz that delivers very similar peformance to G70. I fancy it will beat G70 (in GTX trim) on balance, but that an Ultra G70 will give back the lead to NV pretty much immediately. Perhaps R580 vs whatever NV have going at that stage will be the closest fight of all.

If from each R520 quad comes out a similar result (as in artithmetic) as on a G70 quad, then the clockspeed advantage of the first will balance things out. In any other case the latter has more units, exept ROPs (which I hope ATI has optimized further this time).

Float HDR is important to me at least; I sure hope that R520 will be able to combine 64bpp HDR with MSAA. If it's FP10 only that's combinable with MSAA I'm not so sure I'll be that much interested.

I read over and over again why ATI hasn't released any preliminary performance numbers from it etc etc from others. It's not usual practice and no indication for anything in my book. What could be an indication though are developer reactions, since R520 prototypes are in major ISVs hands since March this year. They'll obviously not comment about it yet, but it would be the only source where something more reliable could leak out before the official announcement.

As for the hypothetical "lead" it's all about winning impressions isn't it? Some of those were already won by releasing months earlier with immediate availability. To make things clearer I don't know what NV really intends to do, but for the time being I don't see an absolute necessity; especially if they can have a 90nm Gxx variant on shelves before this year's holidays.
 
Hmmm, I can't get my head around the idea that there's going to be a 90nm G70 this year (for sale). That feels too soon for me. I would expect the first 90nm G70 chip to be more of a mainstream performance part in the mould of NV42 (which was the first large NV4x chip based on 110nm) followed next spring by a high end 90nm chip. But I also feel that there will be surprises from both players.
 
caboosemoose said:
Hmmm, I can't get my head around the idea that there's going to be a 90nm G70 this year (for sale). That feels too soon for me. I would expect the first 90nm G70 chip to be more of a mainstream performance part in the mould of NV42 (which was the first large NV4x chip based on 110nm) followed next spring by a high end 90nm chip. But I also feel that there will be surprises from both players.

Reported by B3D on March 4th 2005:

Michael Hara/NVIDIA:

Well, from an architecture standpoint we’re just still at the beginning of shader model 3.0. And we need to give the programmers out there some time to continue to really learn about that architecture. So in the spring refresh what you’ll see is a little bit faster versions...

Spring turned out to be summer 2005....

I think you’ll see the industry move up a little bit in performance. But I don’t think you’ll see any radical changes in architecture. I doubt you’ll see any radical changes in architecture even in the fall. When we came out with GeForce 6, we tend to create a revolutionary architecture about every actually two years. And then we derive from it for the following time. So even the devices that we announced this fall, that will be I think a lot more powerful than the ones we actually had a year ago. Architecturally we’re still in the shader model three type era.â€

Fall may turn out to be winter 2005....

If you look at when we go to 90, my guess will be is we’ll have one or two products this year going from 90 in the second half.

http://www.beyond3d.com/forum/viewtopic.php?t=20937
 
Ailuros said:
Float HDR is important to me at least; I sure hope that R520 will be able to combine 64bpp HDR with MSAA. If it's FP10 only that's combinable with MSAA I'm not so sure I'll be that much interested.

I'm not much into blind faith, but I do believe in past as prologue. When thinking about FP10 MSAA, so far as how good the results will look in scenarios available today and the near future (say the next two years or so), do you take into consideration what ATI did with FP24 when the conventional wisdom was all about FP32? I look back at the famous debate with Sireric and Reverend, and I see an ATI that clearly spent a great deal of skull sweat working thru the theory in advance as to how much "eyeball impact" the reduced precision was going to have for the lifetime of the card at the top end.

So I apply that to the FP10 MSAA discussion and am comforted. Am I kidding myself?
 
geo said:
I'm not much into blind faith, but I do believe in past as prologue. When thinking about FP10 MSAA, so far as how good the results will look in scenarios available today and the near future (say the next two years or so), do you take into consideration what ATI did with FP24 when the conventional wisdom was all about FP32? I look back at the famous debate with Sireric and Reverend, and I see an ATI that clearly spent a great deal of skull sweat working thru the theory in advance as to how much "eyeball impact" the reduced precision was going to have for the lifetime of the card at the top end.

So I apply that to the FP10 MSAA discussion and am comforted. Am I kidding myself?

I don't think thwrowing anything with a "FP-preset" into the same pot and draw from there conclusions will work.

That conventional wisdom you're talking about is merely in one spot in the pixel shader for ATI and it aimed to replace the split FP16/32 precision of NV's approach.

Besides it'll come down to what developers think of a specific implementation; even if some consider it as a viable alternative, there are way too many games under development that include already 64bpp HDR.
 
caboosemoose said:
Ailuros said:
Well, they did both single and dual slot versions of NV40, didn't they?

But only because NV40@400MHz (130nm) wasn't possible as a single slot design back then.

I'd be very, very surprised if there wasn't a dual slot version of G70 (even if its given another name) that launches to compete with R520.

G70 has 10.32 GTexels/sec theoretical fill-rate. In order to reach that fill-rate R520 has to be clocked at 645MHz. Now let's assume ATI managed to get even higher than that, it'll never reach the 30% fillrate advantage the X800XT PE had against the NV40.

G70 was launched to compete against what exactly? I'm not excluding the possibility, I just don't seem to find a good reason for a higher clocked dual slot version this far. Even more if ultra high end R520 turns up with a dual slot cooling system, NVIDIA will have a marketing advantage against it.

How much more effective are they, or is there as much "show" as "go" in using a dual slot cooler?

If they would be just there for decorational reasons, I think the IHVs would spare the extra expenses and "bulkiness" for those.

Bottom line is this - they didn't use the Ultra name.

And they also haven't released a G70 quadro variant yet, and the card in this picture has a genlock connector, a feature which has absolutely zero use for home users

http://www.theinquirer.net/?article=24837

I guess we'll find out at siggraph.
 
I'm not arguing tht card is 7800 Ultra, I am simply arguing that a dual slot 7800 Ultra is very likely to appear at around the same time as R520. And given that NV has invested in making this heatsink and fan solution, I wouldn't be surprised to see it turning up on the 7800 Ultra.
 
The cooling sollution for the Quadro FX4500 is from Leadtek AFAIK. As for the missing "ultra" as a name, that's still not enough for an indication for anything. They better get rid of that name all along, since I never really liked it ;) :LOL:
 
If you look at the history, it seems to me that they only pull out an "Ultra" when they are in a tuff position competitively. For intance, if I'm recalling correctly, GF3 & GF4 had no "Ultras".
 
geo said:
If you look at the history, it seems to me that they only pull out an "Ultra" when they are in a tuff position competitively. For intance, if I'm recalling correctly, GF3 & GF4 had no "Ultras".

GF3 had the Ti500 I believe to counter the Radeon 8500. Same thing really.
 
Back
Top