G71

That looks a little excessive for a 525mhz G70 (the rumored "ultra") doesn't it? Unless G70 doesn't OC as well as we've been thinking. . .or they went higher than 525mhz.
 
geo said:
That looks a little excessive for a 525mhz G70 (the rumored "ultra") doesn't it? Unless G70 doesn't OC as well as we've been thinking. . .or they went higher than 525mhz.

Let's not forget that there are (well, maybe that should be 'were') video cards that had no need for a fan on the heatsink, but one was implemented because it gives the impression of power. Yes, it's true. You'd think the chip that didn't need a fan to be superior by design, but customers associate cooling with power and the look of the card is important in selling the product even though you never see it when you use it.
 
Hmm.. I was under the impression that geforces running between 70 and 90 degrees centigrade might actually have an extended lifespan/better performance because of the cooling that was applied..

When it comes to guiding currents, cool is good, insanely hot is bad, right?
 
Maybe its not just the gpu they are OCing. After all, we've had reports that some AIBs are going to put 1200 mem on the GT. . .GTX has got to be wildly bandwidth limited, let alone an Ultra. . .
 
wireframe said:
I thought "dual GPU" when I saw that cooling design as well, but if you look at the pictures, especially the one showing the back of the card, you can see that there is only one GPU on there. The dual heat sinks are just there to dissipate the heat from the central area where the GPU is located, using heat pipes. So, I think it's just an example of extreme cooling for a single GPU solution. Might be exciting.
I meant dual-die on a single package.
 
IbaneZ said:
27341481_4f21240495_o.jpg


27341490_ca1deeb101_o.jpg


27341505_b566c4dc0c_o.jpg



http://www.nvnews.net/vbulletin/showthread.php?p=657331#post657331

The guy who posted the pics thinks it's the 7800 Ultra. But if it has something to do with G71 only god knows. :)

The cooler should be pretty effective. Nice to see a big (92mm?) fan too, should be quiet enough.

http://www.lovehinaplus.com/blog/index.php?op=ViewArticle&articleId=433&blogId=2

Quadro FX4500@ 430/525MHz, 512MB
 
caboosemoose said:
So we think this is definitely a Quadro-only design, and not indicative of what the revved 7800 will look like?


No idea. Such a cooling system doesn't make sense honestly for 430/525, that's true. But I don't see any extravagant frequencies claimed for that one right now.

***edit: by the way many of us expect a 90nm version of GXX later on this year. Judging from RSX it could be anywhere in the =/>550MHz ballpark and I expect to see at least 700MHz GDDR for it. That one has obviously good chances to become a single slot design.

What would break the G70 Ultra@110nm theory in my mind, is the fact that NVIDIA went through all that hussle to design a single slot G70 design (think also dynamic clock frequencies) and an intermediate sollution within 6 months doesn't really make all that much sense either.
 
Yeah, it certainly seems like overkill for a 430MHz board. I wouldnt be surprised to hear that Quaddro boards require wider operating margins in terms of thermals than a GeForce card, but you'd think an NV40-style dual slot cooler would be good enough - that thing looks like a lot of effort to go to ensure greater long-term stability on the Quaddro board.
 
Ailuros said:
caboosemoose said:
So we think this is definitely a Quadro-only design, and not indicative of what the revved 7800 will look like?


No idea. Such a cooling system doesn't make sense honestly for 430/525, that's true. But I don't see any extravagant frequencies claimed for that one right now.

***edit: by the way many of us expect a 90nm version of GXX later on this year. Judging from RSX it could be anywhere in the =/>550MHz ballpark and I expect to see at least 700MHz GDDR for it. That one has obviously good chances to become a single slot design.

What would break the G70 Ultra@110nm theory in my mind, is the fact that NVIDIA went through all that hussle to design a single slot G70 design (think also dynamic clock frequencies) and an intermediate sollution within 6 months doesn't really make all that much sense either.


PS3 Reference Tools
(Cell 3.2 GHZ / RSX / XDR 512 MB / BD/DVD/CD-Laufwerk / HDD

maybe rsx gets XDR Ram
 
The GXX@90nm part will most likely get whatever highest ram is available by the end of this year and at affordable prices. I said at least 700MHz GDDR (albeit I forgot the "3" there :p ).
 
DaveBaumann said:
Anyone pinpoint what the vertex shaders are clocked at?

Doesn't say anything between the letters I can't read at the link above. Is that a hint?
 
Ailuros said:
What would break the G70 Ultra@110nm theory in my mind, is the fact that NVIDIA went through all that hussle to design a single slot G70 design (think also dynamic clock frequencies) and an intermediate sollution within 6 months doesn't really make all that much sense either.

Yes, this is the tension --historical analysis strongly suggests either Ultra @110 appearing Sep-ish has been "a mind job" all along, or 90nm G70 is further out than folks have been assuming. . .and the latter would suggest unlikely things about RSX (assuming RSX is much more similar to G70 90nm, than, say C1 to R520).
 
geo said:
Ailuros said:
What would break the G70 Ultra@110nm theory in my mind, is the fact that NVIDIA went through all that hussle to design a single slot G70 design (think also dynamic clock frequencies) and an intermediate sollution within 6 months doesn't really make all that much sense either.

Yes, this is the tension --historical analysis strongly suggests either Ultra @110 appearing Sep-ish has been "a mind job" all along, or 90nm G70 is further out than folks have been assuming. . .and the latter would suggest unlikely things about RSX (assuming RSX is much more similar to G70 90nm, than, say C1 to R520).

I'm not aware of NVIDIA's plans what that concerns but to me it makes way more sense to release two single slot designs within let's say 6 months, instead of 3 high end GPUs where the second could end up being a dual slot design and would additionaly steal some of the wind out of GXX@90nm performance-sails.
 
Ailuros said:
What would break the G70 Ultra@110nm theory in my mind, is the fact that NVIDIA went through all that hussle to design a single slot G70 design (think also dynamic clock frequencies) and an intermediate sollution within 6 months doesn't really make all that much sense either.

Well, they did both single and dual slot versions of NV40, didn't they? I'd be very, very surprised if there wasn't a dual slot version of G70 (even if its given another name) that launches to compete with R520. The only thing I wonder about is the difference those dual slot coolers make. How much more effective are they, or is there as much "show" as "go" in using a dual slot cooler?
 
Well, they did both single and dual slot versions of NV40, didn't they?

But only because NV40@400MHz (130nm) wasn't possible as a single slot design back then.

I'd be very, very surprised if there wasn't a dual slot version of G70 (even if its given another name) that launches to compete with R520.

G70 has 10.32 GTexels/sec theoretical fill-rate. In order to reach that fill-rate R520 has to be clocked at 645MHz. Now let's assume ATI managed to get even higher than that, it'll never reach the 30% fillrate advantage the X800XT PE had against the NV40.

G70 was launched to compete against what exactly? I'm not excluding the possibility, I just don't seem to find a good reason for a higher clocked dual slot version this far. Even more if ultra high end R520 turns up with a dual slot cooling system, NVIDIA will have a marketing advantage against it.

How much more effective are they, or is there as much "show" as "go" in using a dual slot cooler?

If they would be just there for decorational reasons, I think the IHVs would spare the extra expenses and "bulkiness" for those.
 
It has a freaking genlock connector on it, it's a quadro. Don't you think it's possible that the Inq got this picture before anyone else, and then, thinking it was some G71 made the article around the pictures?
 
Back
Top