geo said:That looks a little excessive for a 525mhz G70 (the rumored "ultra") doesn't it? Unless G70 doesn't OC as well as we've been thinking. . .or they went higher than 525mhz.
I meant dual-die on a single package.wireframe said:I thought "dual GPU" when I saw that cooling design as well, but if you look at the pictures, especially the one showing the back of the card, you can see that there is only one GPU on there. The dual heat sinks are just there to dissipate the heat from the central area where the GPU is located, using heat pipes. So, I think it's just an example of extreme cooling for a single GPU solution. Might be exciting.
IbaneZ said:
http://www.nvnews.net/vbulletin/showthread.php?p=657331#post657331
The guy who posted the pics thinks it's the 7800 Ultra. But if it has something to do with G71 only god knows.
The cooler should be pretty effective. Nice to see a big (92mm?) fan too, should be quiet enough.
Ailuros said:http://www.lovehinaplus.com/blog/index.php?op=ViewArticle&articleId=433&blogId=2
Quadro FX4500@ 430/525MHz, 512MB
caboosemoose said:So we think this is definitely a Quadro-only design, and not indicative of what the revved 7800 will look like?
Ailuros said:caboosemoose said:So we think this is definitely a Quadro-only design, and not indicative of what the revved 7800 will look like?
No idea. Such a cooling system doesn't make sense honestly for 430/525, that's true. But I don't see any extravagant frequencies claimed for that one right now.
***edit: by the way many of us expect a 90nm version of GXX later on this year. Judging from RSX it could be anywhere in the =/>550MHz ballpark and I expect to see at least 700MHz GDDR for it. That one has obviously good chances to become a single slot design.
What would break the G70 Ultra@110nm theory in my mind, is the fact that NVIDIA went through all that hussle to design a single slot G70 design (think also dynamic clock frequencies) and an intermediate sollution within 6 months doesn't really make all that much sense either.
DaveBaumann said:Anyone pinpoint what the vertex shaders are clocked at?
Ailuros said:What would break the G70 Ultra@110nm theory in my mind, is the fact that NVIDIA went through all that hussle to design a single slot G70 design (think also dynamic clock frequencies) and an intermediate sollution within 6 months doesn't really make all that much sense either.
I think CELL processor has 512 mb of xdr ram on PS3 devkit.sklaar said:maybe rsx gets XDR Ram
geo said:Ailuros said:What would break the G70 Ultra@110nm theory in my mind, is the fact that NVIDIA went through all that hussle to design a single slot G70 design (think also dynamic clock frequencies) and an intermediate sollution within 6 months doesn't really make all that much sense either.
Yes, this is the tension --historical analysis strongly suggests either Ultra @110 appearing Sep-ish has been "a mind job" all along, or 90nm G70 is further out than folks have been assuming. . .and the latter would suggest unlikely things about RSX (assuming RSX is much more similar to G70 90nm, than, say C1 to R520).
Ailuros said:What would break the G70 Ultra@110nm theory in my mind, is the fact that NVIDIA went through all that hussle to design a single slot G70 design (think also dynamic clock frequencies) and an intermediate sollution within 6 months doesn't really make all that much sense either.
Well, they did both single and dual slot versions of NV40, didn't they?
I'd be very, very surprised if there wasn't a dual slot version of G70 (even if its given another name) that launches to compete with R520.
How much more effective are they, or is there as much "show" as "go" in using a dual slot cooler?