Next NV High-end

Geo

Mostly Harmless
Legend
Since we are all avid readers of B3D, on its face and between the lines, it goes without saying that by release day of R520 that largely our response will be "Yawn. No surprises.". :LOL: (That's only about 50% joke, btw).

So, since R520 is nearly here, obviously it is time to turn to obsessing over next NV.

The first thing I want to know, since I hate awkward formulations, is what is the code name for 90nm high-end? G75? G80? What?

You may also discuss amongst yourselves if NV is likely to go with a "G70 Ultra first" strategy or a balls-to-the-wall "let's kill ATI with a 90nm 650mhz G7x immediately" approach.
 
No longer sure about the code name, but it still strikes me that I have yet to see a reliable source affirm that any fab has taped out a high-end 90nm NVDA GPU for the PC market.
 
I thought that the concensus here is G80 is the first DX10 part, so we'll only be seeing G7x variations for the next 6 months.

Jawed
 
Jawed said:
I thought that the concensus here is G80 is the first DX10 part, so we'll only be seeing G7x variations for the next 6 months.

Jawed

I think that's a fair statement. It's worth checking the CW's pulse from time to time too, don't you think? And given "fun with code names" from our friends at NV the last 6 months, I'm just not confident about extrapolating forward until we've had some more experience with what they are doing on that score.
 
I don't think we should expect a 90nm version of the G70 before next year. Then I'd say there's a fair chance (oh, 60%) that we'll see one by April. Might be 32 pipelines and similar clocks, or 24 pipelines and significantly higher clocks.

But we may still see a 20% or so improvement over the GTX in an upcoming 7800 Ultra much sooner. We'll see.
 
John Reynolds said:
20% seems rather too optimistic to me.
I don't know. Consider, if you will, that the GeForce 7800 GTX is both a single-slot design and has a very low power consumption profile for a high-end card of today. So it seems to me that nVidia has a lot of clocking room with the 7800 GTX if they go for a dual-slot design. I could be wrong, of course. I haven't seen any overclocking data on the 7800 GTX.
 
John Reynolds said:
20% seems rather too optimistic to me.

20% from reference sounds about right, tho in the high end of the range. Not 20% from the BFGs and such of the world.
 
geo said:
20% from reference sounds about right, tho in the high end of the range. Not 20% from the BFGs and such of the world.
Right, that's what I meant: from reference.
 
I think Nvidia allready stated that by the time PS3 was in the market there would be a more powerfull chip for the PC ( April, May ? ).I supposse RSX could be a good indication. They are now adapting G70 to 90 nm to make the RSX. I suppose the next high-end PC chip could be the same but with 32 pipes.
 
Last edited by a moderator:
John Reynolds said:
20% seems rather too optimistic to me.

20% from reference might be a bit high. Going by 3dmark05, a GTX at 490/1350 is 10% faster than the reference version. Will another 10% require 550 on the core assuming that 3dmark05 isn't bottlenecked much by available memory bandwidth at default settings.
 
I see the 7800 Ulta being a possible 512MB 525 core and 1400 memory card. I dont think it'll be that much more powerful over the 7800GTX, and I'm still curious if they'd switch over to 90nm on the high end and if this would give them any issues at all.
 
John Reynolds said:
20% seems rather too optimistic to me.

If we'd be talking about 90nm and 550/700MHz, chances are high that it'll deliver that kind of difference at least. That of course if they "just" increase frequencies and don't change anything else.

Overclocking the G70 by X% (both core and ram) gives me (with exceptions like Doom3) a performance increase of ~X%.

What I'm wondering though on the other hand is if they're bound in any way to SONY within their PS3 aggreement. I did hear some rumours in the past that they promised SONY to get their fastest possible GPU. No idea if it's true and what it exactly translates into, so take it with a grain of salt.
 
Ailuros said:
What I'm wondering though on the other hand is if they're bound in any way to SONY within their PS3 aggreement. I did hear some rumours in the past that they promised SONY to get their fastest possible GPU. No idea if it's true and what it exactly translates into, so take it with a grain of salt.
That would seem like a really bad thing for nVidia to agree to, and really stupid for Sony to ask for it.

Consider: no matter the power of nVidia's highest-end offering at the release of the PS3, PS3 games and demos will look better because the PS3 has more peak CPU power than current CPU's and developers can program straight "to the metal." So under no account should the PS3 appear to be weaker than the PC.

On the other hand, if nVidia holds back a high-end product, and ATI takes a clear performance lead before the PS3 is released, then Sony will lose out on the marketting muscle they might gain from sharing a graphics chip from the current performance/tech leader in the PC space.

This just seems like a lose-lose agreement to me.
 
Consider: no matter the power of nVidia's highest-end offering at the release of the PS3, PS3 games and demos will look better because the PS3 has more peak CPU power than current CPU's and developers can program straight "to the metal." So under no account should the PS3 appear to be weaker than the PC.

Assume they started developing multi-threaded games for Cell now, they'll be done with those when?

As for the rest as I said take it with a grain of salt; nonetheless though this is SONY we're talking about and they haven't been interested through all those years to license graphics technology from anyone so far. In short I wouldn't be surprised, albeit I admit that it sounds like a strange clause for such a deal.
 
Ailuros said:
Assume they started developing multi-threaded games for Cell now, they'll be done with those when?
You don't think that some developers have been making games for the PS3 for a long time now?

And regardless of the multithreading capabilities, the PS3 still has vastly more bandwidth to the GPU, and no PC game is going to be made for a 7800 GTX as a baseline. So even a hypothetical single-threaded game written for the PS3 should look dramatically better than a PC game at the launch of the PS3.
 
Chalnoth said:
You don't think that some developers have been making games for the PS3 for a long time now?

I don't think they had working hardware long enough in their hands to be even close completion with multi-threaded games; unless of course you're trying to tell me that games are being developed in around 12 months or even less. I've read at several spots that we should expect dual-threading on CPUs in about 2 years from today and that timeframe makes actually sense.

And regardless of the multithreading capabilities, the PS3 still has vastly more bandwidth to the GPU, and no PC game is going to be made for a 7800 GTX as a baseline. So even a hypothetical single-threaded game written for the PS3 should look dramatically better than a PC game at the launch of the PS3.

Cell running single-threaded code sounds according to developers themselves is a major culprit. What great bandwidth anyway? For 720p (1280*720) it's plenty no doubt :rolleyes:
 
Last edited by a moderator:
Ailuros said:
I don't think they had working hardware long enough in their hands to be even close completion with multi-threaded games; unless of course you're trying to tell me that games are being developed in around 12 months or even less. I've read at several spots that we should expect dual-threading on CPUs in about 2 years from today and that timeframe makes actually sense.
That doesn't seem right to me. Consider that we've known that Sony would use Cell for how many years now? About two years or somesuch? It certainly wouldn't be challenging for Sony to put together devkits with virtual machines for the Cell processor (and maybe IBM would help them with that, anyway). It should be pretty easy for, say, a quad-processor machine running a VM to come close to approximating a Cell processor.

So I really don't see why Sony wouldn't have had some devkits out to select developers for years.

Cell running single-threaded code sounds according to developers themselves is a major culprit. What great bandwidth anyway? For 720p (1280*720) it's plenty no doubt :rolleyes:
I mean bandwidth from the CPU to the GPU. This is being stated as the thing that nVidia had to change their GPU the most for.
 
I don't expect to see any ultra version of the GTX. They have a good performing card selling like hot cakes, there's simply no reason for it. I see the prices dramatically falling if R5xx should turn out to be a truly great performer, but nothing more than that. And nV surely _can_ lower the prices as they wish, since G70 has great yields on an older process and is thus cheap to produce.

I rather expect a new core mid next year, whatever it might look like.
 
Back
Top