Next NV High-end

Maybe rather than more pipelines they will just use the 90nm process to make the core cheaper and ( hopefully ) make it faster in the process. A 24 pipe G70 at 600Mhz would be fine and dandy, especially as it does not cost them as much per core to make. However it would be a more expensive process so I am not sure how much cheaper it would be per core.
 
But that transition would cost quite some, I don't think it would be a good investment. I'd rather wait a couple of months and go with the next gen (as I suppose they will do).
 
Chalnoth said:
and no PC game is going to be made for a 7800 GTX as a baseline.

In two years it could be the baseline.

As the GTX has shown . .it preforms alot better with a faster CPU i.e. X2. Couldn't it be possible the 7800 GPU in the PS3 will preform alot better because the Cell is a lot better than current CPU's(including the X2).

It's performance in the PS3 I can think will be quite alot faster than it is in it's current PC state.
 
Chalnoth said:
That doesn't seem right to me. Consider that we've known that Sony would use Cell for how many years now? About two years or somesuch? It certainly wouldn't be challenging for Sony to put together devkits with virtual machines for the Cell processor (and maybe IBM would help them with that, anyway). It should be pretty easy for, say, a quad-processor machine running a VM to come close to approximating a Cell processor.

So I really don't see why Sony wouldn't have had some devkits out to select developers for years.

Like 3 years before? LOL

We do have PS3 dev kits also, and we’ve brought up some basic stuff on all the platforms.

Hardware-wise, there’s again a lot of marketing hype about the consoles, and a lot of it needs to be taken with grains of salt about exactly how powerful it is. I mean everyone can remember back to the PS2 announcements and all the hoopla about the Emotion Engine, and how it was going to radically change everything, and you know it didn’t, its processing power was actually kind of annoying to get at on that platform.

But if you look at the current platforms, in many ways, it’s not quite as powerful as it sounds if you add up all the numbers and flops and things like that. If you just take code designed for an x86 that’s running on a Pentium or Athlon or something, and you run it on either of the PowerPCs from these new consoles, it’ll run at about half the speed of a modern state of the art system, and that’s because they’re in-order processors, they’re not out-of-order execution or speculative, any of the things that go on in modern high-end PC processors. And while the gigahertz looks really good on there, you have to take it with this kind of “divide by two†effect going on there.

Now to compensate for that, what they’ve both chosen is a multi-processing approach. This is also clearly happening in the PC space where multi-core CPUs are the coming thing.

So the returns on multi-core are going to be initially disappointing, for developers or for what people get out of it. There are decisions that the hardware makers can choose on here that make it easier or harder. And this is a useful comparison between the xbox 360 and what we’ll have on the PC spaces and what we’ve got on the PS3.

And there is some truth to that, there will be the developers that go ahead and have a miserable time, and do get good performance out of some of these multi-core approaches and CELL is worse than others in some respects here.

But I do somewhat question whether we might have been better off this generation having an out-of-order main processor, rather than splitting it all up into these multi-processor systems.

It’s probably a good thing for us to be getting with the program now, the first generation of titles coming out for both platforms will not be anywhere close to taking full advantage of all this extra capability, but maybe by the time the next generation of consoles roll around, the developers will be a little bit more comfortable with all of this and be able to get more benefit out of it.

http://www.beyond3d.com/forum/showthread.php?p=543232#post543232


I mean bandwidth from the CPU to the GPU. This is being stated as the thing that nVidia had to change their GPU the most for.

And again that bandwidth has been fine tuned mostly for 720p.
 
Last edited by a moderator:
Well, I am expecting an Ultra with memory clocked at 1.6GHz (effective). As for core speed, that's trickier to predict. My feeling is that there isn't quite as much headroom as some suggest. The 7800 GTX's 430MHz clock is probably set just below where the power consumption and heat disappation curve gets really steep. Given the margins that reference boards always have my guess would be 475 - 500MHz. Still, combined with the much faster memory (and perhaps a slab of extra performance from the 81.26 driver) that will probably be enough to beat R520 XT (well except for in some very particualr rendering scenarios...).
 
caboosemoose said:
As for core speed, that's trickier to predict. My feeling is that there isn't quite as much headroom as some suggest. The 7800 GTX's 430MHz clock is probably set just below where the power consumption and heat disappation curve gets really steep. Given the margins that reference boards always have my guess would be 475 - 500MHz.

Not according to Anand. XFX GTX @ 500 (+16%) on the core, and ~ 5% increases in heat and power consumption. And this is still on single-slot cooling.

http://www.anandtech.com/video/showdoc.aspx?i=2500&p=4
 
Well, I hope we see NV 90nm high-end (I'm about that close to just calling it "G75" for convenience until someone tells me different) no later than when we see R580, whenever that is. I'm very curious to see what conclusion we'll come to about "who made the right choices" on how to spend transistors and die size on the same process.
 
They have a solid card in the G70 so have somone thought of late dec/or early 06 for a 90nm G70 with 8 quads/10VS with higher ram speeds and 512MB, a "7900GTX/ULTRA" untill a late 06 lauch with NV50?
 
I HIGHLY doubt that there will be any 32 pipeline cards this generation, highly. Maybe a refresh, but even then. That just seems way above any transistor budget right now for either company.
 
Oh, man, that ones worse than 'yes'. :LOL:

But if you're changing on R520 I'm gonna go get that rotten fruit out of the garbage, and ask to borrow that frying pan. . . :p

Edit: Ah, of course, the 48-pipe R580. :LOL:
 
Jawed said:

Well, it's really the wrong the thread for it, but just for grins and giggles, oh technacious one, I'll give you one "fact" (not so much, but play along, 'kay?), and you give me another:

"Guess" 1: R520 is 295m transistors.

"Guess" 2: R580 is ???m transistors.

Edit: Edited for the quote mark impaired.
 
Last edited by a moderator:
Skrying said:
I HIGHLY doubt that there will be any 32 pipeline cards this generation, highly. Maybe a refresh, but even then. That just seems way above any transistor budget right now for either company.

Care to explain, a 90nm G70 with 8quads/10VS/16ROPs should be a fair amount smaller than the current incarnation on 110nm.
 
I HIGHLY doubt that there will be any 32 pipeline cards this generation, highly. Maybe a refresh, but even then. That just seems way above any transistor budget right now for either company.
It all depends on how you count your pipes :)
 
overclocked said:
Care to explain, a 90nm G70 with 8quads/10VS/16ROPs should be a fair amount smaller than the current incarnation on 110nm.

I didnt say size, I said transitor budget. Two different things really. Complexity would go up another level with 8 quads, and therefore make it harder on Nvidia. I dont see a need for it for a good while, if ever though.
 
geo said:
Oh, man, that ones worse than 'yes'. :LOL:

But if you're changing on R520 I'm gonna go get that rotten fruit out of the garbage, and ask to borrow that frying pan. . . :p

Edit: Ah, of course, the 48-pipe R580. :LOL:

Yeh i hate those "blips" Dave does. Just thinking here maybe Dave is still on "the missing quad/quads" in G70, hmmm... I thought unwinder had nailed it long ago that there arent any mystery things. Although Dave is just teasing us with R520/580 or something else.
 
Back
Top