NVIDIA GF100 & Friends speculation

Any small change in the architecture is enough to pull it off, but binned chips from old batches that been sold as Gf100, that's just unnecessary.

I'm just saying that it would be stupid, and give limited gain. I understand if some people think that this is not a reason to think it would not happen.

Why not a new number? It's obviously different from the GF100 product if it's got more enabled cores and difference clock speeds.

If you ask a marketing man if it's a new product, he'll probably say "it has a new number doesn't it?"
 
Surely you jest. You're not familiar with normalization?
As you can see from the many flaming posts here most of the posters have no clue what normalization even is.

I mean the 5870 was pegged at "100%" and the 480/580 shows the relative gain/loss in percent vs the 5870.

Very common chart and one that nVidia uses when comparing their products to the competition.
 
Why? If the reduced power consumption is true (quite sure in my opinion) it only gets more likely, isn't it?

That's a very good point, but I had considered that Nvidia would be using that improved power envelope to lower heat and increase clockspeed. Nvidia has to think about it if 580 is going to get trumped by Cayman XT, and AMD can still do more with Antilles. Even worse if AMD is still going to be able to do it with smaller chips.
 
As you can see from the many flaming posts here most of the posters have no clue what normalization even is.

I mean the 5870 was pegged at "100%" and the 480/580 shows the relative gain/loss in percent vs the 5870.

Very common chart and one that nVidia uses when comparing their products to the competition.

The thing is, both AMD and NV normally start the bar with 80% so 160% looks more massive then when the bar starts at 0 and they keep the same graph size.
 
Last edited by a moderator:
As you can see from the many flaming posts here most of the posters have no clue what normalization even is.

I mean the 5870 was pegged at "100%" and the 480/580 shows the relative gain/loss in percent vs the 5870.

Very common chart and one that nVidia uses when comparing their products to the competition.

Neither of you seem to understand what they were laughing at.
 
And there's always Antilles. I don't think we've seen a dual GF100 card, so a dual GF110 seems even less likely.

Right.. it won't fit into a 300W power envelope, even downclocked and undervolted like GTX295 had to.

Dual GF104 does make sense though and I'd rather have such a dual board than a GTX580. Graphics performance would be just slightly better than GTX580, but for compute apps (my interest) you get a massive boost in aggregate memory bandwidth. Zotac's been showing their dual GF104 prototype, but it'd be easy for NVidia to make an official one.
 
Yes. To what extent remains to be seen.

Edit:
I guess not all noticed it, but even Charlie said this:

I honestly don't think so. Cayman is expected to be quite fast, and I believe it would make more sense for NVIDIA to push the GTX 580 as high as possible while remaining within the 300W envelope. So my bet is on >290W under Furmark.

If I'm wrong, you can rub my nose in it next week! :p
 
Somebody here mentioned a few days ago, that Cayman won't be drastically faster than Cypress. I can't find the post, unfortunately...


I don't believe that for a second. Given that 6870 is most of the way to 5870 on a smaller, cheaper chip, there's no way Cayman is going to be just a little bit faster than Cypress. I think it's going to be a significant jump, and it's going to put another year of pressure onto Nvidia. I really don't see Nvidia recovering the high ground for this cycle.
 
Why not a new number? It's obviously different from the GF100 product if it's got more enabled cores and difference clock speeds.

If you ask a marketing man if it's a new product, he'll probably say "it has a new number doesn't it?"

What's the new number from GF100? Clock speeds? Surely 512 cores is not new according to nvidia GF100 slides.

I think the line is quite distinct. Let's see how it turns out.
 
Back
Top