NVIDIA Fermi: Architecture discussion

At least according to that supposed leak the 104 is faster.

Nvidia's aibs said 104 was "high end" which is a bit ambiguous. Typically "high end" is below "enthusiast" so who knows. Either way my guess is that the 14 pin beast is the faster config and likely dual gpu. That would imply a single gpu that would be 5-10% faster than 5870?

Of course we really know nothing. :)

The 'supposed leak' was bad fake at best
 
I agree.
Based on the CES footage I'd guess GF100 is the dual gpu enthusiast product and GF104 is likely the single gpu. It only makes sense to launch and demo the better part first to slow AMD sales while people wait for the mainstream high-end part.
 
I agree.
Based on the CES footage I'd guess GF100 is the dual gpu enthusiast product and GF104 is likely the single gpu. It only makes sense to launch and demo the better part first to slow AMD sales while people wait for the mainstream high-end part.

I disagree. First of all, Nvidia has been very fond of its "sandwich" approach to dual-GPU cards. And this looks to be a single PCB card. I'm not saying they won't change ever change their design philosophy, but they haven't up until now.

Secondly, if you look at the CES video, at around 27 seconds you can see the backside of the GF100. There appear to be one cluster of SMD components in the middle of the card, right where you'd expect to find them on a single GPU card. And arrayed around that cluster you can see groups of more SMDs that would appear to be for the memory. A typical layout for a single GPU card.

And Nick Stam, the Nvidia narrator in that video, definitely called it a GF100.
 
I disagree. First of all, Nvidia has been very fond of its "sandwich" approach to dual-GPU cards. And this looks to be a single PCB card. I'm not saying they won't change ever change their design philosophy, but they haven't up until now.

They did actually. There was a single PCB dual GPU version of the GTX 295.

Creig said:
Secondly, if you look at the CES video, at around 27 seconds you can see the backside of the GF100. There appear to be one cluster of SMD components in the middle of the card, right where you'd expect to find them on a single GPU card. And arrayed around that cluster you can see groups of more SMDs that would appear to be for the memory. A typical layout for a single GPU card.

And Nick Stam, the Nvidia narrator in that video, definitely called it a GF100.

Yeah, GF100 seems to be the chip inside the single GPU high-end i.e. GeForce 380.

Dual GPU card seems to be referenced as GF104.
 
I disagree. First of all, Nvidia has been very fond of its "sandwich" approach to dual-GPU cards. And this looks to be a single PCB card. I'm not saying they won't change ever change their design philosophy, but they haven't up until now.
Secondly, if you look at the CES video, at around 27 seconds you can see the backside of the GF100. There appear to be one cluster of SMD components in the middle of the card, right where you'd expect to find them on a single GPU card. And arrayed around that cluster you can see groups of more SMDs that would appear to be for the memory. A typical layout for a single GPU card.

And Nick Stam, the Nvidia narrator in that video, definitely called it a GF100.

Take a look at the GTX295....2 gpus on one PCB;)
 
does it matter really ? as long as performance is there who cares if it is 448 , 512 .. 384... 1024 or 42 ? Just get the damn thing out, force ATI to lower their damn prices.
I don't think Nvidia will be able to beat ATi with 448 cores only , it needs the whole 512 cores and it needs to run them at high clocks , this could ensure a significant performance advantage , like 30% or 40% over HD5870.
 
3D multi-monitor + shutter glasses + peripheral vision = wtf?

So Nvidia wants to 1-up AMD's eyefinity, but the whole point of a three monitor set-up is to provide information to your peripheral vision for better immersion and gaming advantage and wearing 3D shutter glassses negates that advantage, not only by physically blocking out the side screens but one's eyes need to continually change their focal length to access information at different 'depths' on the screen. Focusing the eyes to different depths takes time and slows down the amount of information one can access and process - just on the main screen - and therefore reduces the capacity, the attention units, to be aware of peripheral vision data. Add in the necessity to turn one's head to even workably see the peripheral data and then sort through the depth field for information ... it's a built in and, being biomechanical, substantial delay lag penalty.

This may have a temporary wow factor, but imagine you're playing on this in an fps head to head against someone with an eyefinity set-up. Against someone of comparable skill, you'd be slaughtered. Maybe workable in slow moving RPGs and the like but I don't see it being at all useful in anything fast moving.

At some future point a really workable wrap around 3D gaming experience might become feasible but Nvidia's present technology is a very long way from that.
 
Last edited by a moderator:
This may have a temporary wow factor, but imagine you're playing on this in an fps head to head against someone with an eyefinity set-up. Against someone of comparable skill, you'd be slaughtered. Maybe workable in slow moving RPGs and the like but I don't see it being at all useful in anthing fast moving.
Well if you don't like it , you can throw the glasses away and run the setup in 2D , problem solved !
 
I was expecting this from NV, but the question remains will they beable to support 3 monitors with only one card or will you have to sli them?
 
Heh, one side used to complain about the bezels, which will somehow no longer be an issue once NV does surround gaming. "I don't know how Nvidia does it, it's like they make the bezels magically disappear." Now the other side can complain about the 3D-glasses rims blocking peripheral vision. There's always ammo.
 
Heh, one side used to complain about the bezels, which will somehow no longer be an issue once NV does surround gaming. "I don't know how Nvidia does it, it's like they make the bezels magically disappear." Now the other side can complain about the 3D-glasses rims blocking peripheral vision. There's always ammo.

Hm, you can buy three projectors. :LOL:

I was expecting this from NV, but the question remains will they beable to support 3 monitors with only one card or will you have to sli them?

I think they will use something like matrox's triple head.
 
Only in that single "leak", GF104 should be midrange part (or next down from GF100 anyway)

What I meant was that the GF104 codename was used to reference the Fermi based GeForce X2 card.

I speculated in the past, that if GF100 (single chip high-end) was quite a bit faster than the HD 5870 (let's assume LegitReviews number - 36%), NVIDIA doesn't really need GF100 X2 to beat the HD 5970. A mid-range chip X2 should be more than enough.
Most considered it to be a flawed speculation, because there were no news of a tape out for such a chip.

Right now, GF104 either does reference the X2 card or there is a mid-range chip that tape out, without much hassle or news about it.
 
Back
Top