NVIDIA Fermi: Architecture discussion

Discussion in 'Architecture and Products' started by Rys, Sep 30, 2009.

  1. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    The 'supposed leak' was bad fake at best
     
  2. Mize

    Mize 3dfx Fan
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,079
    Likes Received:
    1,149
    Location:
    Cincinnati, Ohio USA
    I agree.
    Based on the CES footage I'd guess GF100 is the dual gpu enthusiast product and GF104 is likely the single gpu. It only makes sense to launch and demo the better part first to slow AMD sales while people wait for the mainstream high-end part.
     
  3. Creig

    Newcomer

    Joined:
    Nov 20, 2006
    Messages:
    57
    Likes Received:
    1
    I disagree. First of all, Nvidia has been very fond of its "sandwich" approach to dual-GPU cards. And this looks to be a single PCB card. I'm not saying they won't change ever change their design philosophy, but they haven't up until now.

    Secondly, if you look at the CES video, at around 27 seconds you can see the backside of the GF100. There appear to be one cluster of SMD components in the middle of the card, right where you'd expect to find them on a single GPU card. And arrayed around that cluster you can see groups of more SMDs that would appear to be for the memory. A typical layout for a single GPU card.

    And Nick Stam, the Nvidia narrator in that video, definitely called it a GF100.
     
  4. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    To get an idea of how big the card will be, ie whether it will fit into my case.
     
  5. Silus

    Banned

    Joined:
    Nov 17, 2009
    Messages:
    375
    Likes Received:
    0
    Location:
    Portugal
    They did actually. There was a single PCB dual GPU version of the GTX 295.

    Yeah, GF100 seems to be the chip inside the single GPU high-end i.e. GeForce 380.

    Dual GPU card seems to be referenced as GF104.
     
  6. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    What I really want to know right now is whether this card has 448 cores or 512 core ?
     
  7. FrameBuffer

    Banned

    Joined:
    Aug 7, 2005
    Messages:
    499
    Likes Received:
    3
    does it matter really ? as long as performance is there who cares if it is 448 , 512 .. 384... 1024 or 42 ? Just get the damn thing out, force ATI to lower their damn prices.
     
  8. Spyhawk

    Newcomer

    Joined:
    Oct 31, 2007
    Messages:
    76
    Likes Received:
    1
    Take a look at the GTX295....2 gpus on one PCB:wink:
     
  9. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    I don't think Nvidia will be able to beat ATi with 448 cores only , it needs the whole 512 cores and it needs to run them at high clocks , this could ensure a significant performance advantage , like 30% or 40% over HD5870.
     
  10. Creig

    Newcomer

    Joined:
    Nov 20, 2006
    Messages:
    57
    Likes Received:
    1
    I forgot about the reworked GTX 295. I was thinking of the original GTX 295 which was a typical Nvidia dual-PCB design. The same as the 9800 GX2, 7950 GX2, 7900 GX2.
     
  11. FrameBuffer

    Banned

    Joined:
    Aug 7, 2005
    Messages:
    499
    Likes Received:
    3
    I wouldn't be too sure about that
     
  12. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    Only in that single "leak", GF104 should be midrange part (or next down from GF100 anyway)
     
  13. spigzone

    Banned

    Joined:
    Dec 26, 2009
    Messages:
    45
    Likes Received:
    0
    Location:
    North Dakota
    3D multi-monitor + shutter glasses + peripheral vision = wtf?

    So Nvidia wants to 1-up AMD's eyefinity, but the whole point of a three monitor set-up is to provide information to your peripheral vision for better immersion and gaming advantage and wearing 3D shutter glassses negates that advantage, not only by physically blocking out the side screens but one's eyes need to continually change their focal length to access information at different 'depths' on the screen. Focusing the eyes to different depths takes time and slows down the amount of information one can access and process - just on the main screen - and therefore reduces the capacity, the attention units, to be aware of peripheral vision data. Add in the necessity to turn one's head to even workably see the peripheral data and then sort through the depth field for information ... it's a built in and, being biomechanical, substantial delay lag penalty.

    This may have a temporary wow factor, but imagine you're playing on this in an fps head to head against someone with an eyefinity set-up. Against someone of comparable skill, you'd be slaughtered. Maybe workable in slow moving RPGs and the like but I don't see it being at all useful in anything fast moving.

    At some future point a really workable wrap around 3D gaming experience might become feasible but Nvidia's present technology is a very long way from that.
     
    #3033 spigzone, Jan 7, 2010
    Last edited by a moderator: Jan 7, 2010
  14. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Well if you don't like it , you can throw the glasses away and run the setup in 2D , problem solved !
     
  15. Spyhawk

    Newcomer

    Joined:
    Oct 31, 2007
    Messages:
    76
    Likes Received:
    1
    I was expecting this from NV, but the question remains will they beable to support 3 monitors with only one card or will you have to sli them?
     
  16. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    One card , they are planning to show a fermi based card with 3 monitor outputs tomorrow .
     
  17. thatdude90210

    Regular

    Joined:
    Aug 9, 2003
    Messages:
    937
    Likes Received:
    6
    Heh, one side used to complain about the bezels, which will somehow no longer be an issue once NV does surround gaming. "I don't know how Nvidia does it, it's like they make the bezels magically disappear." Now the other side can complain about the 3D-glasses rims blocking peripheral vision. There's always ammo.
     
  18. Sontin

    Banned

    Joined:
    Dec 9, 2009
    Messages:
    399
    Likes Received:
    0
    Hm, you can buy three projectors. :lol:

    I think they will use something like matrox's triple head.
     
  19. Recall

    Newcomer

    Joined:
    Jul 6, 2004
    Messages:
    89
    Likes Received:
    0
  20. Silus

    Banned

    Joined:
    Nov 17, 2009
    Messages:
    375
    Likes Received:
    0
    Location:
    Portugal
    What I meant was that the GF104 codename was used to reference the Fermi based GeForce X2 card.

    I speculated in the past, that if GF100 (single chip high-end) was quite a bit faster than the HD 5870 (let's assume LegitReviews number - 36%), NVIDIA doesn't really need GF100 X2 to beat the HD 5970. A mid-range chip X2 should be more than enough.
    Most considered it to be a flawed speculation, because there were no news of a tape out for such a chip.

    Right now, GF104 either does reference the X2 card or there is a mid-range chip that tape out, without much hassle or news about it.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...