Nvidia GT200b rumours and speculation thread

Discussion in 'Architecture and Products' started by nicolasb, Jul 11, 2008.

  1. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,879
    Likes Received:
    5,330
    ahh, thanks
    do you know by how much ?
    yes I know i'm a lazy bugger ;)
     
  2. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
  3. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,879
    Likes Received:
    5,330
    thanks again..
     
  4. Randell

    Randell Senior Daddy
    Veteran

    Joined:
    Feb 14, 2002
    Messages:
    1,869
    Likes Received:
    3
    Location:
    London
    €370 no thanks - classic case of buy refresh, really repent 6 months later. If that translates at €1 = £1 then the last 280's around at £275 now look a bargain (and I still don't want to pay over £200). Guess I'll have to wait for a faster sub £200 card. Its struck me that at that price a 4850X2 is still ~15% faster and ~18% cheaper.
     
    #544 Randell, Jan 13, 2009
    Last edited by a moderator: Jan 13, 2009
  5. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,879
    Likes Received:
    5,330
    your best bang for buck looks to be to get another hd4850
    ps: why did you go 3870x2 to hd4850 seems a strange choice
     
  6. Randell

    Randell Senior Daddy
    Veteran

    Joined:
    Feb 14, 2002
    Messages:
    1,869
    Likes Received:
    3
    Location:
    London
    Because I got fed up with dual GPU issues i.e. bugs/ performance issues in new games like Stalker:CS and Crysis:Warhead whilst waiting fro crossfire profiles and bugs in old games like Rome:Total War where the minimap flickers when CF is enabled. I sold my 3870X2 for about £90 and the 4850 was £115 so the 'sidegrade' didn't hurt too much, plus there are plenty of times when the 4850 performs better. That's one reason I'm not buying another 4850 and waiting for the next well priced to performance single gpu option. As I'm on a P45 mobo it kind of makes sense for it to be an ATI card so I can crossfire if I want to.
     
  7. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,708
    Likes Received:
    2,132
    Location:
    London
    Hmm, I missed this:

    http://www.theinquirer.net/inquirer/news/801/1049801/nvidia-55nm-parts-update

    "Wrong DFM" does sound unlikely, I admit. But with the B3 revision of GT200b in GTX285, there's little arguing with the fact that NVidia's really struggled.

    I wonder if the "wrong DFM" is really about the high shader clocks. Do any of TSMC's other customers run any chips or parts of chips at anything like 1.3-1.7GHz?

    I dare say in theory 65/55nm-specific problems shouldn't necessarily impact 40nm.

    Ever since the shock and awe of discovering that G92 was road-mapped into 2009Q1, way back when, I don't think anyone's particularly surprised.

    Jawed
     
  8. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    Charlie has no idea what he's talking about in that article, period. In fact he has no idea whatsoever what he's talking about wrt shrinks; he still believes B3 is the 4th 55nm version even though everybody knows it's B1->B2->B3. The guy is just hopeless and should start redirecting more of his TheInq salary towards psychiatric help.

    Yes, NV's entire 65/55nm line-up is one hell of a fiasco, but Charlie's FUD has little to do with the real problems.
     
  9. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    probably because both AMD and intel and ... heck.. everyone else DO use B0 revisions for their processors just nvidia doesn't but then again.. assumption is....?
     
  10. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    That is simply not true. Some companies do use A0/B0, but many don't. ATI, even now that it's part of AMD, certainly doesn't... (Remember RV670? A11?) - and while I don't have the time to check the list of all possible companies that don't use A0/B0, for example Icera which prides itself in never needing a respin is always A1 or e1... I'm not aware of anyone not using A0 but using B0, and I'm not sure that'd make much sense.

    However I see your point and you're right that it does give him a good excuse... :) Although not one to make the same mistake all the time, and to keep distorting things in the same direction. I doubt I'm the only one who's slightly annoyed about how he uses his sources' tidbits; of course, I'm sure at least a small part of those sources are very happy about it. Heh, whatever - it's what he's paid for, and when it comes to sensationalism he's one hell of a scandal.
     
  11. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    No, not everyone does. Some companies don't even use numbers...
     
  12. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    I know not everyone does, but was overstating to support charlies f'ed up assumption that nv has B0 revisions.
     
  13. ímpar

    Newcomer

    Joined:
    Aug 29, 2008
    Messages:
    7
    Likes Received:
    0
  14. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    same could be seen with the 9800GTX+ the process got cheaper but somehow power consumption wasn't influenced in a good way because of the extra "performance"

    http://en.hardspell.com/doc/showcont.asp?news_id=3628


    xbitlabs only measured the PCIe power connectors while HC measured total system consumption. I don't think XB measured the power draw from the PCIe slot.
    XB is also missing "2d peak" for some cards.
     
    #554 neliz, Jan 15, 2009
    Last edited by a moderator: Jan 15, 2009
  15. ímpar

    Newcomer

    Joined:
    Aug 29, 2008
    Messages:
    7
    Likes Received:
    0
  16. ímpar

    Newcomer

    Joined:
    Aug 29, 2008
    Messages:
    7
    Likes Received:
    0
    Cant edit my previous post?!

    Dont think so.
    Check the above link. I believe the PCI-E slot power is labeled "+12v" and "+3,3v", the "+12v Ex.1" and "+12 Ex.2" are the power connectors.

    :?:
     
  17. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    What exactly do you mean - xbit's numbers don't add up?
     
  18. ímpar

    Newcomer

    Joined:
    Aug 29, 2008
    Messages:
    7
    Likes Received:
    0
    Greetings!
    Check the numbers in the bar graph.
     
  19. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,502
    Likes Received:
    24,397
    The numbers in the bargraph don't need to add up as they're cumulative. There are three numbers indicated, Idle, Peak 2D, and Peak 3D.

    EG: EVGA GTX 260 (715/1541) = 45.1 Idle, 47.6 Peak 2D, 111.8 Peak 3D
     
  20. Pressure

    Veteran

    Joined:
    Mar 30, 2004
    Messages:
    1,655
    Likes Received:
    593
    If I remember correctly, ATI have historically used A11 silicon. Wasn't the R600 A13?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...