eight shader units for R520

Discussion in 'Architecture and Products' started by kyetech, Dec 10, 2004.

  1. MuFu

    MuFu Chief Spastic Baboon
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,258
    Likes Received:
    51
    Location:
    Location, Location with Kirstie Allsopp
    Aww, party pooper. :p
     
  2. anaqer

    Veteran

    Joined:
    Jan 25, 2004
    Messages:
    1,287
    Likes Received:
    1
    I don't think anybody is happy with it, per se - there is some bitching re: Prescott alright.
    Also you should consider that the role of a CPU is somewhat more important than that of a 3D card, even for gamers. :wink:
     
  3. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    You nuts? I got a 35w mobility Athlon just to save a bit-o-juice.

    90+W?!?! That's nuts!
     
  4. hstewarth

    Newcomer

    Joined:
    Apr 13, 2004
    Messages:
    99
    Likes Received:
    0
    This is my personal experience but I think this power stuff is blown way out of portion.

    When I was thinking about getting a NV40, I went out and change my power supply from a 400W to 520W based on some things I reading about power. I had a 5700 Ultra in the machine with is 3.2Ghz P4.

    When I got my 6800GT, I think my power usage actually went down and less noise. As for noise my CPU fan is way worst than the GPU fan.


    I can understand the power concerns for over clocker but that is not what gpu manufactor intendent. Also people with Shuttles with small power supples may be asking too much for such a box.

    If dual core cpus and SLI GPU's in the futures, I think larger power supples will be the norm.
     
  5. Hellbinder

    Banned

    Joined:
    Feb 8, 2002
    Messages:
    1,444
    Likes Received:
    12
    You know,

    i would have to disagree with that. Its more like Ati *claimed* they were going with Brute force but actually ended up with just barely enough clock speed room to hang on.

    To me Brute force should have had much better Per Clock results. If Ati's X800 architecture was truely delivering "Brute force" then the X800PE should be whiping the Floor with the 6800U accross the board.
     
  6. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,062
    Likes Received:
    3,119
    Location:
    New York
    I got a 45 watter so I could overclock the shit out of it. Guess we have different priorities :lol:
     
  7. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    35 watters OC better, that's why I got it instead of the 45 watter. ;)

    Just got me OCZ DDR Booster-thingy from DH too today, I'm looking to hit some new highs. 8)
     
  8. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,062
    Likes Received:
    3,119
    Location:
    New York
    Yeah Prescott sounds like a nightmare but I don't have any first hand experience with the beast.

    I'll have to disagree with you on the importance of the CPU though. For many of us here we only use the full potential of our CPU's when playing games anyway. And when playing modern games the GPU is fast becoming the major bottleneck. Sure the CPU is more important for general applications but for most of them a 1Ghz 25W A64 would be just fine.
     
  9. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,062
    Likes Received:
    3,119
    Location:
    New York
    Your watts are smaller than my watts :p Those weren't out yet when I picked mine up. It's treating me pretty well though. It's the main reason I haven't jumped on the A64 bandwagon yet. An XP at 2.5G is still a mighty fast beast.
     
  10. kemosabe

    Veteran

    Joined:
    Jun 19, 2003
    Messages:
    1,001
    Likes Received:
    16
    Location:
    Montreal, Canada
    Now why can't you guys stick to the sound principle of chronology and concentrate on fully leaking R520 before you set your sights on R580? :lol:
     
  11. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Meh, R520's boring already. Next... :!: :twisted:
     
  12. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    The difference is timing. nVidia is moving on to the NV4x architecture. ATI is still milking the R3xx architecture, despite having released it a bit sooner than nVidia released the NV3x.
     
  13. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Huh? Well, ATI did pretty much double the per clock results. I don't think I've ever heard improving pipeline efficiency called a brute force approach.
     
  14. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    They are both guilty of the same thing: NVIDIA "milked" NV2x for several years whilst ATI released a more advanced shader technology. It suited NVIDIA's business at the time though, so why not.
     
  15. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Several? Um, it was about 1.5 years. And ATI's more advanced shader technology? It was released as competition for the NV2x. This is a completely different scenario, as nVidia has released two architectures to ATI's one, with a spacing of 18 months. It's looking like ATI will wait at least 30 months before replacing the R3xx architecture.
     
  16. FUDie

    Regular

    Joined:
    Sep 25, 2002
    Messages:
    581
    Likes Received:
    34
    Completely different? Doesn't look that way. NV3x was not a completely different architecture than NV2x. ATI had a very good platform with R300 and they are milking it for all it's worth. The R300-based products have been very successful for ATI, only geeks would complain about ATI's good business sense.

    -FUDie
     
  17. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    You can always draw parallels, but to say that the NV3x could not be considered a new architecture is to claim you are blind. It is very easy to see where the design of the NV3x came from, but it is still vastly different from the NV2x.

    And since ATI appears to be losing ground to nVidia once again in the high-end space, I don't think it was good business sense, and thank God for that. The last thing I want to see is a company who decides to focus on performance at the expense of features succeed.
     
  18. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    NV20 was reviewed around March 2001, NV25 was launched in Feb 2002, it was further refreshed later that year with NV28, NV30 was announced in November 2002 but wasn't available until ~April 2003 and realisitcally they didn't have a high end product until NV35 later that year.

    Dependant on how you slice the timings, so had ATI. They had released R200 after NV20 (a little before NV25 iirc) and then the managed to get R300 out well before the launch of NV30 and a long time before any were sold, so they had two significantly different architecture in the same timespan NVIDIA were still using NV2x.

    The situations are very similar they are just at different points for both companies.
     
  19. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,493
    Likes Received:
    474
    Yes, but the quote I responded to specifically mentioned transistor count, not features. Just want to make it clear what I was talking about.
     
  20. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    Didn't the nV30 paper launch before the R300?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...