AMD: Beyond R600

Discussion in 'Architecture and Products' started by Unknown Soldier, Mar 20, 2007.

  1. wolf2

    Newcomer

    Joined:
    Jan 23, 2007
    Messages:
    29
    Likes Received:
    1

    Laying out a graphics part using semi-custom design rules is a whole different ball-game then a CPU in full-custom.
     
  2. Corwin_B

    Regular

    Joined:
    Aug 7, 2005
    Messages:
    733
    Likes Received:
    27
    I love ATI products since the R300, and I'd really like to see them succeed, but I have to agree with Wolf2 here : if the current string of atrocious execution (rivaling 3Dfx's) continues, I can see AMD pulling the plug of the high-end discrete market, even if a change of CEO is needed for that.
     
  3. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    I was thinking that's what the 512b bus and top-shelf GDDR4 were for, but FP64 seems like a hyooge and premature leap above FP32. That said, I don't know how they think R600 will skew b/w pro and consumer SKUs, or how it'll contribute to Fusion or whatever "accelerators" they have planned for a few years down the line.
     
  4. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    I still think its premature to say something like this, mainly because they will try to fix the problem first before pulling the plug, nV was able to, AMD should be too. Faliure to execute has equal importance with planning + management and engineering.
     
  5. Speccy

    Newcomer

    Joined:
    Oct 11, 2002
    Messages:
    86
    Likes Received:
    6
    That would be entirely counter productive to any kind of strategy.

    For starters, the primary cost is the engineering resource, the much of which is still required to design and implement architecture for chipset and CPU's, by removing the desktop all you are doing is removing an incremental cost while cutting off a considerable revenue generator. The technologies are leveraged through other areas of the business as well so, for instance, if you want any consumer presense, removing desktop graphics also jeoparises that.

    Besides Fusion, the point of purchasing a graphics company is to leverage the benefits to further the core business - Intel, for instance, can only offer an "Intel only" ecosystem thats good for business and entry level PC's as their graphics performance is low; however AMD can offer compelete AMD systems into a much wider variety of configurations while offering the stability of a single hardware / single software solution stack; this is all addative. Removing the desktop graphics business will not just undercut this, but also going to send a horrible message to the very OEM's that they are trying to attract to their core business.
     
  6. Corwin_B

    Regular

    Joined:
    Aug 7, 2005
    Messages:
    733
    Likes Received:
    27
    NV managed to get out of the NV30 debacle for a number of reasons :
    1) drivers cheats and FUD (relayed by nice, compliant websites) that managed to stall R300 momentum a little
    2) marketing (I loathe it, but TWIMTBP is a great marketing tool, and used quite effectively)
    3) extremely fast and aggressive move toward competitive parts, wisely keeping away from the "release top part on unproven new process" curse
    4) Reintroduction of SLI, another great marketing opportunity

    Those are not necessarly things to be proud of, but JSH doesn't play for fun.

    At the moment, I don't see ATI/AMD with any of those advantages (I suppose they could start putting some "some would call these overly aggressive optimisations" stuff in the drivers, but reviewers are better equipped to catch those now). They also lack the website lackeys to relay their PR and pass it as an informed, unbiased opinion.

    They can get back on train if R600 performs really well (enough to beat G80Ultra), if its followers are on schedule, and if the victory at the high-end brings successes for mid-range parts. But I don't see them keeping letting NV enjoy month after month alone at the high-end before pulling the plug.
     
  7. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    Well the nv30 was a practice in perfection for later generations as far as manufacturing goes, ATi so far hasn't learned from the x800, how to make sure they won't have manufacturing problems for thier top end SKU's, I don't know why they haven't taken precautions, AMD, I'm sure will look into this before pulling the plug, different management different approach. AMD does have strong marketing, even with the Intel inside going on TV every day some channel or other they were able to pentrate the market against a much more dangerous competitor, Intel.


    Well I'm not saying they won't pull the plug, yes you are right if this continues to happen they might but I don't think its somthing that will happen soon if it ever comes to this point. AMD knows they will lose a lot if they do, as another has mentioned earlier, developing top end technology migrates to great mid and lower end products.
     
  8. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    For anyone talking about "AMD being 1 year behind" or whatever, while R600 is indeed late, just take a look on previous GFX chip releases from ATI (AMD) and nVidia, there's IIRC a lot more releases with more than month or two in between them, than "at the same time" launches (aka within month or two of each other), so it's not necessarily nothing to worry about.
    Sometimes nvidia talked about how they're on 6month cycle, and at the same time ati talked about their 9month cycle.
     
  9. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    Unless demand for AMD chips suddenly falls off a cliff, there will be no excess fab capacity for graphics chips made in AMD fabs for the forseeable future. And currently I just don't see that happening within the next 2 years. Their margins may get whittled down to virtually nothing, however I don't see AMD giving up on any marketshare gains they've made without a fight (IE - Price War).

    As such I'd expect TMSC or someone else to continue to make all chips for the ATI branded graphics and chipsets. Which means that for the near and mid-term, Nvidia and AMD/ATI will continue to share process technology.

    I'd really like to see ATI go back to testing a new process node with low end products for at least one product cycle before moving the high down to it...

    There is one fundamental difference I've seen between Nvidia and ATI however.

    Nvidia has generally led in implementing OEM checkmark items and getting the ball rolling on adoption of it, however, those features don't generally end up being playbale in most situations on the first generation of hardware that implements it. 32-bit color on TNT1 was too slow for any reasonable use. DX9 on NV30 had they launched on time would have been first, and again, not all that useable. SM 3.0 on NV40 generally not performing all that well in actual SM 3.0 games. Will DX10 end up being the same?

    Hell, Nvidia's NV1 was a quite revolutionary product that never caught on.

    ATI on the other hand generally seems to lag behind by half a generation or more on some features, however, when they do implement it, it generally tends to be useable and fast. 32-bit color was quite fast (and in fact the only rendering choice I believe?) on the Rage Fury (?), my memory is a bit spotty here. However, drivers back then were absolutely atrocious for ATI hardware. /shudder. DX9 on R300 should have launched after NV30 had Nvidia been on time. SM3.0 on R5xx was a whole generation behind, yet was actually playable. Will DX10 follow a similar pattern?

    I'm completely totally and utterly expecting DX9 titles to be relatively similar between G80/81 and R600. Except possible in high bandwidth situations. What I'm more interested in is how will they handle DX10 functionality. After all NV30 played quite well in DX8 games and had a situation of win some/lose some when compared to R300 in DX8 if I remember correctly. Yet that was absolutely no indication of performance in upcoming DX9 games.

    Will R600 be like R300? DX9 was quite useable on R300 for 3-5 years depending on the resolution you played at. A friend of mine only just recently upgraded from his Radeon 9700 Pro to a Geforce 8800 GTX.

    Will DX10 have a similar fate with R600? Will DX10 have a similar fate on G80 as it did on NV30? OK, extremely doubtful that Nvidia would bite the pooch that badly. Will DX10 on both cards be more like R300? or more like NV30 wrt DX9? or more like NV40 wrt SM 3.0?

    Will one product end up being less robust but faster? What will be the strong points of each architecture and how that does that play into what matters to ME?

    From early reports R700 will be radical departure from your traditional graphics chip. Then again those might have all been rumors or they could scrap it as being unworkable. As such will it end up being more R300 like? Or NV 30 like?

    More importantly to me, will there ever be a 24" display that does a minimum 3840x2400 with fast response time? And will there be a graphics card in the next 2 years that would even do that at a playable framerate?

    Hell, I'm hoping Intel or someone is successful is producing a competitor. Who knows if 2 years from now I'll be running an Intel, Nvidia, or ATI video card.

    Regards,
    SB
     
  10. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    ROFL, that's 4x the resolution of 1920x1200. You might be able to play Quake3 at that resolution in a couple years :)
     
  11. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Yes. No. :smile:
     
  12. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    I haven't read the whole thread yet, so perhaps this really is relevant, tho it doesn't appear to be on its face. But the real answer is: GFFX5200.
     
  13. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    1). This is a bit unkind and trollish.

    2). Does anyone have any real reason to think that R700 is going to be anywhere near the level of architectural change as R520 and R600?
     
  14. icecold1983

    Banned

    Joined:
    Aug 4, 2006
    Messages:
    649
    Likes Received:
    4
    but at that res we might not need aa in some games. keyword might.
     
  15. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    You must not rule out daisy-chaining a bunch of Quadro Plex'es in the rackmounted version... ;)
     
  16. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Oh, I'm not. But I wouldn't call that "a graphics card".
     
  17. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    I does have DVI outputs, so... ;)
     
  18. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,382
    It's easy to come up with examples where AA is still very much needed, even at ultra-high resolutions.
    This was recently discussed here.
     
  19. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    I think it was actually rather the 5900, because it was dirt cheap. I bought two of those back then for example.
     
  20. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    1. Hmmm I disagree. Certainly not unkind. A little mischievous maybe.

    2. Of course not. R520 probably isn't as significant of a change as R600 either (even if it's reusing the MC and some of the threading logic). The entire flow through the pipeline is turned on its head. And god forbid it's anything like Jawed envisions then it really would be a radical change.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...