AMD: Beyond R600

AMD will have a fab tech advantage in graphics, which should allow them to compete even with a design disadvantage.


Laying out a graphics part using semi-custom design rules is a whole different ball-game then a CPU in full-custom.
 
I love ATI products since the R300, and I'd really like to see them succeed, but I have to agree with Wolf2 here : if the current string of atrocious execution (rivaling 3Dfx's) continues, I can see AMD pulling the plug of the high-end discrete market, even if a change of CEO is needed for that.
 
Also, I assume that you'll see FP64 on ATI cards before the end of this year. Don't count on it being fast; I think it's exclusively for GPGPU.
I was thinking that's what the 512b bus and top-shelf GDDR4 were for, but FP64 seems like a hyooge and premature leap above FP32. That said, I don't know how they think R600 will skew b/w pro and consumer SKUs, or how it'll contribute to Fusion or whatever "accelerators" they have planned for a few years down the line.
 
I love ATI products since the R300, and I'd really like to see them succeed, but I have to agree with Wolf2 here : if the current string of atrocious execution (rivaling 3Dfx's) continues, I can see AMD pulling the plug of the high-end discrete market, even if a change of CEO is needed for that.


I still think its premature to say something like this, mainly because they will try to fix the problem first before pulling the plug, nV was able to, AMD should be too. Faliure to execute has equal importance with planning + management and engineering.
 
I love ATI products since the R300, and I'd really like to see them succeed, but I have to agree with Wolf2 here : if the current string of atrocious execution (rivaling 3Dfx's) continues, I can see AMD pulling the plug of the high-end discrete market, even if a change of CEO is needed for that.
That would be entirely counter productive to any kind of strategy.

For starters, the primary cost is the engineering resource, the much of which is still required to design and implement architecture for chipset and CPU's, by removing the desktop all you are doing is removing an incremental cost while cutting off a considerable revenue generator. The technologies are leveraged through other areas of the business as well so, for instance, if you want any consumer presense, removing desktop graphics also jeoparises that.

Besides Fusion, the point of purchasing a graphics company is to leverage the benefits to further the core business - Intel, for instance, can only offer an "Intel only" ecosystem thats good for business and entry level PC's as their graphics performance is low; however AMD can offer compelete AMD systems into a much wider variety of configurations while offering the stability of a single hardware / single software solution stack; this is all addative. Removing the desktop graphics business will not just undercut this, but also going to send a horrible message to the very OEM's that they are trying to attract to their core business.
 
I still think its premature to say something like this, mainly because they will try to fix the problem first before pulling the plug, nV was able to, AMD should be too. Faliure to execute has equal importance with planning + management and engineering.

NV managed to get out of the NV30 debacle for a number of reasons :
1) drivers cheats and FUD (relayed by nice, compliant websites) that managed to stall R300 momentum a little
2) marketing (I loathe it, but TWIMTBP is a great marketing tool, and used quite effectively)
3) extremely fast and aggressive move toward competitive parts, wisely keeping away from the "release top part on unproven new process" curse
4) Reintroduction of SLI, another great marketing opportunity

Those are not necessarly things to be proud of, but JSH doesn't play for fun.

At the moment, I don't see ATI/AMD with any of those advantages (I suppose they could start putting some "some would call these overly aggressive optimisations" stuff in the drivers, but reviewers are better equipped to catch those now). They also lack the website lackeys to relay their PR and pass it as an informed, unbiased opinion.

They can get back on train if R600 performs really well (enough to beat G80Ultra), if its followers are on schedule, and if the victory at the high-end brings successes for mid-range parts. But I don't see them keeping letting NV enjoy month after month alone at the high-end before pulling the plug.
 
NV managed to get out of the NV30 debacle for a number of reasons :
1) drivers cheats and FUD (relayed by nice, compliant websites) that managed to stall R300 momentum a little
2) marketing (I loathe it, but TWIMTBP is a great marketing tool, and used quite effectively)
3) extremely fast and aggressive move toward competitive parts, wisely keeping away from the "release top part on unproven new process" curse
4) Reintroduction of SLI, another great marketing opportunity

Those are not necessarly things to be proud of, but JSH doesn't play for fun.

At the moment, I don't see ATI/AMD with any of those advantages (I suppose they could start putting some "some would call these overly aggressive optimisations" stuff in the drivers, but reviewers are better equipped to catch those now). They also lack the website lackeys to relay their PR and pass it as an informed, unbiased opinion.

They can get back on train if R600 performs really well (enough to beat G80Ultra), if its followers are on schedule, and if the victory at the high-end brings successes for mid-range parts. But I don't see them keeping letting NV enjoy month after month alone at the high-end before pulling the plug.


Well the nv30 was a practice in perfection for later generations as far as manufacturing goes, ATi so far hasn't learned from the x800, how to make sure they won't have manufacturing problems for thier top end SKU's, I don't know why they haven't taken precautions, AMD, I'm sure will look into this before pulling the plug, different management different approach. AMD does have strong marketing, even with the Intel inside going on TV every day some channel or other they were able to pentrate the market against a much more dangerous competitor, Intel.


Well I'm not saying they won't pull the plug, yes you are right if this continues to happen they might but I don't think its somthing that will happen soon if it ever comes to this point. AMD knows they will lose a lot if they do, as another has mentioned earlier, developing top end technology migrates to great mid and lower end products.
 
For anyone talking about "AMD being 1 year behind" or whatever, while R600 is indeed late, just take a look on previous GFX chip releases from ATI (AMD) and nVidia, there's IIRC a lot more releases with more than month or two in between them, than "at the same time" launches (aka within month or two of each other), so it's not necessarily nothing to worry about.
Sometimes nvidia talked about how they're on 6month cycle, and at the same time ati talked about their 9month cycle.
 
Unless demand for AMD chips suddenly falls off a cliff, there will be no excess fab capacity for graphics chips made in AMD fabs for the forseeable future. And currently I just don't see that happening within the next 2 years. Their margins may get whittled down to virtually nothing, however I don't see AMD giving up on any marketshare gains they've made without a fight (IE - Price War).

As such I'd expect TMSC or someone else to continue to make all chips for the ATI branded graphics and chipsets. Which means that for the near and mid-term, Nvidia and AMD/ATI will continue to share process technology.

I'd really like to see ATI go back to testing a new process node with low end products for at least one product cycle before moving the high down to it...

There is one fundamental difference I've seen between Nvidia and ATI however.

Nvidia has generally led in implementing OEM checkmark items and getting the ball rolling on adoption of it, however, those features don't generally end up being playbale in most situations on the first generation of hardware that implements it. 32-bit color on TNT1 was too slow for any reasonable use. DX9 on NV30 had they launched on time would have been first, and again, not all that useable. SM 3.0 on NV40 generally not performing all that well in actual SM 3.0 games. Will DX10 end up being the same?

Hell, Nvidia's NV1 was a quite revolutionary product that never caught on.

ATI on the other hand generally seems to lag behind by half a generation or more on some features, however, when they do implement it, it generally tends to be useable and fast. 32-bit color was quite fast (and in fact the only rendering choice I believe?) on the Rage Fury (?), my memory is a bit spotty here. However, drivers back then were absolutely atrocious for ATI hardware. /shudder. DX9 on R300 should have launched after NV30 had Nvidia been on time. SM3.0 on R5xx was a whole generation behind, yet was actually playable. Will DX10 follow a similar pattern?

I'm completely totally and utterly expecting DX9 titles to be relatively similar between G80/81 and R600. Except possible in high bandwidth situations. What I'm more interested in is how will they handle DX10 functionality. After all NV30 played quite well in DX8 games and had a situation of win some/lose some when compared to R300 in DX8 if I remember correctly. Yet that was absolutely no indication of performance in upcoming DX9 games.

Will R600 be like R300? DX9 was quite useable on R300 for 3-5 years depending on the resolution you played at. A friend of mine only just recently upgraded from his Radeon 9700 Pro to a Geforce 8800 GTX.

Will DX10 have a similar fate with R600? Will DX10 have a similar fate on G80 as it did on NV30? OK, extremely doubtful that Nvidia would bite the pooch that badly. Will DX10 on both cards be more like R300? or more like NV30 wrt DX9? or more like NV40 wrt SM 3.0?

Will one product end up being less robust but faster? What will be the strong points of each architecture and how that does that play into what matters to ME?

From early reports R700 will be radical departure from your traditional graphics chip. Then again those might have all been rumors or they could scrap it as being unworkable. As such will it end up being more R300 like? Or NV 30 like?

More importantly to me, will there ever be a 24" display that does a minimum 3840x2400 with fast response time? And will there be a graphics card in the next 2 years that would even do that at a playable framerate?

Hell, I'm hoping Intel or someone is successful is producing a competitor. Who knows if 2 years from now I'll be running an Intel, Nvidia, or ATI video card.

Regards,
SB
 
More importantly to me, will there ever be a 24" display that does a minimum 3840x2400 with fast response time? And will there be a graphics card in the next 2 years that would even do that at a playable framerate?

ROFL, that's 4x the resolution of 1920x1200. You might be able to play Quake3 at that resolution in a couple years :)
 
More importantly to me, will there ever be a 24" display that does a minimum 3840x2400 with fast response time? And will there be a graphics card in the next 2 years that would even do that at a playable framerate?

Yes. No. :smile:
 
NV managed to get out of the NV30 debacle for a number of reasons :g the plug.

I haven't read the whole thread yet, so perhaps this really is relevant, tho it doesn't appear to be on its face. But the real answer is: GFFX5200.
 
And then when R700 is delayed you're gonna wait another year to build the fastest computer available with the top of the line parts available in xmas of 2k8 ..... then you're gonna wait to see if it's worth waiting for R800 in Q1 0f 2009 !! :devilish:

1). This is a bit unkind and trollish.

2). Does anyone have any real reason to think that R700 is going to be anywhere near the level of architectural change as R520 and R600?
 
I haven't read the whole thread yet, so perhaps this really is relevant, tho it doesn't appear to be on its face. But the real answer is: GFFX5200.

I think it was actually rather the 5900, because it was dirt cheap. I bought two of those back then for example.
 
1). This is a bit unkind and trollish.

2). Does anyone have any real reason to think that R700 is going to be anywhere near the level of architectural change as R520 and R600?

1. Hmmm I disagree. Certainly not unkind. A little mischievous maybe.

2. Of course not. R520 probably isn't as significant of a change as R600 either (even if it's reusing the MC and some of the threading logic). The entire flow through the pipeline is turned on its head. And god forbid it's anything like Jawed envisions then it really would be a radical change.
 
Back
Top