The RV870 Story: AMD Showing up to the Fight

Anandtech has just posted an extremely interesting article on the development of the Evergreen architecture. An enjoyable read especially if you liked the one about RV770.

http://www.anandtech.com/video/showdoc.aspx?i=3740

" You have to deliver at these bulges. ATI’s Eric Demers (now the CTO of AMD's graphics group) put it best: if you don’t show up to the fight, by default, you lose. ATI was going to stop not showing up to the fight.

ATI’s switch to being more schedule driven meant that feature lists had to be kept under control."

I've never understood why ATi puts out these PR mini dramas regarding their business decisions, I guess it's an attempt to make their customers feel like they're "part of the family" or "part of the process".

First there was that "The great day we decided to stop making the fastest GPU" piece, and now this.

To me, they always read like Wilford Brimley narrating a business meeting.

I don't think anyone can dispute ATi has put out two very nice generations of hardware in a row, especially if you look at it from a straight "bang for buck" perspective. Do we really need to know some guy thought about quitting for a couple months?

Guess I don't really care what sort of mood everyone was in, just want to know about the hardware.
 
" You have to deliver at these bulges. ATI’s Eric Demers (now the CTO of AMD's graphics group) put it best: if you don’t show up to the fight, by default, you lose. ATI was going to stop not showing up to the fight.

ATI’s switch to being more schedule driven meant that feature lists had to be kept under control."

I've never understood why ATi puts out these PR mini dramas regarding their business decisions, I guess it's an attempt to make their customers feel like they're "part of the family" or "part of the process".

First there was that "The great day we decided to stop making the fastest GPU" piece, and now this.

To me, they always read like Wilford Brimley narrating a business meeting.

I don't think anyone can dispute ATi has put out two very nice generations of hardware in a row, especially if you look at it from a straight "bang for buck" perspective. Do we really need to know some guy thought about quitting for a couple months?

Guess I don't really care what sort of mood everyone was in, just want to know about the hardware.
May be it is part of an elaborate smoke and mirrors trick....
 
"
I've never understood why ATi puts out these PR mini dramas regarding their business decisions, I guess it's an attempt to make their customers feel like they're "part of the family" or "part of the process".

First there was that "The great day we decided to stop making the fastest GPU" piece, and now this.

To me, they always read like Wilford Brimley narrating a business meeting.

ATI didn't put anything out. This was written as an interest piece by Anand, There are a lot of projects (whether they be games, music, hardware, software, movies) that generate this kind of article that describes the making of from a persperctive of after it's all done and dusted.

"
I don't think anyone can dispute ATi has put out two very nice generations of hardware in a row, especially if you look at it from a straight "bang for buck" perspective. Do we really need to know some guy thought about quitting for a couple months?

Guess I don't really care what sort of mood everyone was in, just want to know about the hardware.

You don't have to read it, but it's pretty interesting for some of us. There were plenty of hardware reviews if you prefer not so see the story behind the directions they took or the reasons they did what they did.
 
May be it is part of an elaborate smoke and mirrors trick....
In what way? Their cards are pretty much on the table at the moment (not for their next gen, but then this isn't hyping that). Now if it was in the Wallstreet journal I could imagine they were trying to talk up the share price or something, but this is Anandtech ... interesting to read or not, they have nothing to tell consumers here which their product doesn't tell as far as buying decisions go.
 
In what way? Their cards are pretty much on the table at the moment (not for their next gen, but then this isn't hyping that). Now if it was in the Wallstreet journal I could imagine they were trying to talk up the share price or something, but this is Anandtech ... interesting to read or not, they have nothing to tell consumers here which their product doesn't tell as far as buying decisions go.

I was thinking in terms of affecting nv's thought process for nv's next gen part.

The rv 770 article appeared in Dec 2008. The championing of small die strategy in that piece _might_ have affected/distracted nv's design process, only for it to be surprised by the relatively bigger Cypress.
 
ATI didn't put anything out. This was written as an interest piece by Anand, There are a lot of projects (whether they be games, music, hardware, software, movies) that generate this kind of article that describes the making of from a persperctive of after it's all done and dusted.



You don't have to read it, but it's pretty interesting for some of us. There were plenty of hardware reviews if you prefer not so see the story behind the directions they took or the reasons they did what they did.

I suppose I just thought the flavor of the article was sort of hokey, and the last one I mentioned. Reads too much like an old sports magazine article. (no offense intended Anand)

NVIDIA tells people their reasoning behind design decisions as well, just in a more straight forward manner, with none of the "Joe fought long and hard for the increased double precision focus, while Pete took up drinking over it. The fact that it ever came to being is a testament to the courage and foresight of the wise engineers at NVIDIA" type stuff.

I know some people think of this as some sort of sports rivalry, I suppose this sort of style plays well to them.

Doesn't surprise me they wanted to keep Eye Finity a secret- it a competitive advantage and NVIDIA already had some experience spanning monitors with the Quadro line. My guess is they wanted as many months as possible alone in that market, and ended up getting them.
 
" You have to deliver at these bulges. ATI’s Eric Demers (now the CTO of AMD's graphics group) put it best: if you don’t show up to the fight, by default, you lose. ATI was going to stop not showing up to the fight.

ATI’s switch to being more schedule driven meant that feature lists had to be kept under control."

I've never understood why ATi puts out these PR mini dramas regarding their business decisions, I guess it's an attempt to make their customers feel like they're "part of the family" or "part of the process".

First there was that "The great day we decided to stop making the fastest GPU" piece, and now this.

To me, they always read like Wilford Brimley narrating a business meeting.

I don't think anyone can dispute ATi has put out two very nice generations of hardware in a row, especially if you look at it from a straight "bang for buck" perspective. Do we really need to know some guy thought about quitting for a couple months?

Guess I don't really care what sort of mood everyone was in, just want to know about the hardware.

I suppose I just thought the flavor of the article was sort of hokey, and the last one I mentioned. Reads too much like an old sports magazine article. (no offense intended Anand)

NVIDIA tells people their reasoning behind design decisions as well, just in a more straight forward manner, with none of the "Joe fought long and hard for the increased double precision focus, while Pete took up drinking over it. The fact that it ever came to being is a testament to the courage and foresight of the wise engineers at NVIDIA" type stuff.

I know some people think of this as some sort of sports rivalry, I suppose this sort of style plays well to them.

Doesn't surprise me they wanted to keep Eye Finity a secret- it a competitive advantage and NVIDIA already had some experience spanning monitors with the Quadro line. My guess is they wanted as many months as possible alone in that market, and ended up getting them.

I guess it kind of falls in line with nV's re-launch of HybridPower 2.0 (Optimus) piece by Anand, where the loyalists will gobble up the spoon feed marketing materials and declare that it's revolutionary when in fact it's a step forward (evolution) and not some giant leap that PR makes it to be. The die hard fans will delve deep into the details clinging onto the slightest tidbits that back their respective position and ignore anything that contradicts it, while the general population will sit back and see them as interesting reads with obvious corporate spin. Sure it's alway interesting to see how things work "behind the curtain" but when all is said and done, availability, affordability and usability are what end users really care about.

With regard to Eyefinity, some see it as the next coming of graphics nirvana, others see it as a niche where the numbers range in the single digit percentages .. it's obvious that with NV's reactionary response of adding their Surround View (I think they should have gone with nFinity /wink) that they see it as a competitive threat. IIRC, NV has had the hardware for some time in the Quadro line though it is of very limited use.. the NVS (business) line being the most capable display wise (using 2 gpus iirc to lift display output limit) but aimed solely at productivity and relying almost soley upon it's multi-display as a key selling point (much like ATIs FirePro MV line), hardly a part for a content creation workstation. The Quadro CX and it's nView Multi-Display is limited to Windows XP. The bread and butter of the Quadro line (FX) iirc, (feel free to correct me if I am wrong) too is limited to 2 outputs and the only ones capable of >2 use 2 GPUs to (I think the sole exception being the SDI option which requires an additional PCI 1X "daughter board"). The hardware limit extends from the quadro line to the entire Geforce line as exemplified in the SLI requirement for NV's Surround gaming. Some question as to the SLI requirement also being that the GF NEEDS sli to power NV's software (think SoftTH) surround instead of just the additinal outputs.
 
Last edited by a moderator:
I would think it's standard practice to obfuscate significant new features as much as possible. He seemed to go all out keeping even internal people out of the loop but there has been other stuff that has been kept hush hush until very late - G80's unification for example.

My post from the other thread: Very nice article but I'm not sure what can be inferred from it. If Cypress was in fact cut down and stuff like Sideport was removed was it just a simple cut - like number of SIMDs or did they completely change what the chip was going to be?

In any case if Northern Islands is another full lineup on 40nm I don't see how it can be anything other than an architectural overhaul otherwise what would be the point? Say back when it was decided that Cypress was going to be a bit smaller than first planned they also decided to refresh it with a bigger chip on the same architecture. That would make some sense but it wouldnt make any sense at all for the downmarket derivatives. Yeah so I'm gonna bet on significant overhauls somewhere, maybe in the geometry pipeline. :)
 
I enjoyed reading the behind the scenes look at AMD's engineering processes. Regardless of any perceived slant to the story, I think that these kinds of stories are an opportunity for industry websites to do something fresh with the subject matter, and add some value over being just another rumour mill weblog. Good stuff.
 
I didn't know nv's 3 monitor solution is going to require sli

Aye, no consumer level Nvidia cards able to drive more than 2 monitors on the horizon. So you'll be required to have at least 2 cards.

I still find the most interesting aspect of that article dealt with the Rv740. Going quite to ways to explain why Rv740 was a virtual no show and more importantly how that impacted Rv8xx (yes yes, I know Evergreen, I still like the numbers).

I hope Anand can manage to get info out of Nvidia about how things went with their design process, etc. as well as internal decision making leading up to Fermi. Would be very interesting. Or maybe not, as Nvidia (other than jumping to new processes faster) doesn't seem to have deviated from what they've been doing for years.

Regards,
SB
 
...

...

Guess I don't really care what sort of mood everyone was in, just want to know about the hardware.
I think you missed the relevant part;

"First, it massively increased the confidence level of the engineering team. There’s this whole human nature aspect to everything in life, it comes with being human. Lose confidence and execution sucks, but if you are working towards a realistic set of goals then morale and confidence are both high. The side effect is that a passionate engineer will also work to try and beat those goals. Sly little bastards."

The part relevant to NVDA and you.
 
Aye, no consumer level Nvidia cards able to drive more than 2 monitors on the horizon. So you'll be required to have at least 2 cards.

SB

Slap me upside the head if I"m wrong, but doesn't the 58xx boards require one display to be driven via display port? I don't exactly believe or feel that such displays have much market uptake yet, so how often will three displays be used via either IHV's solutions this year?
 
Slap me upside the head if I"m wrong, but doesn't the 58xx boards require one display to be driven via display port? I don't exactly believe or feel that such displays have much market uptake yet, so how often will three displays be used via either IHV's solutions this year?

That's true, but single link displayport to DVI adapters are reasonably priced. Dual link is another story though.
 
Slap me upside the head if I"m wrong, but doesn't the 58xx boards require one display to be driven via display port? I don't exactly believe or feel that such displays have much market uptake yet, so how often will three displays be used via either IHV's solutions this year?

I think (not 100% sure) that you can use DisplayPort OR HDMI as they share the same "output" (used loosely).. so you can go DVI+DVI+DP or DVI+DVI+HDMI, of course given the ability for DP to DVI or VGA (via adapter) it shouldn't be too hard to find a combination that works.. of course the downside being if you want >1900 rez you need to either have a DP enable monitor or cough up the additional $100 on top of the $300-450 for just one card. (I doubt anyone will be running 1900x1600 with full AA/AF on a 5770 anytime soon). If you run < 1900 (1680 and down) iirc you can use a DP to VGA adapter that runs about $25-30. It's a shame that ATI didn't include any adapters (none of the packages I've seen include on) and the requirement to add $100 on top of the initial expense for an active adapter has put many off from high rez eyefinity .. unless you feel the need to go out and buy a new DP enabled display.

Edit: I'm looking forward to what DP 1.2 and the ability daisy-chain up to four 1920 x 1200 along with the increased bandwidth improvements, iirc this would allow higher resolutions with greater refresh rates (further helping the push for 3D). Though it seems all of ATI's parts are 1.1a and I'm not sure if they can make use of any of 1.2s advancements.
 
Last edited by a moderator:
Slap me upside the head if I"m wrong, but doesn't the 58xx boards require one display to be driven via display port? I don't exactly believe or feel that such displays have much market uptake yet, so how often will three displays be used via either IHV's solutions this year?

You can always get a displayport adapter to use the third display without a displayport enabled display. Indeed, people will have to buy one to use three displays on 5xxx cards. That's still probably a lot cheaper than needing two cards to do three displays as per Nvidia cards or older ATI cards.
 
That's true, but single link displayport to DVI adapters are reasonably priced. Dual link is another story though.
Those don't work right for eyefinity though I believe, and the dual links cost almost as much as another display port monitor would. :???:
 
Back
Top