AMD's culture clash

Yeah, like 3dilettante said, I thought the Barcelona benchmarks pretty much set the table and the issue was a dead one.
Well I'm thinking significantly before the Barcelona benchmarks here - obviously it's even more amazing some people gave the benefit of the doubt after those...

I think I've been saying "certainly lower IPC, probably lower clocks" for nearly 6 months before the Barcelona benchmarks, heh. And I don't think I'm the only one either. The data has been there for quite some time if you knew how to interpret things.
 
Well I'm thinking significantly before the Barcelona benchmarks here - obviously it's even more amazing some people gave the benefit of the doubt after those...

I think I've been saying "certainly lower IPC, probably lower clocks" for nearly 6 months before the Barcelona benchmarks, heh. And I don't think I'm the only one either. The data has been there for quite some time if you knew how to interpret things.

Your attempt to apply rational thought to something that is inherently emotionally tainted has little chance to succeed:D. Check out a number of forums where, after the initial depression post release reviews, the optimistic:"AMD will actually smack Intel around, just you wait!" line runs free. See ppl ignoring the fact that these aren't priced even half-competitive, that they're quite hot, have poorer IPC and so on. See desktop users pull some corner case server-benchies where the native multi-core approach provides benefits over the FSB-based one claiming that it'll do zilch for them in their Desktop environment, whilst playing Crysis or encoding Divx. There's no place for rationality here.

FWIW, I think that if they had had something high-endish(at least the 2.6 9900, maybe a 2.8-3 PhenomFX in mega-humongous limited quantities) upon launch, a lot of flak would've been avoided. Sadly, they couldn't pull it off....
 
oh, i was referring to the last set of graphics products, not this one. I'm always slow on the uptake. Everyone's excited, so I am too, but i don't even think much about it until the prices get pretty stable. I'm not really shopping for a video card yet, so i really can't comment on this crop (thus, i'd be a terrible pundit). Looking forward to buying a midrange part this january, tho..!

Everyone was expecting AMD to hit 3 ghz pretty quickly, which would have put them at least into the same ballpark as Conroe for the mainstream, (even if to be quickly trounced by Penryn).

Instead what we got was the same as R600. Lots of big promises, lots of delays, followed by production/execution failures that have given us a below par part with lacklustre performance that is overpriced and underperforming compared to the competition. Faced with that, it pretty much looks like AMD have been left with nothing but bullshitting their customers.

AMD is doubly in pain because without a decent CPU, the chipset/motherboard business they now have to support will suffer too.
 
Last edited by a moderator:
Your attempt to apply rational thought to something that is inherently emotionally tainted has little chance to succeed:D
Well, what do you want me to do then? Ban everyone who might potentially be emotionally tainted?! I don't think that one is going to work out too well... :)
 
Well, what do you want me to do then? Ban everyone who might potentially be emotionally tainted?! I don't think that one is going to work out too well... :)

Well, considering that finding someone who ISN'T emotionally tainted is somewhat impossible, given the fact that we're only human, it would be quite fun:p:D.
 
...
It's getting to the point that whatever graphics wise AMD throw out nvidia can just squash like a cockroach if they wish. Looking back at history the nvidia FX was one miss wonder and the ATi R300 was a one hit wonder. The nv30 debacle did nvidia more of a favour then they ever new at the time.

Cpu wise I do not even want to think of Shangai v Nehalem, it's too depressing,

You know, I don't think the problem is companies and the products they bring to market, because after all this is just the way things work in competitive industries. Some companies get a leg up on the competition for awhile, and if the competition is actually competitive it'll turn the tables for awhile. Back and forth. That's business. PR actually plays such a small role in the scheme of things, because the bottom line is and always has been that companies sink or swim based on the products they produce. (This isn't the forum for examining the issues of much larger, richer companies kicking around much smaller, poorer ones by essentially buying their market positions during the periods when their product lines are less than competitive with the better products brought to market by those smaller companies, so I won't touch on that aspect of things here.)

I'll tell you what is beginning to "emotionally" impact me a bit, however. And that is the attitude I am seeing more and more of from various people on technical web sites. Listening to some of them sanctimoniously comment negatively on what are essentially trivial things, or else things that are entirely superficial, it's as though these people consider themselves so high up in the food chain that they actually imagine they are qualified to run and manage the companies they so viciously and eagerly criticize.

First off, most of the criticisms I've read to date seem to stem from nothing more important than either bruised feelings or bruised egos--emphasis on the latter--as though these hypercritical pundits of tech companies were thoroughly incensed to discover that the 3d and cpu industries do not exist for their personal convenience or personal financial welfare. It's a pity that most of these people still to this day do not understand the degree to which they themselves are manipulated, even as they heap scorn on the purely promotional efforts of various companies. Hello--if you aren't sufficiently experienced so as to know how to *discount* PR, regardless of its source, and go straight to a thorough analysis of the products you review, then you prove only that you are not qualified to do the reviews you are attempting to do. OK, well what's a "thorough" analysis?

What it isn't is illustrated brilliantly by all of the HD 3870 reviews I've read on the Internet to date--I can't think of a single exception. If there are exceptions to what I'm about to say, please, someone point me in that direction...I couldn't find a single review of these products that did anything apart from reach all of its major conclusions based on the handful of i[benchmark frame-rate bar charts]i the reviewer reportedly ran.

It wasn't that long ago that the real benchmark by which all other 3d-card reviews could be judged was established right here on B3d, where lots and lots of comparative information was compiled along with benchmark frame-rate bar charts. More or less, the amount of information compiled and presented was so plentiful and varied that frame-bar charts assumed their proper position in the scheme of things--a minor position.

What about image quality as a consideration of benchmark frame-rate bar chart results? Somebody point me to a recent review where that subject is approached in anything resembling a thorough manner. I cannot recall when I read the last such 3d-card review, except to say that it was most likely right here on B3d. Long ago.

So what happened? Why are we back the stone-age of cookie cutter gpu hardware reviews which essentially all follow the following basic format:

1) Reviewer begins with an "overview" that consists entirely of his opinion on where various companies and products stand today in relation to each other. Most reviews these days begin with this sort of mini-editorial rant that ads nothing to the review itself.

2) Pages and pages of the obligatory PR regurgitation that the companies who make the products being reviewed supply to the reviewer--slides, diagrams, and whatnot. At this point the reviewer is repeating exactly what the PR arm of the company wants him to repeat and publicize.

3) Selected benchmark frame-rate bar charts. Maybe six to 10 games are selected. Sometimes, only three or four.

4) Conclusion--based primarily on the handful of benchmark frame-rate bar charts the reviewer elected to run and publish.

And viola! Another cookie-cutter 3d-card review is published.

Really, 1&2 above don't bother me as much as 3&4. The problem I have with this template is that for me, image quality is my greatest concern, but in terms of the reviews I've read they either do not mention IQ at all, or only mention it by saying something silly like, "We couldn't see any difference in the image quality we observed between the products"--which is the same as to say to me, "We didn't care to look for any such differences in doing our review."

Excuse me, but reviewers can easily detect differences in performance and power consumption, for instance. That's probably because they looked for them, if you know what I mean...;) Yea, that's got to be it.

Is it really logical to think that when doing a comparative review of products designed and engineered by different companies, products which are using software drivers written and compiled by different programmers for the respective hardware to which they belong, that the image quality produced by all of the products tested always should be the same? I guess if you think it's also reasonable that there should be "no observable difference" between the power consumption and performance of these products then you would be nothing except consistent by saying that comparative IQ among them is all the same, too. But nobody does that, do they? Nope, it's only on the subject of IQ that people consistently err and get away with representing a zero difference between all of these products.

I think that it is no more rational to expect a zero difference in the image quality these contrasted and compared products produce than it is to expect them to also achieve the same frame-rates in the benchmarks used or to achieve the same levels of power consumption. Why? Because they are different from each other, that's why. Sometimes, they are much different from each other, even if they are the gpus manufactured by the same company. Is there, for instance, zero IQ difference between GF6, GF7, and GF8 products? According to the reviews I've read recently--nope, there's no difference deserving of any observation and comment whatever. But let's not let that stand in the way of publishing our frame-rate bar charts anyway.

Last, I really think that only a novice would not know that there is an inverse relationship and correlation between IQ and frame-rate performance. There is, always has been, and always will be. Here's a clue: if a tested gpu doesn't perform as well as a competitor in terms of frame-benchmark results, when you objectively might think that they theoretically should be performing closely, or else that the loser in the match in terms of frame rates ought to actually be the winner, then it is at that point that an exhaustive and thorough look at IQ needs to be conducted. A reviewer who neglects a serious look at IQ during the normal part of any 3d-card review might as well announce that he's decided to wear a blindfold and use a braille keyboard to conduct and write his review, as far as I am concerned. Such reviews are just about worthless to me as a 3d-card consumer.

I could go on and on about it--things like the odd resolution choices reviewers use when there's nothing about the reviewed products that would prevent him from using *many* other resolutions and publishing those results, too. How many bar charts are published with frame-rate results from tests using no AA and AF? (Answer, quite a few.) I mean, even on merely such superficial levels as resolution choices and filtering settings it is obvious that many reviewers today simply could care less about image quality and do not consider it a relevant observation to make when reviewing 3d cards. Considering that 3d is all about what we see on our screens that's nothing short of remarkable. The fact is that one driver may be producing noticeably superior IQ but accordingly *inferior* frame-rate performance as a direct result. That's 3d 101. If a given reviewer doesn't look at that as thoroughly as he looks at frame-rate performance and power consumption then you can take his review, crumple it up, and toss it in the trash, for all the good it will do you.

So, to all the pundits who consider themselves of such an elevated position in the scheme of the 3d industry to be able to heap scorn on the companies responsible for creating the very 3d markets their sites cater, I have but one thought: it's best not to throw stones if you live in a glass house...;)
 
Wow Walt, that was a heck of an editorial yourself!

When I wrote my editorial, it was not to bash AMD marketing, but rather explain (from my obviously outside perspective) what I see going on there. CPUs and GPUs are so entirely different in the way the products are perceived as well as their overall function, that marketing for the two are quite diverse.

Now, you have a good point in that to educated users, Marketing plays a very small role. But you forget that the vast majority of users out there are not entirely well educated. B3D is probably one of the most tech savvy groups of forum users out there, so this type of analysis is not terribly interesting to them, as the users here really want the meat and potatoes of tech.

Anyway, a few AMD guys have come to me saying that it isn't the way it is going down internally, but when looking from outside the perception is there. Hopefully they can change that perception.
 
uh, Walt, you know there's absolutely no difference in image quality between R600 and RV670, right?
 
That complaining about the lack of IQ analysis when it is identical to a previously released product and as such has already been covered in-depth in the preceding reviews is somewhat pointless?
Shhhh, he's trolling WaltC. :p
 
My post is going to reorder the sections I've read from your post with respect to my write sequence (something K10 can't do with unresolved write addresses, OH BURNINATED!!!!).

I think I've been saying "certainly lower IPC, probably lower clocks" for nearly 6 months before the Barcelona benchmarks, heh. And I don't think I'm the only one either. The data has been there for quite some time if you knew how to interpret things.

There were a number of posters who believed as such, though would pointing that out count as an "I-told-you so-neener-neener" kind of post?

I think the IPC hints were pretty strong once AMD disclosed the cache design and capacity, the basically unchanged integer pipeline, and the various optimizations that were almost, but not quite as strong as Core 2.

The clock speeds angle was pretty certain, though I think there was some surprise at just how far down the clocks fell, relative to 65nm K8.

Well I'm thinking significantly before the Barcelona benchmarks here - obviously it's even more amazing some people gave the benefit of the doubt after those...

I am somewhat surprised by the lack of dual-core offerings by this point.
Early this year, I had initially thought the primary clocking barrier would have been thermal, so I had placed my bet on Kuma clocking up past 2.8 GHz, so that AMD would at least be somewhat competitive with the upper mid-range of Intel's lineup of dual cores.
AMD's process has shown problems in scaling frequency, and it does lag notably in certain metrics compared to Intel's, but I had expected K10 to have taken some measures to improve on the K8 steppings.

I thought the initial Barcelona numbers that showed the TDP climb quickly with the higher speed grades backed that up, though chatter since then and the number of speedpath rumors+speedpath errata has made me shift position somewhat.

I now believe that TDP limits are present, but that K10's limited redesign of a pipeline that was set down long before signal integrity issues had become so significant created a whole host of problems that are suppressing scaling even before heat comes into play. The speedpath rumors and the L3 errata are some of the data points that have made me re-sort my list of AMD's CPU gremlins.

This unfortunately is bad news for Kuma, which might be why AMD has delayed its dual cores.
 
I am somewhat surprised by the lack of dual-core offerings by this point.
I'm not. Consider that K10 doesn't offer too big per-clock performance advantage over K8 and dualcore K10 would be much bigger than dualcore K8. With dualcore they don't have nearly as much benefit of ebing native than with quadcore, though even there it doesn't seem to help too much.

Why would AMD produce a much bigger die with very little performance advantage instead of their old and proven dualcores?
 
I'm not. Consider that K10 doesn't offer too big per-clock performance advantage over K8 and dualcore K10 would be much bigger than dualcore K8. With dualcore they don't have nearly as much benefit of ebing native than with quadcore, though even there it doesn't seem to help too much.

Why would AMD produce a much bigger die with very little performance advantage instead of their old and proven dualcores?

Because if they had been able to upclock Kuma to 2.8 GHz or more, they would have been able to take the pricing brackets currently occupied by 90nm A64s. If it hit 3.0 or more, it would have been possible to take another pricing bracket above that.

Those 90nm cores are 200 mm2 per die, and Barcelona with 4 cores is 280 mm2.
A dual-core variant would be in the same size range or smaller than AMD's current top of the line dual cores.
 
Because if they had been able to upclock Kuma to 2.8 GHz or more, they would have been able to take the pricing brackets currently occupied by 90nm A64s. If it hit 3.0 or more, it would have been possible to take another pricing bracket above that.
Performance wise I would expect 2.8GHz K10 to be roughly equal to 3.2GHz K8 in all but SSE workloads so it couldn't do more but to replace the old products, it wouldn't make for a new price bracket.
Those 90nm cores are 200 mm2 per die, and Barcelona with 4 cores is 280 mm2.
A dual-core variant would be in the same size range or smaller than AMD's current top of the line dual cores.
True, but they still have to do something with their 90nm toolset. They cannot simply replace it in a couple of weeks.

Also 90nm likely yields much better than 65nm so I doubt it would be that much cheaper to replace them with anything based on K10. If they could replace 90nm immediately then I think it would be smarter to start selling 65nm K8 x2's with higher than 65W TDP, they should be able reach 3GHz+ with 125W.
 
Performance wise I would expect 2.8GHz K10 to be roughly equal to 3.2GHz K8 in all but SSE workloads so it couldn't do more but to replace the old products, it wouldn't make for a new price bracket.
Your estimation does not contradict what I said, that Kuma at 2.8 could replace the current FXs and that going above 3 GHz would make it equivalent to a speed grade above the current top 90 nm chips.

True, but they still have to do something with their 90nm toolset. They cannot simply replace it in a couple of weeks.
The 90nm equipment is already shut down and possibly already sold or it is soon to be sold.
It's not clear how recently AMD scaled back 90nm production, but it likely went down fast once AMD announced it was cutting production.

Also 90nm likely yields much better than 65nm so I doubt it would be that much cheaper to replace them with anything based on K10. If they could replace 90nm immediately then I think it would be smarter to start selling 65nm K8 x2's with higher than 65W TDP, they should be able reach 3GHz+ with 125W.

They would be replacing about 2 speed grades of 90nm chip with a product that, if it could have clocked to 3GHz+, could have been 4 speed grades above the maximum that could be reached by the 65nm Brisbanes.

This debate is moot, however. Like I said, I was surprised in the past tense.
(edit: nevermind, I see that was not the tense I used. I originally thought the Brisbane A64s were at their max planned clocks, and that Kuma was designed to take over at that upper range.)
It seems clear that AMD is having difficulty scaling anything to high clocks on its 65nm process.
I thought at first this was a thermal problem, but it seems the process or design itself is not allowing good timings at acceptable voltages.

As I learned more recently, AMD's 65nm process is not an improvement over 90nm in some areas. In particular, its gate oxide thickness went up, a decision that likely plays a significant role along with possible signal integrity issues in why 65nm still can't reach 90nm clocks and why it took so long to bring the voltages down for 65nm chips.
 
Last edited by a moderator:
The 90nm equipment is already shut down and possibly already sold or it is soon to be sold.
Are all currently sold 90nm parts just some old stock then? It could be, considering the following news:

AMD Resurrects K8 Architecture for 2008 Roadmap
The company will introduce eleven 65nm K8 processors over the next two quarters


Essentially, AMD will move any remaining Athlon 64 processors from the 90nm node to the 65nm node, with a few new frequency and TDP variations.
As I was saying, dualcore K10 doesn't make much financial sense if they can produce similarly clocking K8 dualcores.
 
They're moving more Athlons to 65nm? For the love of God, would someone go explain to them already what a kick ass value proposition a 939 6000+ would be?
 
They're moving more Athlons to 65nm? For the love of God, would someone go explain to them already what a kick ass value proposition a 939 6000+ would be?

Once I learn to speak incompetent fluidly, I will get right on it.
 
Back
Top