AMD: Volcanic Islands R1100/1200 (8***/9*** series) Speculation/ Rumour Thread

Well, in reality, the W9100 have replace the W9000, and the W8100 replace the 8000.. its not really like they are in the same series. The old generations gpu was mostly in the table for compare. You dont want buy a 3999$ W9000 if for the same price you can buy a W9100..
I think they have keep the W9xxx and 8xxx just because there's the S10'000 who is allready existing with 2x Thaiti cores .. Then i ask me if they will be a S series gpu with 2x W8100-W9000 cores .. ( imagine the monster, in all metrics )

Anyway for a "simple" workstation for CAD 2x or 4x W8100 should be the sweetspot price / performance ratio compared to a W9100 based system. ( but what a monster ). ( a bit more affordable, 5-6000$ less ).

With this 1/2 DP rates, they put the bar really high when you start to align the gpu's in a system...
 
Last edited by a moderator:
Kaveri's die shot wasn't very clear, that's true. But there were die shots for Llano, Trinity, and I assume that Richland does look identical to Trinity.
The downward slope in die shot quality is illustrated by the change in behavior for the last APUs versus the earlier ones.

There are also die shots for Bobcat, Kabini and Beema, as well as Zambezi and Vishera.
I do grant that the low-power ones do get more photographic attention, but Zambezi and Vishera are from years ago and represent the endpoint for when things used to be better (although there was rampant photoshop BS with Zambezi pre-launch).
 
The downward slope in die shot quality is illustrated by the change in behavior for the last APUs versus the earlier ones.


I do grant that the low-power ones do get more photographic attention, but Zambezi and Vishera are from years ago and represent the endpoint for when things used to be better (although there was rampant photoshop BS with Zambezi pre-launch).

What do you mean by "change in behavior"?

And "when things used to be better"? It seems to me that AMD's current position is better than when the pretty catastrophic Zambezi was introduced.
 
This is specific to die shots. AMD's CPU side has historically been quite active in getting out die shots, and even though the Zambezi pre-launch shots were heavily shopped, a die shot of decent quality was eventually provided.

Its GPU products are less consistently published, and APUs are spotty. The micrograph I'd seen for Bobcat was the false-color one through metal layers, although I did find one with more detail more recently. Kaveri reverts to that behavior, and Richland is uncertain because it's not clear if the silicon is different. Kabini versus Beema is interesting in the sense that it's closer to a Trinity-Richland transition, but AMD did provide an image, which may reinforce how little Richland changed things.
 
This is specific to die shots. AMD's CPU side has historically been quite active in getting out die shots, and even though the Zambezi pre-launch shots were heavily shopped, a die shot of decent quality was eventually provided.

Its GPU products are less consistently published, and APUs are spotty. The micrograph I'd seen for Bobcat was the false-color one through metal layers, although I did find one with more detail more recently. Kaveri reverts to that behavior, and Richland is uncertain because it's not clear if the silicon is different. Kabini versus Beema is interesting in the sense that it's closer to a Trinity-Richland transition, but AMD did provide an image, which may reinforce how little Richland changed things.

Ah, understood. I admit I find this puzzling. I don't understand why some companies release die shots while others don't, or why some only publish obfuscated ones. I mean, I'm pretty sure Kaveri's die was analyzed in detail by AMD's competitors as soon as it became available in retail anyway, so why the secrecy?

NVIDIA's policy is easier to understand, and I guess Intel doesn't have much to worry about anyway.
 
Ah, understood. I admit I find this puzzling. I don't understand why some companies release die shots while others don't...

One reason is that it is simply a PITA! To get good actual die shots you need to have wafers in the process that you can pull from the line it at the right point in time; there's costs, time, management and all kinds of faff involved, especially when dealing with 3rd party fab around the other side of the world.
 
One reason is that it is simply a PITA! To get good actual die shots you need to have wafers in the process that you can pull from the line it at the right point in time; there's costs, time, management and all kinds of faff involved, especially when dealing with 3rd party fab around the other side of the world.

The stuff that Chipworks does seems pretty decent. Cost issue :?:
 
The Wii U GPU die-shot they released quite some while ago now completely lacked the punch of traditional die-shots. It had residue of metal layers gunking it up, and so on. Would seem getting a beauty shot off of a die is fairly non-trivial if even a company specializing in analyzing such shots can't make really good ones off of production chips...
 
Well, it's something rather than nothing... And then we wouldn't necessarily have to question photoshopped areas of the die (somewhat defeating the purpose of the die shot). Aren't folks just looking for the general layout and block structures? There were die shots they did for Xbox One & Playstation 4 & even Tahiti & Apple ICs etc. that seemed adequate for folks to chew on. I'm not sure the fancy rainbow colours are important.
 
One reason is that it is simply a PITA! To get good actual die shots you need to have wafers in the process that you can pull from the line it at the right point in time; there's costs, time, management and all kinds of faff involved, especially when dealing with 3rd party fab around the other side of the world.

Thanks, that's been bugging me for years!
 
One reason is that it is simply a PITA! To get good actual die shots you need to have wafers in the process that you can pull from the line it at the right point in time; there's costs, time, management and all kinds of faff involved, especially when dealing with 3rd party fab around the other side of the world.
I remember the days of decapping chips so you could FIB them (use a Focused Ion Beam to make a fix for testing). I don't recall how much things cost back then but I imagine it's tougher now.
 
I remember the days of decapping chips so you could FIB them (use a Focused Ion Beam to make a fix for testing). I don't recall how much things cost back then but I imagine it's tougher now.
You don't do that anymore: if you need a FIB, you take a fresh die, FIB it, then mount it on the package.

But note the 'if': you have to be really desperate to actually go through with it. It's extremely costly, and the yield is very low.
 
They are talking about R280 levels of performance, not that interesting.

Does anybody here have experience with the Gigabyte R290 windforce (oc)? After changing my mind many times I found this card for 45000 yen while the R280X I planned on buying goes for 36~40K so the price difference is small enough to justify spending a little bit extra for a R290.
 
They are talking about R280 levels of performance, not that interesting.
Personally I think this performance bracket to be way more interesting than hawaii/gk110. In particular if it improves efficiency (most importantly of course perf/w though other areas like bandwidth efficiency or perf/area shouldn't be neglected) - of course, compared to pitcairn, not tahiti. At least I'd hope it is an improvement...
 
Soooo I did it. Just bought a Gigabyte R290 (non x) windforce OC.

Due to price I was initially thinking about a R280X but the price difference on this card was small enough to make it worth the extra cash. (payed 45.000 yen for this card, good r280x cards are 35 ~ 40.000 yen). For some reason all other R290 cards cost way more than this model.

Anyway so far I'm happy with it. Didn't play any games yet except for the Tomb Raider benchmark but the card is pretty silent, doesn't seem to be more noisy at idle than my 560TI DCII and at ~42 degrees it's running pretty cool as well considering I'm sitting in my chair only wearing pants with the aircon on and I'm still sweating.

During the couple of minutes I benchmarked the card it didn't seem much more noise than my 560TI though the sound appears to be a bit higher so it's slightly more annoying than the lower noise of the 560.
 
R9 290 consumes about 100W more power than a 560 Ti if you load it down with a complex graphics fest. ;) I think it's beyond everyone's favorite ridicule target, GTX 480. But that Windforce cooler is very nice and leaves the right impression.
 
R9 290 consumes about 100W more power than a 560 Ti if you load it down with a complex graphics fest. ;) I think it's beyond everyone's favorite ridicule target, GTX 480. But that Windforce cooler is very nice and leaves the right impression.

The GTX 480 could sometimes exceed 300W, so no, it's not quite that bad.
 
Back
Top