AMD: Volcanic Islands R1100/1200 (8***/9*** series) Speculation/ Rumour Thread

As for DP, it's for HPC, not for gamers.

And people besides gamers are buying these cards and commenting in these forums.

For me 1:8 is a massive fail for the 290X. Two 280X's are much more attractive given the prices, but means I have to compromise and miss out on the new compute features and true audio. I do play games as well but even my hd 7770 meets almost all my game graphic performance needs.
 
Yeah, the real interesting products will be in 2015 with that tech combination.
 
And people besides gamers are buying these cards and commenting in these forums.

For me 1:8 is a massive fail for the 290X. Two 280X's are much more attractive given the prices, but means I have to compromise and miss out on the new compute features and true audio. I do play games as well but even my hd 7770 meets almost all my game graphic performance needs.


You was complain the same about the 780 or the 680 ????

For me too this sad.. something i liked with the 7970 was this powerfull 1Tflop DP.... so dont misundestand me... But i can understand for identify the pro market or just for save power they have do it.. ( with 5.6tflops SP at 1:4 or 1:3 ( thaiti was initially made for 1:2, but it have end at 1:4 ) .. this card will be a monster for DP Flops.

Anyway, lets wait the 20nm.. i think AMD have some surprise for us on Pirate Island on the DP side ..
 
It's not a marginal improvement like 4K is over 1080p with good AA. The difference between headphones with stereo sound and full positional/reverb etc...should be night and day. I'd say I can't wait except I'm running NV so I won't get to experience this at all :cry:

I am totally not convinced that Truaudio is going to go anywhere... Partly because middleware audio engines are more sophisticated than you think, and partly because the masses don't seem to care enough about fancy audio to spend extra for it for gaming. And partly because I still use my 10 year old Audigy 2 for an amazing 5.1->headphone downmix (i.e. it's not new or unavailable by any means).
 
It's not a marginal improvement like 4K is over 1080p with good AA.

Have you seen a large 4K screen running native material up close and in person or are you just running with preconceived notion like most people pretty much always are when it comes to something new?

edit: was the jump from iPad 2 to iPad 3 also marginal in your opinion?
 
Last edited by a moderator:
Have you seen a large 4K screen running native material up close and in person or are you just running with preconceived notion like most people pretty much always are when it comes to something new?

edit: was the jump from iPad 2 to iPad 3 also marginal in your opinion?

That's the problem. ~50'' is about the biggest size tv you can fit in most homes and people will easily sit 2 ~ 3 meters from the screen. At that point it becomes difficult to see any difference between 4k or 1080p when watching a movie or playing a game.

When there is a lot of text involved it's different become the extra sharpness helps a lot which is why phones/tablets benefit a lot from the higher resolutions, even more because people keep use them quite close to their eyes.
 
That's the problem. ~50'' is about the biggest size tv you can fit in most homes and people will easily sit 2 ~ 3 meters from the screen. At that point it becomes difficult to see any difference between 4k or 1080p when watching a movie or playing a game.

When there is a lot of text involved it's different become the extra sharpness helps a lot which is why phones/tablets benefit a lot from the higher resolutions, even more because people keep use them quite close to their eyes.

Moving closer to a screen doesn't require more space in the house, but less. Even a regular desktop can easily have say 40" screen on it. If one sits/views too far away from the display it doesn't mean that the benefit of the display tech is marginal, but that your other setup compromises it too much and compromises it without valid reasons. Big screens don't have to viewed (literally and figuratively) as TVs in the living room have been viewed in the past and monitors don't have to be small if there is more resolution. You just have to combine the best parts of few different worlds and have a new better end result.
 
Intel has them in volume production in 2013. Come on AMD, can't you get her before Volta?

This might more properly belong in a Pirate Islands thread, but I think AMD will have interposers in 2014. It was supposed to happen in late 2012, but evidently it didn't.

AMD_roadmap_tiran.jpg


I don't know why Tiran was scrapped. Maybe the technology just wasn't mature enough to guarantee decent yields, maybe it was too expensive, maybe AMD's progress with memory PHYs made it a bit less attractive.

But if you look at Hawaii, it's pretty clear to me that it needs to happen for the next big GPU. I mean, suppose its 20nm replacement has 64 CUs (4096 shaders), where is it going to get its bandwidth? AMD is already running a 512-bit bus with 5 GT/s RAM. Presumably they could push that to 6, maybe even 6.5 GT/s, but that would only be 30% more bandwidth at a very significant power cost. The alternative would be a wider bus, but 768 bits? That would be pretty damn costly, not to mention very large on the die.

So it seems to me that interposers need to happen concurrently with the jump to 20nm.
 
Have you seen a large 4K screen running native material up close and in person or are you just running with preconceived notion like most people pretty much always are when it comes to something new?

I don't know if games will have sufficiently high resolution assets to take full advantage of 4k. Native 4k video running on a 4k screen is mind blowing though. It almost feels like you can reach out and touch the scene.
 
I am totally not convinced that Truaudio is going to go anywhere... Partly because middleware audio engines are more sophisticated than you think, and partly because the masses don't seem to care enough about fancy audio to spend extra for it for gaming. And partly because I still use my 10 year old Audigy 2 for an amazing 5.1->headphone downmix (i.e. it's not new or unavailable by any means).

Truaudio shouldn't need more investment from the masses, it should work with any sound system, including cheap headphones.
 
Moving closer to a screen doesn't require more space in the house, but less. Even a regular desktop can easily have say 40" screen on it. If one sits/views too far away from the display it doesn't mean that the benefit of the display tech is marginal, but that your other setup compromises it too much and compromises it without valid reasons. Big screens don't have to viewed (literally and figuratively) as TVs in the living room have been viewed in the past and monitors don't have to be small if there is more resolution. You just have to combine the best parts of few different worlds and have a new better end result.

I don't know where you live but in the average European and Asian house you cannot fit a 40'' monitor on a desk. Even if you could it would be too big to be practical for most people as you just cannot see the whole screen at once.

Same goes for the tv, sure you can put the couch in the middle of the room to get closer but now try getting your wife/girlfriend to allow you to do that (as it ruins your living room). If you have kids it's impossible because you can't just have one sofa infront of the tv.
 
I don't know if games will have sufficiently high resolution assets to take full advantage of 4k. Native 4k video running on a 4k screen is mind blowing though. It almost feels like you can reach out and touch the scene.

Yeah this is somewhat of an issue, but it should get better and even the best games today look very very impressive on a 4K screen. But time isn't quite ready for it to properly shine yet, a couple of things still need do fall into place, but things are moving quickly and only accelerating. 12 months from now there should be cheap panels with HDMI 2.0 out there and then we are a lot closer for 4K to way more relevant. edit: Next gen consoles will also help bringing the base quality of assets way up and PC can only improve from there.

I don't know where you live but in the average European and Asian house you cannot fit a 40'' monitor on a desk. Even if you could it would be too big to be practical for most people as you just cannot see the whole screen at once.

Same goes for the tv, sure you can put the couch in the middle of the room to get closer but now try getting your wife/girlfriend to allow you to do that (as it ruins your living room). If you have kids it's impossible because you can't just have one sofa infront of the tv.

None of these are imo real issues, you just have to plan and think things a bit. Some room/situations might need a 4K projector and a screen that can be hidden and imo if you can't fit a 40" screen somewhere in the house on a stand or wall mounted and sit about a meter away from it then you simply aren't trying. You don't need much more than 2m2 to have a working setup with a wall mount. Just one example.

The real issues for 4K tech for gaming are some aspect of maturity of the display tech and costs, costs of the display and cost to run that display. Everything else can imo easily be worked around.

If one doesn't like or understand something he'll be quick to come up with obstacles, but if one does he quickly removes them. :)
 
Last edited by a moderator:
Yeah this is somewhat of an issue, but it should get better and even the best games today look very very impressive on a 4K screen. But time isn't quite ready for it to properly shine yet, a couple of things still need do fall into place, but things are moving quickly and only accelerating. 12 months from now there should be cheap panels with HDMI 2.0 out there and then we are a lot closer for 4K to way more relevant. edit: Next gen consoles will also help bringing the base quality of assets way up and PC can only improve from there.



None of these are imo real issues, you just have to plan and think things a bit. Some room/situations might need a 4K projector and a screen that can be hidden and imo if you can't fit a 40" screen somewhere in the house on a stand or wall mounted and sit about a meter away from it then you simply aren't trying. You don't need much more than 2m2 to have a working setup with a wall mount. Just one example.

The real issues for 4K tech for gaming are some aspect of maturity of the display tech and costs, costs of the display and cost to run that display. Everything else can imo easily be worked around.

If one doesn't like or understand something he'll be quick to come up with obstacles, but if one does he quickly removes them. :)

In my experience, it isn't a lack of understanding. It just doesn't matter enough to many people for them to go out of their way to accommodate an optimal media setup. I have had trouble convincing people to set up their surround sound speakers in proper positions. I fully explained it to them and they understood, they just didn't care. Not enough to rearrange their furniture anyway.
 
This might more properly belong in a Pirate Islands thread, but I think AMD will have interposers in 2014. It was supposed to happen in late 2012, but evidently it didn't.

AMD_roadmap_tiran.jpg


I don't know why Tiran was scrapped. Maybe the technology just wasn't mature enough to guarantee decent yields, maybe it was too expensive, maybe AMD's progress with memory PHYs made it a bit less attractive.

But if you look at Hawaii, it's pretty clear to me that it needs to happen for the next big GPU. I mean, suppose its 20nm replacement has 64 CUs (4096 shaders), where is it going to get its bandwidth? AMD is already running a 512-bit bus with 5 GT/s RAM. Presumably they could push that to 6, maybe even 6.5 GT/s, but that would only be 30% more bandwidth at a very significant power cost. The alternative would be a wider bus, but 768 bits? That would be pretty damn costly, not to mention very large on the die.

So it seems to me that interposers need to happen concurrently with the jump to 20nm.

That schedule is from 2009. Hardly a predictor for current products considering all the mess AMD has been in.

Yeah, the current limits for bus width and clock rates have been exhausted. GDDR6?
 
That GDDR6 is what I was thinking about too.

But even if they stay with a 512-bit MI which can work with faster GDDR5 memory frequencies and achieve that 30% higher memory bandwidth, we can't be sure how much exactly that "at the cost of significant power increase" is and whether or not AMD can improve efficiency wise things in such a way that 30% more would be pretty enough.

However, I doubt they will dare to go this big chip on 20 nm, so a 512-bit could be a no-go because of this. In such a case, the upper thoughts can be neglected.
 
Truaudio shouldn't need more investment from the masses, it should work with any sound system, including cheap headphones.

The investment is an AMD GPU. You can't even get it separately, which would help with adoption. Truaudio might help AMD sell GPUs as a neat bullet feature though. I bet the hardware is miniscule on 28nm so cost-benefit may turn out to be glorious for them.
 
That GDDR6 is what I was thinking about too.

But even if they stay with a 512-bit MI which can work with faster GDDR5 memory frequencies and achieve that 30% higher memory bandwidth, we can't be sure how much exactly that "at the cost of significant power increase" is and whether or not AMD can improve efficiency wise things in such a way that 30% more would be pretty enough.

However, I doubt they will dare to go this big chip on 20 nm, so a 512-bit could be a no-go because of this. In such a case, the upper thoughts can be neglected.

Well, looking at the maxwell speculation [http://pastebin.com/jm93g3YG], which sounds pretty reasonable if you ask me, may be we'll get more caches instead? :D
 
Anyone see a review where all the voltage bins are listed?

Unless someone finds a way to full dissect the voltage table in the Hawaii BIOS, you won't be seeing that information any time soon. AMD switches between too many states far too quickly, so we can't use logging to find the actual states. What gets reported to logging tools is the average clockspeed over the tick. And on the voltage side there aren't currently any tools (that I'm aware of) that can see the VIDs

28akg1c.jpg


255 voltage steps between 0 and 1.55V through 6.25 mV voltage granularity

http://anandtech.com/show/7457/the-radeon-r9-290x-review/5

So, if you take these values from the table:

t643dk.jpg


http://www.techpowerup.com/reviews/AMD/R9_290X/32.html

you can calculate it on your own and have an approximate idea on how exactly things work
 
Back
Top