AMD: Volcanic Islands R1100/1200 (8***/9*** series) Speculation/ Rumour Thread

Also from maximumpc:

According to Nekechuck, even though the R9 290X uses a 438 square mm die, which is significantly smaller than the Titan’s 550 sq. mm GK110 offering, it “will definitely compete with the GTX 780 and Titan.”

Key word is "compete" when we're not in Mantle mode. Good enough, but without Mantle it may not be a GTX 780 killer.
 
I think without mantle it should be able to take it. I think BF4 employs global illumination I think because it uses direct x 11.1. This has always been a performance killer on Nvidia cards. Enough to slow down a gtx 780 enough that a 7970 ghz beats its or catches it.

As far as this Mantle thing, I think if it somehow catches on and the performance is there, Nvidia would be dumb not to do their own version of it and use their much higher cash reserves to spread it more prolifically than AMD. They need a computer hook more than ever since games are going to be coded for GCN.
Only when GI is done via compute shaders , DICE has never done that before in BF3 and they didn't announce it too. In fact only one studio did it as far as I remember (Codemasters).

One of Epic's developers announced a while back that they have seen the most amazing thing from NVIDIA , and it's not related to GPUs .. I think we can now assume that they MIGHT be preparing an answer , considering the very close ties between them and the guys at the Unreal engine.
 

That is what AMD simply put will put to good use while some see the 290x as, ok similiar to the 780, add Mantle and you gear into above Titan territory for half the price while still having compute on the card....
so here we have a situation where the 290x and maybe even the 7970/280 are doing way better in a few months due to the new API.
smart move.

I as a gamer I admit soon 50 I am a gamer I get better sound immersion after Microsofts removal from hardware that been lacking and more fps with my card making it go on for longer for me if I play BF4 which I am highly likely to play and this makes me as a gamer, happy.
The way they AMD are going about this allows a whole new direction of gaming where the developer can fine tune the sound and the performance allowing a deeper and richer set of visuals and sounds that make a PC game, a PC game for real.

If I had planned that tech day the presentation had been very different.
Those guys are good at what they do but presenting they would need a overhaul with.

Now I understand why so many have been hired by AMD from other competitors.
its a gamechanger
 
Mantle could be a gamechanger. In the hands of Nvidia or Intel it would be, as throwing money at devs is probably the key to it's success. We'll see just how serious AMD is about it soon enough.

Some more info is required though - can it work complimentary to DX11 or is it an all or nothing gig? If it can be used simultaneously it's surely a massive win for AMD. If not, that's something they have to be working on.

There is so much potential here for AMD to score a huge victory. Steambox and just plain old Linux seem destined to adopt it asap. Even Microsoft could be forced into adopting it or integrating into DirectX somehow. How it will work with the consoles is another question - just how close to a console API is it? Close enough that ports are simplistic? If so that's another huge win.
 
I figured one way for those leaks benches to be true.

http://translate.google.com/translate?sl=auto&tl=en&js=n&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fwww.expreview.com%2F28411-all.html

In this review the clocks of 290x is 1020 mhz.

I just calculated the frequency of the the frequency based on 5TF to be about 885mhz. Based on 5TF/ 2/ 2816/1000000 = 885mhz

So if we divide 885 by 1020 we get 86.76% If we multiply all the scores in that leak by that amount, we get something for the most trades blows with a gtx 780 but pretty much consistently loses to titan.

Considering scaling doesn't work so well in regards to frequency for firestrike, it would explain the less than 8000 firestrike score.

So at stock clocks this things matches up with a gtx 780.
 
Well here could be the answer to one of my questions...

http://www.anandtech.com/show/7371/understanding-amds-mantle-a-lowlevel-graphics-api-for-gcn

...what becomes increasingly hinted at as we read through AMD’s material is not just that Mantle is a low level API, but rather Mantle is the low level API. As in it’s either a direct copy or a very close derivative of the Xbox One’s low level graphics API.

Holy mother of God can you imagine what is going to happen if this turns out to be the case?
 
... because they said "will definitely compete with", not "will definitely be faster than". :?:

You put a little bit word on his mouth there.... If he have say "it will definitively be faster of the 780", some will have conclude by " so not faster of Titan ..".. Lets wait and see..

Im not too worry about Firestrike score... i dont remember when a card have shown is full potential in 3Dmark benchmark with early driver or release press driver.
 
... because they said "will definitely compete with", not "will definitely be faster than". :?:
Compete with the 780/Titan = slower than 780 in your eyes?

It seems to me that it will compete with the Titan in performance and compete with the 780 in price(complete speculation).
The sentence you quoted does not contain enough information to jump to the conclusion you did.
 
It seems to me that it will compete with the Titan in performance and compete with the 780 in price(complete speculation).
The sentence you quoted does not contain enough information to jump to the conclusion you did.

Fair enough, Wynix. Anyway, we shall soon see. I hope it will be plenty fast.
 
Holy mother of God can you imagine what is going to happen if this turns out to be the case?
I wonder how involved MS is with XBO's low level graphics API. If it's entirely AMD's baby, then they could very well have a similar API for PS4 as well.

If that's the case, then even multiplatform titles will have reason to use Mantle, and you could well be right about the huge implications.

Remember when NVidia said they basically let AMD take the console business?
http://uk.gamespot.com/news/ps4-not-worth-the-cost-says-nvidia-6405300
That could really bite them in the ass now. I wonder if they burned any bridges with Sony due to PS4 getting a rather outdated architecture and NVidia trying to milk profits. Didn't they have a dispute with MS with the first XB?
 
I figured one way for those leaks benches to be true.
In this review the clocks of 290x is 1020 mhz.
I just calculated the frequency of the the frequency based on 5TF to be about 885mhz. Based on 5TF/ 2/ 2816/1000000 = 885mhz

No. It's >1Ghz at stock, that's more or less a given from the >4G tris/sec slide. (or come up with a good explanation of them doing 5 or 6 triangles/clock, instead of the doubling from 2 to 4 ;) )
 
Remember when NVidia said they basically let AMD take the console business?
http://uk.gamespot.com/news/ps4-not-worth-the-cost-says-nvidia-6405300
That could really bite them in the ass now. I wonder if they burned any bridges with Sony due to PS4 getting a rather outdated architecture and NVidia trying to milk profits. Didn't they have a dispute with MS with the first XB?

I have seen quotes from Nvidia saying how they "let AMD win the consoles" and how it wasn't worth it to make a chip, but who else could have supplied a combined CPU/GPU with the levels of performance required?

Both MS and Sony realised that it was a pain in the ass to create the tools for custom hardware so that developers could create code and content. It was actually a barrier to games development. By going to x86/x64 and a PC-style GPU on the same chip, most of the heavy lifting is already done. The console devs simply transition to the tools and ecosystem that already surrounds the PC space.

Even Intel, had they been willing to make such a chip, would have wanted too much cash to do that instead of their profitable CPU lines, and Intel would not have been able to provide anything more than a mid-level GPU on the same chip.

Nvidia on the other hand, could provide a GPU, but not a x64 CPU.

There really was no other choice than AMD to bring the lost console sheep back into the fold of the glorious master gaming race of the PC.
biggrin.gif
 
nVidia has long been rumored to have an x86 chip in the works. At least I think I remember them having access to an x86 license somehow, leading to lots of speculation along those lines. But they still could have created a high performance ARM-based solution if they really wanted to compete for the console contracts.
 
Remember when NVidia said they basically let AMD take the console business?
That was a "need to save face" statement. It wasn't too long before that, prior to the confirmation that AMD was in 'em all, Jensen was stating "we'll be in one of them, right, because its impossible for one company to do three".
 
Even Intel, had they been willing to make such a chip, would have wanted too much cash to do that instead of their profitable CPU lines, and Intel would not have been able to provide anything more than a mid-level GPU on the same chip.
but that's just a rumor you've just created;)
AMDs desktop APUs are not faster than Haswell, yet they create an APU for consoles that is way more powerful. Why would Intel be not able to also stuff some more shading resources into their iGPU ?
 
but that's just a rumor you've just created;)
AMDs desktop APUs are not faster than Haswell, yet they create an APU for consoles that is way more powerful. Why would Intel be not able to also stuff some more shading resources into their iGPU ?

Because until now they didn´t want, or they didn´t need it. But, we could see even a Super Nintendo 2 with a Iris Pro 2. Intel is getting hammered all around.
 
Back
Top