AMD: Southern Islands (7*** series) Speculation/ Rumour Thread

I *really* hope you're right, cos I'm itching to replace this card now, but I'm seeing specs that are falling fast (eg ROPs 64 -> 32, BW 8GHz -> 5.5GHz etc). I know that this is all rumour and nothing is confirmed, but it was looking like a 100% improvement max over a 6970 with the best of specs, and that has reduced in pretty much every way by now.

A £500 card better perform near to a 6990, or it's gonna be a laughing stock.

Don't listen to bad rumors then. You should have never taken the XDR2 rumor seriously...

If it's consistently 20% faster than the GTX 580, what does that leave to question?
Dual cards.
 
They are gonna shoot themselves in the foot, the leg, the arse and the head if that's the actual price. Only the super wealthy/fanboi's will spend £500 to upgrade 20% from a 6970 or a 580. Especially when the new NV card is expected only 2 or so months away. I've held out for 3.5 years for a card thats gonna be me beneficial increases over my 4870x2, at a decent price point. The 7970 is so far NOT this card.

Who is going to shoot themselves in the foot? The gouging etailers? They get some hits and maybe even some pre-sales from those you describe. I doubt they give a crap about your opinion of them.

Ultimately it's going to be priced where its performance puts it (if it's the same performance as a GTX580 they aren't going to sell it for significantly less just because they can). If you're expecting them to sell a higher performance part for a significant discount over currently selling parts, you're going to be disappointed a lot. Limited availability and gouging at launch are normal and expected and websites throwing up ridiculous prices for products they don't have in stock isn't anything new. When the major/respectable e-tailers have stock the price will move to the MSRP.

<Edit> and if you were looking for a 100% improvement over the 6970, you were destined to be disappointed.
 
I think full-speed FP16 texture filtering isn't really helpful these days - with the exception of 3DMark, of course.

It helps with auto-mimapped FP16 rendertargets which you access filtered. I only see this use-case becoming more used, not less. We use filtered access for Bokeh, Godrays even AO (single channel FP16 intermediate buffer).
 
I think if the memory has been designed to a specification that allows for a modest number of traces across long distances, a compliant device can't just sprout hundreds of extra outputs just because an interposer can support much higher pitch density.
 
Interposers will do very very little for performance without a new wide-IO memory standard, they can allow lower power or slightly higher clock speeds with existing memory standards ... but it's not going to be a big leap.

Their first use will probably in high margin mobile phones ... not graphics cards, unless AMD or NVIDIA decides to push their own custom RAM.

I am assuming that the new mem standard is being worked in the background, to go with this interposer. Intel's going big on interposers with Haswell, so there must be more to it than meets the eye.

Or it might just all be wishful thinking.
 
Dual cards.

Exactly. ;)
HD 6990 can be bought for € 665, all the way down to € 556 incl VAT.
So, if that card is at € 696, or even € 625, it's logical to assume that it puts the HD 6990 in the dust.
I can't understand such a ridiculous pricing and who will go for it. Those should be sentenced. :mrgreen: I thought that it was logical to expect the new manufacturing node to bring the same performance to lower market segments.

What happened to the liquid chamber coolers? Are they present or again false rumour? :???:
 
It helps with auto-mimapped FP16 rendertargets which you access filtered. I only see this use-case becoming more used, not less. We use filtered access for Bokeh, Godrays even AO (single channel FP16 intermediate buffer).
I know it's useful, but typical amount of Int8 filtering is still several times higher.
 
amdradeonhd7000seriesgbom9.jpg


amdradeonhd7900_anaozjfr3c.jpg


http://translate.google.com/transla...0-serisi-icin-alti-cizilen-ana-ozellikler.htm
 
I hope ZeroCore works even when Eyefinity is enabled. 3W idle power sounds really good, especially in the summer.
 
ZeroCore is active when display turns off or on additional CF-cards with no attached displays.
Probably with Virtu ZeroCore could be also used, if you attach your displays to the IGP.

With multiple displays(@different timing) attached to the card and activated, we probably see up to 100W idle-consumption, especially with 3GiB GDDR5 @ 5,5Gbps and 4 billion transistors.
 
Last edited by a moderator:
Exactly. ;)
HD 6990 can be bought for € 665, all the way down to € 556 incl VAT.
So, if that card is at € 696, or even € 625, it's logical to assume that it puts the HD 6990 in the dust.
I can't understand such a ridiculous pricing and who will go for it. Those should be sentenced. :mrgreen: I thought that it was logical to expect the new manufacturing node to bring the same performance to lower market segments.


You always pay a premium for single card performance, since you dont have to rely on crossfire profiles and other annoyances of dual GPU cards.

If it's faster than GTX 580 it will cost the same or more than GTX 580. We dont have to like it but that's the way it will be.
 
If it's faster than GTX 580 it will cost the same or more than GTX 580. We dont have to like it but that's the way it will be.

Only if they are targeting similar or lower sales volumes as well. Which, depending on production capability/yields, may well be the case.
 
Back
Top