Next generations of low end graphics cards

I half-expect AMD will just leave Cape Verde in production through 2013 to serve as a low end discrete card and hybrid crossfire solution for Kaveri. It is only 123mm2.

I expect that more than half. :D

Seriously though, that's what they did last gen, Juniper was in production nearly the entire useful life of the 40nm process. I expect them to do the same this time. It's good enough for it's segment that it would be very questionable if any kind of update they could make to it without a shrink would be worth the masks. The price point might come down a little as the generation progresses, but that's about it.
 
I think it's crazy that a high end card from 2009 (4890) is already in "legacy support" for AMD, Nvidia from now on it's seems at least will keep the support for the 8800, which is more acceptable given how bad the 7 series are nowadays, but the 8800 and higher are still adequate

I count agree more with you there but it seems that DAMN never consider its 239USD part (on release, iirc) to be a top-notch just upper-mainstream card. And HD4890 great performance comparing it to HD5000/HD6000 parts certainly didnt play well for AMD to feel warmth around their heart with their empty pockets :LOL:

nVidia always puts a high prices on their parts and they rebranded their 8800gt/gts (g92a/b) into a zillion new brands spanning from the end of 2007 to the mid 2009 while product were in fact retailing long into 2010 as no mainstream Ferminator parts were available. So highly infamous nvidia rebranding, lack of new architectural changes every two years accompanied with traditionally high prices of nvida products might come in handy when driver support is issued :rofl:

Only long term support (longer than 3yrs) when AMD/ATi is in mind we could expect from GCN based products considering how high they were placed on their introduction :rofl: And also they will highly abuse that architecture in next few years that driver support could be easily done, but i'd better trim my expectations considering how bad ATi driver support traditionally was.


I half-expect AMD will just leave Cape Verde in production through 2013 to serve as a low end discrete card and hybrid crossfire solution for Kaveri. It is only 123mm2.

I expect that even those chips will be upgraded considering low-power optimizations for GCNg2. But that might be a bit too high to expect as AMD will only release it first HD8800 based cards in Jan/Feb and high end HD8900 god know when (Apr?). Thus expectations for some chip that will serve sub-100USD GPU might be a bit tad to expect, but i'm hoping AMD will really find a time to redesign it sooner.

Even if they just make smaller chip that fully featuring 512:32:16 (different from CV 640:40:16 setup) but which is capable to run of much higher clocks 1200-1300Mhz w/ room for OC ofc. It could play better with Kaveri (hopefully GCNg2) and certainly 100-110mm2 on far mature 28nm node could allow a sub 80USD cards on its introduction.

But then they might look at high regard to Haswell GT3 graphics and DAMN CEO might think of this chip redesign as redundant (as they usually do). But Haswell with fully enabled GT3 graphics should be priced for 400USD so there's more than wasteland of marketing space for 80USD GPU that could easily beat performance wise and pretty-picture wise those measly GT2 in Haswell, IvyBridges etc. And au pair with GT3 and its performance, buut someone thats already paid 400$ for CPU probably wont go with 80$ GPU and that GT3 inside Haswell will suit him more than great.
 
I believe that Cape Verde successor if it could be brought down to 100mm2 which would be 20% size reduction should be wisely considered as 28nm HKMG allows huge speed margins and proper redesign would allow 1250-1330MHz out of the box.

One chip which wouldn't have needs for its salvage counterparts would be more beneficial than now when Cape Verde is binned for 512SP/640Sp parts in mobile and desktop. One 512SP part that would serve for mobile and desktop along one more lower clocked LowPower part for mobile market. But that seriously depends how small could this part be just by removing 2CUs (128SPs+8TMUs) and leaving 128-b memory bus intact. But hey i would be satisfied with odd 3RBE if that would help to reduce it to 100mm2

I think you overrate price of mask if this job could be done before Kaveri is launched up until mid February 2013. They could cut their expenses on more time consuming binning along by having 20% smaller chip which results with 20% increased output per wafer, or 20% of dies damaged during production could be directly canned without sacrificing market share unlike today.

So 512:32:16(4RBE) or 512:32:12(3RBE) along with 128b GDDR5 and running at reasonably affordable 1250-1300MHz for 28nm node would be much better part than Cape Verde 640:40:16 modestly clocked to 1100MHz, which is designed for immature 28nm where they didnt know how good chip clock would scale up. The only question is how much revamped GCNg2 and more mature 28nm could allow for more transistors to be crammed on smaller space. I'd bet a lot. As pure removal of 2CUs would meant only ~16mm2 smaller die, so 1RBE should be removed along. And at those high chip clocks 3RBE should be more than sufficient to run up to four 1080p displays (like Juniper/HD5770). And even at those high clocks could be rated under 80W TDP (CapeVerde XT has)

But then AMD lack of execution for decades and there's 0.02% that they would even think this over, as they're busy with CEOs bonuses large cutoffs and other crappy stuff some real firm shouldn't invest so much effort.
 
Execution of "AMD" GPU programs is measurably better than "ATI". Metrics prove that.
This.

From a technical execution perspective they have been exceptional. It is very hard to see how the GPU department of AMD could have done much better than they have.
 
Makes you wonder why things are so bad on the CPU side.
On the GPU, they fight with roughly the same weapons: same process, somewhat similar amount of engineers, etc.

On the CPU, they take a knife to a gun fight. Overwhelmed in every possible way.
 
On the GPU, they fight with roughly the same weapons: same process, somewhat similar amount of engineers, etc.

On the CPU, they take a knife to a gun fight. Overwhelmed in every possible way.

I'm sure that's a factor, but they do seem to frequently screw up in ways that aren't obviously related to their relatively smaller resources.
 
On the GPU, they fight with roughly the same weapons: same process, somewhat similar amount of engineers, etc.

On the CPU, they take a knife to a gun fight. Overwhelmed in every possible way.

This completely remind me the interview of E. Demers about GCN when he was asked what he think about Bulldozer.

Eric: It's architecture design was created by a completely separate team [from Bulldozer], but there is a lot of overlap between the tech managers and what they're using for boosting their performance is the same thing we're using in PowerTune. There is some crossbreeding there. We talk a lot with them about the shader design core and leveraging technology from Bulldozer; Bulldozer is an x86 architecture, it is completely different from a GPU architecture. We're wide and relatively speaking slow compared to theirs, what 4.5GHz rates, we're running 925 and a completely different process technology. They're running at [COLOR=!important]GLOBALFOUNDRIES[/COLOR], we're at TSMC - it's very hard to compare those two things; different business units, different PR people, different engineers, different everything.


Eric: I actually like the Bulldozer design, I think particularly the revisions that are upcoming are going to be pretty good for it. It's not a bad CPU, it’s just that the competition is very good there. We [Graphics division] have the advantage with the competition being somewhat on-par with our current designs, in performance/$$ we’re probably still ahead of them. This part is another salvo in that continuous war. We don't have alien process technology like Intel does [laughing], thankfully we're not competing directly with them. We're actually competing with guys that have exactly the same process technology as us, so we feel really comfortable about going for it. In fact, right now, I wish we had more time (of course) with it before we introduced it but this is looking to be a rock solid product. Everybody has met their expectations and everybody is happy with their performance.
But, there's too a big difference between the ATI at his beginning and the gpu market at this time, and what it is now. At the 9700Pro period ( one of my first ATI cards ), they was less marketing, and games developpement was surely not the same it have become now.
 
Last edited by a moderator:
So 512:32:16(4RBE) or 512:32:12(3RBE) along with 128b GDDR5 and running at reasonably affordable 1250-1300MHz for 28nm node would be much better part than Cape Verde 640:40:16 modestly clocked to 1100MHz, which is designed for immature 28nm where they didnt know how good chip clock would scale up. The only question is how much revamped GCNg2 and more mature 28nm could allow for more transistors to be crammed on smaller space. I'd bet a lot. As pure removal of 2CUs would meant only ~16mm2 smaller die, so 1RBE should be removed along. And at those high chip clocks 3RBE should be more than sufficient to run up to four 1080p displays (like Juniper/HD5770). And even at those high clocks could be rated under 80W TDP (CapeVerde XT has).

I think they should also work to improve the memory clock, 7770 is limited to 4.5GHz, which is lower than the 5750... Nvidia cards from the same price point have memory at 5-5.4GHz

also I would like to see a GCN card with maybe 256:16:8 maybe even with 64bit memory bus (as long as it uses DDR5 at a high clock) to replace the 6450-6670 DDR3 decently, with lower power usage.

but probably their focus on "APUs" will limit what they are doing for the low end cards!?
 
I think they should also work to improve the memory clock, 7770 is limited to 4.5GHz, which is lower than the 5750... Nvidia cards from the same price point have memory at 5-5.4GHz

also I would like to see a GCN card with maybe 256:16:8 maybe even with 64bit memory bus (as long as it uses DDR5 at a high clock) to replace the 6450-6670 DDR3 decently, with lower power usage.

but probably their focus on "APUs" will limit what they are doing for the low end cards!?

CapeVerde is chip made in rush to have budget offering in overpriced HD7000 series lineup so it's good considering how small it is and probably how little time they spend developing it. They probably just cut off Pitcairn in half to make it working :LOL:

Exact it's easier to make you upgrade the whole platform that way. But hey, Intel is doing longer with its Nucleus than AMD with its Fusion/Vision slideware is supposed to. Anyway it's bad not to have real budget cards anymore few months after new tech came to market. They wisely estimate that it would spoil their profits, i guess.

And i said most people don't care about inferior graphics they get bundled as Intel GT2/GT1 inside Sandy Bridge or Ivy Bridge, so they wouldn't buy those cards anyway. Only who cares about is some people with five year obsolete rigs. And for them AMD offers APU. Although Intel will take bigger market share with their CPU+GT while offering inferior graphic experience.
 
I must have lost a post somewhere, but I'm now with the 7600GT, I've only ran silly things on it that were butter smooth (google earth, warcraft III, quake 3 engine). I intend to keep on fully running linux (I would never dual boot, so I made a hard choice some time ago) but I'm really waiting for the free Team Fortress 2 we were promised.

it's a card I had given away years ago but found it back, not used anymore ...blah-blah... /edit : I'm left out of Team Fortress 2 Linux, it complains about a missing OpenGL feature. So I'd be fine on Windows with DirectX 9 gaming but it seems that for running ported old games on Linux a modern GPU will be needed..
 
Last edited by a moderator:
AMD will be launching a small GCN v2.0 GPU with 384 units :), and a 128bit bus.

http://www.hardware.fr/news/12818/amd-devoile-mars-radeon-hd-8000m.html

IMG0039868.gif


The rationale for it is laptops. There are laptop APUs, but apparently staying on the market of laptops with Intel CPU + separate GPU is worth it. 128bit bus is said in the news to be useful to impress buyers with numbers, such as "2GB dedicated video memory".
Maybe GK208 will be pitted against it, I speculate it has 192 units and a 128bit bus as well, and likewise would be the first GPU of the refresh gen to come out.
Nvidia may be less in a hurry though, they have gf117 already (geforce gt620M), that laptop-only 28nm GPU (it doesn't do output)
 
Last edited by a moderator:
AMD will be launching a small GCN v2.0 GPU with 384 units :), and a 128bit bus.
This is still really GCN 1.0.

http://www.hardware.fr/news/12818/amd-devoile-mars-radeon-hd-8000m.html
The rationale for it is laptops. There are laptop APUs, but apparently staying on the market of laptops with Intel CPU + separate GPU is worth it. 128bit bus is said in the news to be useful to impress buyers with numbers, such as "2GB dedicated video memory".[/QUOTE]
For 2GB dedicated video memory a 64bit bus would just be enough (as you can easily get 4gbit 16bit ddr3 chips). 128bit is however an absolute necessity if you want to get significantly more performance out of it than a ivy bridge igp with ddr3 (and 64bit gddr5 isn't all that impressive as you're pretty much limited to 512MB). hardware.fr is saying though the 8500m and 8600m are limited to 64bit memory I'm not sure if I believe that yet (but if so you should stay far far away from them). Could be though so the seemingly completely overlapping (performance-wise) 8500m, 8600m and 8700m series wouldn't actually overlap...
 
Back
Top