AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
Ah yes, duly noted. I neglected to mention that my brother is using a slimline tower from Dell, and finding half-height 5670's hasn't been easy so far. Or else, I'm just not looking in the right places.
AMD LP cards (not the low-end which come at that size pretty much always) seem to be even more rare than nvidia LP cards. I think you can get GT240 (without power connector), 9600GT, 9800GT (at least for the 9600GT even versions without power connector exist), and even the odd GTS250 as low profile cards. Though I don't know how feasible (except the GT240) those are, since obviously there's a difference in cooling you'll need if the card uses 40W or 75W (or even more), which could be a problem in such cases.
I thought there were rumors about a lp HD4670 but I've never seen one, though I guess you don't want that any longer anyway with the HD5570 now (almost) available. I guess that could indeed be an excellent choice with its low power draw for such small cases. Probably similar in performance to the ddr3 gt240, cheaper and a bit lower power draw.
 
for non graphics i can understand a new chip set but why a new socket? We know bulldozer is using the same socket as magny cours (G34) so why change the socket?
 
The 5570 seems full of the same fail as the 5450.

IT seems to get its but kicked by similarly priced nvidia cards

The gt 240 is not only faster , but costs the same , uses less power and has been out for awhile. The only down side to the card is that its dx 10 and not dx 11. But it seems the 5570 is barely able to play older games let alone newer games.

I really don't understand what ati is doing with these two cards . The radeon 5570 and the 5450 should not exist.

http://www.anandtech.com/video/showdoc.aspx?i=3738&p=1

Conclusion

With AMD's positioning of the Radeon HD 5570 in the marketplace, you can get a few very different outcomes depending on what you’re looking for. As a video-only HTPC card, it’s no better than the 5450 in features, while it’s worse in terms of power consumption and noise. Based on our research the 5570 isn’t the HTPC über card we were expecting it to be, so if you can bear the limitations of the 5450, that’s going to be the better card. Otherwise the 5670 is the most capable choice out there. The 5570 does nothing better than either of those two cards when it comes to HTPC use.

Meanwhile if we take a look at overall performance, the 5570 doesn’t fare much better. The move from GDDR5 to DDR3 has a significant impact on the performance of the Redwood GPU in most cases, bringing the 5570 well below the 5670 and similar cards. The lower-end of the 5000 series has been consistently overpriced when it comes to overall performance, and the 5570 is no different. The GDDR3 9600GT can be found for around the same price point, and is anywhere between just as fast as the 5570 to completely clobbering it. The 5570 can’t compete amidst that much of a memory bandwidth gap. If you can fit a full-sized card, you can do much better than the 5570 when it comes solely to performance; the 9600GT and the GT 240 are both much more capable cards for the $80-$85 price tag.
 
Regarding power it seems anandtech got an very power hungry sample (and one can argue how representive furmark/occt really is for load usage / efficiency).
http://www.computerbase.de/artikel/...adeon_hd_5570/15/#abschnitt_leistungsaufnahme
http://www.tomshardware.com/reviews/radeon-hd-5570,2552-12.html
http://www.hardwarecanucks.com/foru...5570-1gb-ddr3-single-crossfire-review-19.html

Performance is very much like 4670, as expected at those specs - but the dx11 advantage isn't all that big there and neither is the (absolute) power difference. It has to come much closer in price (But it's probably slightly more expensive to produce too).
But it's faster than the common gt240-ddr3 - the ddr5 version (benchmark edition?;)) is more expensive than the 5670 here.
 
Last edited by a moderator:
for non graphics i can understand a new chip set but why a new socket? We know bulldozer is using the same socket as magny cours (G34) so why change the socket?

Because G34 is designed for the future which makes sense since it goes on sale in the future. The current sockets are not optimized for either future servers or future integrated designs. For instance, the current sockets don't have the video outputs that would ideally be available for an integrated design. Likewise, it is reported that G34 will increase the number of memory channels to support increasing core counts and bandwidth requirements.
 
it may be a bit expensive now but when DDR3 really becomes the commodity ram, and older GPU get phased out, it will be a good cheap card. a replacement for the 4650, with twice the bandwith.

It looks perfect for upgrading all those pentium 4 3GHz with radeon X300SE, or may be used where you don't need huge GPU power but something similar to a current gen console would be better than an IGP.

funny than you mention the GT240, it's an equally good card but too expensive as well (gddr3 models pricing the same as 9600GT and gddr5 models priced as a GTS 250. bad value!)
 
The 5570 seems full of the same fail as the 5450.

IT seems to get its but kicked by similarly priced nvidia cards

The gt 240 is not only faster , but costs the same , uses less power and has been out for awhile. The only down side to the card is that its dx 10 and not dx 11. But it seems the 5570 is barely able to play older games let alone newer games.

I really don't understand what ati is doing with these two cards . The radeon 5570 and the 5450 should not exist.

http://www.anandtech.com/video/showdoc.aspx?i=3738&p=1

I think you're missing the point. The 5570 is faster than the 4650, about as fast as the 4670 and close to the GT 240, while costing much less: Redwood is about 100mm² and it only uses cheap standard DDR3.

At the current $85 price tag, it's pretty much pointless unless you have a low-profile case, but it's supposed to (and most likely will) drop much much lower.

Add DX11, Eyefinity and audio bitstreaming to that, and you get something quite appealing for HTPC users who want some casual gaming, OEMs, etc.
 
Last edited by a moderator:
Add DX11, Eyefinity and audio bitstreaming to that, and you get something quite appealing for HTPC users who want some casual gaming, OEMs, etc.
But ...
- DX11, the card is unable to play current Dx10 titles. Will it be able to run ANY game with DX11 quality (ie all maxed) ???
- eyefinity - it won't be able to run games in such mode. Its useless for watching films, that leaves what? professionals using 3 monitors? Why not 5450, it should be good enough for 2D
And I read that currently one needs a 100$ dongle or new expensive monitor...
- HTPC? reading anandtech I see that 5450 and 5570 have problems doing any video stream postprocessing - I really don't get what is the problem there...
 
But ...
- DX11, the card is unable to play current Dx10 titles. Will it be able to run ANY game with DX11 quality (ie all maxed) ???
- eyefinity - it won't be able to run games in such mode. Its useless for watching films, that leaves what? professionals using 3 monitors? Why not 5450, it should be good enough for 2D
And I read that currently one needs a 100$ dongle or new expensive monitor...
- HTPC? reading anandtech I see that 5450 and 5570 have problems doing any video stream postprocessing - I really don't get what is the problem there...

— DX11 can bring performance improvements through multi-threading, so in this regard it may help. For the most part, it will just be a checkbox feature, though.
— I'm guessing old games may run well with Eyefinity, and office work with 3 monitors is always nice. I suspect the price difference between the 5450 and the 5570 will eventually be so small that the 5570 will be worth it even if you just run a game once a month. But again, Eyefinity is mostly a checkbox feature here.
— Yes, the 5570 can't do Vector Adaptive Deinterlacing, but neither can other low-end cards. It's not that big a deal, though I hope AMD makes this option available in the drivers. If the 5670 can do it, is it really that big of a stretch for the 5570?


So yes, DX11 and Eyefinity aren't really going to make a big difference, but this card's main advantage is that it costs very little to make.

If I'm not mistaken RV730 is about 130mm². With 100mm², Redwood has the advantage. And they use the same kind of memory: standard DDR3. Right now, the cheapest 4650 on Newegg sells for $50, or $56 with shipping. There's even one selling for $45 with free shipping, plus a $15 mail-in rebate—that's $30 after MIR.

So that's where we can expect the 5570 to eventually drop, if not lower.
 
— Yes, the 5570 can't do Vector Adaptive Deinterlacing, but neither can other low-end cards. It's not that big a deal, though I hope AMD makes this option available in the drivers. If the 5670 can do it, is it really that big of a stretch for the 5570?
exactly, maybe this is just a driver bug. Doesn't make sense
 
But ...
- DX11, the card is unable to play current Dx10 titles. Will it be able to run ANY game with DX11 quality (ie all maxed) ???

DX11 doesn't mean "max quality".

All depends on what devs do with the CS and Tessellator but some effects could be rendered with considerably less overhead than with PS as previously done.

Tessellation is more efficient than POM while looking better and CS enable faster post-processing than PS, to the point the small 5570 could be close to the 4850... thanks to DX11.

Heaven doesn't show the one and only way to use these, and in the case devs switch to DX11 with a DX10/DX9 fallback for the old GPUs it could be the NV45 story again, this time for all of the HD5k series.
 
But ...
- DX11, the card is unable to play current Dx10 titles. Will it be able to run ANY game with DX11 quality (ie all maxed) ???

Can ANY low end budget card play games at settings at "all maxed"? :p Unless the game was designed for budget cards in the first place, or are really old. I guess that also makes DX10 and DX9 a useless checkbox feature.

I can't think of any that can play Crysis at max settings. Hell I'm not sure there's any that can play Far Cry at max settings. :)

Regards,
SB
 
IT seems to get its but kicked by similarly priced nvidia cards
I think though the anandtech review is bad (due to errors) and a bit unfair.

The gt 240 is not only faster , but costs the same , uses less power and has been out for awhile. The only down side to the card is that its dx 10 and not dx 11. But it seems the 5570 is barely able to play older games let alone newer games.
The GT240 ddr3 seems to be spot on performance wise (as is the 4670). Now, AMD wants this to be compared against GT220, which clearly is cheaper, so the GT240 indeed should be the right card to compare. But if you look at current prices, GT240 GDDR5 usually cost quite a bit more than 85$, they are rare at 85$. I'm pretty sure the 5570 will be priced slightly cheaper on average (hence same level as gt240 ddr3).
And I don't think anand got a leaky sample, I say either something was wrong with it or measurement error (as most likely I'd say some settings changed in the test system - another reason why those at the wall measurements are fail for measuring card power draw). Because the difference in actual TDP as well as other sites doing measurements show that clearly the HD5570 has a quite big advantage there.
Some numbers I trust much more (real measurements):
http://ht4u.net/reviews/2010/amd_radeon_hd_5570/index13.php
The HD5570 has about the same idle power draw but load power draw is actually at the GT220 level, one third lower than the GT240 ddr3, not to mention the ddr5 version which is even higher, and definitely lower than HD4670. Anand's numbers are bogus, for whatever reason.
But even other sites doing at the wall measurements show an advantage of the HD5570 (like toms).
Also, Anand's Resident Evil benchmark looks wrong too. The explanation there for the low score doesn't make a single bit of sense (since the HD4670 doesn't really have much more bandwidth neither). Tom's numbers for RE5 make a lot more sense, with the usual pattern of the HD5570 close to the HD4670 (actually faster in this title, though not enough to catch the gddr5 gt240).
(Oh and btw the GT240 is DX10.1, not DX10.)

So I think you're a bit harsh to this card. It performs right where the HD4670 and GT240 ddr3 do, with similar price to the latter and slightly more expensive than the former (for now - I'd say there's a LOT of room to go down in price, more so than the gt240 can). With a quite obvious advantage in power draw to both of them. Sure, the 9600GT is still faster at the same price, and if all you care about is performance that's still a better buy, but really, that thing was pushed down in price like there's no tomorrow.
 
But ...
- DX11, the card is unable to play current Dx10 titles. Will it be able to run ANY game with DX11 quality (ie all maxed) ???

we're far from seeing a mandatory DX10 or DX11 game, but the featureset doesn't hurt.
there are people who sidegraded from radeon 9550 to geforce 6200, I've seen it. So they can play a SM 3.0 only game.

unless you're running a geforce FX, it will be better to run both the newest shader model and low/medium details.
 
But ...
- DX11, the card is unable to play current Dx10 titles. Will it be able to run ANY game with DX11 quality (ie all maxed) ???
- eyefinity - it won't be able to run games in such mode. Its useless for watching films, that leaves what? professionals using 3 monitors? Why not 5450, it should be good enough for 2D
And I read that currently one needs a 100$ dongle or new expensive monitor...
- HTPC? reading anandtech I see that 5450 and 5570 have problems doing any video stream postprocessing - I really don't get what is the problem there...

yep pretty much spot on and sums up ATI's entire <5600 line. The ONLY feature that might be of actual use would be the audio bitstreaming for HTPC but the broken video post processing completely and utter disqualifies it. So then the choice would then be to use a dedicated sound card for audio with enabled feature set and to use an older/cheaper competitor product using 3 year old tech.. and thats the sad part.. a newly released product (though smaller.. one would think translate to cheaper.. it doesn't) has a hard time competing against an older (more mature) product. It's rather funny (ironic) that Eyefinity for example gets pimped as a "feature" and only 1 product so far (XFX 5570) has opted to enable it through the inclusion of DisplayPort.. the cost of said product .. $90 without any adapter.. and being full height excludes it again from being the choice of many htpc builders where LP cards are highly sought for the minimal size. For the end user the 5500-5400 series offer little over older products and not much in the realm of being actually usable. For OEMs though and ATI's bottom line I expect the <5600 series to be a big boon to ATi's bottom line where the higher (ASP) associated with check box features (DX11, Eyefinity, Directcompute.. blah blah) will score big.
 
isn't the video processing problem a driver issue?
a stupid one at that, because the 5570 is the same GPU as on the 5670.

I always fail to understand the HTPC argument anyway. the market that understand what the Dolby true DTS HH QHD extreme X encodings mean is probably vanishly small, even next to people who will actually game on those GPU.

is it that they never fail to invent a new encoding? and still using an audio connection originally meant for digital stereo? boring.

sorry, I don't want to be mean to the people who actually care.
 
isn't the video processing problem a driver issue?
a stupid one at that, because the 5570 is the same GPU as on the 5670.


I always fail to understand the HTPC argument anyway. the market that understand what the Dolby true DTS HH QHD extreme X encodings mean is probably vanishly small, even next to people who will actually game on those GPU.

is it that they never fail to invent a new encoding? and still using an audio connection originally meant for digital stereo? boring.

sorry, I don't want to be mean to the people who actually care.

It maybe a driver bug in the 5500 so who's to say the only thing we know without a doubt is that right now it is broken and doesn't work. In the 5400 series iirc, it seems post processing is disbabled/limited due to hardware limitations. In fact comparing the 5600 to the 5500 .. etc it seems that ATI's reliance upon shader count is becoming a strain, where lower shader parts have higher utilization than higher (bigger/more costlier) parts. It's a trade off of sorts.
 
Back
Top