AMD: R9xx Speculation

I think the better names would have been Barts pro = HD6830 and Barts XT = HD6850. A cut down Cayman could take the HD6870 name.
The '30's do not go over well in developed markets.

But my issue mainly comes more from the recent statements from Richard Huddy that there is a minimum triangle size beyond which there is no point in tessellating. If it's not worth tessellating past that point now, why will it suddenly become worth doing it next month?
Why do you think that statement will change? The comments are not actually related to the geometry end of the pipeline, but the rendering end, and that applies for everyone. The issue is that we are all still quad based renderers and when triangles get below certain pixel coverage sizes you start wasting a tonne of processing by going over fragments multiple times. This inefficiency applies on all pipelines out there.

Am I missing something, I just installed the Catalyst 10.10 drivers that are currently uploaded to the AMD servers and I don't have multiple quality options for Catalyst AI on my 5870, just standard and advanced like normal
Apparently it is not in 10.10 for older products at the moment, it will be in the future.

Also, has there been a review where the impact of having two "Ultra-Threaded Dispatch Processors" is discussed/explained?
Really that was just an evolution of the diagram more than anything else. Cypress has the same (there is one per shader engine) the early Cypress diagrams just didn't represent it.
 
Apparently it is not in 10.10 for older products at the moment, it will be in the future.

So there is no multiple quality options for Catalyst AI, no Surface Format Optimizations and no MLAA in cat 10.10 for 5800 series users? What is cat 10.10a (cat update released at/after 10.10 WHQL) for specifically? Is Cat 10.10a just for the 6870/6850?
 
Last edited by a moderator:
So there is no multiple quality options for Catalyst AI, no Surface Format Optimizations and no MLAA in cat 10.10 for 5800 series users? What is cat 10.10a (cat update released at/after 10.10 WHQL) for specifically? Is Cat 10.10a just for the 6870/6850?
10.xA are for 4870x2s and 4850x2's, and some CF issues. My guess is that there wont be a 10.11a, they will finally get it in the next "driver branch". And then the hotfix will be for game dijour issue.
 
Also what have you guys done with the colours? I have a horrible pink tint on what was a perfectly calibrated display using a hardware colorimeter.
chose a different color temperature for your screen, 10.10 beta's default to 6500K or so.
 
Apparently it is not in 10.10 for older products at the moment, it will be in the future.

10.xA are for 4870x2s and 4850x2's, and some CF issues. My guess is that there wont be a 10.11a, they will finally get it in the next "driver branch". And then the hotfix will be for game dijour issue.

Interesting...but I have to admit that the term "in the future" is a coy answer for explaining when 5000 series users will see those features. I have no idea what that means. There are 2 more Cat drivers left before the end of year and that statement doesn't tell me if we will see it by than or not. However, when read in conjunction with this earlier post I began to wonder if that user is correct?
 
Last edited by a moderator:
morphological%20aa%20comparo.gif

morphological%20aa.png


This is a very clear win for Barts, and I suspect there might be a few games where this is the case.

edit, assuming you prefer the smoothing effect to jagged edges of no AA, obviously.
 
I think the better names would have been Barts pro = HD6830 and Barts XT = HD6850. A cut down Cayman could take the HD6870 name.
Personally, with so few changes to the 3d core, these actually look more like Evergreen family members to me, so HD5840 and HD5860. Especially since at least Cayman from the NI family apparently has way more changes there, so based on the 3d core alone Barts does not warrant a HD6xxx designation. I understand though the products aren't sold on internal architecture details.

I think some people will be disappointed with overclocking results (personally I'm not never expected anything better - this is still the same arch with the same clock design targets). Especially the HD6870 is a complete no-go for overclocking - results I've seen vary from 20-50Mhz on the core to 0Mhz to not much more for the memory. Naturally the HD6850 fares better though I guess this also largely depends on voltage (the reference card seems to have lower voltage than some retail cards which seem to use same voltage as HD6870).
Some people will also be disappointed about the lack of DP. Looks to me like it would be about time to support that on more cards.

Also, I think power management is a bit of a let down. Not to say it's bad (after all idle power still seems to be a tiny bit better than HD58xx) but it could be better. The cards still don't undervolt the memory at idle (which is a likely reason GTX460 is competitive in idle power draw). Plus, the cards actually took a step back from HD58xx with blu-ray playback - no longer underclocking the memory for some reason. Moreover, still using 1.6V instead of 1.5V for the memory, so power consumption could actually be better. Though I wonder if that overvolt is actually necessary due to the reduced frequency design of the MC to achieve stability (which really seems to have no headroom at all). Maybe that's also the reason even the HD6850 is using memory chips rated for 1.25Ghz rather than 1Ghz (another reason would be that they cost the same...).

In any case the HD6850 looks like a winner to me. Faster, cheaper, much less power consumption than the anemic HD5830 - this is the card with best performance/price and performance/power ratios. Nvidia might be willing to compete with that with the GTX460 1GB (which really is a more natural competitor than the GTX 460 768MB) but no way they are going to achieve performance/power ratio of the HD6850.
The HD6870 isn't bad but it's not much of an improvement over HD5850, at least not if you're willing to overclock (in which case the HD5850 will easily beat any HD6870). Plus the performance difference to HD6850 doesn't really justify the price difference.
 
Rage3d review states in part:
MLAA will be enabled via CCC using a check box, but is not included in the AMD Catalyst 10.10 WHQL driver....It is due to be released as part of a hotfix driver shortly after the AMD Radeon HD 6800 series launches.
link
Which makes me wonder if they were implying the 10.10a hotfix? But it's not clear if that means just the 6800 series or not as it doesn't say.
 
Plus, the cards actually took a step back from HD58xx with blu-ray playback - no longer underclocking the memory for some reason.
How much did they underclock the memory during blu-ray playback?

In any case the HD6850 looks like a winner to me. Faster, cheaper, much less power consumption than the anemic HD5830 - this is the card with best performance/price and performance/power ratios.
One thing I've been wondering about: If I took a HD6870 and undervolted/underclocked it to HD6850 levels, would the power consumption be similar?
Also, is there a guarantee that all HD68xx models will actually undervolt? Some HD46xx cards, for example, did not have the ability to undervolt at idle.
 
chose a different color temperature for your screen, 10.10 beta's default to 6500K or so.

Like I said I use a hardware colorimeter with my monitor, using the Lacie Blue Eye Pro software. I already target 6500K color temperature, Catalyst 10.10 completely screws that up. The official WHQL 10.10s are already up on the server, you just have to change the link.
 
Like I said I use a hardware colorimeter with my monitor, using the Lacie Blue Eye Pro software. I already target 6500K color temperature, Catalyst 10.10 completely screws that up. The official WHQL 10.10s are already up on the server, you just have to change the link.

Is CCC's "Use Extended Display Identification Data EDID" unchecked?
 
Tried both unchecked and checked. AMD did something screwy with 10.10.

That is strange. Your using win7 right? If so right click on desktop:
-Select screen resolution than right click on the blue monitor.
-Tab over to Color Management and click on the Color Management button
-Tab over to Advance
-Click on the Change system Defaults button
-Tab back over to Advance
Is Use Windows display calibration checked or unchecked for you? Assuming you are using a specific monitor profile.
 
That is strange. Your using win7 right? If so right click on desktop:
-Select screen resolution than right click on the blue monitor.
-Tab over to Color Management and click on the Color Management button
-Tab over to Advance
-Click on the Change system Defaults button
-Tab back over to Advance
Is Use Windows display calibration checked or unchecked for you? Assuming you are using a specific monitor profile.

I have Windows Display calibrated checked with my calibrated monitor profile loaded. I check Windows Display calibration so that the ICC profile doesn't get unloaded.

This all works perfectly fine on the 10.9s

In other news they finally added the OpenCL driver to the Catalyst package.

Introduction of AMD Catalyst™ Accelerated Parallel Processing
("APP") technology Edition
There will now be two variants of the AMD Catalyst package available:
1. AMD Catalyst (comparable to prior versions in features and components) –
currently includes the Direct3D®, OpenGL®, display driver and AMD
Catalyst Control Center components
2. AMD Catalyst Accelerated Parallel Processing (“APP”) technology Edition –
AMD Catalyst plus the OpenCL driver
Users can still obtain the individual AMD Catalyst components as well (which will
also include the OpenCL driver as well)

Full patch notes.

http://viewer.zoho.com/docs/lfdCB
 
Wow that's a vast difference for such a measly gap.
Saving die space is a bonus. Normally you only save on cheaper RAM. This is the right move for a part in this market, because you don't want to pay for 4.8GHz memory anyway.

Whatever AMD did, the total savings are substantial. Removing 30% of the SIMDs would only save around 10% of Cypress. Recall that RV790 was larger than RV770 to get the clock speed up.
 
How much did they underclock the memory during blu-ray playback?
Oh, looked it up and actually not that much. Seems HD5850 lowered mem clock from 1000Mhz to 900Mhz, so hardly worth it. Maybe that's the reason HD68xx don't bother at all. In any case, it's worth noting that a GTX460 only draws half the power when doing blu ray playback and that is probably the reason (GTX460 lower mem clock from 900Mhz to 162Mhz for that), and the HD68xx seem to do slightly worse in that area than HD58xx - http://ht4u.net/reviews/2010/amd_radeon_hd_6850_hd_6870_test/index20.php.

One thing I've been wondering about: If I took a HD6870 and undervolted/underclocked it to HD6850 levels, would the power consumption be similar?
I'd guess so though it should still be a bit higher due to the 2 additional simds. Though, from the results published, power usage varies quite a bit.

Also, is there a guarantee that all HD68xx models will actually undervolt? Some HD46xx cards, for example, did not have the ability to undervolt at idle.
I think nowadays all cards undervolt at idle. Though of course it's possible someone builds a non-reference board with fixed voltage, and it wouldn't be the first time some overclocked version looses the undervolt at idle ability due to broken bios...
 
Back
Top