AMD: Southern Islands (7*** series) Speculation/ Rumour Thread

TR graphs are showing frame times, so lower is better ;)
The 660 Ti is rendering less frames during the benchmark, so it is finished earlier. It is slower.
 
But that still is one sample from many thousands cards sold and at least few hundred cards reviewed by testers.
They tried different drivers, a different platform and two different 7950 cards.
http://techreport.com/review/23981/radeon-hd-7950-vs-geforce-gtx-660-ti-revisited/11
Our first instinct upon seeing these results was to wonder if we hadn't somehow misconfigured our test systems or had some sort of failing hardware. We test Nvidia and AMD GPUs on separate but identical systems, so to confirm our numbers, we switched the cards between the systems and re-tested. The Radeons still exhibited the same patterns of frame latency, with no meaningful change in the results. We wondered about the possibility of a problem with our Sapphire HD 7950 Vapor-X card or its Boost BIOS causing the slowdowns, but swapping in an older, non-Boost Radeon HD 7950 card from MSI produced very similar results.

We're also quite confident the problem isn't confined to a single set of drivers. You see, this article has had a long and difficult history; it was initially conceived as an update comparing Catalyst 12.8 and 12.11 beta drivers. However, driver updates from AMD and Nvidia, along with some additional game releases, caused us to start testing over again last week. I can tell you that we've seen the same spiky frame time plots in most of these games from three separate revisions of AMD's drivers—and, yes, Catalyst 12.11 is an improvement over 12.8, all told, even if it doesn't resolve the latency issues.
 
If you aren't sensitive to microstutter, yes:

For be honest, i cant arrive to see microstutter on this game.... The game is not a fast shooters, 3th person view, but you will feel or see them otherwise.. will be interessant to test frametimes on my setup, but yet again i use v-sync but nearly maxed (4xMSAA if i remember well ) setting fps are really smooth and fixed at 60fps..

I have follow many threads about Hitman, and strangely the problem seems inverted for most peoples with 600 series ( massive fps drop on some missions, stutter ( specially when you shoots with slowmotion ) ( ofc its allways hard to know if this is the setup of thoses peoples or not )
 
Wow, it seems that all of you have issues understanding the techreport measurements.
The results made by both sites are pretty much in line, looking at average FPS only, and the tests done @1080P have the hd7950 ahead.
It is funny to notice that using the Techreport setting, if one were to turn V-sync on, 30FPS with v-sync is what one should aim for, with Nvidia card doing significantly better.
Using HArdOCP setting land into pretty unacceptable territory imo, like barely above 10fps.
HardOCP measurements are by far less interesting than the techreport ones, a lot less fine grained, a lot less work to do too (for the reviewers).

EDIT

I actually think people especially enthusiast doesn't like the techreport measurement, as no matter the card, it shows their products under a less positive light, from yes I run the OlDGamez at 53FPZ, to actually the best I can hope is 30 FPS v-sync and it will drop of v-sync actually quiet often...
From I can tell it is Zmoother, to actually facing the fact that it is in fact a shaky ride, and so on.
 
Last edited by a moderator:
Wow, it seems that all of you have issues understanding the techreport measurements.
The results made by both sites are pretty much in line, looking at average FPS only, and the tests done @1080P have the hd7950 ahead.
It is funny to notice that using the Techreport setting, if one were to turn V-sync on, 30FPS with v-sync is what one should aim for, with Nvidia card doing significantly better.
Using HArdOCP setting land into pretty unacceptable territory imo, like barely above 10fps.
If you refer to the Hitman:Absolution measurements with the frametimes plotted above, I disagree. With vsync, the 7950 will generally run at 30fps with short episodes at 60fps (the single frame spikes in frametimes are smoothed out with vsync just as a frame limiter tends to smooth microstutter with SLI or CF, it appears to be "just" some synchronization issue, the actual render times of the frames probably don't spike at all as a fast frame is followed by a slow frame or vice versa). The 660Ti on the other hand is always limited to 30fps with occasional drops to 20fps. It would be clearly worse than the 7950 with vsync on.

Edit:
Sorry, I mixed it up a bit. The frametimes for Hitman:absolution are actually the numbers from HardOCP. But anyway, they are not just above 10 fps. They are in the playable territory.
 
Last edited by a moderator:
If you aren't sensitive to microstutter, yes:
http://techreport.com/r.x/7950-vs-660ti/hma.gif

I wonder if other HDMI users can impact performance by changing the color as well? For example:

Out the box settings
vs
Color Corrected

As shown in FC3 there is 2 FPS reduction with the color matching AMD. Also, the IQ, once changed (he didn't mention what he did exactly) is identical to AMD. I'm not sure why that is:?: Matching color shouldn't impact performance as far as I know. However, a deeper look at this seems to suggest that there is more to it then just changing the color. For some reason a nv user had to change his resolution from SD/HD to PC in order to correct the color problem. That only shows up if you are using HDMI. DVI should default to PC.

But I'm not sure if there is such an option for CCC though, shrug.
 
Last edited by a moderator:
Other sites have duplicated TTR's findings... so your claim is rather unsubstantiated.

Without going to much after it, and without contest the results:

- Borderland 2, i need to get the settings as PhysX by the cpu is a real problem on AMD cards with this game, so if they have use Low or average physX.. this is a problem as it create threads stall who will forcibly explain the problem with this game.

- BF3 results are good compared to what TR have shown.

- Skyrim... huum.. something definitively happend with this game who need be fixed by AMD.

At the same time, the end explanation by NH is more in line of what users see it seems... ( if they cant see what card is what in a blindtest ( lol ) and knowing the problem exist ( not like someone who will just fire the game )
 
Last edited by a moderator:
Other sites have duplicated TTR's findings... so your claim is rather unsubstantiated.

Thanks for link, but after reading it I do see them showing major problems in Borderlands 2, no problem in BF3 and something between the two in Skyrim.
No Hitman Absolution tested as far as I can see and that was the game you referred to in the first place. Besides Nordic test platform was again based around Intel X79 chipset which adds little to variety of configurations I was seeking for.
As I said before, the issue is real and reproducible, but to claim everyone will have it on their system (even in problematic titles) is at this point premature.

Good thing is that finally hardware websites are starting to pay attention to frame metering (and microstutter) which will give some pain to all graphic chip manufactures and therefore we, end users, will gain most.
 
You can also pick another set of titles and you may find very different results between the two companies as well.

You also have to question whether these things are going to affect you. I play a lot of Borderlands 2 (Lvl 50 baby!) on a 8 core Bulldover + Tahiti XT2 or a 4 core Phenom 2 and Picairn XT and can't say that I've ever noticed the effects of this in playing; the only thing that I did notice was the lag in displacement map loading, but that is common thing with UE titles.
 
You can also pick another set of titles and you may find very different results between the two companies as well.

You also have to question whether these things are going to affect you. I play a lot of Borderlands 2 (Lvl 50 baby!) on a 8 core Bulldover + Tahiti XT2 or a 4 core Phenom 2 and Picairn XT and can't say that I've ever noticed the effects of this in playing; the only thing that I did notice was the lag in displacement map loading, but that is common thing with UE titles.

your CPU was a bottleneck. You should change to Jaguar opto cores with dot product specific registers. It will be the shit this year.
 
So now that we are what a year into the 7x00 series being out , how is it performance wise over the 6x00 series ? I had a 6950 bios unlocked to a 6970 plus a bit on the clock speeds and at launch I didn't find the 7950 worth buying. Has that change at all ?
 
So now that we are what a year into the 7x00 series being out , how is it performance wise over the 6x00 series ? I had a 6950 bios unlocked to a 6970 plus a bit on the clock speeds and at launch I didn't find the 7950 worth buying. Has that change at all ?
AnandTech allows you to compare various GPUs in a bunch of apps. I don't know what drivers were used. Here's HD6970 vs. HD7950 w/ Boost: http://www.anandtech.com/bench/Product/509?vs=645
 
AnandTech allows you to compare various GPUs in a bunch of apps. I don't know what drivers were used. Here's HD6970 vs. HD7950 w/ Boost: http://www.anandtech.com/bench/Product/509?vs=645

Yeah I don't think my 6950 is going away until Sea Islands or Maxwell. I like a 100% or better speed boost across the board, preferably with significant new features too.

I still have a 3870 in one machine though and that card could use replacement. It's actually pretty usable with a lot of games, but yeah it's iffy. I am hoping to get ahold of a free 4850....
 
So now that we are what a year into the 7x00 series being out , how is it performance wise over the 6x00 series ? I had a 6950 bios unlocked to a 6970 plus a bit on the clock speeds and at launch I didn't find the 7950 worth buying. Has that change at all ?

At TPU 1920x1200 7950 is ~25% faster than 6970, I _think_ their 7950 is the one without boost-clocks though.
 
So now that we are what a year into the 7x00 series being out , how is it performance wise over the 6x00 series ? I had a 6950 bios unlocked to a 6970 plus a bit on the clock speeds and at launch I didn't find the 7950 worth buying. Has that change at all ?

It is good....i went from 6970 unlocked to 7970 NE....most new games i get 50-100% faster fps....this is comparing max optimal cayman overclock to max optimal tahiti overclock (ie. without cranking up vcore/heat/noise)...900/1375mhz vs 1100/1600mhz....tahiti reacts to overclock much better, especially the vram...

I gave this upgrade path a 8/10 on my pleasure scale...and i bought mine before the never settle bundle...at a good price still It would be a 8.5/10 if AMD settles the stuttering problem discovered...at least Dave has owned up with an update...

But get the normal edition....the GE has too much vcore for normal users imho...and you would need a custom heatsink solution to tame it.

However that said...we know Seas Island is coming in another 3 months...the 8870 will be less than 299...and performance is as good if not better than 7970 GE....and very likely it will be free of stutter....not sure if AMD can/will completely solve the stutter in GCN v1...you know...the talk about upgrade through obsolescence...or natively broken in hardware..just speculations though.
 
Thanks guys , my gf wants to get a new purse which means I can get a new video card. I'm debating holding out for the new stuff which is hopefully maybe coming soon ? Dunno if I want to drop $300 for 25-50%
 
Back
Top