AMD: Southern Islands (7*** series) Speculation/ Rumour Thread

Homeles: The article is based on another article (benchmark.pl), which is based on an alleged picture of HD 7950 GE:

amdradeonhd7950ghzediejqsv.jpg


But the picture is fake - it's just a crop of HD 7870
 
I think they should go all-in with further significant discounts.

20q105v.jpg

Ok, for clarity sake, has any ISV, OEM, distribution house or retail chain released actual or potential dates with the HD 7970 Ghz Ed. to be readily available in the channel?

Yes, I know there are OC'd cards now, but we are interested in these new skus as well

Thx all
 
Last edited by a moderator:
Source for the AF claim? As far as I know this was a hardware bug in the texture filtering units. This has been discussed to death.
Where does Nvidia change the image quality arbitrarily? HawX was a application bug. They don't do it with tessellation or AF. Also I find it unconstructive to divert from the topic at hand by saying "but Nvidia does optimizations too". Maybe, maybe not. But that is beside the point here. When it comes to optimizations, no matter what kind they are, my alarm siren always goes on. And it always is for the benefit of the IHV and for winning benchmarks, never for the customer. Some may call this view overly pessimistic, I call it realistic.

Remember Nvidia making a lot of noise about AMD's AF issue on the road in trackmania, then when people looked more closely at the screenshots they saw that Nvidia's IQ was inferior all over the place?

http://img829.imageshack.us/img829/9648/edit01.jpg

Original for zooming - http://www.3dcenter.org/image/view/3747/_original
 
Not only those soft textures in background, but there was also higher amount of moire on the road. But some sites cared only about the line on AMD's hardware, because they were told to care about the line on AMD's hardware.
 
I mean, if true, what the ... for?? Why so late and how many will buy it?
Why so late? AMD was waiting on process maturity to produce what's been dubbed as "Tahiti XT 2" chips, and also developing their boost tech.

I'd be willing to bet money at this point that it's finally on its way, but I highly doubt it will come in substantial quantities and I'm even more doubtful it'll beat the 690 in anything other than multi-monitor configurations.
 
HD 6870 owner here and I appreciate being able to play Crysis 2 with tessellation on and playable frame-rates thanks to the adjustment available in the CCC.

Also, IIRC, aren't even the Jersey barriers tessellated to a degree whereby no discernible visual improvements are gained over a much less tessellated version?

If my memory of that is correct then, imo, that is a decisive fact.

AMD did the right thing in addressing such a sad situation.

The only decisive fact that came out of that whining fest was that tessellation in Crysis 2, was well within the allowed range allowed by the DX11 SDK. Whats even more hilarious was how the level of Tessellation in Crysis 2 was almost the exact same as AMDs SubD11 given to microsoft to show what AMD thought was the ideal level for games just prior to them finalizing DX11.

So instead of being a respectable company and accepting that their early DX11 cards flat out sucked at tessellation... AMD executives hoping to avoid blame and keep their jobs decided to invent some non existent scandal in which Crytek and nVidia conspired to give Crysis 2 triangles smaller than pixel level, or is used to create geometry that’s never seen, it’s visually useless but a performance hit!!

And finally.... all of you who hopped on the BS wagon and joined in on the manufactured outrage, did you ever stop a moment and think about how in the months between Crysis 2 being released and the Texture/DX11 pack being finished.... weren't you the same people p*ssed off about Crytek lying about it not having DX11, or being a console port, or not following in the original Crysis' footsteps and having system requirements that slapped your expensive computer around like two dollar ho!

Seriously though... unlike Bioware who refused to listen to millions of fans and get rid of hologram kid and rewrite the ending so it had narrative coherence thus ruining one of the best series ever made.... Crytek actually did listen to the fans, went back to work, and gave them high resolution textures, DX11 features, and b*tch slapped your computer with far more demanding graphics.... only to then get slandered in the media by AMD and all their fanboys as to divert attention away from a failed product. Reminds me of how my 5 year old nephew just drives his older brother nuts and the moment a finger gets laid on him he cries for mommy.

/rant
 
Why so late? AMD was waiting...

Yeah, they can and have the right to wait whatever they want. The thing is that we are approaching the time for the second generation on 28 nm. See, AMD releases in few weeks* HD 7990, while NV will be waiting for 2-3-4 months to destroy it again with GTX 790.

Funny, no?
 
Wow, you sound like quite the apologist there. I think everyone (except you) agrees that the Crysis DX11 patch was just terribly executed; tesselated water rendered underneath the ground of all levels with water in them, brick walls ludicrously overtesselated, rock faces looking like ridiculous pincushions upclose, and those concrete barriers with millions, if not tens of millions of polys each, all completely wasted on FLAT surfaces...

Even Crytek themselves have said it wasn't much more than a quick hack, and it sure shows.
 
Wow, you sound like quite the apologist there. I think everyone (except you) agrees that the Crysis DX11 patch was just terribly executed; tesselated water rendered underneath the ground of all levels with water in them, brick walls ludicrously overtesselated, rock faces looking like ridiculous pincushions upclose, and those concrete barriers with millions, if not tens of millions of polys each, all completely wasted on FLAT surfaces...

Even Crytek themselves have said it wasn't much more than a quick hack, and it sure shows.

This.
 
The only decisive fact that came out of that whining fest was that tessellation in Crysis 2, was well within the allowed range allowed by the DX11 SDK. Whats even more hilarious was how the level of Tessellation in Crysis 2 was almost the exact same as AMDs SubD11 given to microsoft to show what AMD thought was the ideal level for games just prior to them finalizing DX11.
Of course it was within the allowed range by DX11. It's impossible to write a DX11 app that goes above tessellation level 64. I don't know where you're getting it's the same level of tessellation as in the SubD11 demo. That demo uses patches with more control points and I believe it caps the level at 32. It also has nothing to do with what AMD thought was ideal.
So instead of being a respectable company and accepting that their early DX11 cards flat out sucked at tessellation... AMD executives hoping to avoid blame and keep their jobs decided to invent some non existent scandal in which Crytek and nVidia conspired to give Crysis 2 triangles smaller than pixel level, or is used to create geometry that’s never seen, it’s visually useless but a performance hit!!
Senior executives probably knew nothing about the minor drama that was Crysis 2's tessellation implementation.
 
PCB wise they are idential, save for the memory devices. The improved "sensing" is actually part of PowerTune and comes via the software. Anyone trying out "Boost" should make sure they are using 12.7 Beta.
 
Does that mean Turbo will work with it or only PTE ?

Edit after reading a bit better the result, it look it is the case.
 
Does that mean Turbo will work with it or only PTE ?
DTE works with all cards and bios versions using the Cat12.7+ drivers. For the boost thing you need one of the newer 7970 GEs or you try modding a normal one using the bios linked above.
 
DTE works with all cards and bios versions using the Cat12.7+ drivers. For the boost thing you need one of the newer 7970 GEs or you try modding a normal one using the bios linked above.

I was not sure at first it concern too the PT Boost.

But anyway, as many mention it: just OC your 7970 and finish.
 
So if I already have my cards overclocked to 1.1Ghz core/1.6Ghz memory the only benefit I would see is reduced power consumption via boost's dynamic clock management, correct?
 
So if I already have my cards overclocked to 1.1Ghz core/1.6Ghz memory the only benefit I would see is reduced power consumption via boost's dynamic clock management, correct?
If you did the overclock at stock voltage, no as for the 1050MHz a slightly higher voltage is applied apparently.
 
Back
Top