AMD: Pirate Islands (R* 3** series) Speculation/Rumor Thread

Interesting though now with far cry 4 turning on x2 msaa the fury x looses 20% that is a huge hit.

Is Far Cry 4 one of the games that is close to the 4gb limit?

I think AMD said they have a few engineers working on improving VRAM utilization now, so maybe it won't be as much of an issue later on.
 
well if they can do anything they have to do it before reviewers put up their reviewers. This card is marketed for the 4k crowd, its not good if it hits a vram wall.
 
well if they can do anything they have to do it before reviewers put up their reviewers. This card is marketed for the 4k crowd, its not good if it hits a vram wall.
Yeah, who knows. I remember them stating the reason it took so long to launch these cards was to polish the drivers up. The hardware was ready months ago, I heard.
 
Is disabling the Uefi check will not be enough in this case ?
I just skimmed through that thread (yikes) and I don't see any firm answer. There is a post from when people were trying this and that, before Dell and nVidia provided the solution that did work. So maybe I was wrong in thinking UEFI was somehow related.
Troubleshoot Details:
1) Install the A08 BIOS update from the Dell Drivers Page
2) Remove the GPUs. Connect display to the Intel(Motherboard's GPU). Disable Secure Boot
3) Plug in the GTX 970 with the display connected only to GTX 970. It will boot exactly once
4) Once you shutdown or restart for any reason, it will enter in an infinite boot cycle
5) The only way to make it boot again is to remove the GPU. Boot once on Intel GPU. Shutdown the system. Connect the GTX 970 again. It will boot again exactly once. No mercies
Ok, no more OT from me. :) Back to business!
 
Well if they are making this card for overclocking and don't let people over clock the ram what do you think will happen?
Plenty video cards in the past have had RAM that already was pushed so hard you basically couldn't overclock it at all, so it's not a big deal. My current Hawaii has factory OC'd RAM that artefacts in 3DMark Firestrike, I turn any more on that knob and it just gets worse.
 
Ahh ... this is a fantastic disinformation campaign. Why not just release NDA if the Fury series is that good? We will find out about it sooner or later ...:yes:
Disinformation campaign is probably an overly strong term. Rather, this falls under a more traditional general marketing umbrella. IE, first you tease your new product, then you display it (which they did at CES earlier this month), then they unveil it (at E3 some days ago now), then to keep up the hype the NDA on reviews expires only when the product is available to buy (or close enough anyway.)

If you reveal everything all at once before it's in stores, all the hype dies, and the product already feels old when it finally is available to buy.
 
Guru3d's take on AMD's internal benchmarks ...
Secondly, and I want to make a strong note about this, these benchmarks are made by AMD themselves, it's at 4K only and paints a very narrow picture. Please understand that when a company shows performance results in a guide or document, only the positive numbers are included. I mean its a marketing strategy to make 'the other guys' look bad. Yes, cherry picked results.

So please do take these things with a grain of salt and just wait for the real-world numbers from media outlets like Guru3D.com before you draw any conclusions okay ? The Radeon R9 Fury X packs 4096 GCN cores and 8.9 billion transistors.
http://www.guru3d.com/news-story/amd-radeon-r9-fury-x-official-benchmarks.html

Also an interesting observation from a forum poster on AMD's internal benchmarks.
These cherry picked games are also running an a bizarre range of settings, I mean who would buy a high end GPU to run a 4k display with in game settings set to MEDIUM with no AF? Why not just simply put all games to maximum and set the 4k resolution with AFx16 and either AA on or off (throughout all tests) and see which is the better performer. Heck even the numbers from AC Unity are poor on both cards with those settings so why not just see what they can do flatout since we all now we can run 4k games if you adjust the settings.
 
Guru3d's take on AMD's internal benchmarks ...

http://www.guru3d.com/news-story/amd-radeon-r9-fury-x-official-benchmarks.html

Also an interesting observation from a forum poster on AMD's internal benchmarks.

These cherry picked games are also running an a bizarre range of settings, I mean who would buy a high end GPU to run a 4k display with in game settings set to MEDIUM with no AF? Why not just simply put all games to maximum and set the 4k resolution with AFx16 and either AA on or off (throughout all tests) and see which is the better performer. Heck even the numbers from AC Unity are poor on both cards with those settings so why not just see what they can do flatout since we all now we can run 4k games if you adjust the settings.

That is what people are actually doing out there.
I saw someone running witcher 3 at 1080p at 100fps obviously not maxed out.
of course people adjust settings when they game why is that a suprise?
 
Correct. That's a hardware limitation. They hit the reticle limit and they weren't going to be able to put in over 4GB of VRAM. It was never going to be a FP64 compute GPU.
Originally AMD never published it. When I was writing that table I forgot that we finally got them to cough up a number when the 295X2 was launched (and have since corrected the article). Even then that's the only place you'll find it.

As usual Wikipedia is wrong. Officially Tonga doesn't have an HEVC decoder, though I wouldn't be surprised if it was maybe the same UVD design with HEVC either broken or intentionally turned off. Fiji however does have an HEVC decoder.

Then someone (maybe you?) should correct it, after all that's the spirit of wikipedia :)
 
Guru3d's take on AMD's internal benchmarks ...

http://www.guru3d.com/news-story/amd-radeon-r9-fury-x-official-benchmarks.html

Also an interesting observation from a forum poster on AMD's internal benchmarks.
This is not strange at all. They use the setting which is playable. Yes, you could just push everything to the max and both cards will suffer, but that would do no good for AMD. They might still have won the benchmark, but for marketing purposes, it is stupid. It's at 4K because it is something that AMD always talk about.
Sometimes people need to stop thinking too far...
 
That is what people are actually doing out there.
I saw someone running witcher 3 at 1080p at 100fps obviously not maxed out.
of course people adjust settings when they game why is that a suprise?

I haven't run anything with AF lower than 8x in over 10 years, I believe. Not even when I'm trying to play a recent game in my 2 year-old laptop with a GF107.
I get that they wouldn't turn on MSAA if they're already rendering at 4K, but turning off anisotropic filtering and getting some settings on Medium raises quite the huge flag for me.

You don't get to price a card towards $650 and then market it to run the games on medium settings with AF turned off. That's just nonsense IMO.
 
Last edited by a moderator:
Settings AMD used for these benchmarks just show how useless manufacturer numbers are and it doesn't only apply to AMD. The only useful numbers shown are for:
* 3DMark
* BF4
* FarCry 4
* Skyrim
* Sleeping Dogs
* Sniper Elite 3
* The Witcher 3

The rest of them is useless from a gamer point of view. They might still be used for comparing hardware and finding bottlenecks, but not a representation of real world usage.
 
I know 4K is all the rage, but anything below 60fps is unplayable to me. It's quite obvious no single card can do that @4K.

So GG AMD and Nvidia, let me know when we get there.
 
Plenty video cards in the past have had RAM that already was pushed so hard you basically couldn't overclock it at all, so it's not a big deal. My current Hawaii has factory OC'd RAM that artefacts in 3DMark Firestrike, I turn any more on that knob and it just gets worse.


Well it isn't that, this card is marketed as a card for overclockers, if they can only overclock the GPU kinda defeats the purpose, and it looks to be bandwidth limited when overclocking by AMD's results, hopefully in future drivers that will be resolved.
 
Last edited:
That is what people are actually doing out there.
I saw someone running witcher 3 at 1080p at 100fps obviously not maxed out.
of course people adjust settings when they game why is that a suprise?

Benchmarks are different, its to gauge a cards performance to other cards so a good picture is made what is a good card to purchase, isolating certain instances where it is favorable to one card isn't a good measurement technique. But it did give a good deal of insight to where it might have issues.

And who would by a $650 card to play at medium settings? The only time I did that was with Crysis, and that's because the game crushed hardware, Witcher 3 isn't like that. And also this is personal preference I guess its kinda moot.
 
Oh, it's AMD doing this? Well, then it's all right, I guess. The positive side of it: just like Tomb Raider TressFX thing, it will provide years of counter-fodder.
The most surprising part of it all is that AMD would explain to an editor why they retracted giving a sample. Sometimes I think even I'd be better at this stuff than them...
 
And who would by a $650 card to play at medium settings?

Just speaking in general terms (not specifically about "medium") but there are certainly settings that are too expensive for what they contribute to the final presentation, plus some settings can be annoying to some people (various post-processing or even shadow settings).

e.g.
Someone might hate all forms of depth-of-field, so bokeh can bokeh a ticket out of here.
Egregious light shafts, motion blur, bloom, chromatic aberration.
Bioshock Infinite's contact hardening shadows are terrible in practice because the shadow cascades are literally cutting edges. Same thing happens in GTA5 IIRC.
 
Just speaking in general terms (not specifically about "medium") but there are certainly settings that are too expensive for what they contribute to the final presentation, plus some settings can be annoying to some people (various post-processing or even shadow settings).
I don't think there was ever a visual upside on disabling AF. ;)

But, yes, you're right.
 
Back
Top