AMD: Pirate Islands (R* 3** series) Speculation/Rumor Thread

I'm on a 7950. Do you think grabbing a 290x on ebay for under $200 is a decent move ? I rather wait for to drop real money on a micron drop , but I kinda want a little more power now and i'm seeing the 290s start to drop a lot in price
 
I'm on a 7950. Do you think grabbing a 290x on ebay for under $200 is a decent move ? I rather wait for to drop real money on a micron drop , but I kinda want a little more power now and i'm seeing the 290s start to drop a lot in price


just depends on how much horsepower ya need, but yeah for under 200 its a good deal.
 
I'm on a 7950. Do you think grabbing a 290x on ebay for under $200 is a decent move ? I rather wait for to drop real money on a micron drop , but I kinda want a little more power now and i'm seeing the 290s start to drop a lot in price

Is it a reference one? Those are kinda loud. I would go for aftermarket with dual axial fans.
 
I haven't picked one to bid on. I would prob put my h60 with kraken on the card I choose.

Do you guys think the 7950 to 290x is worth it though. I'm kinda wishy washy on it.
 
I haven't picked one to bid on. I would prob put my h60 with kraken on the card I choose.

Do you guys think the 7950 to 290x is worth it though. I'm kinda wishy washy on it.

Should make the trick, Well, i can only enjoin you to check some reviews and try see the gain by yourself with the games, or software you are using.

personally i have not upgrade my system with 7970's, but i have 2 of them, and they are just incredibly fast, even for OpenCL 3D Raytracing render ( Luxrender, V-ray or now Cycles as AMD have just patched Blender and allow to use OpenCL on Cycles ) ( Damn, you imagine, itt have been needed that AMD create the patch for Blender to get the developpement team to understand what they should do, i use OpenCL+Blender witth exporters without any problem.. maybe they should hire some guys from Luxrender team. ) Instead of crying about kernel size, they have never think to just split it in multiple one.. crazy. It was so simple to fix Cycles OpenCL, that when you look the time to see the Blender team dont do it.. so much time loss.

Its funny because it exactly at the same time, that Blender annonces multiples assets support for the Pixar engine, Alambic engine ( Lucasfilm if im right ) and a lot of other well known movies studio engine.

Now just for say, im a bit much tempted to take a FuryX, , i have 8.4Tflops with my 2x 7970, so for render, i should end at something like 17-18Tflops with only adding one Fury.
 
Last edited:
I went from a 7950* to a GTX 970, it's noticeable. ;) I think it's a roughly comparable upgrade.

I was really considering the 290x's but it turned out I'd have needed a rare one with a special bios in order to run it in my Dell XPS 8300. And even with that it didn't sound 100% certain there wouldn't be issues. I spent hours google searching the issue but my loyalty had a limit and I bought an EVGA GeForce GTX 970 SSC ACX 2.0+, the one with the new cooler, on sale, and I ended up with two great games bundled with it. Not sure if that's OT, but it is the background to the switch.

*Btw, in the guru3d forums you can score drivers that allow downsampling on the 79x0 cards.
 
I went from a 7950* to a GTX 970, it's noticeable. ;) I think it's a roughly comparable upgrade.

I was really considering the 290x's but it turned out I'd have needed a rare one with a special bios in order to run it in my Dell XPS 8300. And even with that it didn't sound 100% certain there wouldn't be issues. I spent hours google searching the issue but my loyalty had a limit and I bought an EVGA GeForce GTX 970 SSC ACX 2.0+, the one with the new cooler, on sale, and I ended up with two great games bundled with it. Not sure if that's OT, but it is the background to the switch.

*Btw, in the guru3d forums you can score drivers that allow downsampling on the 79x0 cards.

Hé a good choice is never OT. The 970 is a really cool gpu, even with the memory things. But why it will have been a problem with the 8300 if i can ask ? It could not recognize the gpu? I will be a bit surprised if this will be the case.
 
I
Should make the trick, Well, i can only enjoin you to check some reviews and try see the gain by yourself with the games, or software you are using.

personally i have not upgrade my system with 7970's, but i have 2 of them, and they are just incredibly fast, even for OpenCL 3D Raytracing render ( Luxrender, V-ray or now Cycles as AMD have just patched Blender and allow to use OpenCL on Cycles ) ( Damn, you imagine, itt have been needed that AMD create the patch for Blender to get the developpement team to understand what they should do, i use OpenCL+Blender witth exporters without any problem.. maybe they should hire some guys from Luxrender team. ) Instead of crying about kernel size, they have never think to just split it in multiple one.. crazy. It was so simple to fix Cycles OpenCL, that when you look the time to see the Blender team dont do it.. so much time loss.

Its funny because it exactly at the same time, that Blender annonces multiples assets support for the Pixar engine, Alembic engine ( Lucasfilm if im right ) and a lot of other well known movies studio engine.

Now just for say, im a bit much tempted to take a FuryX, , i have 8.4Tflops with my 2x 7970, so for render, i should end at something like 17-18Tflops with only adding one Fury.


we are saving for a house so I have limited funds to play with. I was thinking of a 290x so when I get my rift or vive I have enough umph to run it decently and then later that year buy two new graphics cards based on the next micron drop and hbm2.

I wont lie though a fury x would be nice but I can't afford it right now . A nano would be nice too but no idea on performance or price
 
I



we are saving for a house so I have limited funds to play with. I was thinking of a 290x so when I get my rift or vive I have enough umph to run it decently and then later that year buy two new graphics cards based on the next micron drop and hbm2.

I wont lie though a fury x would be nice but I can't afford it right now . A nano would be nice too but no idea on performance or price

I will not be bad, but it is not 200$ more or less that will do you can or not buy an house.. But dont send me your wife for an explanation, i think she will knock me down.

I joke but i completely understand it.

For be honest, i think the 290x should be a really good choice over the 7950. Or an Nvidia in the same ballpark of performance. im not much side branded, even if clearly i have own more ATI / AMD GPU's than Nvidia ones.
 
I'm going to see if I can grab one real cheap then. I think in the next week or so as the 390x and the furys show up it will drop alot
 
Hé a good choice is never OT. The 970 is a really cool gpu, even with the memory things. But why it will have been a problem with the 8300 if i can ask ? It could not recognize the gpu? I will be a bit surprised if this will be the case.

Dell has an amusing/infuriating attitude about upgrading the BIOS of their PC's in order for them to deal with newer videocards. They officially support what was availible during the time they were on sale, and no more. It got crazy, with older XPS systems, like the 420, supporting newer nVidia cards like the 6xx series, but the fairly recent 8500 series didn't. Finally nVidia had to team up with Dell so as to make a custom BIOS. Some crazy threads in the Dell forums over that.

My Dells were great bargains, I got them for cheaper (large sale on the 8300, plus I lucked out with a variable discount, I won an extra 30% off) than the cost of their parts. The OS, assembly, and goodies, were a bonus. No overclocking, but with my i7 2600, I haven't had any need. But as you see, this came with a downside.

But to answer your question, what I read suggests that it has something to do with AMD 290 cards being most compatible with newer PC's that use UEFI. They aren't built with older Dells, with older BIOS's, in mind. I also read that MSI makes a 290x that has a bios toggle to run in compatibility mode with older hardware. But I had my heart set on the ASUS STRIX 290x, ... long story short, it all added up to me reading of fellow 8300 users buying nVidia cards and it was smooth sailing.

I might build my next one. :) I've disassembled this one enough, even reapplied thermal grease on the cpu, so that it's not so forbidding a task.

Edit: I just now was reminded of the irony that clinched my buying a 7950. :) http://hardforum.com/showthread.php?t=1697618
Did not have this issue with my 420. It came with an 8800 GT and saw upgrades to a 5700 and then a 6870. It's now hugging my 7950. Old hardware, the XPS 420, but a very large and expensive build, even looks more expensive than my 8300.
 
Last edited:
Ooh ok, i see.. the onl Dell XPS i have use was workstation based on office. At home, and in general at work every computer i use since ( well since my first Athlon XP one ) are own and cusom made.. so never got this type of " bad " surprise.
 
It's basically an apples-to-oranges comparison, but that advantage may be necessary to remain competitive....
Is somebody strong-arming Nvidia to not provide an AIO solution like that for the same price-point?

The comparison is apt as long as its done as per the pricing.
 
It's unknown if the AIO cooler is really needed or just something they did for, uhm, coolness factor. ;)

We'll see with Fury non X if there is enough headroom for overclock or simply just good average temps and decibels In daily use too..

about putting an AIO on a gtx 980ti here 50 degrees celsius for example www.evga.com/Products/Product.aspx?pn=06G-P4-1996-KR

And i would like to add this for the Titan X power consuption on average www.hardware.fr/articles/935-3/consommation-efficacite-energetique.html
 
How much is Fury X winning some of the benchmarks again?
Except, power draw isn't directly comparable to benchmark numbers, so I don't really see your point. :)

In any case its simple when they start swaying away from performance and power usage vs their competition, they are hiding something or something they don't want to give up because its going to hurt them some way.
There's a lot of negativity radiating from you in this thread. What is it they're "hiding", you think? They announced 1.5x perf/watt improvement vs. hawaii. Other figures for power useage is also circulating in this thread, so I have to assume those also come from AMD. Sounds like you got an axe of some sorts to grind, for reasons I can't quite figure out.

Dell has an amusing/infuriating attitude about upgrading the BIOS of their PC's in order for them to deal with newer videocards. They officially support what was availible during the time they were on sale, and no more.
Why would you need specific BIOS support for new video cards? All video cards use the same BIOS compatibility nonsense hailing right back to the early 1980s, to BIOS, even the latest GPUs are just a dumb text display. So you should be totally fine with any video card you like, provided Dell provided you with the appropriate auxiliary PCIe power connectors, which isn't a surefire thing. (Saying this from experience; had an XPS box with a 1000W PSU that only offered 6-pin power connectors. BLAH.)
 
Is somebody strong-arming Nvidia to not provide an AIO solution like that for the same price-point?

The comparison is apt as long as its done as per the pricing.

I think Nvidia's view is the hybrid solution is not required to be competitive with current "reference" offerings, but that doesn't say it won't change in the future. Current AIB hybrid offerings for Titan X, 980ti and 980 series is about $100 more. I think the price point of the Fury X w/o the AIO solution would broaden product appeal and potential sales (lower price), but it is uncertain if Fury X's singular usage of AIO as the only solution is necessary for performance to remain competitive.
 
I haven't picked one to bid on. I would prob put my h60 with kraken on the card I choose. Do you guys think the 7950 to 290x is worth it though. I'm kinda wishy washy on it.
Assuming you don't make frequent video card purchases, I'd save and wait for HBM2 offerings since that will be the real game-changer regarding a "future proof" purchase.
 
Assuming you don't make frequent video card purchases, I'd save and wait for HBM2 offerings since that will be the real game-changer regarding a "future proof" purchase.
Well, I wouldn't assume that the first GPUs on 16/14nmFF will be 600mm2 monsters. (Also, design costs will rise, so low volume products are less valid.) If I were to make a guess, I'd say that the first products are more likely to top out at around 300mm2 or so, making performance increases versus todays top end GPUs relatively modest. Of course, the amount of memory can increase, and I'd hope DisplayPort 1.3 is included to drive high resolution displays, as well as 4k at higher frame rates. Also, I would assume that the first products will have novelty pricetags attached to them for some time...

It has generally been a good idea to purchase GPUs in sync with lithographic node steps, and the upcoming step promises to be a major one. But these last 28nm behemoths are really ambitious, so if you for some reason want the performance it may actually be a decent approach to buy and enjoy now, in anticipation of straightening the questionmarks concerning what will be produced, when, and at what price at the 16/14nm nodes. Personally, I hold on to my wallet until I see DP1.3. :)
 
AMD Radeon Fury X doesn't have HDMI 2.0 support

So if you had a peek about all & everything presented in the past few days, one thing you will have noticed. The new architecture doesn't seem to offer support for HDMI 2.0, but instead still uses HDMI 1.4a. This means with Fury GPU based products, like the new project Quantum for example which really is a small form factor PC, you can't fully use it on an Ultra HD TV.

See, HDMI offer bandwidth support for 60hz, on 1.4a it'll drop back to a measly 30 Hz. A big miss if you ask me, HDMI 2.0 is the answer for products in the living room. Especially Nano and Project Quantum in mind, this might not have been the smartest move from the engineering team within AMD.

It's a bit of a thing for 4K gaming in the living room I'd say.

This question was answred by AMD Matt from our forums:

Q: Does the AMD Radeon R9 Fury X GPU have HDMI 2.0?
A: No. AMD recommends and uses DisplayPort 1.2a for 4K60 content.

http://www.guru3d.com/news-story/amd-radeon-fury-x-doesnt-have-hdmi-2-support.html
 
Back
Top