I'm on a 7950. Do you think grabbing a 290x on ebay for under $200 is a decent move ? I rather wait for to drop real money on a micron drop , but I kinda want a little more power now and i'm seeing the 290s start to drop a lot in price
I'm on a 7950. Do you think grabbing a 290x on ebay for under $200 is a decent move ? I rather wait for to drop real money on a micron drop , but I kinda want a little more power now and i'm seeing the 290s start to drop a lot in price
I haven't picked one to bid on. I would prob put my h60 with kraken on the card I choose.
Do you guys think the 7950 to 290x is worth it though. I'm kinda wishy washy on it.
I went from a 7950* to a GTX 970, it's noticeable. I think it's a roughly comparable upgrade.
I was really considering the 290x's but it turned out I'd have needed a rare one with a special bios in order to run it in my Dell XPS 8300. And even with that it didn't sound 100% certain there wouldn't be issues. I spent hours google searching the issue but my loyalty had a limit and I bought an EVGA GeForce GTX 970 SSC ACX 2.0+, the one with the new cooler, on sale, and I ended up with two great games bundled with it. Not sure if that's OT, but it is the background to the switch.
*Btw, in the guru3d forums you can score drivers that allow downsampling on the 79x0 cards.
Should make the trick, Well, i can only enjoin you to check some reviews and try see the gain by yourself with the games, or software you are using.
personally i have not upgrade my system with 7970's, but i have 2 of them, and they are just incredibly fast, even for OpenCL 3D Raytracing render ( Luxrender, V-ray or now Cycles as AMD have just patched Blender and allow to use OpenCL on Cycles ) ( Damn, you imagine, itt have been needed that AMD create the patch for Blender to get the developpement team to understand what they should do, i use OpenCL+Blender witth exporters without any problem.. maybe they should hire some guys from Luxrender team. ) Instead of crying about kernel size, they have never think to just split it in multiple one.. crazy. It was so simple to fix Cycles OpenCL, that when you look the time to see the Blender team dont do it.. so much time loss.
Its funny because it exactly at the same time, that Blender annonces multiples assets support for the Pixar engine, Alembic engine ( Lucasfilm if im right ) and a lot of other well known movies studio engine.
Now just for say, im a bit much tempted to take a FuryX, , i have 8.4Tflops with my 2x 7970, so for render, i should end at something like 17-18Tflops with only adding one Fury.
I
we are saving for a house so I have limited funds to play with. I was thinking of a 290x so when I get my rift or vive I have enough umph to run it decently and then later that year buy two new graphics cards based on the next micron drop and hbm2.
I wont lie though a fury x would be nice but I can't afford it right now . A nano would be nice too but no idea on performance or price
Hé a good choice is never OT. The 970 is a really cool gpu, even with the memory things. But why it will have been a problem with the 8300 if i can ask ? It could not recognize the gpu? I will be a bit surprised if this will be the case.
Is somebody strong-arming Nvidia to not provide an AIO solution like that for the same price-point?It's basically an apples-to-oranges comparison, but that advantage may be necessary to remain competitive....
Tom's have 290X uber at 242W compared to 233W of 980Ti. What exactly you mean to convey by that statement then?
It's unknown if the AIO cooler is really needed or just something they did for, uhm, coolness factor.
Except, power draw isn't directly comparable to benchmark numbers, so I don't really see your point.How much is Fury X winning some of the benchmarks again?
There's a lot of negativity radiating from you in this thread. What is it they're "hiding", you think? They announced 1.5x perf/watt improvement vs. hawaii. Other figures for power useage is also circulating in this thread, so I have to assume those also come from AMD. Sounds like you got an axe of some sorts to grind, for reasons I can't quite figure out.In any case its simple when they start swaying away from performance and power usage vs their competition, they are hiding something or something they don't want to give up because its going to hurt them some way.
Why would you need specific BIOS support for new video cards? All video cards use the same BIOS compatibility nonsense hailing right back to the early 1980s, to BIOS, even the latest GPUs are just a dumb text display. So you should be totally fine with any video card you like, provided Dell provided you with the appropriate auxiliary PCIe power connectors, which isn't a surefire thing. (Saying this from experience; had an XPS box with a 1000W PSU that only offered 6-pin power connectors. BLAH.)Dell has an amusing/infuriating attitude about upgrading the BIOS of their PC's in order for them to deal with newer videocards. They officially support what was availible during the time they were on sale, and no more.
Is somebody strong-arming Nvidia to not provide an AIO solution like that for the same price-point?
The comparison is apt as long as its done as per the pricing.
Assuming you don't make frequent video card purchases, I'd save and wait for HBM2 offerings since that will be the real game-changer regarding a "future proof" purchase.I haven't picked one to bid on. I would prob put my h60 with kraken on the card I choose. Do you guys think the 7950 to 290x is worth it though. I'm kinda wishy washy on it.
Well, I wouldn't assume that the first GPUs on 16/14nmFF will be 600mm2 monsters. (Also, design costs will rise, so low volume products are less valid.) If I were to make a guess, I'd say that the first products are more likely to top out at around 300mm2 or so, making performance increases versus todays top end GPUs relatively modest. Of course, the amount of memory can increase, and I'd hope DisplayPort 1.3 is included to drive high resolution displays, as well as 4k at higher frame rates. Also, I would assume that the first products will have novelty pricetags attached to them for some time...Assuming you don't make frequent video card purchases, I'd save and wait for HBM2 offerings since that will be the real game-changer regarding a "future proof" purchase.
So if you had a peek about all & everything presented in the past few days, one thing you will have noticed. The new architecture doesn't seem to offer support for HDMI 2.0, but instead still uses HDMI 1.4a. This means with Fury GPU based products, like the new project Quantum for example which really is a small form factor PC, you can't fully use it on an Ultra HD TV.
See, HDMI offer bandwidth support for 60hz, on 1.4a it'll drop back to a measly 30 Hz. A big miss if you ask me, HDMI 2.0 is the answer for products in the living room. Especially Nano and Project Quantum in mind, this might not have been the smartest move from the engineering team within AMD.
It's a bit of a thing for 4K gaming in the living room I'd say.
This question was answred by AMD Matt from our forums:
Q: Does the AMD Radeon R9 Fury X GPU have HDMI 2.0?
A: No. AMD recommends and uses DisplayPort 1.2a for 4K60 content.