New G-unit: Kepler vs GCN

function

None functional
Legend
Okay, so I'm after a new graphics card. I've narrowed it down to a GeForce 770 or a Radeon 280X.

The 760 I'm looking at is some fancy overclocked thing with 4GB of ram (I ain't no 2GB scrub) or a standard clocked 280X (with a quiet none-stock cooler).

Bang for buck shows them to be close but with a slight edge perhaps going to the 280X. But here's the thing ...

Which architecture is more future proof: Kepler or GCN??

My hardware lasts me years. Whatever I buy will likely see out my overclocked 2500K PC. And that will likely see out this gen of consoles. So I need to pick wisely. But I am not wise.

Which will age better, and be better suited to any incoming compute shiznit that gets added to games? (I'm not convinced that the lovely Mantle will take off, so I'm excluding that from the consideration for the time being).
 
Last edited by a moderator:
GF770 is 'only' DX11 on the hardware level, the 280X should be 11.2, unless that chip is a re-badged tahiti core in which case it's only 11.1. Not that it matters as I don't think any PC game really use any of these additional features to any great extent.

Hard to say if any PC graphics hardware really ages all that well, TBH... They are all equally good until the next generation arrives and then they get shoved to the wayside. :p
 
Thanks for pointing that out! Just assumed they both offered the same level of DX support ...

So tiled resources will only be available on the 280X!

That should be in use by, like, 2018 right? So that could be useful to me! Too bad it'll mean I can't downgrade to Windows 7 like I was thinking of doing (I've spent some time on Win 7 machines recently, and even after a year with 8 I still prefer the feel of 7).
 
Looking at the hardware on both new-gen consoles and how multi-platform games are usually developed (first consoles, then PC), I wouldn't purchase a new graphics card in 2014 that doesn't have a GCN GPU.

I bet that for the price of that 4GB GTX770 you could buy a custom-cooled R9 290 4GB, which should bring much better performance out-of-the-box and should be much more future-proof.
 
Damn. I meant to say it's a 4GB GTX 760. Spent all of yesterday looking at benchmarks ...

4GB 760 is about 20% OC and £212.

3GB 780X is £235.

Very similar performance. One of main reasons I want this in the short term is to stuff the memory full of insanely high res Skyrim assets, and it's clear that 2GB won't hack it.

Long term ... I just want it to last without falling off a performance cliff and being frikkin useless like my GTX 7900 did back in 2006. I know that "in theory" GCN should be more familiar long term, but in the PC space a lot depends on how hard the vendor pushes and supports developers, and how well the underlying architecture stands up to changes in usage.

GCN seems like it's better at mixing compute and rendering, which seems like a good thing to go forward with. But .... ?
 
Back in the day, I used the largest high-res texture mods for Skyrim I could find and it never went over ~1700MB out of my HD6950's 2GB.
That was 2 years ago, though.

As for the rest, are you sure the GTX760 has a similar performance to the HD7970 GHz, even if it clocks up to 1.2GHz?

What you gain with a nVidia card is PhysX, ability to turn on constant video recording and the hope that they'll turn the Shield's video streaming into a standard Android app.

But if you're concerned with longevity. I think getting an architecture that is essentially the same as the consoles is definitely the way to go.
Besides, nVidia is to change their architecture to Maxwell and embedded ARM CPUs during the next two years, so I don't think the current Keplers will be on their driver development spotlight for much longer.
 
Is it possible for a 32-bit game to fill 2GB VRAM? Most 32-bit games aren't even large address aware AFAIK.

At this point both Kepler and GCN are pretty ancient. Both are about 2 years old. It's unprecedented. I wouldn't buy anything until 20nm and new architectures arrive. I suspect a major shake up is coming.

As for AMD being the supposed best option, lets also consider that 360, Wii and WiiU are AMD but NV is still dominant. I think what happens in the future will depend on Maxwell vs. GCN 2.0, drivers and devrel. The usual. AMD seems to be doing some clever marketing/advertising these days though, hence the positive vibe going on. Of course in the future PC gaming will again, as usual, be anchored by obsolete consoles. The next PC GPUs should be vastly superior to them.
 
I'd definitely go with the 280x given that choice. AMD seems to be investing in GCN for the long term so it might have a brighter future than previous AMD architectures, plus there's the console link to keep it fresh, plus there's the fact that it's faster than the 760 out of the box. You'll also be set to take advantage of Mantle if it ever catches on.

Personally though, if I were buying now and wanted the GPU to see out the whole console generation I'd get a custom cooled 290. It's much faster than the 280, has a more advanced feature set that's a perfect match for the new consoles, features TrueAudio and has the extra memory which will be handy for this generation. But I guess £112 extra is a lot more to put down even if it is expected to last the next 6-8 years.
 
I wouldn't buy anything until 20nm and new architectures arrive. I suspect a major shake up is coming.

The GTX7xx and R9 2xx lines are fairly recent. It should take quite some time before they're replaced..
To be honest, I wouldn't expect any new cards using 20 nm GPUs to come before Q4 2014.
 
The GTX7xx and R9 2xx lines are fairly recent. It should take quite some time before they're replaced..
To be honest, I wouldn't expect any new cards using 20 nm GPUs to come before Q4 2014.

I'm fine with Q4 2014. ;)

The NV 700 series is essentially the 600 series, and the R9 series is essentially the 7000 series. I'm also not so sure about how new Hawaii really is considering it seems like it was originally planned for early 2013 (remember that sudden Sea Islands roadmap alteration?) But anyway they are all old but selling for lots of money. 290X and 780 Ti look barely 2x faster than a 6970. Not so exciting at $700!
 
Back in the day, I used the largest high-res texture mods for Skyrim I could find and it never went over ~1700MB out of my HD6950's 2GB.
That was 2 years ago, though.

I've read folks talking about crashing when they go over their card's 2GB when using some of the insane mod combinations, so I'm assuming it's a real issue. And I want this card to last a few years and allow for some crazy ass texure mods ...

As for the rest, are you sure the GTX760 has a similar performance to the HD7970 GHz, even if it clocks up to 1.2GHz?

In the reviews I've seen yeah, the superclocked 760's come very close to the stock 280Xs, although the superclocked 280Xs pull away again. Could just be the game and resolutions combination's I've seen though.

But if you're concerned with longevity. I think getting an architecture that is essentially the same as the consoles is definitely the way to go.
Besides, nVidia is to change their architecture to Maxwell and embedded ARM CPUs during the next two years, so I don't think the current Keplers will be on their driver development spotlight for much longer.

Good point about Kepler! Both in the PC and console spaces GCN seems to have a longer active future.

Personally though, if I were buying now and wanted the GPU to see out the whole console generation I'd get a custom cooled 290. It's much faster than the 280, has a more advanced feature set that's a perfect match for the new consoles, features TrueAudio and has the extra memory which will be handy for this generation. But I guess £112 extra is a lot more to put down even if it is expected to last the next 6-8 years.

The price is certainly a stretch right at the moment. And other thing is ... the custom cooled 290s are both rare and expensive. Infact, both the 280 series and 290 series are in short supply. Seriously limited numbers in the online shops and the prices of the available models are high. And that seems to be keeping NV 770 and 780 prices high ...
 
Sooo ....

Out of the currently available options it looks like GCN is the way to go. Only problem is that the 280X and 290 are in short supply and price-hiked.

Ooof. I wonder if I should hedge my bets and go with a 270X 4GB (seen one for £165) and just accept that I'll want to replace it in two or three years ...
 
What games are you playing and looking forward to? Did you ever mention what card you have now?

I would most certainly not pay the currently exorbitant prices for a 280 or 290. Plus you should consider that NV has the market share and that will affect what features get use. I'm not entirely convinced that AMD being in all consoles is as significant as being pushed because new architectures will change the landscape. Buy for today, not so much the future because the future is not predictable IMO. Certainly be very skeptical about promises and advertising.
 
Ooof. I wonder if I should hedge my bets and go with a 270X 4GB (seen one for £165) and just accept that I'll want to replace it in two or three years ...

I'd probably skip the 270X just to guarantee a more stable 1080p60 (or higher + other settings etc) considering PS4 as a target.

Given the supply issues for 280/290X, maybe you ought to wait just a bit more... :p
 
Long term ... I just want it to last without falling off a performance cliff and being frikkin useless like my GTX 7900 did back in 2006.
Interestingly enough, those GPUs are the last of the terribly future-impacted GPUs NVidia made. An equivalent ATI card from that time would be around twice as fast on newer games. That behaviour seemed to go back a long way, too.

Since G80, NVidia has been solid.

Someone might whisper in your ear: mine Litecoin with the AMD card while you're not gaming and it'll pay for itself eventually:runaway:This is also an option for NVidia, but can't be seriously argued as relevant.
 
What games are you playing and looking forward to? Did you ever mention what card you have now?

I'm looking forward to playing a fully modded up Skyrim. The idea alone is becoming an obsession!

I currently have a 560Ti (none 448) overclocked thingy. Still performing well - only 1GB of ram though. I scrubbered out on the 2GB version because the reviews at the time said it wasn't needed.

I would most certainly not pay the currently exorbitant prices for a 280 or 290.

That's seeming like pretty good advice.

I'd probably skip the 270X just to guarantee a more stable 1080p60 (or higher + other settings etc) considering PS4 as a target.

Given the supply issues for 280/290X, maybe you ought to wait just a bit more... :p

Yes, I think folks here have succeeded in pointing out the merits of waiting. Waiting it is!

It means missing out on the chance to sell my 560 TI, but I'd rather make the right choice than get a disappointingly small upgrade that I'll possibly be stuck with for years.
 
Skyrim isn't particularly demanding but running out of video RAM is indeed a problem with those crazy mods. You could probably sell off the 560 Ti and get something with 2GB and break almost even.

My fastest cards are a unlocked 6950 2GB and a 560 Ti 1GB. ;)
 
Interestingly enough, those GPUs are the last of the terribly future-impacted GPUs NVidia made. An equivalent ATI card from that time would be around twice as fast on newer games. That behaviour seemed to go back a long way, too.

Since G80, NVidia has been solid.

My 8800 GT was incredible value. It's aged like fine wine. Although a 1GB version would have aged better ...

Someone might whisper in your ear: mine Litecoin with the AMD card while you're not gaming and it'll pay for itself eventually:runaway:This is also an option for NVidia, but can't be seriously argued as relevant.

I hear from the interwebs that those miners are partly responsibly for the 280 and 290 shortages. Can you really make back more money that it'd cost in electricity? Seems like a GPU powered bubble might be forming ...
 
Skyrim isn't particularly demanding but running out of video RAM is indeed a problem with those crazy mods. You could probably sell off the 560 Ti and get something with 2GB and break almost even.

My fastest cards are a 6950 2GB and a 560 Ti 1GB. ;)

Argh! Should have gone for the 2GB 560, then I could have held out till 20nm ...
 
Interestingly enough, those GPUs are the last of the terribly future-impacted GPUs NVidia made. An equivalent ATI card from that time would be around twice as fast on newer games. That behaviour seemed to go back a long way, too.
Too bad that AMD killed driver support for R580 cards 4 years ago. Lots of D3D9 games from the past few years don't work right on them.
 
Back
Top