Value of Consoles vs PC *spin off*

Anyone looking to buy a GPU and keep it for the long haul would be much better served going with AMD. Nvidia GPUs have poor long term value IMO.

What would the similar priced early 2014 AMD alternative have been?
Just wondering, I do believe the AMD architecture was more “future proof” with regards to async compute
 
What would the similar priced early 2014 AMD alternative have been?
Just wondering, I do believe the AMD architecture was more “future proof” with regards to async compute
R7 265 which was a rebranded and overclocked 7850. It was a direct, price matched competitor. Don't get me wrong, it doesn't come remotely close to keeping up with a base PS4 and hasn't for a long time, but its much better than anything comparable at the time from NVidia. It actually performs much closer to a GTX 770 than a GTX 750ti in most of the newer AAA multiplatform titles.
 
Btw my 'pro' upgrade was 150 euro only; I sold my launch PS4 for 250 euro. That was a pretty good upgrade for me.

Indeed, reselling components on any platform can significantly reduce your overall expenditure.

A friend actually built the early 2014 'digital foundry-pc', with the core i3 and the 750ti, I believe he pirated windows and still payed upwards of 500-600 euros. Needless to say, he got f*****, because later in the generation, the 'digital foundry-pc' became more and more shitty, with the biggest joke trying to run Red Dead Redemption 2 on it last year. The 2016 and 2018 tomb raider games were also embarrassing to watch.

In early 2014 the promise was that this 'digital foundry-pc' would run the entire console generation with better settings, for only a 50% price increase over the console.

At least now nobody is trying to make that argument anymore

The beauty of a PC is it's modularity and flexibility. Realistically a 750Ti was never going to last an entire console generation. Not because it's not technically more capable than what is in the consoles (which in many ways it isn't), but because it's entirely unreasonable to expect either Nvidia/AMD or games developers to support what is effectively an obsolete GPU architecture that no-one is using, 7 years after it's launch.

When the 750Ti launched it was generally faster than the PS4 and the point of it was to give PC gamers who already owned decent CPU's to upgrade to console matching or beating performance for an extremely low price ($149). A wise buyer at that point would have bought one with the intention of keeping it no more than a year or two and then upgrading it to something newer and faster once available such as a GTX 1060 or even 1070. In that way they could have maintained console matching or beating performance for the whole generation - including the mid gen models for as little at $520 (for a 1070) - far cheaper than the cost of a PS4, PS4 Pro and 7 years of online subscriptions. And that 1070 will even last a year or so into this new generation without sacrificing too much until an economical replacement is available.

Of course an upgrade to keep up with this this gen would require a full platform and SSD upgrade too, so realistically you'd be looking to spend in the order of $1k. But if you were to do that 1 year from now, the likelyhood is that you'll get something that could again have console matching or beating performance for the rest of the generation.

Savvy PC upgrading over time doesn't have to be significantly more expensive than console gaming. But unfortunately when these kind of (IMO silly) comparisons come up, people always tend to focus on the absolute worst case scenario for the PC's perspective of buying a brand new system from scratch on day one of the consoles launch.
 
Indeed, reselling components on any platform can significantly reduce your overall expenditure.



The beauty of a PC is it's modularity and flexibility. Realistically a 750Ti was never going to last an entire console generation. Not because it's not technically more capable than what is in the consoles (which in many ways it isn't), but because it's entirely unreasonable to expect either Nvidia/AMD or games developers to support what is effectively an obsolete GPU architecture that no-one is using, 7 years after it's launch.

When the 750Ti launched it was generally faster than the PS4 and the point of it was to give PC gamers who already owned decent CPU's to upgrade to console matching or beating performance for an extremely low price ($149). A wise buyer at that point would have bought one with the intention of keeping it no more than a year or two and then upgrading it to something newer and faster once available such as a GTX 1060 or even 1070. In that way they could have maintained console matching or beating performance for the whole generation - including the mid gen models for as little at $520 (for a 1070) - far cheaper than the cost of a PS4, PS4 Pro and 7 years of online subscriptions. And that 1070 will even last a year or so into this new generation without sacrificing too much until an economical replacement is available.

Of course an upgrade to keep up with this this gen would require a full platform and SSD upgrade too, so realistically you'd be looking to spend in the order of $1k. But if you were to do that 1 year from now, the likelyhood is that you'll get something that could again have console matching or beating performance for the rest of the generation.

Savvy PC upgrading over time doesn't have to be significantly more expensive than console gaming. But unfortunately when these kind of (IMO silly) comparisons come up, people always tend to focus on the absolute worst case scenario for the PC's perspective of buying a brand new system from scratch on day one of the consoles launch.
That wasn't the message that was repeatedly hammered home though. It was the 750ti alongside the rest of the components to form a 600$ or so PC would match or beat a PS4 for the entire generation and DF planned to show this throughout. This of course was parroted nonstop on gaming forums as an example of PC budget value. The second similar message was how a 970 could play multiplatform games at console or better settings at 1080p/60 throughout the generation. As this GPU begin to falter it changed to a 1060 6GB. That GPU begin to falter and these types of comparisons mostly died off.
 
Indeed, reselling components on any platform can significantly reduce your overall expenditure.



The beauty of a PC is it's modularity and flexibility. Realistically a 750Ti was never going to last an entire console generation. Not because it's not technically more capable than what is in the consoles (which in many ways it isn't), but because it's entirely unreasonable to expect either Nvidia/AMD or games developers to support what is effectively an obsolete GPU architecture that no-one is using, 7 years after it's launch.

When the 750Ti launched it was generally faster than the PS4 and the point of it was to give PC gamers who already owned decent CPU's to upgrade to console matching or beating performance for an extremely low price ($149). A wise buyer at that point would have bought one with the intention of keeping it no more than a year or two and then upgrading it to something newer and faster once available such as a GTX 1060 or even 1070. In that way they could have maintained console matching or beating performance for the whole generation - including the mid gen models for as little at $520 (for a 1070) - far cheaper than the cost of a PS4, PS4 Pro and 7 years of online subscriptions. And that 1070 will even last a year or so into this new generation without sacrificing too much until an economical replacement is available.

Of course an upgrade to keep up with this this gen would require a full platform and SSD upgrade too, so realistically you'd be looking to spend in the order of $1k. But if you were to do that 1 year from now, the likelyhood is that you'll get something that could again have console matching or beating performance for the rest of the generation.

Savvy PC upgrading over time doesn't have to be significantly more expensive than console gaming. But unfortunately when these kind of (IMO silly) comparisons come up, people always tend to focus on the absolute worst case scenario for the PC's perspective of buying a brand new system from scratch on day one of the consoles launch.
Yeah, I’ve always tended to micro upgrade my PCs and occasionally do an overhaul. I currently have a 1060 and will upgrade to maybe a 3060ti when I can get one for ~£200-250 (taking into consideration selling the 1060).

I do think including PS+ is a bit of misdirection though as it is just for online and also you get games with it, so whilst there is a cost often it pays for itself.

All formats have their pros and cons and resale values (etc) - this is why I think USP are so important...and why I have to own all formats! Lol
 
That wasn't the message that was repeatedly hammered home though. It was the 750ti alongside the rest of the components to form a 600$ or so PC would match or beat a PS4 for the entire generation and DF planned to show this throughout.

When did DF say that 750 Ti would outperform PS4 for the entire generation, and that they planned to show it across the generation?
 
Good luck trying to get old Bethesda games to work. Oblivion, Fallout 3 and New Vegas are crash fests - if you can even get them running on Windows 10 at all. They work fine on Xbox!

Polar opposite to my experiences thankfully! I played Fallout 3 plus expansions through on Win 10 a couple of years back - more than a hundred hours. Absolutely loved it! Struck me as being one of the least buggy Bethesda games I'd ever played funnily enough, and all the issues I can recall were in game glitches rather than crashing out of the game. Put about 15 hours into New Vega before I gave up (preferred FO 3), and don't remember it crashing either (though admittedly on PC you can kind of filter that out if it's only very occasional).

Haven't tried Oblivion since Win 7 though, I'll dig out the disk and give a go when I can as I never played The Shivering Isles. Morrowind was a crashing hell on everything I ever tried with it - in terms of everything (hardware, OS, drivers, prayers, ritual sacrifices). Oh lordy did that damn game like bouncing me out to the desktop.

In general Win 10 has been absolutely bang on for the older games I've played, and I've done a fair few 360 era games on it. Maybe it helps that I'm running an archaic nvidia GPU from the time of Moses.
 
That wasn't the message that was repeatedly hammered home though. It was the 750ti alongside the rest of the components to form a 600$ or so PC would match or beat a PS4 for the entire generation and DF planned to show this throughout. This of course was parroted nonstop on gaming forums as an example of PC budget value.

The original DF article didn't mention this build lasting the whole console generation. In fact the exact opposite is true. They call out the fact that it's unlikely to last the whole generation here:

https://www.eurogamer.net/articles/digitalfoundry-2015-budget-gaming-pc-guide

Digital Foundry said:
Of course, as developers get to grips with the consoles, we may find that the PC we've created falls behind (already there are issues with texture quality in a couple of games - the amount of VRAM on the GPU will probably become much more important this year), but the beauty of the platform is its upgradability - RAM, CPU and GPU can all be replaced with far more capable parts.

The second similar message was how a 970 could play multiplatform games at console or better settings at 1080p/60 throughout the generation. As this GPU begin to falter it changed to a 1060 6GB. That GPU begin to falter and these types of comparisons mostly died off.

I think there's an element of filtering out the past arguments that best suite the current narrative here. The 970 is still more capable than the base consoles today. And while it was far more capable when it launched, and that lead has been significantly been cut into, if you'd bought it back when it launched only a year after the PS4, you'd have got a generation of console beating performance out of it. As I said above, buying at the same time that the new generation of consoles launch is never the best idea from a cost perspective. It may well be possible to see out the whole generation on a top end card. But if you're really that concerned about cost (not everyone is), then better to wait a year or so and buy a more modest one, or buy a lower end card at the start of the gen and then something much better 2-3 years later.
 
That wasn't the message that was repeatedly hammered home though. It was the 750ti alongside the rest of the components to form a 600$ or so PC would match or beat a PS4 for the entire generation and DF planned to show this throughout. This of course was parroted nonstop on gaming forums as an example of PC budget value. The second similar message was how a 970 could play multiplatform games at console or better settings at 1080p/60 throughout the generation. As this GPU begin to falter it changed to a 1060 6GB. That GPU begin to falter and these types of comparisons mostly died off.

If you continue like that, a PS4 outperforms a 2080Ti and above. The 750Ti, or basically all NV GPUs just didnt age well due to their disadvantage in compute as compared to AMDs hardware back then.
A early 2012 AMD 7950 outperforms the base consoles. Even people left with 7870 gpus can still play the latest games, on a 8 year old gpu.

Now, NV products will sure age better then kepler etc did, seeing what architecture Ampere is now, the ray tracing and DLSS capabilities.
Dont forget optimizations have improved alot too, even on pc.

DF should have used a AMD gpu back then, they did age much and much better due to compute. Anyway, its more 'fair' to compare late 2013 products to begin with, instead of low/mid end 2012 hardware. R9 280X/290X launched the same time as the consoles, for example.
Also, a 750Ti was never considered a great GPU not even eight years ago.

PC's have never aged as well as they did this generation, and that trend only seems to be improving.
 
Polar opposite to my experiences thankfully! I played Fallout 3 plus expansions through on Win 10 a couple of years back - more than a hundred hours. Absolutely loved it! Struck me as being one of the least buggy Bethesda games I'd ever played funnily enough, and all the issues I can recall were in game glitches rather than crashing out of the game.

That was probably around my last play through as well. Between then and early this year I have not changed my PC hardware at all, there have only been Windows 10 updates and cannot run Fallout 3 outside a VM. This is why Google has millions of results for "fallout 3 windows 10". :yep2:
 
This of course was parroted nonstop on gaming forums as an example of PC budget value. The second similar message was how a 970 could play multiplatform games at console or better settings at 1080p/60 throughout the generation. As this GPU begin to falter it changed to a 1060 6GB. That GPU begin to falter and these types of comparisons mostly died off.

“but that’s the beauty of pc gaming”

also at that point the discussion usually becomes how useful a pc is and how you can use it do anything!
 
Also for more than an entire year,
PC ran RDR2 at exactly 0 frames per second.

but in general for the past and current generation: a card which comes out a year after a console, and costs more than said entire console, will outperform the console.
No surprises there
 
Also for more than an entire year,
PC ran RDR2 at exactly 0 frames per second.

Exclusives and timed exclusives impact every platform. And some would argue that waiting a year to play the best version of a game is worth it. It's not as if there was absolutely nothing else to play during that year. Perhaps said PC gamer may have chosen to play Forza Horizon 4 instead?

but in general for the past and current generation: a card which comes out a year after a console, and costs more than said entire console, will outperform the console.

Or less. As was the case last generation with both the GTX 970 and the R9 285.
 
That was probably around my last play through as well. Between then and early this year I have not changed my PC hardware at all, there have only been Windows 10 updates and cannot run Fallout 3 outside a VM. This is why Google has millions of results for "fallout 3 windows 10". :yep2:

I went back to see exactly when I played it - four years ago. Egads, time flies! Just downloaded it to see how it fares for me now, aaaaand I can't get GFWL to work so it won't run.

GFWL really was a piece of shit, and one that no-one wants to clean up.
 
I went back to see exactly when I played it - four years ago. Egads, time flies! Just downloaded it to see how it fares for me now, aaaaand I can't get GFWL to work so it won't run.
There are ways around GFWL but even skirting those, the game has so many CTD issues.
 
There are ways around GFWL but even skirting those, the game has so many CTD issues.

Yeah, found that. It's working now, and I was able to fire up my old save without issue, but I didn't really play for long enough to be able to comment about stability on my current setup. Running around Megaton though, seeing the stash was in my house - what a nostalgia hit! Amazing game.

Anyway, as there are a few folks in this thread who like older games, this is really a PSA for anyone else that might want to ensure they can play GFWL infested games in future years.

The GFWL installer fails every time for me. It seems to download the required .msi files, but it can't run then and MS won't fix it even though they still distribute the installer.

This guide tells you what to do: https://steamcommunity.com/sharedfiles/filedetails/?id=2291332499

It's pretty straight forward, but I've decided to store a copy of these .msi files in a safe place so I can run them again if I need to, in a future OS install, after MS have stopped distributing them. I don't know for sure it'll work but it can't hurt!

Okay, no more off topic, I'll shut up about GFWL now.
 
The original DF article didn't mention this build lasting the whole console generation. In fact the exact opposite is true. They call out the fact that it's unlikely to last the whole generation here:

https://www.eurogamer.net/articles/digitalfoundry-2015-budget-gaming-pc-guide





I think there's an element of filtering out the past arguments that best suite the current narrative here. The 970 is still more capable than the base consoles today. And while it was far more capable when it launched, and that lead has been significantly been cut into, if you'd bought it back when it launched only a year after the PS4, you'd have got a generation of console beating performance out of it. As I said above, buying at the same time that the new generation of consoles launch is never the best idea from a cost perspective. It may well be possible to see out the whole generation on a top end card. But if you're really that concerned about cost (not everyone is), then better to wait a year or so and buy a more modest one, or buy a lower end card at the start of the gen and then something much better 2-3 years later.
That was published 2 years in. I think I remember this happening far earlier in the generation. Though it has been a while and I cant say it's not possible I'm mixing up forum zealots and DF claims. The 970 did of course outperform the base consoles the entire generation, however it costs almost as much as an entire console and the original promise was 1080p/60 at console or better settings which absolutely did not happen.

When did DF say that 750 Ti would outperform PS4 for the entire generation, and that they planned to show it across the generation?

The Eurogamer search algorithm is quite poor unfortunately. Also see my reply to PJBLiverpool. I do think I remember this being stated in one of the various videos/articles in the early part of the generation.
 
Imo it was a much better investment at the time to go with an AMD gpu. Even the 970 isnt performing all that well relative to its specs, aside from the 3.5gb vram limit.
In may 2012 i went with a GTX670, which was a bad investment as it didnt age well. Was quite shocking to see that when i tested a 7950 rocking on. Remember that at the time (2012), the GTX670 was competing with the hd7950.... Now it doesnt even come close to a 7870 or even 7850. Many call the old GCN gpus 'fine wine'.

A 750Ti was one of the worst choices backthen for DF to choose from i think. Even at launch it wasnt that great of a serious gaming gpu either.
Aside from raw performance, AMD gpus also had more vram allocated to them then their NV counterparts, 3gb for the 7950, 6 for the 7970.
 
Back
Top