Value of Consoles vs PC *spin off*

Imo it was a much better investment at the time to go with an AMD gpu. Even the 970 isnt performing all that well relative to its specs, aside from the 3.5gb vram limit.
In may 2012 i went with a GTX670, which was a bad investment as it didnt age well. Was quite shocking to see that when i tested a 7950 rocking on. Remember that at the time (2012), the GTX670 was competing with the hd7950.... Now it doesnt even come close to a 7870 or even 7850. Many call the old GCN gpus 'fine wine'.

A 750Ti was one of the worst choices backthen for DF to choose from i think. Even at launch it wasnt that great of a serious gaming gpu either.
Aside from raw performance, AMD gpus also had more vram allocated to them then their NV counterparts, 3gb for the 7950, 6 for the 7970.
I'm surprised reviewers don't note the poor long term value of Nvidia GPUs. After 18 months or so they just start to fall off a cliff.
 
This guide tells you what to do: https://steamcommunity.com/sharedfiles/filedetails/?id=2291332499

It's pretty straight forward, but I've decided to store a copy of these .msi files in a safe place so I can run them again if I need to, in a future OS install, after MS have stopped distributing them. I don't know for sure it'll work but it can't hurt!
I've been through a few of these. This may work until the next Windows update, or next graphics driver update. It's an ongoing investment of time to keep some games running which I why decided to have a Windows 7 blackbox VM for a bunch of old games and old drivers. Now I don't have to wonder about OS/driver updates breaking things. :nope:
 
Huh? MSRP of the GTX 970 was $329 and the R9 285 was $249.

that’s what I am saying. In the same way that the MSRP of 3080 is $719 so in this case you will compare the MSRP instead of the actual price. Should the cards actually be available without markup with a discount then the discounted price would be compared.




Yeah, found that. It's working now, and I was able to fire up my old save without issue, but I didn't really play for long enough to be able to comment about stability on my current setup. Running around Megaton though, seeing the stash was in my house - what a nostalgia hit! Amazing game.

Anyway, as there are a few folks in this thread who like older games, this is really a PSA for anyone else that might want to ensure they can play GFWL infested games in future years.

The GFWL installer fails every time for me. It seems to download the required .msi files, but it can't run then and MS won't fix it even though they still distribute the installer.

This guide tells you what to do: https://steamcommunity.com/sharedfiles/filedetails/?id=2291332499

It's pretty straight forward, but I've decided to store a copy of these .msi files in a safe place so I can run them again if I need to, in a future OS install, after MS have stopped distributing them. I don't know for sure it'll work but it can't hurt!

Okay, no more off topic, I'll shut up about GFWL now.

If you keep the same windows install beyond 6-9 months then you are asking for trouble just learn to partition your drive and wipe the windows partition from time to time.
Otherwise deal with slower boot and shutdown times. Stuff not working, programs lagging, your GFWL issues, and so on
 
If you keep the same windows install beyond 6-9 months then you are asking for trouble just learn to partition your drive and wipe the windows partition from time to time.
Otherwise deal with slower boot and shutdown times. Stuff not working, programs lagging, your GFWL issues, and so on

I've run the same install of Windows 10 on a number of machines for years without significant issue, right back to an old C2D and an old A64 machine. I've even taken a HDD out of one machine and put it in an entirely different machine (Intel - > AMD) after several years and it worked fine.

The GFWL issue was because I no longer had it on my machine. I used the "refresh" option when I was drunk, to see what it did. Answer: exactly what it's supposed to, and I had to reinstall my apps (but not Windows). It's many years since I've had to reinstall Windows on a machine - one of mine or one of the ones I'm volunteered to help with.

I don't know what you're doing wrong with your PCs to need to wipe your Win 10 machines every 6 ~9 months. Are you regularly downloading malware?
 
That was published 2 years in. I think I remember this happening far earlier in the generation. Though it has been a while and I cant say it's not possible I'm mixing up forum zealots and DF claims.

This is the original article, they just re-posted it again 6 months after it was originally published because in their words, "Six months on, the Core i3/GTX 750 Ti combo continues to hand in a worthy PC gaming experience for not much money." I'm sure some forum zealots did indeed claim the system would last the whole console generation but that was obviously an ill informed assumption, and a heck of a lot to expect from a $149 GPU!
 
A 750Ti was one of the worst choices backthen for DF to choose from i think. Even at launch it wasnt that great of a serious gaming gpu either.
Aside from raw performance, AMD gpus also had more vram allocated to them then their NV counterparts, 3gb for the 7950, 6 for the 7970.

Yes the irony is that they considered using an R9 285 instead which was only a little more expensive and probably could have outperformed or at least kept up with the base consoles for the whole generation.

I'm surprised reviewers don't note the poor long term value of Nvidia GPUs. After 18 months or so they just start to fall off a cliff.

18 months is a bit of an exaggeration. At that timescale it would still be Nvidias latest architecure and well within both driver and developer support. It seems to me that developers and Nvidia offer good support for at least n-1 architectures which would give a typical architecture 4 years of well supported life. Pascal for example is still more than capable in any new game now a little over years from it's launch. But I do expect it to start falling behind now that Ampere has launched and it's likely receiving less support from Nvidia.

that’s what I am saying. In the same way that the MSRP of 3080 is $719 so in this case you will compare the MSRP instead of the actual price. Should the cards actually be available without markup with a discount then the discounted price would be compared.

Why wouldn't you compare MSRP's? It seems the most logical and fair way of doing things. Sure you might pay more if you buy as soon as the product is launched but that applies to both consoles and PC GPU's.

I've run the same install of Windows 10 on a number of machines for years without significant issue, right back to an old C2D and an old A64 machine. I've even taken a HDD out of one machine and put it in an entirely different machine (Intel - > AMD) after several years and it worked fine.

Same here, I couldn't even say how old my Win10 build is at present. It probably goes all the way back to when I bought my first SSD years back and I have no problems with it whatsoever. The PC is no longer in the days of Windows XP with frequent manual driver updates and patch downloads, manual defragmentation, and regular rebuilds etc... Most of that just happens in the background automatically these days.
 
Yes the irony is that they considered using an R9 285 instead which was only a little more expensive and probably could have outperformed or at least kept up with the base consoles for the whole generation.



18 months is a bit of an exaggeration. At that timescale it would still be Nvidias latest architecure and well within both driver and developer support. It seems to me that developers and Nvidia offer good support for at least n-1 architectures which would give a typical architecture 4 years of well supported life. Pascal for example is still more than capable in any new game now a little over years from it's launch. But I do expect it to start falling behind now that Ampere has launched and it's likely receiving less support from Nvidia.



Why wouldn't you compare MSRP's? It seems the most logical and fair way of doing things. Sure you might pay more if you buy as soon as the product is launched but that applies to both consoles and PC GPU's.



Same here, I couldn't even say how old my Win10 build is at present. It probably goes all the way back to when I bought my first SSD years back and I have no problems with it whatsoever. The PC is no longer in the days of Windows XP with frequent manual driver updates and patch downloads, manual defragmentation, and regular rebuilds etc... Most of that just happens in the background automatically these days.
18 months has been pretty close to the time between new GPU architectures this passed console gen. My 1080ti started falling off back in 2018, almost exactly 18 months after release. More and more games started underperforming on it. I'm not saying it just turns into a dud, but 18 months seems to be roughly when you start to notice big titles underperforming on a regular basis.
 
Last edited:
The biggest value in console gaming still is in the way it all just works.

I have so many issues with PCs, to begin with there’s so many stores and so many issues when trying to get game running...and don’t start me on PCVR vs PSVR!

PC has made great strides but the immediacy of consoles just makes it worth the negatives when you have a busy lifestyle or kids.
 
Yes the irony is that they considered using an R9 285 instead which was only a little more expensive and probably could have outperformed or at least kept up with the base consoles for the whole generation.

Absolutely.

18 months has been pretty close to the time between new GPU architectures this passed console gen. My 1080ti started falling off back in 2018, almost exactly 18 months after release. More and more games started underperforming on it. I'm not saying it just turns into a dud, but 18 months seems to be roughly when you start to notice big titles underperforming on a regular basis.

Yes NVs products didnt age so well, but with Turing and beyond they changed strategy (compute). I dont think Ampere is going to age bad like kepler did for instance. But we will see.

The biggest value in console gaming still is in the way it all just works.

I have so many issues with PCs, to begin with there’s so many stores and so many issues when trying to get game running...and don’t start me on PCVR vs PSVR!

PC has made great strides but the immediacy of consoles just makes it worth the negatives when you have a busy lifestyle or kids.

True, still the strong points of owning a console, its cheap (er) and its just plug & play. Still, i think consoles and pc's have grown closer and closer as generations have passed. With internet, full blown OS's (as opposed to BIOS) and HDDs coming to consoles they also introduced things like day one patches, instable OS's, crashes and security breaches, previously that was inherit to only pc's.
Comparing say W98, XP to W10 with steam.... things have gotten much 'easier' on the pc, basically with stores like steam its just click and play. Still not as easy as a console, but both have inherited ad and disadvantages from eachother.

Obviously theres a market for both, pc gaming is bigger then ever, and so is console gaming. Many have both, they kinda have co-existed since.... forever. And they probably will untill streaming will kill both.
 
18 months has been pretty close to the time between new GPU architectures this passed console gen. My 1080ti started falling off back in 2018, almost exactly 18 months after release. More and more games started underperforming on it. I'm not saying it just turns into a dud, but 18 months seems to be roughly when you start to notice big titles underperforming on a regular basis.

Do you have some specific examples of this? There will of course be corner cases were Turing fares better than we'd usually expect it to vs Pascal and of course as Turings more advanced features come into play we should expect to see Pascal fall heavily back. But in general it seems to me that Pascal still performs about where we'd expect it to vs Turing in the latest games. Both of the following reviews which cover a large swathe of the very latest games show the 1080Ti performing just a little behind the 2080 on average which is pretty much exactly where wed expect it to be. Ditto the 1080 and Vega 64.

Techspot even includes the 980Ti which is over half a decade old now and yet still performs just behind the GTX 1070 - where it was when it first launched. The gap has perhaps opening up by a few percentage points as you'd hope as new features become used and developers start to optimise specifically for a new architecture, but the general takeaway here for me is that Nvidia's GPU's haven't dropped off a performance cliff all the may back to Maxwell. If anything, Kepler was an exception rather than an indication of a rule. And the reason behind Keplers performance bomb is clear and was even predicted way back when it launched, i.e. it deliberately and specifically cut back on GPGPU capabilities compared to it's Fermi predecessor in order to create a more "gaming focused architecture". And in the short term it worked great for them since they were able to beat AMD's technically superior, larger, and more power hungry architecture until nvidia's new architecture (Maxwell) released. But then thanks to console development games gradually started to take advantage of GCN's superior feature set and compute capability and Kepler went bye bye.

The sames story really can;t be told about Turing and Ampere as far as my limited understanding can see. Both are very forward looking architectures, arguably more so than RDNA2 (and much more so than RDNA1).

EDIT: forgot to include links:

https://www.techpowerup.com/review/amd-radeon-rx-6900-xt/35.html
https://www.techspot.com/review/2160-amd-radeon-6900-xt/

and don’t start me on PCVR vs PSVR!

This surprises me, what PC VR do you use? Oculus is basically a self contained ecosystem that essentially behaves as if you have a console plugged into your PC. Granted you can do lots of tinkering with it outside the ecosystem but if you stick within the ecosystem it's a console like experience as far as I can see.
 
Last edited:
Do you have some specific examples of this? There will of course be corner cases were Turing fares better than we'd usually expect it to vs Pascal and of course as Turings more advanced features come into play we should expect to see Pascal fall heavily back. But in general it seems to me that Pascal still performs about where we'd expect it to vs Turing in the latest games. Both of the following reviews which cover a large swathe of the very latest games show the 1080Ti performing just a little behind the 2080 on average which is pretty much exactly where wed expect it to be. Ditto the 1080 and Vega 64.

Techspot even includes the 980Ti which is over half a decade old now and yet still performs just behind the GTX 1070 - where it was when it first launched. The gap has perhaps opening up by a few percentage points as you'd hope as new features become used and developers start to optimise specifically for a new architecture, but the general takeaway here for me is that Nvidia's GPU's haven't dropped off a performance cliff all the may back to Maxwell. If anything, Kepler was an exception rather than an indication of a rule. And the reason behind Keplers performance bomb is clear and was even predicted way back when it launched, i.e. it deliberately and specifically cut back on GPGPU capabilities compared to it's Fermi predecessor in order to create a more "gaming focused architecture". And in the short term it worked great for them since they were able to beat AMD's technically superior, larger, and more power hungry architecture until nvidia's new architecture (Maxwell) released. But then thanks to console development games gradually started to take advantage of GCN's superior feature set and compute capability and Kepler went bye bye.

The sames story really can;t be told about Turing and Ampere as far as my limited understanding can see. Both are very forward looking architectures, arguably more so than RDNA2 (and much more so than RDNA1).



This surprises me, what PC VR do you use? Oculus is basically a self contained ecosystem that essentially behaves as if you have a console plugged into your PC. Granted you can do lots of tinkering with it outside the ecosystem but if you stick within the ecosystem it's a console like experience as far as I can see.
Forza Horizon 4
Battlefield V
RDR 2
Star Wars Squadrons
Godfall
Dirt 5
Doom and Doom Eternal
Wolfenstein 2 and YoungBlood
The last several COD games
Fenyx Immortal Rising
Control
Hyperscape
World War Z
AC Valhalla
 
Forza Horizon 4
Battlefield V
RDR 2
Star Wars Squadrons
Godfall
Dirt 5
Doom and Doom Eternal
Wolfenstein 2 and YoungBlood
The last several COD games
Fenyx Immortal Rising
Control
Hyperscape
World War Z
AC Valhalla

I meant some specific examples of benchmarks showing Pascal performing significantly worse vs Turing than we would expect it to based on Turing launch time reviews. i.e. evidence that Pascal performance has dropped off a cliff.

Most of the games above are included in the two reviews I linked above and thus already accounted for in my conclusion that Pascal in fact hasn't dropped all that much performance at all since Turning launched using the 1080Ti and the 2080 as examples.
 
I think in the PC gpu space its much more evident that older hardware 'underperforms' due to something new coming every year or so.
We should not forget that bar AAA sony exclusives, most multiplat games actually dont run all that well on 2013 base consoles either in frame rate regards, but also reduced settings and inhumane loading times. All hardware ages, but its much more in your face in the pc space where new and more powerfull hardware arrives quickly and where things like Ultra settings exist.

This is not to say NV made the wrong decisions with Kepler though, on the other hand, they seem to have made the right ones this time around. Ampere is more future proofed then anything else.
I think one member here Joey, commented rather much on this compute ability GCN and modern GPUS have, how much devs like it.
Also to note, i wish i had gone with a 7970ghz edition (6gb) instead of a GTX670 back in 2012, as some said before here, one could have seen it coming as there were already articles on NV botching GPGPU, thats not to say i feel cheated or something, i used that GTX670 from 2012 to 2016 without i felt left behind all that much.
In those four years, i mostly had better performance then the base ps4 (BF4, first doom etc). A 7970 would have lasted all the way to today though, outperforming it, a 2012 product.
 
“ i used that GTX670 from 2012 to 2016 without i felt left behind all that much.
In those four years, i mostly had better performance then the base ps4 (BF4, first doom etc”

outside of some early badly optimized ports, is this really true? Or were you running textures below medium quality for example. Just curious
 
“ i used that GTX670 from 2012 to 2016 without i felt left behind all that much.
In those four years, i mostly had better performance then the base ps4 (BF4, first doom etc”

outside of some early badly optimized ports, is this really true? Or were you running textures below medium quality for example. Just curious

It's absolutely true. In that time frame Kepler was still performing strongly. I also had a GTX 670 across those years and when I upgraded to my current Pascal, I did so primarily because of a hardware issue with the 670 that was resulting in hardware reboots, not because the GPU itself lacked performance. Naturally it was a good way below the highest settings in many PC games but I was obsessive then about DF face offs as I am now and I don't once recall seeing one where the 670 performed worse than the base consoles up to that point. I will of course stand corrected if you locate one though - it was a long time ago!
 
The biggest value in console gaming still is in the way it all just works. I have so many issues with PCs, to begin with there’s so many stores and so many issues when trying to get game running...and don’t start me on PCVR vs PSVR!
Yes, as somebody who uses my PC as occasionally as my PlayStation, when I do want to use the PC it's the inevitable gauntlet of Windows security updates, Nvidia driver updates, Steam / EGS / uPlay / Origin updates and then game updates. My internet is slow arse so it's a bugger. Since 2007 and PS3 I've just become used to leaving the console in its low powered mode and the system and games updating silently when the console is "off" (*not technically off). The most you have to do is do a system install and reboot.

My 32Gb i9 + 3080 destroys my PS5 technically but the console experience is unbeatable.
 
Yes, as somebody who uses my PC as occasionally as my PlayStation, when I do want to use the PC it's the inevitable gauntlet of Windows security updates, Nvidia driver updates, Steam / EGS / uPlay / Origin updates and then game updates. My internet is slow arse so it's a bugger. Since 2007 and PS3 I've just become used to leaving the console in its low powered mode and the system and games updating silently when the console is "off" (*not technically off). The most you have to do is do a system install and reboot.

I'm not disputing your overall point that consoles are a slicker experience - they clearly are. But it's also worth highlighting how far PC's have come in that regard since the days when this kind of argument pretty much formed the backbone of any PC/console comparison.

Today, Windows updates happen silently in the background if you set them to and simply prompt you to reset the system as required which in my experience is probably around once fortnight or less (yes the major updates will require more time and a couple of reboots but they're more like once or twice a year and are still automated).

Driver updates are actually more frequent if anything but again that happens fairly transparently through GeForce Experience which simply pops up a notification in the system tray when a new driver is available, which you then click and press install. Wait a few minutes and it's done. You don't even have to reboot most of the time. And that's assuming you choose to update every driver release in the first place which unless you're playing one of the new releases it's targeted at is usually not necessary (even then it's not necessary, but it is advisable for the best performance).

Steam and other game platforms download and install patches silently in the background similar to consoles - not as slick though as the PC has to at least be on, although mine is for other reasons most of the time anyway so that kind of thing is largely invisible to me. And then the game platform updates themselves, again, it's all automatic. When a steam platform update is available it simply tells you, runs the update, restarts itself and away you go. You have to wait for it to complete of course but we're usually talking seconds rather than minutes.

So a console experience is indeed slicker, but I don't really think the PC experience is a big concern anymore. Don't get me wrong though, I'd still like it to be more console like and there's definitely room for improvement.
 
It's absolutely true. In that time frame Kepler was still performing strongly. I also had a GTX 670 across those years and when I upgraded to my current Pascal, I did so primarily because of a hardware issue with the 670 that was resulting in hardware reboots, not because the GPU itself lacked performance. Naturally it was a good way below the highest settings in many PC games but I was obsessive then about DF face offs as I am now and I don't once recall seeing one where the 670 performed worse than the base consoles up to that point. I will of course stand corrected if you locate one though - it was a long time ago!

a 770 will not like console level textures in GTA V, as an example
 
a 770 will not like console level textures in GTA V, as an example

Digital Foundry disagrees:

Digital Foundry said:
With most console settings deduced from the PC menus, one question remains: just how does a budget PC cope with the exact same visual setup? Having seen our Core i3 4130 PC with a GTX 750 Ti hold close to 1080p60 using high presets across the board, these console-grade settings pose a huge challenge. Once we engage ultra post effects, that average drops to 50fps, and down again to 45fps after texture quality is placed at very high. With foliage bumped to very high too, and distance scaling and population sliders pushed to 100 per cent, frame-rates are clearly a lesser priority.

The resulting frame-rate range is between 30-50fps on this PC, where the biggest dips occur during our alpha-heavy shoot-out in the car park. The RAM overhead exceeds the card's 2GB limit here, taking it up 2.2GB overall, but overall the performance profile here gives us options. It glances 30fps at the very worst points, meaning a half-refresh cap (via the game's v-sync toggle) is perfectly suited for this setting list. Given the huge performance nose-dive incurred by pushing post effects up to ultra, the 30fps frame-rate cap on PS4 and Xbox One starts to make sense here - we're nowhere near 60fps at any stage.

Meanwhile, even a 1GB card like the R7 260X holds strong, only falling a little shy of 30fps during the same shoot-out sequence. Switching off the ultra presets smooths out performance for a capped experience, despite flying in the face of the game's suggested RAM limits.

https://www.eurogamer.net/articles/digitalfoundry-2015-grand-theft-auto-5-pc-face-off

And I can corroborate their results given I played and completed the whole game at comfortably higher than console settings on my then GTX 670.
 
They specially showed the POM which was tied to ultra settings completely tanking the frame rate due to the GPU not having enough ram
 
Back
Top