Viability of old CPUs and value of upgrading *spawn

Hey you're preaching at the choir. I enjoyed playing through Diablo II on that nForce1 / Athlon XP / Voodoo3 / Win2k setup earlier this year.

It's just that this place is mostly about chasing the rush of the latest greatest and they will tell you how your old stuff is clunky and wasting electricity. Because perceptions.

To be honest I'm guilty of that....go back 10 years and I HAD to have the latest and greatest........ where as now I'm content with what I have.

Heck, 10 years ago the thought of not being able to crank all the settings to Ultra made me sick, now I'm happy to turn them down.

Older hardware is just sexy as fuck too, modern hardware just looks the same old copy and paste crap, and don't even get me started on the "I must light my PC up like a rainbow" trend that's around now.
 

It's just that this place is mostly about chasing the rush of the latest greatest and they will tell you how your old stuff is clunky and wasting electricity. Because perceptions.
In fairness to this place they were chasing the XP and the Voodoo 3 when they were out, and plenty before. No one here is against old hardware that I know of, I swear by living on the rusty edge of technology. I don't like upgrading to the newest tech, it's a much better investment to buy some good gear from the last generation that will serve me well for a while.
 
In fairness to this place they were chasing the XP and the Voodoo 3 when they were out, and plenty before. No one here is against old hardware that I know of, I swear by living on the rusty edge of technology. I don't like upgrading to the newest tech, it's a much better investment to buy some good gear from the last generation that will serve me well for a while.
Yeah sure once upon a time it was hot new hardware. Avenger, Napalm, Rampage man. Gotta get my hands on Palomino! Will nVidia's Crush chipset finally put an end to the VIA nightmares? ;) Things were developing a lot more rapidly and obviously back then though. I had a 486 only about 4 years prior to that!

For my own desktops I usually look at new hardware (but use it for years). I do love digging for late model everything else. Phones, tablets, notebooks. I suppose I'm less sold on exciting speed gains for those things and don't feel they have a lot of value. And they are more breakable / physically wear out / or just fall apart in general because they weren't built well so why invest much in them.
 
Last edited:
I still have loads of vintage stuff lying around........I just can't part with it and that Rampage II Extreme and Tri-Fire HD4890's is just...... :love:
I have a Powercolor PCS+ 4890 (950MHz). I've been casually watching for a cheap 3870x2 or 4870x2 to play with.

The unfortunate thing about the 4000 cards is ATI killed driver support so early. And I remember 2012, the last year, having lots of broken releases for them.

I tried a little comparison of Dishonored on GTX 285 and the 950MHz 4890 and the 285 runs it like twice as fast. I figure the drivers are the problem there. It's just a DirectX 9 game but it came after support was basically finished.
 
Last edited:
Last week I put my old HD 5770 in my cousin's HTPC because we were going to play some local coop with his family. It had a GT 610 and it could not even play Lara Croft and the temple of Osiris smoothly, also stuttered in Overcooked.
All browsers except Firefox says hw acceleration isn't working anymore. I haven't played around with enabling it manually, but the day when default settings are no longer working is when I think is the point it's hard not to motivate an upgrade.
The 8800GT reported hw acceleration working but that was back in early 2020. Never checked it on the even older 6600GT though IIRC.

Five years ago, I probably would have tried my HD 4870 and 8800GT in the last games to support them, for example GTA V. I tried BF3 BF4 on the HD 4870 to verify its working status before selling it, and performance was crap, might as well have been the Q8200 in that build running too hot though.
I agree that AMD's lack of support for their DX10 cards probably killed their cards prematurely, I saw plenty of users with Nvidia reporting good performance whereas AMD users with the typical counterpart said it had bad performance and/or artefacts.
 
Last edited:
Last week I put my old HD 5770 in my cousin's HTPC because we were going to play some local coop with his family. It had a GT 610 and it could not even play Lara Croft and the temple of Osiris smoothly, also stuttered in Overcooked.
All browsers except Firefox says hw acceleration isn't working anymore. I haven't played around with enabling it manually, but the day when default settings are no longer working is when I think is the point it's hard not to motivate an upgrade.
Yeah the 5000 and 6000 series drivers were killed off back in 2015. They topped out at WDDM 1.3 too so the Windows 10 drivers are rather Windows 8.1-ish. That was rather crazy because the third gen AMD Richland APUs still had 6000 series VLIW GPUs and so only got about 2 years driver support.

Fermi has WDDM 2.0 drivers and they were updated through 2018. I have a GT 620 around (GF119 version like 610). Not exactly a fast card but it's great for desktop use compared to an older IGP.
 
Last edited:
My 4770k is still delivering the goods, sure it's not giving me the 400-500fps a 12900k or 5800X3D will but the amount of games it can't give me at least a 60fps in I can count on one hand.

I'm on the same chip actually! Just one insignificant binning different.

4790K, but I run it at 4.5 all cores, with a ~120W perf limit, only ever throttles 3 or so hundred mhz under that at when torture testing with an AVX "power virus". Never seriously drops in practice tho.

I too also know there's better, and I know there will be some stutters under 60, but 99%+ of the time I'm stuck against the limits of my GPU and monitor refresh.

I think you'll both change your mind in about 5 minutes as soon as you upgrade to a new cpu. And you'll probably see even games you thought were running well, were not, actually. Biggest change you'll get is not higher framerate, but more stable lows. Every game will run more fluid. Check out the .2 for Cyberpunk. All those cpus except the bottom two are 60 or above in averages.

3o76x3.JPG

Yeah, my 4790K is certainly not flawless, but overclocked as it has been for years it's faster than a 7700K at stock. And I haven't played Cyberpunk yet. Don't have a raytracing card.

If the worst I get, in a game I don't play, is 30+ fps for .2% of the time, I can live with that for the time being.

Things will no doubt change once cross gen is over but so far, even in the most demanding games that I play it appears to be one or two threads that limit minimums vast majority of the time. I disabled multithreading years ago due to Meltdown and Spectre and all that, but it anything, that helped the minimums of the games I was playing (note: anecdotal, not reliably quantified). I've reactivated SMT a few times just to see how things are, but it's made no real difference on the whole....

... apart from compilation times. Many compiler threads does really seem to like many threads.
 
Yeah the 5000 and 6000 series drivers were killed off back in 2015. They topped out at WDDM 1.3 too so the Windows 10 drivers are rather Windows 8.1-ish. That was rather crazy because the third gen AMD Richland APUs still had 6000 series VLIW GPUs and so only got about 2 years driver support.

Fermi has WDDM 2.0 drivers and they were updated through 2018. I have a GT 620 around (GF119 version like 610). Not exactly a fast card but it's great for desktop use compared to an older IGP.

Yeah, Fermi reports WDDM 2.3. Kepler is also Legacy now but according to folks it reports WDDM 3.0 on Win11. Kepler for modern games was horrible though, Vulkan in particular seemed to be broken. Horrible frametimes in the last Wolfenstein, and Doom Eternal jumped all over the place from 10-30 FPS. Whereas even HD 7700 cards could get decent performance.

I wonder how severe the security vulnerabilities could be for the old cards though. I hadn't really thought about that. Nvidia has released plenty of security fixes since Fermi was abandoned, and their Linux driver for Fermi has also gotten these fixes.
 
Last edited:
Yeah, Fermi reports WDDM 2.3. Kepler is also Legacy now but according to folks it reports WDDM 3.0 on Win11. Kepler for modern games was horrible though, Vulkan in particular seemed to be broken. Horrible frametimes in the last Wolfenstein, and Doom Eternal jumped all over the place from 10-30 FPS. Whereas even HD 7700 cards could get decent performance.

I wonder how severe the security vulnerabilities could be for the old cards though. I hadn't really thought about that. Nvidia has released plenty of security fixes since Fermi was abandoned, and their Linux driver for Fermi has also gotten these fixes.

Kepler went in the 'wrong' direction back then. GCN products (compute) aged much better. Even a 7850/7870 performs decently, in special compared to say a 660/670. Kepler is indeed legacy, i think the last driver was from April or may this year with official W11 and wddm3.0 support (tested on a GTX670).
 
Yeah, Fermi reports WDDM 2.3. Kepler is also Legacy now but according to folks it reports WDDM 3.0 on Win11. Kepler for modern games was horrible though, Vulkan in particular seemed to be broken. Horrible frametimes in the last Wolfenstein, and Doom Eternal jumped all over the place from 10-30 FPS. Whereas even HD 7700 cards could get decent performance.

I wonder how severe the security vulnerabilities could be for the old cards though. I hadn't really thought about that. Nvidia has released plenty of security fixes since Fermi was abandoned, and their Linux driver for Fermi has also gotten these fixes.
I've read about the issues with Kepler and newer games. I haven't used Kepler on the desktop for games in ages so haven't been able to experience it firsthand. Maybe it is indeed related to some mitigations. GT 650M was my only Kepler chip for games.

I still occasionally get out the Shield Tablet and gamepad and try to find something to play. GK20A GPU. That has OpenGL ES 3.2, OpenGL 4.3, and Vulkan support but so few games ever took advantage of it. The ubiquitous Unity engine indie games tend to be problematic as well. Unity developers naturally don't test on the tiny Tegra installed base and seem to like 16-bit color formats. That is absolutely terrible to use with a modern Nvidia GPU. Tons of banding. Looks nice on every other GPU though with their fun dithering patterns giving a kind of added detail.
 
Last edited:
Its equal in raw performance to a GTS450. Ive had a laptop with that gpu, teamed to a 3630qm. Its extremely slow even at the time.
Eh well when it was new it was alright. I played Wolfenstein New Order on it. I remember being sure to get the GDDR5 version.

I moved to a 860M pretty quick though. It's about 2x faster.

And then to a 980M a couple of years after that. And I still have a 980M notebook because I just don't need more there.

Notebook journey: Radeon 9250 > 9600 > GeForce 6800 > 7800 > Radeon 5870 > GeForce 650M > 860M > 980M. :D
 
Last edited:
Eh well when it was new it was alright. I played Wolfenstein New Order on it. I remember being sure to get the GDDR5 version.

I moved to a 860M pretty quick though. It's about 2x faster.

And then to a 980M a couple of years after that. And I still have a 980M notebook because I just don't need more there.

I had the 2gb GDDR5 (dell 17r), the GTX275 i had in 2009 was about twice as fast. Laptop gpu's have improved alot these days. With Pascal they were equal to desktop variants i believe (1050 in a laptop is exactly like the dgpu and so on).
GTX980m is quite close to a GTX970 desktop if not throttled (done some comparisons back then, 8gb gddr5), quite impressive.
 
I had the 2gb GDDR5 (dell 17r), the GTX275 i had in 2009 was about twice as fast. Laptop gpu's have improved alot these days. With Pascal they were equal to desktop variants i believe (1050 in a laptop is exactly like the dgpu and so on).
GTX980m is quite close to a GTX970 desktop if not throttled (done some comparisons back then, 8gb gddr5), quite impressive.
Yeah from what I understand 980M has all of GM204's ROPs but more SMs disabled than 970. I don't think it has the 3.5GB RAM quirk of the 970.

NV did release a full desktop-like mobile 980 later on.

They were still screwing around with desktop vs mobile model numbers back then though. Like a GTX 960 on the desktop is the 965M and the 960M is much slower (GT 750 / 860M chip).
 
Last edited:
Back
Top