It might be. Lock ups happen - no-one said they don't. Again, this about about PCs being less stable/usable, not consoles being perfect.
Ps4 is based on BSD which started in 1993. PC has been around forever but has been shit for a lot of that, and Windows 8 has had its own set of problems. Don't pretend that because Windows was started in 85, it's had 30 years of refinement. It's had decades of rewrites and rejigs with massive legacy support requirements and a few fundamentally crap design choices like the Registry.
Sure stuff like the registry is an abomination, and I get that you realize there are issues console side. But it seems like many people still don't accept that probably because so much of that is hidden from them. As a result people will usually just blame the game coding for all errors even if it's due to sdk or os bugs. That's why when I read how "console issues are overblown" or am given the usual anecdotes of how people only have game related issues on console, well some of that is actually due to other underlying issues and not game code! We used to have examples back in the day, shame I can't remember them now cuz its been 8 years or so, where you'd have errata on stuff to avoid doing because it was buggy and/or would crash the console on firmware such and such. Games that shipped with that found out the hard way and would have to patch themselves later and they took the "lazy devs" flak for it even though it wasn't their fault at all. And as such you still have people thinking console issues are rare. Well, they aren't as rare as people think.
Any more to what? Who was making a historical comparison? Compare the premutations of PC to those of console hardware (which still runs into a few when different components get used). Motherboard + graphics card == all sorts of potential issues on their own. You have CPU, Southbridge, Northbridge, RAM timings, audio, blah blah components. There are thousands if not millions of combinations of all these pieces.
This depends on how many years back you want to go. I'm thinking in terms of the life of the ps4, in which case there are not that many pc permutations. If you want to go back many years and compare 5 year old sandybridge pc's running nvidia 670's then sure you'll get more permutations. But comparing 5 year old pc hardware to a console which didn't even exist then hardly seems fair and/or points to how much better the experience would be on that 5 year old pc anyways since it can still play all current games at zero extra cost to the user.
Oh, please.
Windows has a long history of being great and shit. It's not like it's always been improving. You call Windows 7 a prehistoric OS. If it's so old, why is it still getting patches and fixes? Surely it was perfected long before Win 8 came out? And I presume Windows 8 is also nigh perfect what with 30 years development behind it and never gets patches or fixes.
Constant patches are inevitable for an os that is in majority use in such insanely high numbers as Windows is because it's attacked on an all but constant basis. Patches also include support for new hardware which is a bonus and/or expected. And some patches just improve things because, why not? They are gradually shifting Windows to a patch model rather than a numeric release model, so frequent patches will be the norm.
The core problem of bringing together all sorts of different hardwares (and softwares) into a platform that runs together and runs all the legacy stuff too means it is an incredibly complicated task prone to issues. Heck, the fact Windows powers PCs as well and stably as it does is nothing short of a small miracle! It's an amazing achievement in its way, but it's not as stable/reliable as fixed hardware. It can't be.
Well, I'd argue that it can be depending on the question. I'll pose you a simple example and have you make the answer. If you had to pick one in each case as being the more stable of the two, which would it be:
CASE 1
1) A modern Windows 8 pc.
2) REV 1.0 of a modern console built 80s style, where there is no internet connectivity and no patching possible, it has significantly reduced functionality, no os, on it's own it does nothing and in game it adds nothing, it's purely an 80s style game player. Insert game and play, that's it.
CASE 2
1) A modern Windows 8 pc.
2) REV 1.0 of a modern console built todays style, where there is internet connectivity and patching is possible, it is fully featured, has a full os, does lots of stuff in game, adds non gaming features along with playing games.
In each case, if you had to choose which you thought would be the more stable bet, which would be be? By what you have been saying you would make choice #2 in each case. But I wouldn't. In the first case I would chose #2, but in the second case I would chose #1. The fact that choice #2 in each case has more predictable hardware is not enough to sway my choice.
You can't have all that extra complexity and not introduce more chance of failure as a result. And that's reflected in Linux as well as Windows, and Android as well - mixed hardware introduces issues for the OS.
On paper yes it does, but it's a software problem that is solvable. Plus fixed hardware introduces new problems, see below.
I have a game where the graphics broke after an Android update and it didn't draw textures. That's what can happen on driver-powered abstracted hardware that can't happen on a closed box where the games are driving the hardware directly.
It can and it does! In the old days when console games were not "authorized", they just let them break when they revised the console hardware. Now for authorized console game companies they will test against their games and patch accordingly as they make hardware revisions. These consoles internally go through many revisions as well, and without an os to abstract it all in some cases can introduce more issues compared to systems where games talk via a software layer. One timing change can break everything when you are to the metal.
EDIT: Here, to steer this back because this is taking way too long and way too much time, the summary of my thoughts:
1) If you want to talk usability then you need to establish a timeline.
2) If the timeline is the ps4's life, then pc's in that timeframe are well sorted and I don't see huge issues on them anymore. There really aren't that many hardware permutations anymore.
3) If the timeline goes back 10 or so years, then pc always wins usability for the simple reason that it will play games that will be a black screen on ps4 because it simply can't play them at all. More issues? Sure. But you cna play more games, which I presume is important to game players who would see a black screen as poor usability.
4) Hopefully people understand that consoles from the ps3 era and forward do have issues as they have become more pc like with less mature code. Much of this is hidden and comes across as "game code" issues.