Value of Consoles vs PC *spin off*

So the “comfortably above” claim is wishful thinking or revisionist history on your part. Sorry. Don’t let facts get in the way though, if you believe it then it is true; for you at least
 
This surprises me, what PC VR do you use? Oculus is basically a self contained ecosystem that essentially behaves as if you have a console plugged into your PC. Granted you can do lots of tinkering with it outside the ecosystem but if you stick within the ecosystem it's a console like experience as far as I can see.
Well I had the Rift CV1 which was largely ok, but it wasn’t as easy to set up - the remote thing I gave up on after 10 mins because it simply wouldn’t find it and setting up the sensors was a PITB too.

Recently got a Quest 2 which requires side loading to get wireless working (thanks Facebook!) - then it’s figuring out how it all works etc. Also having issues with buying the wrong version of games (ie accidentally bought PC version even though there’s a Q2 version) as the store is rubbish at explaining what’s what. PSVR is so much clearer and easier to use IMHO.

Having said that, now I know the pitfalls I know (better) how to avoid them!
 
I'm not disputing your overall point that consoles are a slicker experience - they clearly are. But it's also worth highlighting how far PC's have come in that regard since the days when this kind of argument pretty much formed the backbone of any PC/console comparison.

I don't disagree. The norm for consoles is things update without you doing anything, whereas most software on Windows generally requires user input. You can schedule your PC to wake and sleep on a routine but it'll still only update on that fixed routine. Some updates simply won't happen unless you are beyond the Windows login screen. Windows updates, infamously, happen when they feel like it but the issue for games is that user interactivity is required for drivers, Steam, uPlay, EGS and - I think - Origin updates. Steam is the worst because when it detects a new client, and downloads it, it won't allow any game updates until the client is restarted.

Maybe part of the answer is how Mac allows updates. Even on laptops you can enabled 'PowerNap' and this periodically wakes the Mac periodically to check for updates like emails, uploads, downloaded, software is auto-updated. This requires a whole new API but so much just happens in the background without you thinking about it.

Microsoft have it within their power to offer a console-like kiosk interface for Windows. They've already built it for Xbox.
 
They specially showed the POM which was tied to ultra settings completely tanking the frame rate due to the GPU not having enough ram

Did you read the article? It specifically says POM is tied to texture quality and matches the PC's Very High setting. They then go on to state that increasing texture quality from High to Very high decreases frame rate from 50fps to 45 fps while everything else is set to high and post effects are on Ultra. Is 50->45 fps how you define "tanking frame rate"?

Given the consoels run this game at 30 fps I'm not seeing the problem here.

They even go on to say that the 1GB 260X can stay above 30fps with these same settings if post effects are moved down from Ultra to very high. They specifically state that this is despite the GPU's paltry 1GB.

So the “comfortably above” claim is wishful thinking or revisionist history on your part. Sorry. Don’t let facts get in the way though, if you believe it then it is true; for you at least

What are you talking about? The article shows a 2GB 750Ti achieving 30-50fps at console matched settings.

Are you really suggesting a GTX 670 which is over 50% faster wouldn't be able to best that? Because you would be wrong. Here's a 680 running the game at pretty much max settings and 2560x1600 resolution at 40 fps. The 670 is only a little slower.

https://kotaku.com/grand-theft-auto-v-benchmarked-pushing-pc-graphics-to-1698670906
 
PC has made great strides but the immediacy of consoles just makes it worth the negatives when you have a busy lifestyle or kids.

Yeah, and for families there's the bonus that kids are a lot less likely to b0rk the system. A dumb kid can't download haxx, or be tricked into installing something bad, and the parent doesn't have to worry about creating PC user accounts with reduced privileges (if they even know how to start along that road).

Consoles and PCs are both fantastic (never more so than now), but they're different, and trying to determine an inherently "better" option is a fools task.

Initial cost, cost over lifetime, performance, available software, flexibility, customisability, simplicity, robustness .. even form factor ... are all variable and only an individual can pick the better option for their circumstances.
 
I don't disagree. The norm for consoles is things update without you doing anything, whereas most software on Windows generally requires user input. You can schedule your PC to wake and sleep on a routine but it'll still only update on that fixed routine. Some updates simply won't happen unless you are beyond the Windows login screen. Windows updates, infamously, happen when they feel like it but the issue for games is that user interactivity is required for drivers, Steam, uPlay, EGS and - I think - Origin updates. Steam is the worst because when it detects a new client, and downloads it, it won't allow any game updates until the client is restarted

I'm on my home PC every day and have fairly fast internet, so I don't really feel the impact, but I can see that for a less frequent home user it can be a pain. That's probably something that PC gamers don't normally notice, because it's pretty routine. Start up, click a couple of prompts, looks at YouHub over dinner and then fire off into a game.

It's a bit hypocritical, because when I fire up my X360 every now and again and it wants to do a dash update I'm like "wtf console I'm sat here waiting!". Yeah, not very fair of me I know. It's actually pretty amazing that MS still support the 360 and its online stuff.

Maybe part of the answer is how Mac allows updates. Even on laptops you can enabled 'PowerNap' and this periodically wakes the Mac periodically to check for updates like emails, uploads, downloaded, software is auto-updated. This requires a whole new API but so much just happens in the background without you thinking about it.

MS have are talking about introducing a new standby mode for Win 10 which is more like a phone's sleep mode. I guess it will needs a new API for apps to benefit from it too.

It's supposed to be built on "connected standby" from a few years ago, which I disabled on my "huge mistake" Atom netbook because it was almost as bad for battery life as leaving the cheap POS running.

Edit: I think this is the boy:

https://docs.microsoft.com/en-us/windows-hardware/design/device-experiences/modern-standby
 
My 32Gb i9 + 3080 destroys my PS5 technically but the console experience is unbeatable.

When playing the games themselfs, its about the experiences then, not outside of gaming. Anyway, if the PS5 is giving a better experience then that high end pc if you have that, your doing something wrong.

a 770 will not like console level textures in GTA V, as an example

My GTX670 pc did GTA5 much better then the base consoles atleast, in all regards.
 
I meant some specific examples of benchmarks showing Pascal performing significantly worse vs Turing than we would expect it to based on Turing launch time reviews. i.e. evidence that Pascal performance has dropped off a cliff.

Most of the games above are included in the two reviews I linked above and thus already accounted for in my conclusion that Pascal in fact hasn't dropped all that much performance at all since Turning launched using the 1080Ti and the 2080 as examples.
Forza Horizon 4
1 2 3
BFV
1 2
RDR 2
1 2 3
Star Wars Squadrons
1 2
Godfall
1 2
Dirt 5
1 2
Doom games
1 2 3
World War Z
1 2
Division 2
1

Ehh this takes really long so I don't want to have to do every game but that's a decent amt. I was more so referring to Nvidia performance dropping off relative to their direct competing AMD GPU.
 
Sorry if the 670 did run all the games correct;
A coworker who has an old GTX770 rig showed me last year how RDR2 ran like garbage on his system, some other games as well. I assumed the GTX670 is even worse.
Maybe the GTX670 is much stronger and games are better optimized for it compared to GTX770. Maybe Nvidia killed the drivers so that the newer cards appeared to perform better? I honestly don’t know.
 
This is the first generation of consoles I'm sitting out since the PS1 days. It's a combination of shifting tastes and getting more from the PC than I can from the console. Like OLED was a game changer in the TV PQ space, RT is the same to me in the graphics space. This generation of consoles will not provide that and I can only get that from the PC upgrade cycle.

As my taste shifted towards sim racing in VR, the PC is simply a heads and shoulder better platform for peripherals, game choices, customization and with no glass ceiling so to speak.

As I predicted for some quite some that when these consoles come out, they'll effectively be mid range pc's with the benefit of on box optimization and that's what your'e seeing. I can simply get a better experience elsewhere for my needs.

The consoles are still great value will obviously be cheaper than an equivalent PC due to subsidized, economies of scale and less middleman needing to make a profit. That's how it's always been.
 
This is the first generation of consoles I'm sitting out since the PS1 days. It's a combination of shifting tastes and getting more from the PC than I can from the console. Like OLED was a game changer in the TV PQ space, RT is the same to me in the graphics space. This generation of consoles will not provide that and I can only get that from the PC upgrade cycle.

As my taste shifted towards sim racing in VR, the PC is simply a heads and shoulder better platform for peripherals, game choices, customization and with no glass ceiling so to speak.

As I predicted for some quite some that when these consoles come out, they'll effectively be mid range pc's with the benefit of on box optimization and that's what your'e seeing. I can simply get a better experience elsewhere for my needs.

The consoles are still great value will obviously be cheaper than an equivalent PC due to subsidized, economies of scale and less middleman needing to make a profit. That's how it's always been.

Did you primarily use Xbox as console?
 
We can add Cyberpunk to the the long list of games where Pascal bombs out. Not surprising at all but yeah, Nvidia gonna Nvidia. 2080 almost 50% faster than 1080ti lol.
 
We can add Cyberpunk to the the long list of games where Pascal bombs out. Not surprising at all but yeah, Nvidia gonna Nvidia. 2080 almost 50% faster than 1080ti lol.

Since what generation of GPUs did NV start focussing on compute for real again?
 
I see a lot of friends who only had Xbox move to PC. It’s like having an Xbox but with better graphics and more games/genres :)

Ive seen people move from both PS/XB to PC this 8th generation. Its not just about best graphics for all multiplat games, but also games exclusive to pc, and since of late, to some extend even sony games make it to the platform. Its nice to be in the middle-land.
 
I’ll be honest, if my favorite games (which are more often than not GOTY-nominees, winners, PS-exclusives) were on PC, day one...

then I would not care about Bill Gates’ mess of a garbage candy crush promoting in the start menu, telemetry collecting, BS performance abomination of an OS attempt :p
I would be ordering north of 1500 euro of PC parts and building it in an afternoon.

but without the games, I will only endure my Intel Nuc which allows windows to shit the bed once every 6 months. Like now it again wants me to create a F- MS account, I already told it several months that I don’t want to. And I actually bought a windows 10 license mind you.
Anyway, I can see how people deal with windows if it allows their favorite games to be run at a higher resolution
 
Yes but then, if you have a stance against MS products its a totally different case. Its the same i see some having against apple, samsung etc.
I personally think W10 is the best windows so far, very gaming friendly and no hassle. Also, gaming on pc is giving one more then just a higher resolution, theres things like performance, ray tracing, settings, controls etc. Example is that 2077 most likely will run at lower settings (much lower), heavily reduced ray tracing and less performant at it.

Some dont want to pay extra for online play besides their internet connections, or deal with closed box environments. Different markets and they both can exist, apparently.
 
Forza Horizon 4
1 2 3
BFV
1 2
RDR 2
1 2 3
Star Wars Squadrons
1 2
Godfall
1 2
Dirt 5
1 2
Doom games
1 2 3
World War Z
1 2
Division 2
1

Ehh this takes really long so I don't want to have to do every game but that's a decent amt. I was more so referring to Nvidia performance dropping off relative to their direct competing AMD GPU.

Kudos on the effort you put in there. I feel like I should match it now. So here are the results from what you've posted above in terms of Turing vs Pascal performance:

Forza - 2080 is between 1% - 12% faster than the 1080Ti depending on resolution which is perfectly in line with TPU's 8% faster rating on their GPU spec database which is presumably taken at the point of launch.

BFV - 2080 is 33% faster than the 1080 which is in line with the TPU database

RDR2 - 2080 is 15 % - 26% faster than the 1080Ti here so this one is definitely above where we would expect Turing to be in relation to Pascal

Starwars Squadrons - the 2080S is 14% faster than the 1080Ti - perfectly in line with the TPU database

Godfall - 2080 is 15% faster than the 1080Ti so a little more than the TPU DB

Dirt 5 - 2080 is 10% faster than the 1080Ti which is in line with the TBU DB

Doom Eternal - 2080 is 26% faster than the 1080Ti so well above where we would expect Turing to be in relation to Pascal

WWZ - 2080 is 4% - 9% faster than the 1080Ti which is in line with the TBU DB

Division 2 - 2060 is 5%-7% faster than the 1080 which is a little above the TPU DB which pegs them as even

So of the 9 games looked at, only 2 offer a significant variation from what we'd expect based on the launch performance. And there will of course always be outliers, especially where games take advantage of Turings newer feature set (something I already called out in the post you were responding to).

I get that you're trying to show AMD's growing performance over time vs Nvidia but if you cherry pick AMD friendly games then of course you'll be able to show that. I'm sure the reverse could be shown by cherry picking Nvidia friendly games. That said I won't deny that AMD consistently seems to gain ground over time, but that's different to Nvidia performance falling off a cliff like it did with Kepler (where the equivalent today would be something like the RX580 performing in line with a 2080Ti). Those gains can probably be attributed to AMD picking up more console level optimisations than Nvidia, as opposed the Kepler situation of it's architecture simply being unsuited to modern games.

We can add Cyberpunk to the the long list of games where Pascal bombs out. Not surprising at all but yeah, Nvidia gonna Nvidia. 2080 almost 50% faster than 1080ti lol.

Let me remind you of what I said earlier in this post:

"It seems to me that developers and Nvidia offer good support for at least n-1 architectures which would give a typical architecture 4 years of well supported life. Pascal for example is still more than capable in any new game now a little over 4 years from it's launch. But I do expect it to start falling behind now that Ampere has launched and it's likely receiving less support from Nvidia"

It seems Cyberpunk falls into that description perfectly. Pascal is now 2 generations and more than 4 years old so we should expect some performance loss. Especially in a game known to be using DX12U features which we know Pascal lacks.

Remember, this discussion started with you claiming that Nvidia performance "falls of a cliff" after 18 months which is what I'm disputing. I still see no evidence of that.
 
Back
Top