Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
Just optimizations they've found and implemented since release. Ones that may not have been fully stable and bug free and thus not released in prior patches. Basically just normal stuff with the better developers. CDProjeckt Red has a history of doing this on PC, and it's good to see they continue it on console as well.

Regards,
SB
 
Hmmm? Locked 60 was always an option for a PC gamer during the PS2 era.
If your machine was capable, yes. But games weren't designed for your particular PC build, and generally weren't targeting a stable 60 fps on the majority of machines. They also didn't balance workload well, at least in the games I played. So in something like Neverwinter Nights, if I disabled vSync I had runaway screen tear at times and smoother gameplay other times. If I enabled vSync, the framerate would plummet at times. No amount of tweaking settings could get a stable 60 fps at all times (these days we'd talk about soft lock and vSync on runaway framerates, unlock it when below 60 fps). Contrast that with BG: DA on a much less powerful PS2 that was a constant 60 fps, no tearing, because it was designed to run at such on fixed hardware. That's why I tended to prefer console play back then. Console games by design tended to favour fixed refresh in a way PC games just couldn't.
 
Just optimizations they've found and implemented since release. Ones that may not have been fully stable and bug free and thus not released in prior patches. Basically just normal stuff with the better developers. CDProjeckt Red has a history of doing this on PC, and it's good to see they continue it on console as well.

Regards,
SB

Optimization is one thing... but locking frame-rate at a suboptimal level (that still had major drops) doesn't make any sense. The XB1/PS4 hardware aren't all that different from each other... and their SDKs (OS/Toolchains/APIs/etc...) aren't alien in nature as well.

Yes, I agree optimizations are always made... but that doesn't explain their (CDPR) decision on locking one over the other, when they both (XB1/PS4) have more in common than previous generations.
 
If your machine was capable, yes. But games weren't designed for your particular PC build, and generally weren't targeting a stable 60 fps on the majority of machines. They also didn't balance workload well, at least in the games I played. So in something like Neverwinter Nights, if I disabled vSync I had runaway screen tear at times and smoother gameplay other times. If I enabled vSync, the framerate would plummet at times. No amount of tweaking settings could get a stable 60 fps at all times (these days we'd talk about soft lock and vSync on runaway framerates, unlock it when below 60 fps). Contrast that with BG: DA on a much less powerful PS2 that was a constant 60 fps, no tearing, because it was designed to run at such on fixed hardware. That's why I tended to prefer console play back then. Console games by design tended to favour fixed refresh in a way PC games just couldn't.

For any machine, you just had to configure it for the performance you wanted. If you had a lower end machine that just meant you had to sacrifice some IQ.

It's basically the difference between.

1. Fixed platform, fixed graphics. This is what you're going to get and you can't get anything better or worse.
2. Variable platform, variable graphics. We'll provide you with everything you need to take advantage of your platform at whatever speed and/or quality you desire.

There's benefits to both systems. If you don't want to muck around with settings, [1] is safer and less hassle than [2]. But [2] can always attain what [1] can if needed/desired. But [1] can never attain what [2] provides.

As to the problems with Neverwinter, there's also myriad examples of console games that couldn't maintain consistent performance. So to point to Neverwinter and say that's representative of PC gaming would be as incorrect as pointing to something that was inconsistent on PS2 (like Shadows of the Colossus) and saying that is representative of console gaming. Or Heavenly Sword on PS3. Same situation for any console generation. There's always games that don't perform consistently. Same goes for PC. But on PC there's generally ways to mitigate it or remove the inconsistency entirely. Although there are situations that arise where a combination of software (like say Half-Life 2) plus hardware implementations of Vsync with certain graphics hardware would offer less than optimal performance. But on other hardware (from the same graphics vendors!) would perform smoothly with Vsync on. But then that would be better than say the game not being consistent and no way to change it.

And with your Neverwinter example. What if it was like Shadow of the Colossus. Inconsistent performance with Vsync on. That would be your only option, even if you would have preferred on average a higher framerate. That hypothetical console version would give you no options. But on PC you could turn off vsync and adjust settings.

Of course, that means mucking around with settings. Not something everyone likes to do. And then we're back to the benefits of having a console at the cost of being able to configure graphics/performance to where you'd like.

Regards,
SB
 
Of course, that means mucking around with settings. Not something everyone likes to do.

People often cite this as universal disadvantage of PC's without considering that many PC's gamers thoroughly enjoy that aspect of the platform. Not everyone does this but it really bugs me when it's cited in the context of "always a bad thing" (not saying anyone here is doing that).

It's analogous to the Android (open but more complex) vs Apple (locked down but simple) situation. One isn't generally considered a universally better solution than the other (outside of the hard liners of course). But in the console/PC industry is almost feels a little taboo to argue against that pre-conception. As though whether you agree with it or not you just have to accept that messing around with settings is a "bad thing". I'm guilty of it myself, being drawn into arguments about configuring games, but rather than extolling the virtues of how fun and rewarding that level of customisation can be, I find myself resorting to arguments like "just use GeForce Experience for console style settings configuration". It's a shame but I don't see it changing any time soon.
 
People often cite this as universal disadvantage of PC's without considering that many PC's gamers thoroughly enjoy that aspect of the platform. Not everyone does this but it really bugs me when it's cited in the context of "always a bad thing" (not saying anyone here is doing that).

"Not something everyone likes to do," means there are people that like to do it, or at worse don't hate doing it. But for those that don't like doing it, it is a disadvantage.

People that claim it's a universal disadvantage are just extreme fans of their particular platform and you're unlikely to get anywhere with them if you attempt to explain why people might like to game on PC.

And holy carpe I'm spending too much time posting today. Need to get back to other stuff in RL. :D

Regards,
SB
 
"Not something everyone likes to do," means there are people that like to do it, or at worse don't hate doing it. But for those that don't like doing it, it is a disadvantage.

Yep the post wasn't directed at you, it was just your raising of the concept that sent me off on a little side bar ;)
 
Optimization is one thing... but locking frame-rate at a suboptimal level (that still had major drops) doesn't make any sense. The XB1/PS4 hardware aren't all that different from each other... and their SDKs (OS/Toolchains/APIs/etc...) aren't alien in nature as well.

Yes, I agree optimizations are always made... but that doesn't explain their (CDPR) decision on locking one over the other, when they both (XB1/PS4) have more in common than previous generations.
Honestly; this probably does a better job pointing out how little we understand hardware and what the engine is trying to do and what resources it's looking for.

We don't get it because we don't know enough about these situations that caused them to drive for these decisions. I'm sure if they explained it to you, you'd have an ah_ha moment.

PS4 and XBO are very different when it comes to memory structure. One works with two pools the other with just one. One is similar to a PC the other is not. Not having the right data available at the right time can severely harm your performance, just like not having enough space can harm your performance as well.
 
For any machine, you just had to configure it for the performance you wanted. If you had a lower end machine that just meant you had to sacrifice some IQ.
But you couldn't though. As I say, dropping quality (of which there were limited options back in the day) lead to racing framerates, which could still crash in busy scenes.
Davros said PC wouldn't lock the framerate. That's because the couldn't. They put the control in the PC owners hands. Console gave the option to tailor the game. I think basically you're agreeing with me. ;) I was just explaining to Davros the difference and why forced vSync on PC wasn't a thing where it was on consoles. There were even faster than 60 Hz monitors at the time, so capping a game at 60 Hz on PC made zero sense.
 
How the hell would that work on a display with a fixed 60Hz refresh?

See, the HDTV standard is to refresh the screen 60 times per second, but if you only get 50 frames from the engine, you couldn't present a new image for all of them. So you'd get double frames and thus a considerable judder for 10 frames per second. It'd be like, you see a new frame 1-1-1-1-1-1-1-1-1-2-1-1-1... pretty damn annoying.

With a 60Hz fixed refresh rate, you can only get a proper framerate at the following rates: 60fps, 30fps, 15fps, 12fps, 10 fps etc. 50fps can't be done properly, neither 48 or 35 or any other number.

If the display could do 120Hz, you could also aim for 40fps as a compromise and then you'd get each frame displayed for 3 refreshes. But at 60Hz you're restricted a lot more.
 
I am yet to see a TV in Europe that doesn't support 50Hz. German TV is still broadcast at 50Hz (probably all of Europe). Not sure how well they do it, but they all support it. Could try with my PC... but I don't think I have the time/motivation to do it^^
 
Europe is probably quite a lot less than half the worldwide market. So what should game developers do?

- If they have a game running at 60fps, they could perhaps put in some more effort to have it run at 50fps with a few tiny little extra graphical details. Most people wouldn't notice the difference; but they couldn't play online with the rest of the world gaming at 60fps so they'd need a second server park and playlists and all. It'd also add an extra line into the Q&A matrix, increasing costs significantly.

- If they have a game running at 30fps, they'd have to dumb down the graphics to a level that everyone would notice, add another line to the Q&A matrix, and the EU players would still be unable to play with the rest of the world either. It'd probably be a PR catastrophe.

Neither choice seems to be a good one - it'd not only increase costs but reduce sales either way. You'd get complaints and flamewars about either the graphics being dumbed down from 30 to 50fps; or simply just anger about not being able to play with the rest of the world who's at 60fps. So it makes no sense to go for anything other than 30 or 60fps worldwide.

Some games could still go for an unlocked framerate, like God of War 3, but that has it's own set of caveats...
 
As for what the Q&A matrix means... Let's say you want to release a game with single player only. Testing means you need to have a bunch of dudes playing through the game many many times, in order to check if there are any bugs that could cause the game to crash, or even keep it from being completed.

Now you want to release it on X1 and PS4, that means you have two different versions, so the Q&A effort has to double - people have to test it on both platforms just as hard.

Now, you maybe have both a BR disc release that caches on the hard drive; and a digital download that's all on the HD. That doubles your testing to 4 versions of the game.

Now, maybe you want to offer a choice between 30fps or 60fps; or a resolution of 720p and 1080p. That again doubles the versions and now you have 8.

So you can see where it's going - any additional significant choice would basically double the number of versions that have to be certified by Q&A, and you can quickly get to 32 or 64 or even more. Adding any extra options is going to be really expensive, so publishers would probably prefer to avoid it.
 
Now you want to release it on X1 and PS4, that means you have two different versions, so the Q&A effort has to double - people have to test it on both platforms just as hard [...]

er... no. it doesnt work that way, I believe.
In your game, what changes is (alot of) #ifdef on the backend (of the backend) of your game engine, mostly. Which has its own unit/smoke/whatever testing attached to it.
Of course you need to test the game on the platform, as the entire usermode dll/so changes, but mostly you need to test the coverage of the code on XB1PS4Whatever branches.
95%+ of your code will stay the same, given that the basic usermode is bug-free enough.
Then I believe you adjust for each platform by tuning the engine's settings.
 
How the hell would that work on a display with a fixed 60Hz refresh?
The notion is every TV these days supports 50 and 60 Hz as it's cheaper than specific SKUs. If true, it's an option. If not, it's not really. I'm sure this has been discussed in a specific thread.

It's worth noting that console games all used to have separate PAL and NTSC versions running at different resolutions and framerates, and the devs managed. But only because the TVs forced it. Now TVs can share content, there's little point in regional differences.
 
The notion is every TV these days supports 50 and 60 Hz as it's cheaper than specific SKUs. If true, it's an option. If not, it's not really. I'm sure this has been discussed in a specific thread.

It's worth noting that console games all used to have separate PAL and NTSC versions running at different resolutions and framerates, and the devs managed. But only because the TVs forced it. Now TVs can share content, there's little point in regional differences.
You mean all TVs regardless of region support 50hz right?
 
That actually makes me wonder, what would game devs think about 50fps instead of 60? It's probably still reasonably fluid and much better than 30fps; but they could use up to 20% more computing power...
 
Status
Not open for further replies.
Back
Top