The Witcher 3 : Wild Hunt ! [XO, PS4, NX, PS5, XBSX|S, PC]

Ive asked the question many times, never got an answer as to why its 60 or 30 but never anything else
There have been lengthy discussions on it including what the heck is 'framerate' and people's use of the word 'average' when discussing it. The answer that your screen is locked to factors of 60 fps (on a 60 Hz TV) and anything refreshed not in a factor of 60 has significantly uneven frame pacing or screen tearing, both of which are very jarring for some (few? most?) users. Avoiding those artefacts and maintaining a constant framerate is more brain friendly. The introduction of G-sync etc. means fixed framerates can be ditched as the uneven timing is too small to notice and motion appears smooth.
 
Ive asked the question many times, never got an answer as to why its 60 or 30 but never anything else
TV video signals send a frame sixty times every second. That's just how the standards are defined. If you want evenly paced frame output (without tearing), the framerate has to be this refresh rate divided by an integer.

For instance, frames at 60fps (60/1) are output as:

A B C D E F G H ....

Frames at 30fps (60/2) are output as:

A A B B C C D D ....

Frames at 20fps (60/3) are output as:

A A A A B B B B ....

But how does 40fps look? There's no way to divide it evenly into the signal, you have to split the difference through something like:

A A B C C D E E ....

Because frames are displayed for variable duration, it looks uneven and jerky. Judder.
 
except when the display support various refresh rate. my display have support for 40 hz so 40fps vsync looks good. But generally TV support 60 (common) and 30 (uncommon).
 
You want it to match the display framerate, or half or double the display framerate, or you'll get irregular screen updates, which is not smooth (EDIT: as htupolev shows). This is what G-Sync et al address - make the screen update when the GPU says it should, which could also be at 45 fps, or variable framerates (though the system apparently only works above 30fps or something like that). Additionally, respone time between controller / keyboard input and display also becomes less predictable, so even there you don't always want this value to fluctuate too much.
 
On consoles sure. <golf clap> It could have been so much more, however, if not held back by them. As evident by the version they had running on PC's before the final console builds went into production.

Regards,
SB

Well, they certainly had something running back then. Doubt it was much of a game, though. Yes, they probably could have made a great massive game which adhered to those kind of lofty standards if they had the manpower to create the required assets for more than just a 10 minute snippet of gameplay. They'd better be content with merely selling a hundred thousand copies to the select few who could actually run the damned thing. It's not like the game is light on the hardware as is.

Having played the game for two days now, I think it's pretty fucking great. It's the anti Dragon Age Inquisition basically. Even the smallest quests are interesting and tend to come with their own little susprises and twists. Looks quite lovely too. Character models in particular simply crap over the entire competition (if we're sticking to rpgs of course). It wasn't love at first sight, though. Just like in TW2, calling character progression not very exciting (yeah, now I can deal 5% more damage with the fast attack!) would be putting it lightly. Combat is never more than functional either.
 
Ive asked the question many times, never got an answer as to why its 60 or 30 but never anything else

I still play most of my games "old school" style with vsync off (unless they're hitting a fairly regular >60fps which is just about never in the games I play at the settings I play at. vsync off was the accepted standard on PC for decades before Digital Foundry started pushing it's holy grail target if 30 or 60fps. Honestly, I think it's overblown. A framerate that varies between 35 and 50 fps 98% of the time has very little tearing that I can see (it only get's bad for me at higher frame rates) and feels a damn site smoother than locking the game to 30fps with vsync - which I almost always try because I really do want to work out what this holy grail is all about, but I've never been able to find it!
 
I think I've been forcing VSync in the drivers for over 10 years.
I hate tearing, can't stand it.

I do hate it when I notice it (and at high frame rates it can be very noticeable). But at lower frame rates, for example the 35-45 ish that I'm getting in both Farcry 4 and AC: Unity at the moment, it's barely noticeable to me, far less so than the general sluggishness that I get in those games from turning vsync on and applying a frame rate limiter.

On the other hand I wouldn't dream of playing Lego Marvel for example without vsync on, which could sustain 120fps (with vsync) on my system.
 
yups my sweet spot also on around 40fps. Lower than that is too stuttery and laggy, higher than that and i need to sacrifice graphic quality. But for Withcer 3 i think v sync is a must because it have lots of "contrast" stuff on screen (trees vs sky, etc).
 
Xbox One framerate analysis. (apparently the day 1 patch improves the framerate and locks it to 34fps.)

I don't think "locks it" means what you think it means. V-sync is off, nothing is locked (or capped, which is the correct term). It also seems the dynamic scaling is another PR term, the game sits at 900P for almost all the game play according to DF. They say it jumps to 1080P for menus and cut scenes.
 
To those that asked, GOG.com pre-load allows you to install the game, but then holds back a tiny piece of installation until release.

GOG_Missing_Piece.PNG

Also got the 'game ready' Nvidia driver installed today.
 
TV video signals send a frame sixty times every second. That's just how the standards are defined. If you want evenly paced frame output (without tearing), the framerate has to be this refresh rate divided by an integer.

For instance, frames at 60fps (60/1) are output as:

A B C D E F G H ....

Frames at 30fps (60/2) are output as:

A A B B C C D D ....

Frames at 20fps (60/3) are output as:

A A A A B B B B ....

But how does 40fps look? There's no way to divide it evenly into the signal, you have to split the difference through something like:

A A B C C D E E ....

Because frames are displayed for variable duration, it looks uneven and jerky. Judder.
Many many thanks for the detailed technical explanations, as usual. :smile2:

Isn't there a way to make the framerate even without having to resort to 30 fps all the time, by using a mathematical algorithm? So at 40 fps it could be A A B+A C C D+C A+B B B B+C C C A+C D D D+E E E A+E A A B+C C C -well, you get the idea-

Well, they certainly had something running back then. Doubt it was much of a game, though. Yes, they probably could have made a great massive game which adhered to those kind of lofty standards if they had the manpower to create the required assets for more than just a 10 minute snippet of gameplay. They'd better be content with merely selling a hundred thousand copies to the select few who could actually run the damned thing. It's not like the game is light on the hardware as is.
.
Excellent comment. The tech demo was just that, a tech demo, much like the many CGI trailers of the past that pulled the smoke and mirrors trick some companies are known for.

I still play most of my games "old school" style with vsync off (unless they're hitting a fairly regular >60fps which is just about never in the games I play at the settings I play at. vsync off was the accepted standard on PC for decades before Digital Foundry started pushing it's holy grail target if 30 or 60fps. Honestly, I think it's overblown. A framerate that varies between 35 and 50 fps 98% of the time has very little tearing that I can see (it only get's bad for me at higher frame rates) and feels a damn site smoother than locking the game to 30fps with vsync - which I almost always try because I really do want to work out what this holy grail is all about, but I've never been able to find it!
Agreed. Irony of it is that the Xbox One version has Vsync engaged.

People are becoming far too dependent on Digital Foundry articles, and it's getting tiresome. Look at their wording in the article --thanks to @DrJay24 for pointing it out, I didn't know there was a new DF article.

The framerate isn't a problem at all, but it does affect their fancy graph, so it must be fixed. :???::???:

http://www.eurogamer.net/articles/digitalfoundry-2015-should-you-install-the-witcher-3-day-one-patch

The big positive point for patch 1.01 is in performance, though it's not entirely ideal. What we get on Xbox One is an uncapped frame-rate that varies between 30-40fps, with v-sync engaged to avoid tearing. The unfortunate side-effect of not capping this at a straight 30fps is that frame-pacing wanders up and down the graph, causing the perception of stutter.

The sentence at the end of the article is suitable for framing and hanging on your room's wall. :cry:

On top of that, if CDPR can implement a 30fps cap with consistent frame-pacing, all the better.

Why would CDPR have to do that? Because you say so? Get over yourselves, you aren't gods.
 
I assume they mean adaptive vs capped vsync? There are 3-4 people doing analysis and they would benefit from standardizing some of the analysis.
 
I don't think "locks it" means what you think it means. V-sync is off, nothing is locked (or capped, which is the correct term). It also seems the dynamic scaling is another PR term, the game sits at 900P for almost all the game play according to DF. They say it jumps to 1080P for menus and cut scenes.
I hope they will use the dynamic scaling more regularly, like tagging certain areas -although the game is huge...-, where there aren't many shadows or waggling trees. Indoor areas and towns. I think the dynamic scaling is a last minute addition so it needs maturing.

On a different note, The Witcher 1 ran at 40-50, in some cases 70-80 fps on my laptop, and the framerate seemed smooth, if only a bit jerky because of he uneven frame pacing, but I removed effects so the game would run as smooth as it could -I think I forced vsync in the HD3000 GPU drivers, but I don't quite recall, it was a long time ago.

PC version Ultra setting running side by side with the PS4 version.

 
I assume they mean adaptive vs capped vsync? There are 3-4 people doing analysis and they would benefit from standardizing some of the analysis.
Your idea of a unified approach between them would be helpful for them. If you delegate an article to a staff member, it's better to use some guidelines, or reach a consensus, for the use of common terms, because a DF staff member could call the technique Gsync if they are PC centric, but another one would call it adaptive vsync. It's not the easiest thing in the world, but it's worth trying.

The problem is that they always favour the same platforms and usually the XO version gets a lot of flak, and they are not seeing the forest for the trees.
 
Isn't there a way to make the framerate even without having to resort to 30 fps all the time, by using a mathematical algorithm? So at 40 fps it could be A A B+A C C D+C A+B B B B+C C C A+C D D D+E E E A+E A A B+C C C -well, you get the idea-

This is far from a stable frame pacing... You basically need to adhere to V-sync double-buffered rules (or g-sync & equivalent). Even so, there will be variability in input response/lag.
 
This is far from a stable frame pacing... You basically need to adhere to V-sync double-buffered rules (or g-sync & equivalent). Even so, there will be variability in input response/lag.
Okay then, I guess that Gsync sounds like a good idea for most games. I wonder...at more than 30 fps and with certain variability, is input response/lag still better than at 30 fps?

CD Projekt are probably using some AMD variation of Gsync and found it was benefitial, because if 30 fps locked framerate like in the first version was so important to them, then they would cap it.

Bethesda locked the framerate of Skyrim to 30 fps in the unforgettable Xbox 360 version of the game, but it wasn't vsynced, PS3 version was vsynced, but the X360 version was completely superior. It was less blurry, it ran at 30 fps without drops, and the negative effects of having vsync off were nearly invisible.
 
The PC version has received some colour correction? Article from a German magazine.

http://www.pcgameshardware.de/The-Witcher-3-PC-237266/Specials/Grafikkarten-Benchmarks-1159196/

Original

md7OHAU.jpg


PC Day one patch

JE91msi.jpg


Modded colour.

N46KzSz.jpg


Bonus; some captures from the PS4 version, :oops: by @Globalisateur

_6hepmu.jpg


_2y2r20.jpg


_7opq2b.jpg
 
Back
Top