Digital Foundry Article Technical Discussion Archive [2014]

Status
Not open for further replies.
Curious to know if the resolution on xbox one is controlled by the level designer or something, instead of being engine driven. They could flag non-combat sections of levels to use higher resolution. It would explain why multiplayer is fixed at the lower resolution.

Put it this way, if it were engine driven you'd think that the resolution would scale up if you walked up to a wall where you could not see any combat going on, or if you went into a corridor in a multiplayer map where there was no action.
 
But what of PC performance? Right away, we're pleased to see Sledgehammer Games factoring in a range of setups. For example, at the top-end, a Core i7 3770K system with 16GB of RAM matched with a £350 GTX 780 Ti is capable of 60fps in both campaign and multiplayer at max settings. This is with 2x SSAA and SMAA T2X enabled, though adding 4x SSAA knocks the read-out down to the 35-45fps range for campaign cut-scenes. In this case, actual gameplay (such as the later encounter with a drone swarm) tends to run at between 45-60fps.

The PS4/PC are so similar in campaign performance, albeit PC with a higher AA sampling.

One still has to wonder why PS4 users weren't given the option of adaptive-Vsync? If given, there would be no doubt campaign framerates would be closely matched to XB1. But then again, PS4 users would have shunned the option for no screen tearing.
 
Curious to know if the resolution on xbox one is controlled by the level designer or something, instead of being engine driven. They could flag non-combat sections of levels to use higher resolution. It would explain why multiplayer is fixed at the lower resolution.

Put it this way, if it were engine driven you'd think that the resolution would scale up if you walked up to a wall where you could not see any combat going on, or if you went into a corridor in a multiplayer map where there was no action.

I've had that thought since it was revealed that the game has a dynamic resolution but only in certain scenes.
 
Curious to know if the resolution on xbox one is controlled by the level designer or something, instead of being engine driven. They could flag non-combat sections of levels to use higher resolution. It would explain why multiplayer is fixed at the lower resolution.

Something like trigger volumes to change engine settings? Might be better to have it done automagically within a campaign level as even non-combat sections can still be filled with effects for cinematic effect, hypothetically speaking.

Scene/level-dependent settings are not particularly new though. The easiest examples are letterboxed real-time cut-scenes with different engine settings. Gears Judgment even had a couple Survival/Overrun maps with the FXAA disabled.

Put it this way, if it were engine driven you'd think that the resolution would scale up if you walked up to a wall where you could not see any combat going on, or if you went into a corridor in a multiplayer map where there was no action.
They're possibly just prioritizing image stability, as your eyes adjust to the inherent upscale blur, while being somewhat conservative with framerate. Not too sure how much it takes to just trigger the resolution drop in campaign.
 
Something like trigger volumes to change engine settings? Might be better to have it done automagically within a campaign level as even non-combat sections can still be filled with effects for cinematic effect, hypothetically speaking.

Scene/level-dependent settings are not particularly new though. The easiest examples are letterboxed real-time cut-scenes with different engine settings. Gears Judgment even had a couple Survival/Overrun maps with the FXAA disabled.

I imagine COD, like most games is some kind of streaming level setup. Maybe each "level" is flagged for a particular resolution.

They're possibly just prioritizing image stability, as your eyes adjust to the inherent upscale blur, while being somewhat conservative with framerate. Not too sure how much it takes to just trigger the resolution drop in campaign.

I'd be curious to know from the campaign how the resolution changes. It should be fairly easy to find a "dead" corner of a level to see if it can trigger a change in resolution. Rage would change resolutions on the fly very quickly. COD seems more like areas or levels that have a particular resolution vs the next.
 
They're possibly just prioritizing image stability, as your eyes adjust to the inherent upscale blur, while being somewhat conservative with framerate. Not too sure how much it takes to just trigger the resolution drop in campaign.

According to the early DF articles, it basically dropped resolution every combat scenario.
 
According to the early DF articles, it basically dropped resolution every combat scenario.

It's like Sledgehammer wanted to avoid the Internet noise that the XB1 edition wasn't true 1080p, by making sure all the non-battle scenes were rendered at native 1080p. Smart move...

/tinfoil hat off.
 
I don't think you can adjust the gamma curve without tinkering with the tv's service mode settings (which isn't something you should be doing unless you have in-depth knowledge of color calibration, the internal workings of your tv, and proper tools) . No idea what kind of Samsung tv you have, but my brother and I both have Samsung Plasmas which were calibrated damn near perfectly out of the box.

I have gamma adjustment options in my Samsung Plasma. It is in detailed settings.
I dont think my plasma supports full range though because when I set my X1 to full I cant see the sun in the X1 tv calibration mode. My plasma is an older model though.
 
Most mid-range to high-end (and some low-end) TVs have a gamma control in the user menu. Some gamma controls are more limited than others. Some only offer a gamma slider, which are basically presets. While more high-end displays offer a full 10 or 20 point gamma control from 0-100 stimulus.
 
Don't believe that. Always use full when possible!

If I remember correctly, imagine it like this. Game's graphics eventually end up in GPU memory in something like RGB. These four values each have range of 0-255. If your TV only supports 0-255, these values need to be compressed into 16-235 encoding range. So here, information is compressed. The amount of combinations of 219^3 is lower than 256^3. Then the TV receives it as 3*16-235 values, but can display 0-255, or as is often the case these days, more than that (Wide Color Gamut etc.) So the information is translated again to 0-255 and although less information is lost, it still happens due to not being a proper match (it's not, say, a x2 translation).

Then there is the Deep Color support that allows color information to be output in a higher resolution than 0-255 already, like 12bit per color, and TVs can accept that too. But if you choose limited range, you not only disable that, but decrease from 8bit.

I'm sure I'm not 100% correct, but close enough I hope. I tested it on mine and already with my older TV I could see more color banding and less deep blacks.

Minor correction: 16-235 should have 220 values

255 - 0 + 1 = 256
235 - 16 + 1 = 220
 
It's like Sledgehammer wanted to avoid the Internet noise that the XB1 edition wasn't true 1080p, by making sure all the non-battle scenes were rendered at native 1080p. Smart move...

/tinfoil hat off.

It's more likely that in combat situations you are less likely to sit still and stare at a character or wall or other bit of scenery long enough to notice the drop in resolution. Whereas in non-combat situations people are far more likely to take their time to really look at the graphics, a situation where full 1080p would pay off.

Of course, the drop in resolution will also be noticeable in screenshots of combat or potentially video of combat where you can sit and scrutinize. But in a situation where you are trying to avoid getting your butt shot off? Like actually playing the game? Not so much.

And the other situation where it would be an issue is sniping distant targets. Not sure how often that comes up, but in the limited time I've spent with the game, there hasn't been any distance sniping in the SP yet. CQC won't show the potential drawbacks nearly as much.

Regards,
SB
 
Digital Foundry article up on Eurogamer about the Evolve use of CryEngine 4, the first showing of the engining in a cross platform title. Impressions taken from the current beta.

Digital Foundry said:
The good news is that from a rendering perspective at least, Evolve demonstrates that CryEngine has got what it takes to compete as a state-of-the-art multi-platform engine
 
They should show it fixed with the fully loaded version.

Looks more like filtering problem than LOD otherwise near distance texture wouldn't be identical either.

ModNote: merge
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top