Digital Foundry Article Technical Discussion [2018]

Status
Not open for further replies.
It could be something as simple as a shrink of the XBO-X which isn't so much a shrink as an Xbox with a similar level of performance but upgraded internals. IE - would a shrink of the XBO-X SOC be cheaper than a new SOC with similar levels of performance based on whatever the next "gen" console is based on?

So for example, if we assume that the "next gen" console for Xbox is based on Zen/Navi. Which would be cheaper for a "base"/mainstream Xbox?
  • Paying AMD to shrink XBO-X SOC to 7 nm.
  • Continue to use XBO-X SOC as is.
  • Pay AMD to design a cut down SOC with similar performance to XBO-X based on the "next gen" Xbox SOC.
    • Basically similar to say Polaris 10 versus Polaris 11.
    • So fewer "units" combined with less speed.
    • Either smaller than the full "next gen" SOC (cheaper manufacturing) or just the "next gen" SOC with units disabled (increase yield by increasing usable dies).
There's a lot to say for something like that in a "rolling generations" scenario.

Regards,
SB
 
Motion blur really only makes sense out of cutscenes is if you have eye tracking. Things are only blurred if they are moving relative to your eyes. If a car goes past the frame, it's only blurred if my eyes aren't following it. (The shape of the motion blur on the wheels also changes depending on if my eyes are following the car. If my eye follow the car, the wheels are blurred in a circular pattern around the axle, while if my eyes are stationary relative to the ground, the part of the wheel near the ground receives little blur, while the part of the wheel farthest from the ground receives more blur.) If I focus my eyes on something in my periphery, then turn my head to bring it into the center of my field of view, the object does not become blurry while I do this. All motion blur implementations assume the player's eyes are focused on a fixed point on screen, and they all get these wrong.

The only acceptable places to use motion blur during gameplay are for things that move too fast or abruptly for eyes to follow, and maybe a little bit on the very edge of the screen to emphasize speed.
 
https://www.eurogamer.net/articles/digitalfoundry-2018-hands-on-with-battlefield-5s-closed-alpha

Hands-on with Battlefield 5: how the small things matter in this massive-scale shooter
Frostbite evolved.

Players of the recent Battlefield 5 alpha have been witness to quite a treat. Building on DICE's excellent work in BF1 and Battlefront 2, we're looking at an exceptionally handsome game that, small bugs aside, almost feels like the finished article. It's visually outstanding in fact, the only disappointment - if you can call it that - being that the signs are pointing towards an evolution of the Battlefield formula and its Frostbite engine, as opposed to a full-on next-gen revolution.


Overall though, if this is the state of the PC game three months out from release, I'd say it's in fine shape. The game looks stunning in motion and the micro-level enhancements to visual fidelity are beautiful bearing in mind that Battlefield is fundamentally built on the concept of its vastness. Nothing has been lost in its sometimes-insane action, and the enhanced destruction model can only add further spice to the gameplay. The implications here for the mooted Battle Royale mode are also mouthwatering - titles like PUBG have the scale, but fall short in terms of localised detail, not to mention physics fidelity - areas where BF5 clearly excels. On a more general note, it'll be interesting to see when DICE decides to push the Battlefield model on to the next level, but that may well require a beefed-up next-gen console baseline to make that happen. In the here and now, the closed alpha PC game seems to suggest an iterative - but substantial - upgrade to an already impressive multiplayer experience.
 
There's something I find really odd looking about Battlefield V. It's almost too colourful? Looks kind of cartoony to me. Not sure if that's something about the lighting or the materials, or just colour choices.
 
To further elaborate, realtime graphics are so far away from interactive photorealism there is no point in going all out trying to achieve it. A consistent visual direction is far more important IMO than "realism" because in the end, pushing for absolute realism just means a games visuals age faster once the new rendering engine or tech is released. There is so much more that can be done with the limited compute power in terms of gameplay than the never ending horizon of photorealism. AI behavior/pathfinding, collision detection, complex world state book keeping, etc, these are things that I look forward to more so than graphics for the next-gen consoles. Until we reach interactive graphics that are indistinguishable from real life, many graphic art styles can age like milk but great gameplay can last forever.
 
Motion blur really only makes sense out of cutscenes is if you have eye tracking. Things are only blurred if they are moving relative to your eyes. If a car goes past the frame, it's only blurred if my eyes aren't following it. (The shape of the motion blur on the wheels also changes depending on if my eyes are following the car. If my eye follow the car, the wheels are blurred in a circular pattern around the axle, while if my eyes are stationary relative to the ground, the part of the wheel near the ground receives little blur, while the part of the wheel farthest from the ground receives more blur.) If I focus my eyes on something in my periphery, then turn my head to bring it into the center of my field of view, the object does not become blurry while I do this. All motion blur implementations assume the player's eyes are focused on a fixed point on screen, and they all get these wrong.

The only acceptable places to use motion blur during gameplay are for things that move too fast or abruptly for eyes to follow, and maybe a little bit on the very edge of the screen to emphasize speed.
This would be true for extremely low percistance monitors, but for actual consumer monitors a single frame of animation stays on screen for many ms, so, with or without eyetracking, your natural ability to "follow" a moving object with your eyes is already hindered anyway. It will feel juddery unless each frame only flashes very briefly. So there is still a point in using MB as a form of motion anti aliasing deapite the whole eye movement rethoric.
 
To further elaborate, realtime graphics are so far away from interactive photorealism there is no point in going all out trying to achieve it. A consistent visual direction is far more important IMO than "realism" because in the end, pushing for absolute realism just means a games visuals age faster once the new rendering engine or tech is released. There is so much more that can be done with the limited compute power in terms of gameplay than the never ending horizon of photorealism. AI behavior/pathfinding, collision detection, complex world state book keeping, etc, these are things that I look forward to more so than graphics for the next-gen consoles. Until we reach interactive graphics that are indistinguishable from real life, many graphic art styles can age like milk but great gameplay can last forever.
AAA games are already in the uncanny valley. Not even high-end CGI has managed to get out of it. Pursuing photorealism at this point is a fool's errand.
 
Yup, there's a reason that Wind Waker still looks spectacular.

Personally, I prefer stylised over photorealistic, and rendering limitations mean we're stuck with the former anyway.

It's strange though, how the bar keeps moving. When I was in the mid teens, playing MGS3, there were times when I was tricked by how real The Boss looked. I remember thinking "next generation, we'll finally be there."

Well, something like 15 years have passed, and we're still about as near. Albeit with vastly greater resolution and fidelity.

Of course, part of this will be because I was a stupid teenager :p
 
Not really wowed by BFV mp honestly, aside from some nice terrain tessellation and cool gpu particles the rest just don't stack up. Game environment is not as densely packed as I'd hope it to be, low res explosions, sub par water, still terrible looking character models and the lighting seems off for some reason. Hope single player ramps things up considerably.
 
This would be true for extremely low percistance monitors, but for actual consumer monitors a single frame of animation stays on screen for many ms, so, with or without eyetracking, your natural ability to "follow" a moving object with your eyes is already hindered anyway. It will feel juddery unless each frame only flashes very briefly. So there is still a point in using MB as a form of motion anti aliasing deapite the whole eye movement rethoric.
The eye tracking moving objects isn’t rethoric, that’s how our visual system works.
Frame rates and displays have grave limitations, so using band aids to alleviate those makes sense. The problem with motion blur is that without knowing what the viewer looks at, it will be percieved as wrong, causing irritation or even breaking immersion completely. As in giving the impression that your eye is malfunctioning rather than being a part of the limitations of the rendering tech.

In that sense it is similar to DOF effects, where not being able to focus wherever you want in the scene is bizarre. (Actually DOF in games is just wrong on many levels, not the least of which being that a normal eye focussed on a couple of meters actually perceives everything as sharp even without refocussing. Our FOV and pupil aperture takes care of that.)

People simply find this immersion breaking as it conflicts with how vision works in general.

There is a real problem that motion blur is trying to adress, but I’d contend that I’d rather see band-aids being used that doesn’t introduce jarring artifacts of their own. And of course, what we really should be doing is attack the problem at its source.
 
Last edited:
Crash Bandicoot's Xbox, PC and Switch ports tested
And we look at the new PS4 patch too.

https://www.eurogamer.net/articles/digitalfoundry-2018-crash-bandicoot-nsane-trilogy-face-off

The Xbox One side of the situation can be covered very quickly. Playing the game on the base S model offers up an experience that is virtually identical to the standard PlayStation 4 game. The visual feature set is identical and resolution is the same at 1080p, with only the most minor fluctuations in performance setting it apart from its Sony counterpart. Put simply, base Xbox users can go in safe in the knowledge that they're getting an excellent experience - and that only ramps up on Xbox One X, where Crash retains its solid 30fps performance but ramps up the pixel count to a full 4K. That's an impressive 2.25x increase over the 1440p of PS4 Pro.


There are a couple of takeaways here. First of all, Vicarious has delivered a lean PC port here: the options are thin on the ground (there's no facility to run the game above 60Hz and there's no ultrawide support) and most of the scalability only comes through adjusting resolution, but regardless, overall system requirements are low enough to ensure that you should get a good experience on a range of hardware. Secondly, CPU utilisation is absurdly low to the point where a 60fps option - for the enhanced consoles, at least - should be viable. Based on our tests, GPU power is the primary limiting factor, but 1080p60 should be doable for Pro and X and we hope it's something that Vicarious considers for the future.

And there's certainly evidence that the developer listens to feedback. Loading times were my major issue with the original release on PlayStation 4 and Pro, taking forever to move between stages, impacting the flow of the game. This is fixed on Xbox One, with loading times so fast, it almost feels like you don't really need the loading screens. Switch is the next fastest, running a second or so faster than PS4 Pro, which Is still massively improved compared to its initial showing - we're talking about frequent 13-16 second delays reduced down to five to six seconds with the new update.
 
^ I thought the exact same as John, there doesn't appear to be a reason for Crash not to run at 60fps on the X and pro. Perhaps they can patch in the option?

CPU utilization on some 1080ti vids with the 8700k was about 10% or less and that cpu is not 10x Ps4's cpu even.

Switch holds up very well visually, latency appears to be the only significant issue. I probably will skip the switch version just because of performance.
 
Status
Not open for further replies.
Back
Top