Digital Foundry Article Technical Discussion [2018]

Status
Not open for further replies.
Man red dead looks way more scripted and movie like than the first one. I'll pass esp. since it needs a 55gb download on top of the disc. Not too impressed with the graphics either tbh.
 
Well, it's the next generation for Rockstar. Their last open world game was GTA V for PS360.
Sure but I guess it's all relative, for people like me it's borderline trolling because I thought we should use "current gen" to describe what's to be released on current gen consoles.
 
It is relative. Nothing about the 'current gen' adjective specifies hardware as the noun being described. It's next-gen software running next-gen algorithms etc., as opposed to previous generation solutions reworked and reoptimised for a new platform.
 
Sure but I guess it's all relative, for people like me it's borderline trolling because I thought we should use "current gen" to describe what's to be released on current gen consoles.
Not sure if it would be borderline trolling, but Shifty is right that this is their first official release for this gen.
It's considerably better looking than the games that have arrived up to now, and I'm expecting largely for the most part, that early games in next-gen will largely look like this, perhaps not even considering the production values.
 
" How the hell do you see me in the middle of the grass?", and got this simple reply: "What grass?".
Yeap, you guessed, on lower detail there was no grass, and they had a player completely uncovered and exposed moving around. Time to lower detail!
These are the kind of things that cannot happen. And for this you must always have in mind that the specs of a lower machine cannot give you neither advantages, neither disadvantages, in terms of gameplay.

That's a huge problem with alot of games. War Thunder comes to mind for me.
 

Funny enough, the technical gap seems larger between the PS4/XBS than between the Pro/X.

The game looks far worse than Spiderman in my opnion. It is certainly more ambitious in many aspects, but the overall package is clearly less polished.
 
Funny enough, the technical gap seems larger between the PS4/XBS than between the Pro/X.
Not completely unexpected that PS4's lowest dynamic resolution is XBS's highest delivered dynamic resolution given the relative performance delta between the two machines. Wasn't this also true for Origins? It's also surely not that unexpected at this point that the similar relative performance delta bewteen Pro and X is less noticeable at higher resolutions given you're well into the diminishing return zone for this engine.

It's a very pleasing-looking game though, I'm playing on Pro having recently finished Spider-Man. In no way does this feel like any kind of graphical step down. :nope:
 
It's also surely not that unexpected at this point that the similar relative performance delta bewteen Pro and X is less noticeable at higher resolutions given you're well into the diminishing return zone for this engine.

I don't know if it is less noticeable visually because the X version is quite sharp, but according to DF the resolution gap is 44% higher on X compared to the Pro on average. The resolution gap seems higher between the XBS and the PS4. According to VGtech, the PS4 is the console that spends most of its time at its peak resolution : "PS4 Pro, Xbox One and Xbox One X seem to usually render natively below their maximum resolutions, but during relatively static scenes they can often reach their maximum pixel counts due to the temporal reconstruction. The base PS4 renders natively at its maximum resolution more often than the other consoles, but the native resolution can see sustained drops below 1920x1080."

Edit : acutally it's the XB1 and not the XBS in the DF analysis.
 
Last edited:
The actual article is now out to go along with the earlier video … https://www.eurogamer.net/articles/...reed-odyssey-best-played-on-enhanced-consoles

Why Assassin's Creed Odyssey is best played on PS4 Pro and Xbox One X
Smoother, cleaner, faster.

Assassin's Creed returns once again with the excellent Odyssey, built upon the same technological revamp that successfully powered last year's Origins. By and large, it's a successful multi-platform deployment across consoles and PC, but similar to the last offering, it's best played on the enhanced '4K' consoles. There's an almost majestic scale and scope to this new title across all systems, but it's PS4 Pro and Xbox One X that deliver a quantifiably smoother, more consistent experience over base consoles.

Technologically, Odyssey follows Origins in adjusting rendering resolution according to load, improving image quality using a variant of the temporal anti-aliasing technology pioneered in the remarkable For Honor. So yes, if we look at the raw numbers, there is a clear resolution boost as we scale the console power ladder - base Xbox console at the bottom, followed by a 1080p-centric PS4, before we move on to Pro and Xbox One X at the top end of the scale. However, the traditional way we perceive resolution - edge jaggies, pixel-popping, etc - is circumvented via the use of TAA.

The end result is that more GPU power simply translates into more clarity. It's an elegant solution that gives the developers the freedom to more easily deliver their vision without being limited by the host platform. If the scene is more complex, resolution drops, but the end result still looks fairly consistent, and even platform comparisons hold up fairly well in motion. Think of each system as having a specific resolution window designed - in theory - to keep the game running smoothly at 30fps and this effectively sums up how both Origins and Odyssey work. However, the implementation varies fairly dramatically.


With Xbox One X, resolution scales in a window broadly between 2944x1656 all the way up to full 4K - 3840x2160. That's a fairly wide range of potential values then - around 60 per cent of native 4K resolution up to 100 per cent. That range is much more constricted on PS4 Pro - values of 2227x1242 at the lower end to 2816x1584 at the top deliver 33 per cent to 54 per cent of a native 4K framebuffer. But the point is that the delta between lowest and highest measurements on both systems is in the 60 to 70 per cent range - that's a wide range of real estate Ubisoft can work with in maintaining performance.

...
 
I don't know if it is less noticeable visually because the X version is quite sharp, but according to DF the resolution gap is 44% higher on X compared to the Pro on average. The resolution gap seems higher between the XBS and the PS4.

I'm not disagreeing and if this resolution difference was 15" from your face as might be experienced on a monitor in front of somebody at a keyboard and mouse, that could be quite noticeable, but 8-10ft from a TV? Probably not so much for the average person.

A material difference in actual gameplay terms is the quicker switching back from Eagle vision when you've strayed from your character. On Pro this can be a long as 5 seconds, which is feels like quite a while to be staring a black screen mid-gameplay and I know base PS4 and Xbox One are far worse.
 
A material difference in actual gameplay terms is the quicker switching back from Eagle vision when you've strayed from your character. On Pro this can be a long as 5 seconds, which is feels like quite a while to be staring a black screen mid-gameplay and I know base PS4 and Xbox One are far worse.

This is why I always fly back to my character.
 
It's probably a bit of everything. Modern game environments consume a ton of increasing complex, more detailed nom-static geometry with increasingly more unique textures. That's a ton fo memory right there. As I said above, something has to give. There are answers to this, lower resolution textures, more shared textures, smaller environments, less geometry, contrived environments revealing less of the distance, but I'm guessing that isn't what Shortbread craves.

Just for information on culling. It doesn't answer any memory question directly but it is easy to read and a good summary for those who don't know this.

"Z-Buffer: Prevents unnecessary pixels from executing but there's still potential for wastage since if you draw something in the distance first, then draw over it, you'll still write twice. This is why it's common practice to draw the sky last in most games, and why (on and off) it's been common to do a fast depth render of some/all of the scene before going to full-price pixels. This takes us to...
Tile-Based Rendering: Basically this is a solution for the problem above. Rather than immediately filling a triangle that's been drawn, the GPU works out which screen tiles it overlaps and sticks it in a bucket for each one. Then after many meshes have been drawn, it sorts the bucket and throws away any that are provably hidden by others. This used to only be common on mobile, but recent PC GPUs have taken it up too, though it's a completely hidden process from our perspective

Both of those are only good at reducing GPU work after a lot of processing to get the triangles positioned on screen. Ideally you'd want to kill a hidden object before you even Occlusion culling gets you something better, the chance to kill an invisible object before anything has been said about it to the GPU at all, but that comes with one major problem: you're on the CPU, trying to work out if one object's hiding another before anything's been drawn at all. So, some of the options:

Frustum Culling: Basically everyone does this, work out if an object's not in front of the screen. This only really gets tricky if there are big landscape pieces that need to be chopped up and culled separately
Portal Culling: If you know things upfront about the structure of the environment, eg with indoor areas, you can pre-calculate which rooms can see each other and cull objects in the invisible areas. A step better than this, you can do the frustum culling within that room using a field of view that's been clipped to the size of the door/window
Occlusion Culling: Actually working with occlusion data is going to give the finest culling but the challenge is getting that information early enough to use it. One option is to have simplified versions of the models drawn with a low-res CPU rasteriser. Another is to pull data back from the GPU, though this means it's several frames out of date by the time it arrives. To compensate for the delay, developers often exclude all moving objects from the occlusion buffer and a CPU process does the best it can to correct for camera movement. Unfortunately this isn't perfect since parallax can open up holes anywhere that's been dis-occluded in the intervening time."

https://www.robertsspaceindustries....as-occlusion-culling-in-games-often-been-disa
 
Last edited:
haven't found a DF article or video on this yet, but Panic Button are like what Rare was in the SNES and N64 days, wringing out the console like no one ever did. :oops::oops:

These Panic Button guys might be alchemists because the difference in framerate, definition and picture sharpness is so pronounced that you don't even need a still frame to compare.


Now this version can look into the face of the Xbox One X version. One can't imagine how Doom Eternal can look on the Switch having these guys behind the project.
 
https://www.eurogamer.net/amp/digitalfoundry-2018-dark-souls-remastered-switch-analysis

Dark Souls on Switch is a current-gen port with last-gen visuals
But smoother performance, improved resolution and portable play set it apart.


This isn't quite what we expected! Strictly speaking, Dark Souls for Switch actually has more in common with From Software's last-gen original as opposed to the remastered versions released a few month ago. The new release misses all of the visual refinements found in the PS4, Xbox One and PC versions of the game and at its core, this is the Dark Souls you originally played back in 2011. There are key improvements though - like a smoother frame-rate and a higher rendering resolution - and there's a bonus increase to six concurrent players online too. The Switch game still manages to impress on its own terms then and of course, it's fully portable. This is a desirable feature that makes this version unique, and where it shines brightest.

The idea that the Switch game is based on the last-gen release was first raised as a possibility when checking out the file size of the game. Dark Souls on Nintendo's hybrid weighs in at only 3.9GB, compared to 7.5GB when installed on PlayStation 4 or Xbox One. It's the same size as the original PS3 or Xbox 360 editions, a profile that also helps to squeeze it onto one of Nintendo's lower capacity cartridges. Virtuos has also opted to keep the original textures, shaders and effects from the original version too, which no doubt aids in keeping the install size in the same ballpark. Side-by-side with the Xbox 360 version, it's clear there's very little between them in terms of the core visual feature-set. If it's the authentic Dark Souls experience you want, as it was presented back in the day, this is the version for you.


While Dark Souls on Switch presents as 'last-gen plus', there are actually a couple of areas where the original last-gen release exceeds the quality of this new port. Bandwidth is at a premium on the Tegra X1 processor, and so alpha effects resolution drops down to around 400p, notably on fire effects from the red wyvern on the bridge. It's huge drop in quality next to PS4, and even the Xbox 360 edition produces a higher quality effect. Equally, fire shaders in the Quelaag boss battle are simply changed; it's a similar effect to the Xbox 360 version, but again with a lower quality haze surrounding it.
 
Status
Not open for further replies.
Back
Top