Digital Foundry Article Technical Discussion [2017]

Status
Not open for further replies.
But not dynamic CPU load.

Logically tho if it's locked on pro and the CPU is improved in the X we should expect the same.
it does appear that GPU having load there though, the move from resolution to quality mode shows additional impact to frame rate. Curious to see if they can get it to locked 30. That's going from 24 -> 32/33 fps. Nearly a 40% improvement needs to be realized there.
 
But not dynamic CPU load.

Logically tho if it's locked on pro and the CPU is improved in the X we should expect the same.
Don't forget that XB1 CPU is already nearly 10% faster than PS4 (similar difference between XBX and Pro CPUs) and it doesn't prevent the XB1 version to performs a bit worse than the PS4 game. I think those drops are not CPU related contrary to what DF think. I agree with @iroboto on this point.

it does appear that GPU having load there though, the move from resolution to quality mode shows additional impact to frame rate. Curious to see if they can get it to locked 30. That's going from 24 -> 32/33 fps. Nearly a 40% improvement needs to be realized there.
I think they should begin to lower the minimum dynamic resolution (currently 1890p, Pro minimum recorded is 1512p). Or make the dynamic res more aggressive. If it's GPU related then it should improve things a lot. XBX GPU has 43% more power than Pro, not 56% more (what would be needed for 1512p -> 1890p).
 
Don't forget that XB1 CPU is already nearly 10% faster than PS4 (similar difference between XBX and Pro CPUs) and it doesn't prevent the XB1 version to performs a bit worse than the PS4 game. I think those drops are not CPU related contrary to what DF think. I agree with @iroboto on this point.


I think they should begin to lower the minimum dynamic resolution (currently 1890p, Pro minimum recorded is 1512p). Or make the dynamic res more aggressive. If it's GPU related then it should improve things a lot. XBX GPU has 43% more power than Pro, not 56% more (what would be needed for 1512p -> 1890p).
Well, I assume before they get to that point, I wonder if they were already at near final shader code. As per my graph, needing 4 weeks before gamescom, you're looking at code that hasn't changed since July and comparing it to code released this month. I think they were optimized pretty well, but I don't know if they are at final shader code. If 1X is struggling at that part of the game and no where else, I assume all consoles struggled at that same spot.

So, let's wait and see for the final compare, but I'm pretty confident that DF will check that scene out again and see if Monolith cleaned that part up. Effects seem to have something to do with the performance drop. If 'prefer resolution' is running higher frame rates over prefer 'quality', then it's something one of their effects are doing that is bogging the system down. Perhaps alpha, perhaps something else.
 
is there still a benefit in 4K textures even if the output res is (much) lower than 4K? If so, is there a resolution breaking point where it becomes useless to have 4K textures?
Yes and maybe.

Screen resolution is a 2D grid of colour values referred to as pixels. Texture resolution is a 2D grid of colour values referred to as texels. If you have a 1080p display and a 16:9 aspect texture of dimensions 1920x1080, and you draw that texture to exactly fit the screen, you will have a pixel perfect texture. If you have a 16:9 aspect texture at 4x the resolution, 3840x2160, and display it on that display, you'll either view every other pixel or average a group of four of them. If your texture is only 192x108, you'll have to interpolate pixel values from between texel values for every ten pixels.

That's when the texture is perfectly aligned with the screen, but textures rarely are. If you enlarge the texture to fill more of the screen, you reduce the number of visible texels being used to create the pixel data. Thus, walking towards a wall or shooting the wall with an Engorgeo Ray, the original 1920x1080 texture no longer provides a texel colour value for every pixel and instead you start to interpolate values and create blur. The '4K texture' will be able to provide per-pixel data up to viewing 1/4 of the texture, beyond which it'll start to blur. On higher resolution displays, the perfect texel:pixel ratio is obviously requiring higher resolution textures. A super-massive texture of 16k x 16k could fill both a 1080p and 4K display with pixel-perfect data even on close zooms/large scales. Let's say zooming in an object through a sniper lens or binoculars. Uncharted 4 has some pretty horrible low-res assets when zoomed in. If they were many times the size, you could zoom in and still see detail instead of blur (this goes for geometry detail too, of course)

However, if your game is such that you never need more texel data than your display requires, it'll go to waste. Imagine a top-down game where no textures are stretched larger than the screen - extra texture data will never be used.

It's also worth noting that '4K texture packs' aren't necessarily perfect for a 4K display, any more than the 1080p assets are perfect for the 1080p display. You are still likely to be interpolating texel values to fill the screen because the per-pixel data is too much to fit in RAM, unless you used tiled assets/megatexturing, which no-one is doing, despite it being the future, which I guess it wasn't, but I don't know why.
 
Is there even such a thing as "4k textures"?...it's just higher texture detail and lower?

Higher texture detail would still be noticeable at 1080p...it's just the difference is probably amplified at a 4K resolution.
 
Is there even such a thing as "4k textures"?...it's just higher texture detail and lower?
Well technically yes - 4096 x 4096 textures instead of the old classic 2048 x 2048. But in this context it's just 'higher res textures'. I'm sure on PC these super high-quality textures were present at the same level on 1080p displays. You can't really market 'higher quality textures' so well as '4K textures' though.
 
Is there even such a thing as "4k textures"?...it's just higher texture detail and lower?

Higher texture detail would still be noticeable at 1080p...it's just the difference is probably amplified at a 4K resolution.
It's exactly like Shifty writes

With megatextures/tiled resources this problem doesn't even exist.
 
It's exactly like Shifty writes

With megatextures/tiled resources this problem doesn't even exist.

It still exists. It just shifts from individual texture resolutions to the overall texture resolution and density. At some point you need to restrict the size (storage size) of your megatexture. And then you have to make sure it's of a size suitable for whatever game delivery mechanism you are using.

A classic case of this is iD Software's Rage. Someone can correct me if I'm wrong but their original mega texture was over 1 TB in size and their artists still wanted greater texture fidelity and detail. They eventually pared that down to 100-150 GB. But for the shipping game they had to drastically reduce that yet again which is why it looks great in some places and looks really blurry in other places. They used an algorithm to determine which areas would be seen the most and for the longest times and reduced the quality of the texture there the least. While other places which wouldn't get seen as much got very very drastically reduced in quality.

Regards,
SB
 
It's exactly like Shifty writes

With megatextures/tiled resources this problem doesn't even exist.
The weird thing though is tiled resources are barely touched. Why is that? Why are we needing more and more RAM to accommodate assets still?
 
The weird thing though is tiled resources are barely touched. Why is that? Why are we needing more and more RAM to accommodate assets still?

Because of older rendering engines that haven't been retrofitted with recent streaming capabilities and/or do not know how to keep track of what assets are or will be needed? The older engines are then limited by the slow mechanical storage mechanisms which may not be fast enough to cope with the asset thrashing? If the mechanical storage were replaced with lower latency storage (nvme, hybercubes, etc) then perhaps even the old engines could cope with less ram?

Yes, I answered your last question with more questions (possible situations).
 
My impression was that tiled resources was somewhat impractical until Tier 3.

Streaming buffer?
IIRC tiled resources only supports 64K tile sizes. So unless you developed your game and engine around this tile size, it's not an ideal scenario. Sebbbi preferred his own VT system, his tile sizes were 16K IIRC.

There was another factor about keeping track of the tiles using his own system vs the one provided. He preferred his own method.

T3 tiled resources is volume tiled resources IIRC. I dont think they are related.
 
IIRC tiled resources only supports 64K tile sizes. So unless you developed your game and engine around this tile size, it's not an ideal scenario. Sebbbi preferred his own VT system, his tile sizes were 16K IIRC.

There was another factor about keeping track of the tiles using his own system vs the one provided. He preferred his own method.

T3 tiled resources is volume tiled resources IIRC. I dont think they are related.
Also, as I understood it, if you want to use tiled resources, you must deliver them in a specific format. All hardware that doesn't support that would need the "classic" format so it is only usefull if you really have the DX12 feature-level hardware.
 
...

Honestly, looking back, it's actually sort of silly how much we debated the differences back then. Aside from resolution there were basically none. I feel silly to even have engaged in it.

...
That's history revisionism.

Remember Call of duty ? 720p on XB1 and 1080p on PS4 ? Sum of difference of clarity and details of geometry was bigger than here. Yes it was.

Tomb Raider 2013 running at 60fps on PS4 and 30fps on XB1 with often a mere 20fps advantage on PS4 when both dropped under their target framerate ?
Well, I assume before they get to that point, I wonder if they were already at near final shader code. As per my graph, needing 4 weeks before gamescom, you're looking at code that hasn't changed since July and comparing it to code released this month. I think they were optimized pretty well, but I don't know if they are at final shader code. If 1X is struggling at that part of the game and no where else, I assume all consoles struggled at that same spot.

So, let's wait and see for the final compare, but I'm pretty confident that DF will check that scene out again and see if Monolith cleaned that part up. Effects seem to have something to do with the performance drop. If 'prefer resolution' is running higher frame rates over prefer 'quality', then it's something one of their effects are doing that is bogging the system down. Perhaps alpha, perhaps something else.
Well you assumed wrongly. Even the peasant consoles don't struggle in that area.
 
Also, as I understood it, if you want to use tiled resources, you must deliver them in a specific format. All hardware that doesn't support that would need the "classic" format so it is only usefull if you really have the DX12 feature-level hardware.
Yea if you are going multiplatform this isn't ideal
 
Well technically yes - 4096 x 4096 textures instead of the old classic 2048 x 2048. But in this context it's just 'higher res textures'. I'm sure on PC these super high-quality textures were present at the same level on 1080p displays. You can't really market 'higher quality textures' so well as '4K textures' though.
hence why I don't use the term 4k texture, I call them 4k assets. This just means that the textures are higher quality than the ones used for hd.

I remember when world of tanks said they used 4k textures on the XO, and people thought the game ran at 4k, or that it was using textures made for 4k tvs.
can understand the confusion though, even if it did bug me.
 
That's history revisionism.

Remember Call of duty ? 720p on XB1 and 1080p on PS4 ? Sum of difference of clarity and details of geometry was bigger than here. Yes it was.

Tomb Raider 2013 running at 60fps on PS4 and 30fps on XB1 with often a mere 20fps advantage on PS4 when both dropped under their target framerate ?

Well you assumed wrongly. Even the peasant consoles don't struggle in that area.
Aside from a few freak cases, which at the time at the beginning of a new gen did get people's juices flowing; after that point the games started to hit fairly predictable configurations.

We've not seen that sort of issue since that I can properly recall.

The exception is definitely the exception. And we found most of those exceptions early on in this generation. For a variety of reasons, but since those issues have been hammered out, we haven't seen that type of thing since.

4Pro had been out for a while now, I don't think we will see similar patterns between the start of the gen and this mid generation. The maturity of the tools and the kits, and the developer skills and familiarity with the hardware at this point in time should in many ways carry directly into the mid gen refresh.

I'm not expecting any surprises from this point forward.

And once again it's useless to make a statement about performance with a build that wasn't even in its optimization phase yet. By the time SoW is released they would have had 4 months of dev and optimization to improve those areas. I find it highly improbable that they would release the title in that state.

And the only reason you think the base models can handle that scene fine is because you never saw earlier code releases of it. Pre-release code has only ever been Xbox.
 
Status
Not open for further replies.
Back
Top