Expected perceptual difference between Scorpio and PS4 Pro *spin-off*

4K intermediate buffers take 4x memory compared to 1080p intermediate buffers. You could lose more than 1GB just for that.

4K also requires 2x2 more detailed textures (= 1 mip = 4x memory). Otherwise graphics will look as blurry as 1080p up close. Far away graphics will look better, but only if you have enough memory to stream the textures at (2x2) higher quality. So you need 4x larger memory pools for texture streaming.

When we moved from 720p to 1080p we got 4x memory (for 2x pixel count increase). Now we are moving from 1080p to 4K (4x pixel count increase), but get only up to 1.5x memory. Needless to say, the extra memory is not going to be enough to even reach acceptable quality at 4K. Unless most engines are quickly adapting tiled resources / PRT / virtual texturing.
thanks, that shows just how important prt and new rendering methods like your mssa trick etc will become.
crazy thing is it sounds like even more reason to have split memory, big slow cheaper pool to buffer textures and faster smaller one for rt etc.
although I fully except that's 'very' unlikely now.
 
When we moved from 720p to 1080p we got 4x memory (for 2x pixel count increase). Now we are moving from 1080p to 4K (4x pixel count increase), but get only up to 1.5x memory. Needless to say, the extra memory is not going to be enough to even reach acceptable quality at 4K. Unless most engines are quickly adapting tiled resources / PRT / virtual texturing.
Interesting given how gung-ho Microsoft have been about "native 4K". All things being equal (and they rarely are) devs building Scorpio games are going to have to compromise somewhere to hit Microsoft's native 4K promise. Maybe it'll end up like their 720p AA promise for 360.
 
would 16GB make the difference? What spec is the spec that will actually meet MS' 4K VR goals?

Somehow i feel like the Scorpio spec is underwhelming for 2016 much less the end of 2017.
 
Interesting given how gung-ho Microsoft have been about "native 4K". All things being equal (and they rarely are) devs building Scorpio games are going to have to compromise somewhere to hit Microsoft's native 4K promise. Maybe it'll end up like their 720p AA promise for 360.
This combined with many engines still sitting at dx11. Those high twitch 60 FPS titles are certainly going to be hurt in quality because my understanding is that VT isn't a good fit for that type of high speed maneuver snap shot gameplay.


Sent from my iPhone using Tapatalk
 
Somehow i feel like the Scorpio spec is underwhelming for 2016 much less the end of 2017.

So what are your thoughts on the Nintendo Console Specs when they're announced? o_O
 
When we moved from 720p to 1080p we got 4x memory (for 2x pixel count increase). Now we are moving from 1080p to 4K (4x pixel count increase), but get only up to 1.5x memory. Needless to say, the extra memory is not going to be enough to even reach acceptable quality at 4K. Unless most engines are quickly adapting tiled resources / PRT / virtual texturing.

Since AF is some mythological thing in console land, maybe they can get away with relatively low texel density anyway. ;)
 
if I read it correctly, from an asset perspective, Sebbbi's post above just indicated that this would not be the case.

Going up a mip map level requires four times the memory, but you don't have to increase the resolution of all the textures that you use to see a visible improvement of scene detail. A game might choose to increase the quality of environment textures - as these typically suffer on consoles - but leave character and weapon textures the same. On PC, it's common for presets to balance quality for common sizes of video card memory. Texture improvements aren't all-or-nothing.

And with LOD, it's not just about how good your best texture can be, it's also about when you load them and how many of them you can use at once.

If we were to crudely assume that a Neo game used 3 GB (more than half of memory available) for textures, and that an otherwise identical Scorpio game would now have 7 GB of memory for textures, you can see how you could take 1/3 of your textures that you considered most in need of improvement and take them up a mip level. And lets not forget "lower than texture res" bump maps either.

My environment artist chums can produce some insanely good looking stuff. Just give them all the memory.
 
Thanks Sebbbi, learning a lot here. Just one question, but I recall your disappointment with hardware Tiled Resources fixed size on paging and the lack of control over it, does anything change now that the resolution is 4K? Or is it still better to run your own software variant of it?
Visible surfaces will contain on average 2x2 more continuous texture area on 4K. 64 KB pages (256x256, BC compressed) are viable at 4K. In comparison, software virtual texturing most commonly uses 16 KB pages (128x128, BC compressed) at 1080p.

Tiled resources are still lacking GPU indirect tile mapping update support. If this is introduced in future DirectX 12 version and all GPUs support it, then tiled resources is going to be much more useful. Right now tiled resources is not good enough to implement latency sensitive sparse dynamic rendering, such as sparse shadow mapping. The lack of indirect page mapping update will also limit volume tiled resource (tier 3) usability, as you'd likely want to generate/modify sparse 3d data sets on GPU.
 
I find it hard to believe 4k buffers for current gen games would get to take up to 1GB on the jump to 4k. Aside from the ocasional ps4 exclusive, don't most games try to stay close to 32MB for ESRAM performance? I'd also think most games, using virtual texture or not, still have enough headroom in their streaming systems that some extra mip levels are loaded in before they are ultimately necessary. There's still some texels left to show at 4k without ths need for extra memory, letalone if scorpio has 12GB...
 
I find it hard to believe 4k buffers for current gen games would get to take up to 1GB on the jump to 4k. Aside from the ocasional ps4 exclusive, don't most games try to stay close to 32MB for ESRAM performance? I'd also think most games, using virtual texture or not, still have enough headroom in their streaming systems that some extra mip levels are loaded in before they are ultimately necessary. There's still some texels left to show at 4k without ths need for extra memory, letalone if scorpio has 12GB...

Even on the X1, games don't fit all their buffers into the 32 MB esram. You might have some significant part of your your z-buffer and colour buffer in main memory (like Forza Horizon 2 does iirc). You might have a buffer storing motion vectors for each pixel. You might have particles writing to their own full sized buffer using compute shaders. You might have many tens of MBs of shadow map buffers, and you might have several previous copies of colour, depth and motion vectors stored for your temporal needs. If you're using virtual texturing, the memory required for that will scale with resolution too. HDR may require more bits per pixel also. Not all of it will require huge amounts of bandwidth, but it will need storing.

Even an X1 game could easily be using North of 100 MB MB of buffers, and that could easily scale up by 5x in the push for 4K.

Sony making an extra 500 MB of memory available for the Neo - even with the same assets, techniques and shaders used - is probably a sign of the headroom that it's wise to give developers, should they need it.
 
X1 does remarkably well with its 32 MB of esram, but if you consider that MS's goals may require 4~5x the fast storage for buffers, while process advances mean only a ~2x increase in the size of esram would have been practical ... then it really looks like esram had to go in the bin.

If the focus had been on 1080p 60hz, then perhaps it would have been a different story.
 
A 6Tf console is underwhelming in 2016? I'm divided between :rolleyes: and :runaway:

the rx480 does about 6tf. the gtx 980ti does more than 6tf, The 1070 does 6.5 or thereabouts and will probably appear in laptops this fall. the 480 is a mainstream $200 card. So my laptop this year will outperform a console that arrives next christmas?

Next year the current 1080 should be mainstream/midrange and its about 9tf.

I dunno. it just doesn't seem like relative to PC, 6tf next christmas is all the great.
 
the rx480 does about 6tf. the gtx 980ti does more than 6tf, The 1070 does 6.5 or thereabouts and will probably appear in laptops this fall. the 480 is a mainstream $200 card. So my laptop this year will outperform a console that arrives next christmas?

Next year the current 1080 should be mainstream/midrange and its about 9tf.

I dunno. it just doesn't seem like relative to PC, 6tf next christmas is all the great.

480 does 5.8 TF at peak turbo and has yield, BW and fillrate issues - and there is no 12 GB, 384-bit model.

Your laptop is only going to outperform Scorpio if you spend a lot more than $400 on it.
 
I find it hard to believe 4k buffers for current gen games would get to take up to 1GB on the jump to 4k. Aside from the ocasional ps4 exclusive, don't most games try to stay close to 32MB for ESRAM performance? I'd also think most games, using virtual texture or not, still have enough headroom in their streaming systems that some extra mip levels are loaded in before they are ultimately necessary. There's still some texels left to show at 4k without ths need for extra memory, letalone if scorpio has 12GB...
Hahahaha, you find him hard to believe?! He's a dev who makes fairly advanced rendering engines. When he posts, he's not posting wild theories. He is stating facts backed up by experience working on these consoles.
 
480 does 5.8 TF at peak turbo and has yield, BW and fillrate issues - and there is no 12 GB, 384-bit model.

Your laptop is only going to outperform Scorpio if you spend a lot more than $400 on it.

You would have to spend a small fortune on such a laptop today, but still, the fact that a laptop can outperform a new ultra high end console more than a year before it actually releases - regardless of the cost, has to be something of a first. Isn't it usually a year or two before laptops catch up with new consoles? They don't tend to significantly outperform them over a year before they release.
 
I don't expect much of a difference at all if Microsoft demands all games render at native 4K on the Scorpio. Every bit of TFlop advantage over the PS4 NEO will evaporate. A PS4 Neo game rendered at 1400P and then checkerboarded up to 4K will look nearly identical -- perhaps slightly softer -- than a native 4K game on the Scorpio which may have between 6 and 7 TFlops.
 
Would be nice to have uncompressed screenshots of the same viewpoint w/ checkerboard rendering and native 4K. That said I don't have a 4K display to look at them, much less side by side ;)
 
I don't expect much of a difference at all if Microsoft demands all games render at native 4K on the Scorpio. Every bit of TFlop advantage over the PS4 NEO will evaporate. A PS4 Neo game rendered at 1400P and then checkerboarded up to 4K will look nearly identical -- perhaps slightly softer -- than a native 4K game on the Scorpio which may have between 6 and 7 TFlops.
Why wouldn't you use checkerboard rendering also on Scorpio? Games already use checkerboard rendering to reach 1080p on Xbox One and PS4. This tech is in no way limited to PS4 Pro.

4K pixels are 4x smaller (compared to 1080p). This makes checkerboard issues much less visible compared to 1080p. I think checkerboard is a no-brainer really. Unless of course you use some alternative reconstruction method to reach 4K (I presented one method at SIGGRAPH 2015). Brute force rendering 4K (4x more pixels) is insane. I wouldn't do that even on high end PC (GTX Titan). Checkerboard doubles the flops you can use to improve visual quality.
 
Last edited:
Back
Top