Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Sorry but pretty much always when people post that stuff, all I can think of is that probably none of you guys have ever properly tested those kind of sizes, distances and resolutions. I sit about 3 meters away from my 75" TV and I greatly appreciate native 4K resolution. I like to think my eyes are pretty good, but I don't think they are exceptional or anything like that. those vieving distance charts are just BS or don't translate well into a real moving images. 4K brings out a ton of more detail much sooner than your examples suggests.

I just started playing Arkham Knight on PS4 Pro, but apparently it does not have a Pro batch and 1080p with jaggies produces a absolutely terrible looking image quality in that game, while the assets itself are pretty good. I need to play that on a PC. 1440P is good, but not a huge fan of the checkerboarding etc. methods. They seem to fall apart in many situations.
Those viewing distances have been tested with film material. (and most 4K films are actually 2K upscaled, it requires uncompressed 4K sources ideally)

In VFX we have hundreds of rays per pixels, some difficult scenes can go up in the thousands of rays per pixels to deal with specular shimmering.

In games you have very very few samples per pixels. I see mention of intermediate buffers at half res, AA is reducing details in a bid to reduce jaggies, etc... If you compare a 1.84TF console with jaguars to a 10TF+ PC it's not the raw resolution of the frame buffer that's being compared.

Any technique reusing previous frame data is multiplying the effective samples per pixel. It's ludicrous not to use those techniques, just like it was stupid not to use compression where media bandwidth was the bottleneck. We have a severe compute bottleneck compared to what would be needed to brute force rendering. Full scene native 4K is just wasting power that could have been used to improve lighting quality instead.
 
Last edited:
I agree in practice… but I’m not sure CBR will be effective in all cases.
It's not perfect, but it's the better compromise out of all the compromises we have to make. Similar to RTX card owners preferring to disable RTX reflections in games to get a better framerate because, even though the artefacts of screen-space reflections are obvious when you know them, the improvements in temporal resolution are deemed a more significant gain than the improvement in image quality from accurate reflections.

Where a 5% difference in rendering quality enables a 100% improvement in framerate, it's clearly the better choice. And the artefacts you complain of in Spiderman are in part to it targeting 30 fps. At higher framerates, reconstruction becomes ever more robust.
 
If all CBR methods were equal (great), then I could agree with such a statement. But they aren’t.

A big reason for that is they are often targetting only PRO and X, when not only PRO and just bruteforcing 4k on X as well (as in RDR2)
It just doesn't recieve as much attention.
Come next gen, temporal reconstruction will be the default for the base experience. Also add to that higher framerate: less temporal artifacts, and the extra compute: more sophisticated reconstruction, and the baseline reconstruction IQ on next gen might reach and surpass the current best examples of now.
 
Any technique reusing previous frame data is multiplying the effective samples per pixel. It's ludicrous not to use those techniques, just like it was stupid not to use compression where media bandwidth was the bottleneck. We have a severe compute bottleneck compared to what would be needed to brute force rendering. Full scene native 4K is just wasting power that could have been used to improve lighting quality instead.

Sure.

But with pros there are cons.

And Dr. Evil is correct. Larger TVs will exacerbate certain issues with CBR more so than smaller sets. Example; my 4K 32-inch LG desktop monitor has less noticeable issues with Pro games using CBR. The problems haven't disappeared magically because of a smaller set, they're just less noticeable because of the smaller screen real-estate (tighter pixel density). However, something 60-inches and over (like my 79" LG or 85" Sammy) will definitely show CBR artifacting, ghosting, and imagery loss on certain objects/textures (overall quality), so it isn't simply the user imaging these things.

Point being, CBR is a good step towards offering high resolutions and solid framerates (a middle-ground between the two), but it isn't a miracle method (not yet) without drawbacks.
 
Spiderman are in part to it targeting 30 fps. At higher framerates, reconstruction becomes ever more robust.

But this is the problem, developers aren't going to be simply targeting 60fps just because CBR is more common. If anything, CBR resources will give developers more room on saturating the graphics-pipeline with more visual eye-candy (possibly with limited RT), rather than the greater benefits towards higher framerates (60fps, 120fps, etc.). Sure we'll get more 60fps (possibly 120fps) cross-generational titles, but once the new generation gets into full swing, we’re going to see more of the same – just prettier.
 
Last edited:
Guru3D measured about 238W on 5700XT Strix OC at 1950MHz
Still too hight but less than you expect
This figure is for the whole board, which included memory (30-40W), VRM overhead, and various board IO. It also has whatever voltage AMD sets it to. We already know console makers will fine tune voltage profiles to hit their targets. This is how MS got a 6TF Xbox to draw the same 180W at the wall what a 6TF RX 580 is rated for.

They may not have N7+, but they’ll probably have N7P to save 5%.
 
I agree in practice… but I’m not sure CBR will be effective in all cases. In most cases CBR can exacerbate aliasing issues (particularly with smaller objects edges - like blades of grass) and add additional unwanted ghosting / motionblur not intended by the developers.

Example: Uncharted 4: A Thief's End is a visually beautiful game with a nice CBR method in place. However, the CBR method creates too much additional blur (even when you disable motionblur within the games settings) in which I’m unable to finish the game because of the latency of late blur created during image reconstruction. Makes me quite nauseated. And there are games like Spiderman and Detroit: Become that have the best CBR methods among Sony’s first-party games, IMHO. However, both of those games (more so Spiderman) have issues with far-off objects with intrinsic texture details getting butchered or even lost during the reconstruction phase. It's not overtly bad, but once you recognize certain issues, you know there is much room for improvement.

Don’t get me wrong, I’m not against CBR – I’m a big supporter of it. But if that's the developer(s) goal on presenting pristine native 4K imagery, then I’m totally for it.

The problem here is that you're looking at games that are mostly 30 Hz. 30 FPS is absolutely horrible for any reconstruction technique, especially for ones that leverage temporal reconstruction/accumulation.

60 Hz is the bare minimum for decent results and even then you may notice artifacts.

The large bump in CPU power in the next gen consoles should make temporal reconstruction/accumulation rendering methods far more feasible as the developers can more easily target 60 or 120 Hz rendering.

Many if not most current consoles games are likely CPU limited in such a way as to make 60 Hz rendering not feasible. In other words, even if they are GPU limited currently, if they reduced the GPU load by using reconstruction, they may still be limited by the CPU in such a way as to prevent 60 Hz rendering.

Regards,
SB
 
Last edited:
I see a lot of people here using current gen methods of upscaling to nay say next gen upscaling but you can't do that really.
Next gen it won't be 1080p being upscaled to 4k but probably a higher resolution and possibly 60fps which should lead to much better results.

No doubt that native 4k will always look better but for me 60 fps upscaled beats 4k native at 30fps by a long way.
 
I agree in practice… but I’m not sure CBR will be effective in all cases. In most cases CBR can exacerbate aliasing issues (particularly with smaller objects edges - like blades of grass) and add additional unwanted ghosting / motionblur not intended by the developers.

Example: Uncharted 4: A Thief's End is a visually beautiful game with a nice CBR method in place. However, the CBR method creates too much additional blur (even when you disable motionblur within the games settings) in which I’m unable to finish the game because of the latency of late blur created during image reconstruction. Makes me quite nauseated. And there are games like Spiderman and Detroit: Become that have the best CBR methods among Sony’s first-party games, IMHO. However, both of those games (more so Spiderman) have issues with far-off objects with intrinsic texture details getting butchered or even lost during the reconstruction phase. It's not overtly bad, but once you recognize certain issues, you know there is much room for improvement.

Don’t get me wrong, I’m not against CBR – I’m a big supporter of it. But if that's the developer(s) goal on presenting pristine native 4K imagery, then I’m totally for it.
It's not CBR. On Pro the game runs at good old native 1440p. What you see (when motion blur off) is probably their custom TAA.

And Spiderman doesn't use CBR, it uses temporal injection which is why it's less sharp than CBR and details are lost.
 
Last edited:
I think that CBR and Temporal Injection/upscaling should be discussed separately.
Or at least don't use CBR to cover every non native rendering technique.
CBR gives certain types of artifacts that some people just can't stand when it breaks down. TI and Temporal upscaling will just look like a lower resolution image.
I think sometimes people are ok with TI and dynamic resolution, but when you say CBR they won't like it due to artifacts.

Seems like the industry is moving more to the TI route than CBR also. Maybe because getting CBR right is a lot harder, especially when it breaks down?

VRS, dynamic res, TI, is here to stay and straight native will become less and less used, apart from the odd marketing bullet point.
 
I think that CBR and Temporal Injection/upscaling should be discussed separately.
Or at least don't use CBR to cover every non native rendering technique.
CBR gives certain types of artifacts that some people just can't stand when it breaks down. TI and Temporal upscaling will just look like a lower resolution image.
I think sometimes people are ok with TI and dynamic resolution, but when you say CBR they won't like it due to artifacts.

Seems like the industry is moving more to the TI route than CBR also. Maybe because getting CBR right is a lot harder, especially when it breaks down?

VRS, dynamic res, TI, is here to stay and straight native will become less and less used, apart from the odd marketing bullet point.
But isn't CBR just a form or temporal injection? The good CBR implementations we have seen happen to be good TI implementations. The bad ones causing artifacts are obviously to be ignored.
 
CBR is a specific form of reconstruction using a checkerboard pattern. We have no idea what TI involves as it's never been publicly discussed. It would appear people are using 'CBR' to mean 'reconstruction' because the industry hasn't provided a generic term for this new technique; to the point we have nonsensical naming like nV calling their reconstruction an AA method.
 
Assuming that design is functional to support their cooling arrangement, what would the underlying arrangement be? There's no room for a large, slow fan.
 
The problem here is that you're looking at games that are mostly 30 Hz. 30 FPS is absolutely horrible for any reconstruction technique, especially for ones that leverage temporal reconstruction/accumulation.

60 Hz is the bare minimum for decent results and even then you may notice artifacts.

The large bump in CPU power in the next gen consoles should make temporal reconstruction/accumulation rendering methods far more feasible as the developers can more easily target 60 or 120 Hz rendering.

Many if not most current consoles games are likely CPU limited in such a way as to make 60 Hz rendering not feasible. In other words, even if they are GPU limited currently, if they reduced the GPU load by using reconstruction, they may still be limited by the CPU in such a way as to prevent 60 Hz rendering.

Regards,
SB

As I stated before, even with improved CBR methods allowing for higher framerates, most developers will use those additional resources towards more-and-more eye candy, saturating whatever headroom towards having higher framerates (e.g., 60/120fps). The most we'll see are improvements towards prior or cross-generational titles (higher framerates), but once the next-generation of gaming gets into the full cycle, 30fps gaming will be the bulk of AAA/AAAA gaming... just prettier.

It's not CBR. On Pro the game runs at good old native 1440p. What you see (when motion blur off) is probably their custom TAA.

And Spiderman doesn't use CBR, it uses temporal injection which is why it's less sharp than CBR and details are lost.

I stand corrected (see below). So, maybe more dynamic resolutions games should be the key (or middle-ground) rather than CBR?

https://www.eurogamer.net/articles/digitalfoundry-2018-marvels-spider-man-ps4-tech-analysis
This is coupled with generally excellent image quality. Spider-Man utilises Insomniac's temporal injection technique featured in Ratchet and Clank on PS4 Pro. While there is precious little information available on this technique, Insomniac seems to prefer this over typical checkerboarding. The results are mostly excellent - completely avoiding the stippling that can crop up with checkerboard rendering and producing detail that exceeds raw pixel count. But even here, Insomniac has pushed the boat out by implementing DRS - dynamic resolution scaling. On PS4 Pro, I saw a top-end of 3456x1944, with a 2560x1368 minimum. This lowest value appears to be extremely rare, however, with the average resolution presenting around 1584p instead.
 
Status
Not open for further replies.
Back
Top