Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
I love the fact they're targeting 30fps as it appears to be the case here. Pushing all those scope, detail and destruction would surely require all the power you have. Raytraced reflection is not needed tho if the game is stuck at 1080p, it would take at least 1440p to clearly discern distance objects and thin alphas, otherwise there's no point pushing all those details if the output is a blurry mess.
Also it seems PS5 is more and more likely to be either 10.24 or 11.26 RDNA TF depends if you're using 40 or 44 CU, more towards the latter according to the same person who informed that next gen game.
 
My Conclusion is, Nextgen Consoles are too weak to handle 4K, Raytraycing and 60 Fps at the same time .
David Cage has already said so.

Cage admitted that new technology does have a lot of strengths such as “a significant improvement in CPU power, which will mean significant improvements in physics and AI. GPU improvement should be enough to get Ray Tracing in Full HD or Full 4K resolution (without Ray Tracing).” Despite how great these features sound, David Cage was sure to state that “All in all, we believe that there will be serious improvements in next-gen games, but maybe not the ones that are currently promoted the most.”
 
I love the fact they're targeting 30fps as it appears to be the case here. Pushing all those scope, detail and destruction would surely require all the power you have. Raytraced reflection is not needed tho if the game is stuck at 1080p, it would take at least 1440p to clearly discern distance objects and thin alphas, otherwise there's no point pushing all those details if the output is a blurry mess.
Also it seems PS5 is more and more likely to be either 10.24 or 11.26 RDNA TF depends if you're using 40 or 44 CU, more towards the latter according to the same person who informed that next gen game.

What does resolution and thin alphas has to do with RT reflections. A mirror that can properly show dynamic objects behind the camera vs. a static cubemap is a transformative difference in any resolution, even at 480p, let alone the "blurry mess" of Full HD. Ughhh...
 
From the same guy again :).
So here is the deal.

I specifically asked about general Teraflop performance about Scarlett and PS5.

He said " from what I know, both final console should have double digits TF."

.....keep in mind, this conversation was in late June, after AMD already outlined 5700 and 5700xt were shipping July 7th.....

So then I specifically said."well that would mean greater performance than the new AMD Navi GPUs right?"

....he nodded his head yes!
One of them is 11.26 RDNA TF alright!
 
What does resolution and thin alphas has to do with RT reflections. A mirror that can properly show dynamic objects behind the camera vs. a static cubemap is a transformative difference in any resolution, even at 480p, let alone the "blurry mess" of Full HD. Ughhh...
If the power used for RT reflection is a big tradeoff for Res then you compromise clarity. Everything is a compromise in console environment, but interestingly enough I can see mid gen consoles to be the perfect use for RT, so there's hope.
 
4k is the biggest waste of pixels, especially at 30Hz.

A 40 degree viewing angle on an 85" tv is a seating distance of approximately 102". If you made a 2560x1440 display that was 85", you'd have to sit closer than 100" away to start to see the shape of the pixels. 4K just covers the situations where people sit abnormally close to their tvs, or have exceptional vision. It generally is not worth the rendering budget. It's much smarter to render at lower resolutions and use something like temporal accumulation to a 4k output. Accumulation strategies work better at 60+ Hz. 60Hz has the byproduct of lowering motion blur massively. You can render 4k30, but it looks like garbage in motion and destroys all of that pixel detail anyway. The new consoles sound like they're well positioned for 60Hz this gen, because the CPUs won't cripple performance. The smart play, especially with RT is to run at 60Hz to take advantage of temporal information.
 
So you "really" believe Sony will try to market a 250-300W console as a worldwide mainstream product?:)

Let's not forget 5700XT TDP is 225W at 1905Mhz and you guys expect 2Ghz which is vastly outside the sweet spot of that design.

Unless EUV or AMD has some unexpected TDP improvements that makes no sense to me.
 
Let's not forget 5700XT TDP is 225W at 1905Mhz and you guys expect 2Ghz which is vastly outside the sweet spot of that design.

Unless EUV or AMD has some unexpected TDP improvements that makes no sense to me.

The 5700XT's TDP a data point and a snapshot of where the 7nm and current RDNA design is, not completely indicative of where the consoles will be next year. Do I expect a console in 200+ W range? no, but one in the 150-200 is possible (X1X and PS3 were two such consoles).

I personally don't expect many features of RDNA2 to make it to the consoles, maybe some minimum RT. I think the majority of the work done beyond what has shipped in Zen2 and Navi will be focused on tight integration, clock improvements, and power efficiency.

Like with software for consoles vs PCs, you can squeeze out a lot from hardware design by targeting one spec/product. I think getting another 25-30% efficiency gains is certainly possible especially if they couple that with a more mature 7nm process.

And since this is the baseless thread, I think the rumors of HBM may actually hold some water due to the lower power consumption of HBM over GDDR6. That's probably another way to recover 10-20W.
 
I still believe native 4k is a big waste of resources. I doubt the big AAA games pushing tech will be native 4k maybe in the launch window though.

If all CBR methods were equal (great), then I could agree with such a statement. But they aren’t.

Red Dead Redemption 2 on my X looks miles better (clarity and sampling wise) than my Pro edition. There is so much image noise, blurriness and murkiness dealing with hair, fur and shrubberies when compared to X’s pristine native 4K edition. And there are also certain issues with high frequency details (e.g., stitching, laces, etc.) being washed out with Rockstar’s awful CBR method. It’s complete trash when compared to other impeccable CBR methods such as Spiderman, GOW, and Detroit: Become Human.

So there are plenty of reasons for developers to use native 4K. Xbox One X games like Forza 4, RDR2, RE2, GoW 4 and plenty others prove so – even in a 6TF package.
 
4k is the biggest waste of pixels, especially at 30Hz.

A 40 degree viewing angle on an 85" tv is a seating distance of approximately 102". If you made a 2560x1440 display that was 85", you'd have to sit closer than 100" away to start to see the shape of the pixels. 4K just covers the situations where people sit abnormally close to their tvs, or have exceptional vision.

Sorry but pretty much always when people post that stuff, all I can think of is that probably none of you guys have ever properly tested those kind of sizes, distances and resolutions. I sit about 3 meters away from my 75" TV and I greatly appreciate native 4K resolution. I like to think my eyes are pretty good, but I don't think they are exceptional or anything like that. those vieving distance charts are just BS or don't translate well into a real moving images. 4K brings out a ton of more detail much sooner than your examples suggests.

I just started playing Arkham Knight on PS4 Pro, but apparently it does not have a Pro batch and 1080p with jaggies produces a absolutely terrible looking image quality in that game, while the assets itself are pretty good. I need to play that on a PC. 1440P is good, but not a huge fan of the checkerboarding etc. methods. They seem to fall apart in many situations.
 
Help, I'm trapped in the wayback machine...

Let developers determine how best to use the given resources to make the game they want to make. Then you can vote with your wallet.
 
It’s complete trash when compared to other impeccable CBR methods such as Spiderman, GOW, and Detroit: Become Human.
The smart move would be to advance the art, with Sony et al sharing best practices and next-gen engines being developed from the ground up with reconstruction in mind. It's far and away the better solution than brute-forcing higher resolutions.
 
Sorry but pretty much always when people post that stuff, all I can think of is that probably none of you guys have ever properly tested those kind of sizes, distances and resolutions. I sit about 3 meters away from my 75" TV and I greatly appreciate native 4K resolution. I like to think my eyes are pretty good, but I don't think they are exceptional or anything like that. those vieving distance charts are just BS or don't translate well into a real moving images. 4K brings out a ton of more detail much sooner than your examples suggests.

I just started playing Arkham Knight on PS4 Pro, but apparently it does not have a Pro batch and 1080p with jaggies produces a absolutely terrible looking image quality in that game, while the assets itself are pretty good. I need to play that on a PC. 1440P is good, but not a huge fan of the checkerboarding etc. methods. They seem to fall apart in many situations.

Nowhere did I claim that a 1080p output is comparable to 4k in any way. You can't find a 1440p tv in that size anywhere. The best you can do is play a game with 1440p output that's scaled. Whether you can see the difference or not is going to depend on your eyesight, but if you're 20/20 or less you won't be able to see the difference. Still doesn't change the fact that some lower rending resolution accumulating to a 4k output is a better choice than native 4k rendering.
 
Nowhere did I claim that a 1080p output is comparable to 4k in any way. You can't find a 1440p tv in that size anywhere. The best you can do is play a game with 1440p output that's scaled. Whether you can see the difference or not is going to depend on your eyesight, but if you're 20/20 or less you won't be able to see the difference. Still doesn't change the fact that some lower rending resolution accumulating to a 4k output is a better choice than native 4k rendering.

The second chapter was just something related to the topic. Regardless there are situations where none of those non native methods provide as good results and where the native higher resolution has large benefits and that is pretty much any scene with tons of detail, close or far. 1440p simply wont be able to resolve the same amount of detail than 4k looking at things like thick bushes and trees with leaves give that away instantly.

Personally I'd be ok if games were native 4K 60fps and took a hit elsewhere, not holding my breath for that to happen, but I'd make that trade off... More often than not the higher detail settings are the ones that bring nearly nothing to the table but half the frame rate, where as 4K brings pristine image quality across the screen, reducing jaggies and is just a pleasing to the eyes in general.

I hope next gen consoles take the current quality/res user choice settings that mid gen consoles brought further...
 
1440p simply wont be able to resolve the same amount of detail than 4k looking at things like thick bushes and trees with leaves give that away instantly.
In static images, no, but lower resolutions means higher framerates with better temporal resolution and better reconstruction. Would you pick 4K30 or FauxK60? Or 4K60 versus FauxK120? 120 Hz upscaled should have unnoticeable jittering and effectively the same spatial resolution in 1/60th of a second as 60 Hz, but with the added benefit of super smooth motion. The artefacts will be greatly diminished in the same way high bitrate movies and JPEGs and audio are indistinguishable from the raw originals. Even then, do you really want 4K30 with pristine pixels and large judder on camera movement, or slightly software pixels and significantly smoother motion? I know which I'd prefer!
 
The smart move would be to advance the art, with Sony et al sharing best practices and next-gen engines being developed from the ground up with reconstruction in mind. It's far and away the better solution than brute-forcing higher resolutions.

I agree in practice… but I’m not sure CBR will be effective in all cases. In most cases CBR can exacerbate aliasing issues (particularly with smaller objects edges - like blades of grass) and add additional unwanted ghosting / motionblur not intended by the developers.

Example: Uncharted 4: A Thief's End is a visually beautiful game with a nice CBR method in place. However, the CBR method creates too much additional blur (even when you disable motionblur within the games settings) in which I’m unable to finish the game because of the latency of late blur created during image reconstruction. Makes me quite nauseated. And there are games like Spiderman and Detroit: Become that have the best CBR methods among Sony’s first-party games, IMHO. However, both of those games (more so Spiderman) have issues with far-off objects with intrinsic texture details getting butchered or even lost during the reconstruction phase. It's not overtly bad, but once you recognize certain issues, you know there is much room for improvement.

Don’t get me wrong, I’m not against CBR – I’m a big supporter of it. But if that's the developer(s) goal on presenting pristine native 4K imagery, then I’m totally for it.
 
Guru3D measured about 238W on 5700XT Strix OC at 1950MHz
Still too hight but less than you expect

How can it be "less" with the extra power requirements of a console with CPU, more memory,IO,HDD,CD and higher cooling requirements? A 2Ghz 5700XT would be 250W with its memory+fan at 7nm.
 
Status
Not open for further replies.
Back
Top