. It’s not like how PS4Pro and X1X are designed purposefully to run games at 4K.
You're right or wrong depending on what you define as "run games".
If by saying "run games at 4K" you very specifically mean the internal rendering resolution is 3840*2160, then you're right.
I agree with the devs who say the internal render resolution isn't a very important metric, and what counts is the actual image output being substantially better than a regular 1080p output. And from that POV, both the PS4Pro and XBoneX are consoles made for 4K output, hence designed purposefully to run games at 4K.
Honestly, if restaurants had a history of false hyping and outright bullshitting just remotely close to that of Technology, I wouldn't even sit on a chair without checking digital foundry's analysis of it's sturdiness.
When did Sony or Microsoft false advertise features, say in the last 10 years?
Honest question.
I remember features that were eventually dropped because no one cared for them, but atm I can't find any blatant dishonesty in the order of what's being discussed here (Sony talking about real time raytracing every time they talk about the PS5, only to have a tiny use of it because there's no actual hardware acceleration).
Maybe the only exception I do remember is Microsoft claiming cloud processing on XBone.
The issue isn't the presence but the degree. 8K output will be supported. Some games will be 8K. Most won't.
Yeah well, change 8K for VR in the PS4.
Does that shock people less? VR is supported. Some games will be VR, most won't.
I don't think most games will run at 8K at least until they release a half-gen, but the capability to do so is there so that's a feature.
But the PS5 will be a 8K capable console just as much as the PS4 is a VR capable console.
And no one is claiming that VR in the PS4 is some kind of dishonest marketing, is there?
I guess I’m not sold on the idea that we’re going to see RT 4K 60fps or 8K 30 FPS super fidelity graphics.
Discussing how much the PS5 / RDNA+ / RDNA2 can do of raytracing given a plausible X amount of GPU compute teraflops might not be a fruitful conversation.
All we know is how much compute power the
first implementation of the
first AIB needs to apply an amount of RT effects that make a discernible difference.
For all we know, nVidia's second RT implementation for Geforces could be able to do 10x more in-game RT effects with the same amount of compute by passing more RT stuff towards fixed-function units and/or using lower precision, and AMD's / Sony's / Microsoft's implementation could be similar in that regard.
I've seen devs claiming that Turing is drawing way too much precision on their real-time raytracing implementation. Which honestly makes sense because RT in Turing seems to have been made for offline rendering, as claimed by many.
This is not for the next-generation playstation out of one comment by the Sony CEO about realtime raytracing. The image is from a 40 GPU demo...
I wonder if he meant 40 "cores" (as in CUs) and that got somehow lost in translation.