Can't be worse than the first wave of PS3 and 360 games.
I think games have a long way to get to good lighting. Properly ray traced scenes look noticeably better than the best of traditionally lit scenes.
Simpliest answer is that theory was entirely bullshit.
There are many reasons for this. The majority of console gamers are coming from a 1080p machine to one that will do native, or near-native 4K. The increased burden there alone can eat up a lot of the compute gains. On top of that, per-pixel detail is harder to see on 4K, so you're in the land of diminishing returns when it comes to increasing graphical fidelity. Finally, there will be an expectation of increased framerates, which while part of the graphical experience overall, doesn't increase the detail on a per-frame basis. Fancy bullshots won't see the big gains some might hope for. The increased fidelity of HDR can be hard to convey in screenshots as well.I think they mean that jumps in graphics between generation aren't what they used to be. You don't have to be a developer to reason that.
The guy that created this...
It's an indie with no real basis for his remark (whether true or not), coupled with a second link repeating the same story. I'm not sure why you linked them both. Posting two links to the same quote adds weight to it? Must be true of the whole internet is parroting the same one line?I think they mean that jumps in graphics between generation aren't what they used to be. You don't have to be a developer to reason that.
Even if we get offline render quality during next gen, the jump from PacMan to Uncharted just feels larger in comparison. But that's the only 'wall' there is.We just don't have the tech jumps like before, it's all rather logical.
RT isn't going to change graphics that much.
Why are you reading Sony into it? Of course we're facing diminishing returns, but we don't know how much or what the next gen machines will be capable of.Do you really expect Sony devs to say 'we can't deliver the jumps we have had in the past?' There's no attacking on PlayStation here, the same problem exists on Xbox, or let's say all platoforms, even with a slight difference in TF numbers. RT and SSD are the new things now, maybe some CPU related tasks too. RT isn't going to change graphics that much.
Why are you reading Sony into it?
Yup, MS decided to put in a 12TF GPU just for giggles.
I always maintained 9+ TF is not possible without going well and above of what consoles in past pushed in terms of TDP and form factor. The fact that Arden has 56CUs leads me to think it is in fact 12TF, as anything less (say 9-10TF) would mean they would be going well beyond Navi clock sweet spot and in that case you are simply wasting silicon.I still see no official information saying xbsx is 12TF.
The latest devkit photo makes me question the theory it's significantly above 200W.