probably, but if you have the gpu power to ramp up and they want to push the graphics envelope, they'll naturally push it to 30fps.
I'm mean look at it this way specifically at the GPU.
60 FPS @ 12 TF is = 30 FPS @ 6TF. They do very close to the same amount of work if they are at the same resolution. If PS5 is 60fps mandated, and is 12 TF, it cannot make a generational graphical difference from 1X. You'd barely notice the difference. At 30fps is a different story.
Well, that is if Sony will use Xbox One X settings. The other possibility is, that they will just use PS4 Pro settings, which would probably look like this:
30 FPS @ 4.2 TF = 60 FPS @ ~8.4 TF
And some of the limitations of PS4 Pro games were also a result of the relatively low memory bandwidth. Afair this was the reason why Uncharted 4 is "only" 1440p on PS4 Pro. With 8.4 TF, higher bandwidth and other architectural improvements, I think they could get 4K checkerboarding and 60fps for a game like this.
But obviously there won't be a 8 TF GPU in PS5, probably something like ~12.6 TF. So that means Naughty Dog will have over 4 TF which they can use for better lighting. And if you think about it, that's still a massive amount of GPU power, basically one PS4 Pro. To summarize:
Naughty Dog PS4 / PS4 Pro game:
1080p (2,073,600 pixels) @ 30 fps (PS4)
1440p (3,686,400 pixels) @ 30 fps (PS4 Pro)
Naughty Dog PS5 game:
4K CB (4,147,200 pixels) @ 60fps + 4 TF for improved lighting + 4K textures
This is for a 30fps game. But of course there are already quite a few 60fps games on PS4 like Call of Duty, Wolfenstein and Star Wars: Battlefront (which are among the most popular games). Those will have an additional 8 TF for improvements + 4K textures on PS5. To me this seems like a pretty good generational leap.
And we should not forget that the vast majority of PS4 owners just have a base PS4. This will be a massive jump for them, going from base PS4 to PS5, and they don't even need a 4K TV to see this.
I'm really sorry but 30fps is here to stay. Games that want to be 60fps are 60fps.
I don't really agree with this. We now have more 60fps games than during last-gen. Titles like FIFA, Resident Evil, Metal Gear Solid and Battlefield have been 30 fps during last gen, and they are 60fps in current-gen. We will have much more CPU power for next-gen consoles, which makes 60fps more attractive to devs, because there are less compromisses needed.
Also, many of those 30fps games during this gen are among the most expensive titles. Games like Assassin's Creed or GTA have over 1,000 people working on them. Is it really such a good idea to use all of next-gen's power to only increase graphics? I think there is an economical incentive here to slow down this development. And 60fps is an easy way to do this, while still getting a big improvement for your games. Also, if you look at GTA, it's multiplayer mode is incredibly popular. And there are rumors that multiplayer is coming back to Assassin's Creed. That's why I think that 60fps makes a lot of sense even for such games.
All these wonderful technologies that have been developed to offload work from the CPU to the GPU. If you use them, then you only need a bigger GPU.
Going by this logic, then why not just use a 7nm Jaguar for next-gen?