Playstation 5 [PS5] [Release November 12 2020]

Yup. Every new combination/option requires more testing. On PC the game developer has no knowledge or control over the hardware and OS so performance is just expected to be at the whim of the settings the user choses. But on a console, if targeting multiple resolutions, frame rates, HDR and RT, that's a bunch more testing they need to do if they want to ensure consistent performance - which they do.

Yeah, I look at the console side as a curation of "select" (LOL) settings for each device. Probably would be easier to test versus PC, if it weren't for the consoles' certification programs which are notorious for their level of stringency.
 
Yes, But I'd say with those CPUs it will be much easier for the devs to reach the 60fps target. I think we can already see this as some exclusive games are already targeting 60fps. Demon's souls, Ratchet. And the previous exclusive Ratchet game was only 30fps on PS4.


With both Spiderman games and Ratchet being 60fps on PS5 I don't see how Insomniac could even think of shipping their next game with only a 30fps mode. Same for all next Bluepoint games. And now with the advent of 120fps gaming on consoles (that I had predicted), it will become harder and harder to return to 30fps. Sony even indirectly acknowledged the threat of PC gaming.

You can always do more with 30fps over 60. Not just in terms of graphics or visuals but actual game design. Your getting back twice the resource time for the cpu and gpu after all depending on the workload.

I would not expect 60fps on rockstars gta6 for examplel.

Without a doubt there will be plenty of devs who push their engines and tech to the limit in the next gen and that will neccessitate 30fps caps.

And that is how consoles have always operated in service of getting the most out of the hardware.

Like i said, next gen game design requires hardware sacrifices from finite resources regardless of the power on consoles. PC is the ever present option.

Otherwise there should be zero complaints about game design not changing from how it is now with all of the CPU power gained by the new processors that could be used for more interactivity, higher npc and smarter ai and denser worlds ect is eaten up by higher framerates.
 
Last edited:
Still doesn't explain the relevance: PS5 has its own custom controller and the M.2 SSDs used by the PS5 won't need another one.

Of course all M.2 SSDs need their own controller. The NVMe spec assumes there's a controller in the card. How else would several NAND chips send data through the PCIe bus?


These new controllers were mentioned because they support 7GB/s read speeds using only 8 channels, meaning there will soon be plenty of competition for M.2 SSDs that are supported by the PS5.
 
Of course all M.2 SSDs need their own controller. The NVMe spec assumes there's a controller in the card. How else would several NAND chips send data through the PCIe bus?


These new controllers were mentioned because they support 7GB/s read speeds using only 8 channels, meaning there will soon be plenty of competition for M.2 SSDs that are supported by the PS5.
Yes you are right. I thought it would be have being possible for the ram on the M.2 to use another controller.
 
Yes you are right. I thought it would be have being possible for the ram on the M.2 to use another controller.
The PCIE philosophy is centred on delegation of I/O so every PCIE device (incl. M.2) has it's own controller. It's communicates with the host platform (hi, I'm a solid state drive class device, I have x channels and y capacity etc) and importantly manages the actual use of the solid state cells. Any given array can be configured in various arrangements to hit a good balance of reliability/performance - you can two two identical arrays of cells and get very different performance depending on quality of the controller.
 
Last edited by a moderator:
Christ if I thought the 60FPS or bust crowd were insufferable the 120FPS or bust crowd are going to put them in the halfpenny place aren't they?
yea curious to see how many people are using monitors for their consoles (thus the move towards 60/120fps). Versus the number of folks using TVs.
TVs are quite behind on supporting some of these features, and they are very costly. But if they are finding that a majority of players are enjoying monitors in the 1080p/1440p @120+Hz range, I can see why there was a sudden move to support these individuals. As this is the cheapest method to play vs the large 4K OLED screens we have.
 
Generally 30fps on TV is fairly acceptable. 30fps on monitor feels really sluggish for whatever reason, it's just much more noticeable.

Very true, maybe also because of the controller and larger distance to tv. Playing 30fps games with the mouse does something too.
Though i never like 30fps, probably because as a high end pc gamer im used to more. Even spiderman feels jarring when turning around, strangely enough not so much in forward motion. Hard to explain. Or watching someone else play at a distance.

Many where against 60fps or higher before but now that consoles support it more widely its got a very warm welcome. Next up 144hz as standard :D
 
The reason ppl seem to support it is becauase they think consoles are pcs and that you can just add on hardware to account for overrhead allowing for 60fps gaming.

But of course you cant and the iterative machines were an aberration. The new machines are just using the overhead they have from the crossgen period and games created for the jaguar consoles and before.

Now that engines will be moving back to being designed as cpu limited affairs due to not having to account for jaguar, simply increasing the gpu or lowering the resolution wont bring the same dividends as this gen.

Itll be interesting to see how things go.

All i know is that if people start complaining about games not being next gen enough, there is clearly something to point to and say they got what they asked for.

All i personally want is games to be designed from the start with one framerate in mind and take advantage of that. Otherwise the arbitrary need for options will limit devs on consoles which are analogous to said options unlike PC. The cpu will always be having to account for that 60fps(or 120fps) target regardless of if your using it or not, making any 30fps option almost pointless outside of the res boost.

On pc that may be fine since that is how console ports work on pc, but on console where these games are created from the design stage, its a huge waste of resources that could be used for designing the foundations of the game instead being eaten up just to have a silly res boost setting and nothing else.

In an era where resolutions will always be fairly high and reconstruction and TAA methods clean up the image very well already, arbitrary options destroying game design advancement just to exist is something i actively hate.
 
Last edited:
Generally 30fps on TV is fairly acceptable. 30fps on monitor feels really sluggish for whatever reason, it's just much more noticeable.

Monitors tend to have higher refresh rates, and they're built differently from HDTVs, so figure those'd be the big contributing factors. Monitors have been doing 120 Hz, 240 Hz etc. for years now; seems like 4K televisions are only just starting to get into that area.

Even then, things like color balance and different between the two, there's more pixel density with a monitor (so even if you have same framerates on a monitor vs. television, a 30 FPS game probably looks more "sluggish" on a monitor due to higher pixel density over the surface area, and I guess whatever features televisions have built-in for blurring effects or whatnot, are not present in vast majority of monitor displays), and I wouldn't be surprised if televisions having higher burn-in rates inadvertently benefits 30 FPS games, since they tend to apply a lot of motion blur to smoothen animations (certain developers in particular excel at this).

Also from my experience monitors seem to give better color saturation and sharpness; maybe 30 FPS games on a television benefit from the slightly less saturated pop of colors since you aren't visually distinguishing certain color segmentation as sharply as you would on a monitor? Combined with lower pixel density, slightly more prone burn-in, and the other stuff I can see why 30 FPS is generally a target for console games that aren't competitive-driven.
 
Monitors tend to have higher refresh rates, and they're built differently from HDTVs, so figure those'd be the big contributing factors. Monitors have been doing 120 Hz, 240 Hz etc. for years now; seems like 4K televisions are only just starting to get into that area.

Even then, things like color balance and different between the two, there's more pixel density with a monitor (so even if you have same framerates on a monitor vs. television, a 30 FPS game probably looks more "sluggish" on a monitor due to higher pixel density over the surface area, and I guess whatever features televisions have built-in for blurring effects or whatnot, are not present in vast majority of monitor displays), and I wouldn't be surprised if televisions having higher burn-in rates inadvertently benefits 30 FPS games, since they tend to apply a lot of motion blur to smoothen animations (certain developers in particular excel at this).

Also from my experience monitors seem to give better color saturation and sharpness; maybe 30 FPS games on a television benefit from the slightly less saturated pop of colors since you aren't visually distinguishing certain color segmentation as sharply as you would on a monitor? Combined with lower pixel density, slightly more prone burn-in, and the other stuff I can see why 30 FPS is generally a target for console games that aren't competitive-driven.
I think in a general case, sure, I would say that monitors are likely higher quality than most TVs. When you can buy a 30"-40" TV at a lower or same price of monitors in the 21" range... there is something to be said for sure.

I think when you enter the high end line of TVs, the quality is definitely in favour of TVs for all things except refresh rate and gaming features.
 
The reason ppl seem to support it is becauase they think consoles are pcs and that you can just add on hardware to account for overrhead allowing for 60fps gaming.

But of course you cant and the iterative machines were an aberration. The new machines are just using the overhead they have from the crossgen period and games created for the jaguar consoles and before.

Now that engines will be moving back to being designed as cpu limited affairs due to not having to account for jaguar, simply increasing the gpu or lowering the resolution wont bring the same dividends as this gen.

Itll be interesting to see how things go.

All i know is that if people start complaining about games not being next gen enough, there is clearly something to point to and say they got what they asked for.

All i personally want is games to be designed from the start with one framerate in mind and take advantage of that. Otherwise the arbitrary need for options will limit devs on consoles which are analogous to said options unlike PC. The cpu will always be having to account for that 60fps(or 120fps) target regardless of if your using it or not, making any 30fps option almost pointless outside of the res boost.

On pc that may be fine since that is how console ports work on pc, but on console where these games are created from the design stage, its a huge waste of resources that could be used for designing the foundations of the game instead being eaten up just to have a silly res boost setting and nothing else.

In an era where resolutions will always be fairly high and reconstruction and TAA methods clean up the image very well already, arbitrary options destroying game design advancement just to exist is something i actively hate.
ehh I would probably half agree with this. It's very unlikely that you can load up a CPU so much to make it designed for 30fps. I just don't see that happening for this coming generation. The move to GPU driven dispatch ensures that as the generation goes forward more of the work that the CPU would traditionally do, is now going to be done on the GPU because it can do it there and not have to come back to the CPU waiting for the next command.

You're actually going to be seeing a lot of draw calls, render work, sorting, culling, etc, all be moved to the GPU now really freeing up the CPU and that's a feature that needed to be implemented across the board for everyone. And that will oddly, be more efficient for rendering despite offloading it to the GPU to do.

That leaves the CPU to be an interesting thing. You'd probably design your game around the number of interactions that can happen in a single moment with a large number of actors/effects happening all at once. But because in the game itself that isn't happening all the time, for the most part the CPU is waiting for those moments in which there are 50 million things happening on the screen at once, and then it's crunching it. Aside from that, by decoupling the CPU from the renderer, the CPU will have significantly less work to do.

Frame rate will be determined largely by the GPU bottleneck.
 
Like i said, games being designed as gpu limited only occured this gen due to jaguar. It was a neccessity to get games to work in general.

Devs wanting to push their games with much more powerful hardware will still always be want for resources. To say "its unlikely for the cpus to be pushed hard" is strange to me considering that resources always intensify for gaming no matter the generation as games become more complex.

You simply put enough npcs in a crowded area and of course the hardware will be given a workout. I dont see how one argues that stops being the case this gen when its never been the case otherwise
 
Back
Top