What was the case with PS4?
Hmm....looks like we are staying with that then.The PlayStation 4 ended up where games could access 4.5 GB of memory, meaning 3.5 GB used for the OS. The same as the PS5 currently uses.
5GB. At first it was 4.5GB + 0.5 flexible memory (it's still 5GB total memory though) but they eventually updated it and gave 5GB to devs without the flexible thing.The PlayStation 4 ended up where games could access 4.5 GB of memory, meaning 3.5 GB used for the OS. The same as the PS5 currently uses.
Yeah, actually, this is much more likely. It makes your wonder what devs do with any surplus RAM on a particular platform.Or possibly games driven by development time restrictions with least common denominator kicking in and many are still cross-gen so it hasn't amounted to much.
Yes increased dashboard resolution.Didn't Microsoft increase the dashboard from 1080p to 4k recently? I'd expect that to have increased the allocation even if it wasn't by a full gigabyte.
One of the reasons given for Xbox One X sticking with a 1080p dashboard was that games would have 9GB available rather than 8GB.
That's a good point. But if increase your allocation in one place then you still need to reduce it in another.Yes increased dashboard resolution.
No didn't change reservation.
Don't ever expect resources to be taken from games once the console is released.
As Jay has already posted, the Xbox System Reservations remain unchanged; Games continue to have 13.5 GB on Series X and 8 GB on Series S.But if increase your allocation in one place then you still need to reduce it in another.
They probably leave some unused reservation as a spare for potential future features.That's a good point. But if increase your allocation in one place then you still need to reduce it in another.
Yes they increased the res, but that is only the dashboard. E.g. you could increase the resolution of you os on PC without needing hundrets of megabytes. There was also a time where cards had just 64mb (or even less) memory and you could still run your CRT with a resolution of 1600x1200. This all was possible with a minimal os footprint. The memory the consoles use is more for other features like live recording, ... That consumes large amounts of memory.Didn't Microsoft increase the dashboard from 1080p to 4k recently? I'd expect that to have increased the allocation even if it wasn't by a full gigabyte.
One of the reasons given for Xbox One X sticking with a 1080p dashboard was that games would have 9GB available rather than 8GB.
Fair points though I’ll note that Windows didn’t have a 3D desktop compositor until Vista. I don’t want to put my foot in my mouth but IIRC those didn’t become truly viable until GPUs were hitting 128MB+. And unless my math is off then double-buffered 4k at 32-bits per pixel with a 32-bit Z-buffer consumes just over 253MB without MSAA (up from ~63MB at 1080p).Yes they increased the res, but that is only the dashboard. E.g. you could increase the resolution of you os on PC without needing hundrets of megabytes. There was also a time where cards had just 64mb (or even less) memory and you could still run your CRT with a resolution of 1600x1200. This all was possible with a minimal os footprint. The memory the consoles use is more for other features like live recording, ... That consumes large amounts of memory.
Also assets from the os could be quickly loaded from the SSD and unloaded again.
I already wrote that is unbelievable how many resources the os reserves for itself (especially on ps5 the memory footprint is even bigger). But I guess that is the cost of all the services and security features of those consoles. But at least, proportional to the available memory, the os footprint shrinked a bit with the current gen. But I guess with the next gen and maybe some new fancy features (anyone uses) the footprint will rise again (measured in MB not in %)
Upscale techs don't decrease memory consumption. You still need the framebuffer for the full 4k output signal.Can’t Sony or Microsoft render their OS UI at 720p base resolution and use FSR 2 or FSR 3 to upscale the OS UI to 4K?, wouldn’t that reduce the OS UI memory footprint therefore can allocate more memory to games, every little bit of memory matters when it comes to consoles.
You only need the final FB to be native res. Everything else making up the 'back buffer' (G-buffer in deferred) can be reduced res. Significant savings are possible.Upscale techs don't decrease memory consumption. You still need the framebuffer for the full 4k output signal.
Indeed. The UI is just a couple of 2160p buffers. 8 MB a frame/buffer. That's not going to break the bank.But I guess it is not the OS ui that gets all the memory. It is more the running apps, services and thinks like recording your games that makes the non-game footprint so big.
That would require a significant reengineering of the hardware architecture and the OS. And you'd still need the memory reserve, unless you added more RAM.How feasible is it to include last gen CPU or a midrange/low end CPU to handle the OS?, is it cheap enough for Sony/MS to continue ordering Jaguar CPUs and including them in Xbox Series S|X and PS5?. would that increase the BOM in a significant way?
Why?How feasible is it to include last gen CPU or a midrange/low end CPU to handle the OS?, is it cheap enough for Sony/MS to continue ordering Jaguar CPUs and including them in Xbox Series S|X and PS5?. would that increase the BOM in a significant way?
Im struggling to find selling points for next gen. PS3 gen introduced “HD” gaming or rather sub-hd gaming for the most part, regardless, moving from SD TV’s to HDTV’s was a massive upgrade in itself, PS4 gen was more of the same just higher quality (much higher in some cases), mig-gen refreshes introduced 4K and HDR, this gen introduced RT and fast/no loading screens, though we haven’t seen much of it yet. Im confident and hope next gen won’t target 8K and spend rasterization performance on that as we’d get back to 30fps gaming again. I believe next gen will be another “more of the same“ gen with faster SSD’s and much better RT, the main difference will be that most games will be 120fps and few heavy ones will target 60fps. With that said there doesn’t always need to be big selling points as people will buy into them like they did with the PS4 gen (including myself) since we tend to forget that the selling point that always existed is that the latest and greatest games will shift gradually to the new machines. If next gen could deliver 4K/120 RT in most games I’d be very happy.Why?
I don't think we have yet seen a scenario for the current gen, where the core count makes a difference. Especially if the code is optimized for around 7 cores (or 6.5) + HT.
I would not really be surprised if we see 8 cores again with the next generation. It gets really hard to find use cases in games where you can really use more than 6 cores and with more cores it is even less gain. A massive IPC increase on the other hand ...
But also currently, we are normally GPU or bandwidth bound. Especially if the next gen of GPUs takes even more RT tasks to the GPU.