Sony's ReRAM plans - what can and can't ReRAM bring to a console? *spawn

If you wanted more memory you can get 64GB of DDR4 for cheaper and it's faster.

You don't know that. You don't know if it will be cheaper - short term and long term.

A ReRAM solution will be a game-changer. Sony even has one with 51gb/s. But it has to be cost effective to make sense, we can all agree to that.

Is it cost effective though? That's the question. But there is enough data to surmise that it might be cost effective both near-term but more importantly long-term.

If the cost to manufacture initially (minus R&D and CAPEX) is at $30 for 128gb then it is a no brainer. Notwithstanding it's ability to stack and scale and chase Nand Flash in price. In fact, there is expectation for ReRAM to do that because IT WILL SCALE DOWN below 15nm.
 
Plus ReRAM will also solve the potentially problematic decompression at run-time taking away CPU bandwith and resources. You can uncompress large amount of data in ReRAM and never engage you cpu for streaming until you need to fetch more data from the cold storage.
 
Again,

The 25.6GB/s probably has more to do with "8 chips and PCIe Gen5 x8" (it's right there in the slide) than with ReRAM itself...

This is the slide posted by megre:
I'll just post some interesting non-technical slides from the sony presentation from 2017 and 2019.
View attachment 3457

Now, does anybody think that Sony will put the equivalent of PCIe Gen5 x8 in the PS5 APU for a "warm cache"? How many pins would that be? How much area of the chip would they need to dedicate to that?
 
That would make the PCB board too expensive/complex.

Yet 128gb ReRAM or even that 256gb can be made in a "small form factor".

I would love a 128gb or even just 64gb of DDR4 as GPU memory for next-gen but we have enough data to prove that that would be prohibitively expensive. While ReRAM although having higher latency is POTENTIALLY much much cheaper.

We can all agree that it has to he cost-effective both near-term and long-term to make sense.

does anybody think that Sony will put the equivalent of PCIe Gen5 x8 in the PS5 APU for a "warm cache"? How many pins would that be? How much area of the chip would they need to dedicate to that?

I have no idea how much PCIe 5.0 would cost. I tried searching but found no meaningful data. I would appreciate if you will enlighten me on this matter.
 
Last edited by a moderator:
This was an eye-opener for me.

What's the new bottleneck with regards to load times?

One poster laid it out very nicely.
  1. The bottleneck is the CPU.
  2. A big part of that CPU bottleneck is decompression. Even a fast 8 core CPU will only be capable of handling full SATA speeds from an SSD. We're overdue for hardware-accelerated decompression cores (or consumer fpgas) to help with this issue, imo. Someone out there should be talking about this. :/
  3. The next problem is simply software: there's a lot of games that will only use a single thread during loading, for instance.
  4. Optane offers a slight performance increase over NVMe, this is thanks to a lower latency between requesting data and getting it back.
  5. Combining the above two: async IO is rare outside server software typically, and this means that games will spend time CPU bound before the CPU thread just stops, submits a request for more data, and sleeps waiting for the response. I'm not even sure if it's the game dev's fault or if there are API problems presented by NVMe that OSes haven't really addressed. But it's possible to fix this issue in software by pipelining async IO requests. This should remove most of the difference between Optane and NVMe for loading times, and make both of them faster than they are now.
  6. Finally, software. Making loading times fast requires game devs to care about making loading times fast. There's tons of dumb stuff that happens at load time that could have been pre-computed or cached, and just isn't. So that's just "do less work." I suspect part of this is that hard drives used to be so slow that CPU time during loading was basically free, and that's still who they have in mind (e.g. consoles still have HDDs...)
So on the one hand, game devs could do a lot. On the other hand, I think there's a type of hardware acceleration we're presently missing.

On the bright side, I think we're entering an era where we'll start to see a lot more specialized bits of hardware. This is happening in phones most prominently right now, but I think we'll start to see it filter back to normal CPUs before too long. Perhaps we'll actually get those decompression cores someday?


My speculation:

PS5 will achieve "no loading times" by using ReRAM as the cache to where game assets will be unpacked and decompressed. All game assets will be unpacked to the cache or just the immediately-needed assets depending on the scope, ambition, and visual fidelity of the game.
At 25.6gb/s, it can fill the main RAM in less than a second without ever needing to go through the hoops of decompressing the data.
 
What is used as cold game storage?

What will decompress from cold storage?

There will still be CPU or some processor usage and time whilst the reram is filled.

Will this be refilled each game as 64gb does not sound so big for uncompressed data.

I can see the performance when it's all in the caches , but filling from cold storage sounds as slow.as any other tech.

What have I missed?
 
What is used as cold game storage?

The early leaks suggested HDD. But there are rumors of 1TB - 2TB of SSD. They could probably go with cheap SATA 4.0 SSD.

What will decompress from cold storage?

It depends on the game. As the Control developer said, "If games would stay the same in terms of scope and visual quality it’d make loading times be almost unnoticeable and restarting a level could be almost instant [in PS5 games]". But if they become ambitious and start to use that ultra-fast SSD to cache immediately-needed assets to showcase insane details then memory management (same as today) must be employed.


There will still be CPU or some processor usage and time whilst the reram is filled.

I'm speculating that there could still be a dedicated separate asic chip (cell cpu?). But it could be a cheap one because it doesn't need to be super fast.

Will this be refilled each game as 64gb does not sound so big for uncompressed data.

We don't know the amount yet. But we have already seen a 128gb and 256gb version of ReRAM from Sony.

But if your worried about long load times when changing game, Sony could actually employ pre-caching on games enough to immediately boot a game. It will depend on the amount of ReRAM in there how much memory will Sony allocate for pre-caching and how much for caching currently played games. There doesn't need to be fix amount, it could be dynamic.

I can see the performance when it's all in the caches , but filling from cold storage sounds as slow.as any other tech.

Again, I will point you to the statement by the Control developer. It depends on the scope, ambition and visual fidelity of the game on how big a game will be. If it's really too big then they will need to employ memory management again. I feel like there will be a separate dedicated asic decompressor so as to not bother the CPU processes.
 
If next-gen storage ends up truly being dozens of times faster than current, I don't think compression will be the biggest bottleneck. Lazy Devs™ will just conpress data less, since the raw read speed can eat it up. As the balance of power changes, so do these engineering/algorithmic decisions.
 
If next-gen storage ends up truly being dozens of times faster than current, I don't think compression will be the biggest bottleneck. Lazy Devs™ will just conpress data less, since the raw read speed can eat it up. As the balance of power changes, so do these engineering/algorithmic decisions.

I would say if the SSD is ultra-fast even at just 4gb/s then why even compress at all. Have it decompressed so that massive amount of data doesn't have to go through shuffling of data to decompress at run-time. The CPU/GPU can load them in-place and read to them directly without friction. But storage capacity would be a problem then I guess.
 
Exactly why cache approach makes sense even at 4gb/s. At 25.6gb/s it's a no brainer. (provided it's not prohibitively expensive)
 
So that you have more than two game installed at once...

Well, are there gonna be any incentives for devs to be more mindful of storage or are they gonna be free to pass the bill down to the consumer how they have been this gen?
 
Well, are there gonna be any incentives for devs to be more mindful of storage or are they gonna be free to pass the bill down to the consumer how they have been this gen?
Theoretically they could have shipped everything on 2 discs. But they didn’t. I think it will still matter as long as there are costs associated with media. And MS and Sony could force compression since they have hardware decompress and swizzle and there are security items to consider for encryption.
Since data has to be encrypted it’s also likely part of the chain is to decompress.
 
Theoretically they could have shipped everything on 2 discs. But they didn’t. I think it will still matter as long as there are costs associated with media. And MS and Sony could force compression since they have hardware decompress and swizzle and there are security items to consider for encryption.
Since data has to be encrypted it’s also likely part of the chain is to decompress.

There are infinitely many flavour of compression and data organization. Some faster, some more efficient, some more specialized, some for big blocks of heterogeneous data, some for small chuncks of specidic types of data, etc.
The choices that are popular are unquestionably gonna shift this gen. We are just yet to see the specifics of how.
 
Back
Top