Next Generation Hardware Speculation with a Technical Spin [2018]

Status
Not open for further replies.
The only reason they will use an ARM processor will be to keep power usage low for rest mode type feature. They will be using that Ryzen for OS features. Probably be the usual two cores restricted to OS at launch.
 
The only reason they will use an ARM processor will be to keep power usage low for rest mode type feature. They will be using that Ryzen for OS features. Probably be the usual two cores restricted to OS at launch.

Originally the PS4 ARM processor was to handle anything related to DRM, streaming, downloading, applications and other OS function. But that idea was scrapped (wasn't robust enough) some time during development, in favor towards assisting the main CPU. That being said, there are some decent ARM processors now, that can handle those task fairly easy. Even if Sony doesn't go the secondary processor route... Sony's lightweight OS and security/DRM measures shouldn't be hogging two full cores on a Ryzen CPU. If so, Sony needs to hire some of Microsoft's OS/software developers on making a decent OS.
 
Last edited:
I feel they reserve two cores in the beginning just to cover themselves in case someone comes up with a game changer type feature, so they can react to it. They don't want to nail down the performance usage of there OS early.
 
Since the console version of the Subor is supposedly using an IoT version of Windows, does that give them the ability to lock out installing apps unless its through their their distribution platform? Since its a custom chip and a custom bios, could that also prevent people from getting drivers, which also thwarts attempts to format the drive and install a full version of windows in order to be able to get Steam?

Short of customizing and locking down a version of Linux like Sony did, how could Subor force customers to only buy games distributed through them (like console manufacturers do)?
 
I'm a bit pessimistic about future cost reduction for nand. QLC is a nice 33% higher capacity per cell but comes at the expense of cell reliability, which in turn requires more overprovisioning (and only applicable to limited write workload, forget using it as caching or continuous recording gameplay). It's a great cost cutting improvement but it already reached the point of diminishing returns. Number of layers is also seeing a difficulty increasing from now on, and process nodes are reaching a limit of how small the cells can be.

The Intel 660p QLC SSD 512GB can write 100TB. That's 200 50GB game installs. If you were to install from a Bluray drive running at PS4 speeds, it would have to run continuously for 42 days to exhaust the write capacity of this drive. Continuous recording gameplay would buffer to DRAM, and even if it went to disc you could record for years before exhausting the write capacity.

More than enough.

As for cost. 240GB TLC SSDs now goes for $39 in China in the OEM spot market. The retail US price for the Intel 660p is $99, the cost is probably two thirds of that.

As for costs, we'll see 96+ layers this year from Toshiba. Micron has plans to use more than 200 layers, Samsung's long term goal is over 500 layers, plus we had an excess in fab spending from late 2016 until now, which means we'll have a glut in flash starting .... now.

I wouldn't be surprised if 512GB QLC flash can be had for $40-50 in late 2019/early 2020. USB memory sticks are already at around $0.1/GB

Cheers
 
I disagree. Thats not enough write survivability at all, not with games being patched as frequently as they are and with consoles being used for 7 to 8 years.
 
I disagree. Thats not enough write survivability at all, not with games being patched as frequently as they are and with consoles being used for 7 to 8 years.

Are you expecting your console to stay alive for seven years ?!!? Anyway, you'd need to install 30 games per year for 7 years to exhaust the write capacity. What are the odds of you having 210 games and not expanding mass storage (or upgrading your console).

As for patches. Patches has to be downloaded. At a reasonable broadband speed of 10MB/s you'd have to download nonstop for 110 days.

Cheers
 
Last edited:
Hmmm I burned out an OCZ TLC 480GB drive in my PS4 after only 4 years so I'm not convinced that the write life time of QLC is going to be that hot either. IIRC doesn't patching wind up having a sort of write amplification effect in that you need to download the patch and subsequently deploy the patch leading to effectively twice the writes for anything downloaded? Are SSD controllers capable of understanding the data being written to them to do proper wear levelling given the non-standard or encrypted file system layouts?

I mean if Sony or MS went with a scratch SSD drive I wouldn't expect them to use SLC or anything that expensive but a TLC drive would at least be a known quantity for service life
 
Are you expecting your console to stay alive for seven years ?!!? Anyway, you'd need to install 30 games per year for 7 years to exhaust the write capacity. What are the odds of you having 210 games and not expanding mass storage (or upgrading your console).

As for patches. Patches has to be downloaded. At a reasonable broadband speed of 10MB/s you'd have to download nonstop for 110 days.

Cheers

Of course console should last that time. Even the psone i bought long time ago still works as does my old ps2 and ps3. However those consoles have new owners as I gave them away to relatives.

One forgotten thing is game sizes could grow and also people could download a lot of demos of games. Also there are things like netflix offline mode where people might download bunch of stuff to watch later, for some people offline mode might be the only way for 4k hdr due to bandwidth needed.

It's enough for few loud people to wear down flash and it can become huge headache due to social media signal amplification.

Once flash starts to be full there is a lot less chance to optimize where writes go and this either breaks flash sooner or performance becomes a lot worse and there is extra writes.
 
I disagree. Thats not enough write survivability at all, not with games being patched as frequently as they are and with consoles being used for 7 to 8 years.
Same. If it was QLC, I would prioritize replacing it, personally. I want my console to last 10+ years.
 
If there's a writeable scratch-pad for games, for saving persistent world state say in something like GTA or Witcher or Elder Scrolls, you'll wear out a small portion of the storage very quickly. To get around that, you'd have to keep moving the allocation around, and the more full the storage becomes, the more impact that'll have, as manux states. So, durability can't be measured by 'capacity / writes' or similar. I can envision a situation where the available storage decreases over time as parts of the flash burn out. That'd be more graceful than a complete, sudden failure.

As for lasting 7 years, we've had plenty of consoles die before then with things like optical failures. Cheap-ass thermal paste is known to deteriorate over time leaving the consoles running hotter and more prone to failure in later years. Basically, they aren't built with long-term durability in mind. So I could see a company considering a cheap, shorter-term solution. If you provide an upgradeable option, you can also monetise this lack of durability further on. It's something that won't impact sales in the first couple of years, and then owners will be bought into the system so they won't swap mid-gen out of outrage because the flash 'ran out'. Same way gamers don't buy an alternative console when their $40-$50 controller wears out in two years.
 
What's the expected lifespan of the mechanical drives ? In my experience 5-7 years is pushing it for them too.
 
If there's a writeable scratch-pad for games, for saving persistent world state say in something like GTA or Witcher or Elder Scrolls, you'll wear out a small portion of the storage very quickly.
Only if the user changes the game they're playing very frequently. Otherwise, I don't see why the content in that scratch-pad would have to be erased between game sessions.

E.g. a 128GB scratch-pad would be sufficient for at least two games. I'd say there aren't many users who switch between more than 2 games very often.
 
BF1 is 110GB, BF4 is 71, COD:Black Ops 3 is 71GB and there's no sign these games are shrinking so even 128GB could get trashing with just two games if one of them is an MP game with DLC, that being said 50-60GB seems to be a standard range for a lot the single player games I have on the PS4. With next gen I'd expect more assets again to the point where I wonder if the next consoles will see games on BD100 or if they'll just use a BD50 with a big old download to boot. If Sony/MS builds support in the next gen APIs that allows for fine division of assets such that the disc has the opening hours of content while the rest downloads in the background it might not be that bad, unless you have bandwidth caps of course.
 
Only if the user changes the game they're playing very frequently. Otherwise, I don't see why the content in that scratch-pad would have to be erased between game sessions.
That save data could be written to constantly. Elder Scrolls has a database of every object that's moved. In a living world with NPCs moving stuff around, the objects would have to be saved as they are changed. Or a massive megatexture of user-modified scenery (persistent bullet holes and footprints) would be constantly updated as the player plays. A save file could be overwritten hundreds of times. We've been talking about using flash as a substitute for RAM, saving RAM costs. In such a case it could be constantly overwritten with cached data from the HDD or wherever.

If you put in fast flash storage with the caveat that devs aren't allowed to write to it however they want, you'd gimp half its value to the system.
 
That save data could be written to constantly. Elder Scrolls has a database of every object that's moved. In a living world with NPCs moving stuff around, the objects would have to be saved as they are changed. Or a massive megatexture of user-modified scenery (persistent bullet holes and footprints) would be constantly updated as the player plays. A save file could be overwritten hundreds of times. We've been talking about using flash as a substitute for RAM, saving RAM costs. In such a case it could be constantly overwritten with cached data from the HDD or wherever.

If you put in fast flash storage with the caveat that devs aren't allowed to write to it however they want, you'd gimp half its value to the system.

Yup. If you put in flash to use only as a OS controlled cache - one write per game (re)cached - then you basically make the flash a loading accelerator and make it no good for persistent, highly volatile game data. But it'll still burn out at an accelerated rate due to the frequent multi GB (or multi tens of GB) patches per game that are automatically downloaded.

If you want to be able to use the flash almost like a game controlled page file, then you'll rapidly kill modern, high density, low cost flash. In this sense a fast mechanical HDD that is fully game read/writeable is preferable to a flaky SDD that can only be used as a slow OS controlled read cache for a super slow/shit 5400 laptop drive.

I still have 8, 16, 32, 128 bit .... erm ... Xbox, XB360 consoles. The thought of a console that can't last more than 3 years due to failure of a single none replaceable part is unacceptable to me.
 
Are you expecting your console to stay alive for seven years ?!!?
Absolutely. My original generation Xbox, PlayStation, Dreamcast, X360 Slim, and Xbox One Launch Day editions still work to this day!

What are the odds of you having 210 games and not expanding mass storage (or upgrading your console).

100% guaranteed because of Backwards Compatibility.

As for patches. Patches has to be downloaded. At a reasonable broadband speed of 10MB/s you'd have to download nonstop for 110 days.

I'm at a reasonable broadband speed of 300 mbit/sec (or 37 MB/s) with the option to upgrade to Gigabit (or 125 MB/s) speed for $20 more a month today. I expect these reasonable broadband speeds to be even faster in 2021 and to be cheaper too.
 
Status
Not open for further replies.
Back
Top