Next Generation Hardware Speculation with a Technical Spin [pre E3 2019]

Status
Not open for further replies.
Which defeats your request, if you can add a "slow' consumer grade device it must deal with caching and swapping in and out. If it has that baggage what is the gain of a 1tb SSD something Vs a smaller one and 1tb HDD out the gate.

Same caching employed , still 1tb storage?

I already explained my reasoning for the default 1TB non-replaceable ultra fast SSD. I'll just copy paste my post in another forum.

With 1 TB of ultra fast SSD, initially, about 8-10 AAA games can be installed on the non-upgradable ultra fast SSD drive while the user still don't have an HDD expansion. But once an HDD is installed all games will be transferred to that storage and the whole ultra fast SSD will be turned into a scratch pad.

100 GB = OS and Apps
250 GB = scratch pad for the actual game being played
650 GB = 20GB of each games are installed

When you start a game, the ultra fast SSD can fill the RAM in maybe 5 seconds, and you'd be playing instantly while the HDD loads the rest of the game to the 250 GB scratch pad.

Up to 30 games (20 GB each) can be installed on the SSD and there will be very quick loading times even on start up or when you change your game.

Additionally, they could enable the latest save of the game with its game assets to be saved on the alloted 20 GB. You could pick your game and in 5 seconds you could be playing your game where you left it off. (I'm thinking 4-5 GB/s to a 20 GB of RAM, so 5 seconds)

If you have more than 30 games, the OS will allow you to tag games that you want as "instant loading".

And then I read the idea of function and I thought that it's a better solution. But we both agree that 1TB as a default is the better solution. It's just that my idea of caching is just too much wear for the SSD. 256 SSD + HDD is a bad idea, IMO.

Use the entire 1TB SSD as a cache for games from either external or optical - longest since used is the first to be ejected - with an option to 'lock' a game in. DD games are locked by default and need to be removed manually, with OS prompts for the user to ok cleanup when appropriate.

Covers all use cases and preferences, minimum user input required, minimises unnecessary writes!

And while you're at it, distribute the games with an emulator/hiberfile type "save state"at the start screen so the game is ready to play after about three seconds. You'll need them anyway for your cloud streaming instances!
 
Megre you argument convince me... Even a 128 giga for OS + 128 Scratchpad + 3 tera HD mechanical will work great...
 
A salvaged chip is not cost effective. That would mean a cost reduction, but not something cost effective.
Lockhart can have lower, same or even higher demand than Anaconda. If the chips were the same how, and lockhart was dependent on bad Anaconda chips, how would you reply to demand?
Nintendo will use the salvage chips in their 2030 hardware
 
Maybe PS5 games don't support external storage (But PS4 BC games support). So that Sony can sell different models of PS5. For example, 1TB $499 and 2TB for $599.
 
You've got this wrong. Consoles typically focused their hardware on specific gaming related functions as opposed to general purpose processing. That's why a Megadrive (1988) could nail a PC in terms of locked 60fps parallax scrolling joy well into the 90s.

And the Dreamcast from 1998 absolutely took a flying dump on anything from PC land in that year.

It's the ever increasing dollar and power budget from the high end PC market since then that allowed the PC market to support silicon that's beyond the wildest dreams of a console vendor. But until the latter 200X's, that comprehensive high end ownership sure as shit want a thing the PC had.

Yeah true, it depends though if the Amiga series could be seen as a PC, or a console, or both.
Dreamcast probably had some advantages yes but 'took a flying dumb' i dunno. It did not last long i can asure you.
That can't be true either, i mean even the late 90's and early 2000's one could build quite more powerfull pc's then the 6th gen consoles.

Even if high-end PCs had more raw power than the console, what they delivered in games was inferior. Games targeting the console instead of a weaker base PC spec (far less scaling up of assets and resources back then; higher res with tearing or juddering, more shadow casting lights and AA was about it) could do way, way more thanks to a large install base of a solid 'lowest spec' and low rendering overheads. When you then couple what that console hardware was actually capable of, comparing end-of-lifecycle games versus what the best launch PC had to offer, it's obvious why consoles completed so strongly versus PC in gamers minds - producing better games on cheaper (less flexible, unable to do your homework on them also) PCs.

They were two different machines with two different evolutionary paths and the specialisation of games consoles made them better for the task.

Agree, the consoles offered optimization the pc still doesn't have, even if it has gotten much better. If PC's delivered inferior depends on how you see it maybe. I mean things like Doom 3, HL2, Far Cry, Crysis, or even Quake 3, Quake 2, UT99 were not really inferior to console games at the time.
PS2 delivered hell of a graphics with MGS2, i remember everyone being impressed by its graphics, animations and interaction.
 
Yeah true, it depends though if the Amiga series could be seen as a PC, or a console, or both.
Dreamcast probably had some advantages yes but 'took a flying dumb' i dunno. It did not last long i can asure you.
That can't be true either, i mean even the late 90's and early 2000's one could build quite more powerfull pc's then the 6th gen consoles.

Amiga was neither PC nor console, it was part of a wonderful but now deceased none-x86 home computer scene! And while its large memory helped with 3D games, the Megadrive's arcade based parallax scrolling 2D hardware could not be matched by the machine. Amiga sound, however, was fucking epic and probably beat out even the Megadrive's sound overall. Though a god-king like Yuzo Koshiro could make it sing like an ecstasy fuelled angel beyond the capabilities of anything pre-saturn...


In terms of polygon throughput and features the Dreamcast was well beyond anything from 1998 in PC land. It was probably the Geforce 256 - paper launch in 1999, real launch in 2000 - until the DC was bested in any real sense and that was a whole new paradigm of hardware T&L.

Hardware can only be judged in the time of its introduction. The 2005 Xenos had greater overall throughout across 5+ years than anything from 2005 in the PC space. And ... wow, this is off topic. But basically, PC dollar and power budgets are now so high that the highest end stuff can't ever be matched in a top end enthusiast product. Especially as you move away from the launch window of the console hardware.

Most timely question right now is probably about acceptable losses in the launch months, about subsidising Halo products, and about what you could do for a ~$600 bom 18 months from now.

I'm not willing to rule out a 48+ CU part for a top end SKU at this point. Not saying it'll happen ... but not saying it won't.
 
And then I read the idea of function and I thought that it's a better solution. But we both agree that 1TB as a default is the better solution. It's just that my idea of caching is just too much wear for the SSD. 256 SSD + HDD is a bad idea, IMO.

Acctually is not a bad ideia, i'm expecting an 3dxpoint ssd to be used where the wear is basically null compared to any other ssd, but the price per GB is still is quite higher, we already know that intel and micron are not going to be the only fabs to produce it anymore, it's going to China and price will decrease for sure.
Fact is to have a 1TB drive the "scratch pad" would probably need to be SLC, i think 200+ storage of ultra fast SLC is just unviable.
But a 128 or 256 will be viable and more sensible for the purpose.
Just to remeber that loading times are not the same thing as installation times, many are treating both as the same thing, installation times can be lowered but will not be absent, Blu Ray discs will still be a thing, and will limit the instalation time for sure, no guarantees that downloaded games will install faster either.
After installation allocation will be done, and don't think it will have fixed amounts for it.
It will be based on the game needs and compression used.
The very fast bandwidth of internal SSD with very fast processor will allow a quick decompression on the fly to the memory anyway.
 
XQD cards has been in my mind as part of some bigger plan of Sony, they have been strangely unused by Sony products overall, just ultra high-end cameras, A7 or A9 series do not use it, but Nikons Z series and the latest FF panasonic uses it.
I guess Sony has bigger plan for it and maybe it will be explained in the next gen of consoles.
 
XQD cards has been in my mind as part of some bigger plan of Sony, they have been strangely unused by Sony products overall...I guess Sony has bigger plan for it and maybe it will be explained in the next gen of consoles.
More likely requiring people to buy new, expensive 'proprietary' storage is just a way to turn off people from your product. Look at the ill-fated MemoryStick format. The only reason to use XQD at all is when other formats are incapable of providing the necessary bandwidth. Otherwise, it's more expensive and not going to be wanted much by photographers if they don't gain anything from it.

If no-one's using XQD, potentially the cost will be higher due to low manufacture, which in turn means people not wanting to use XQD (now CFexpress). It has nothing to do with 'big plans' and everything to do with whether Sony can sell the idea or not. So far, since its announcement in 2010, not. PS5 isn't going to change that. However, it does offer a possibility of an NVMe card design as opposed to NVMe modules for adding/changing storage which would be marginally more consumer friendly.
 
After installation allocation will be done, and don't think it will have fixed amounts for it.
It will be based on the game needs and compression used.

Let's say Sony opted for 128 GB because of cost saving, how many games do you think can be allocated/installed on it for caching? Do you think they'd use the entire 128 GB for allocation to have more games cached as much as possible and then just dump them when the game being played needs the space? (then reinstalled in the background after exiting the game?)

And where do you expect the OS and Apps installed?
 
Even though they can't compete on power budgets anymore, consoles still have something to offer. For PS4bone it was the available graphics ram. I think it's fair to say that minimum graphic ram pushed a fair number upgrades?

Consoles will force PCIe4 / NVMe upgrades I feel, for those wanting to play AAA console titles at least.
 
To be fair, about the write speed, it's what you can get when hitting the slc "cache" of the drive. TLC can't do that, even with a great controller like Phison E12 (I guess that's what the Aorus uses...)
 
So if Sony wants to use the SSD as caching, wouldn't it make more sense to use something like 3d XPoint or Samsung's V-Nand to avoid wear?

It looks like something like Samsung's ZET series is the one you look at for it but I doubt anyone can afford a $1000 console :D
 
Let's say Sony opted for 128 GB because of cost saving, how many games do you think can be allocated/installed on it for caching? Do you think they'd use the entire 128 GB for allocation to have more games cached as much as possible and then just dump them when the game being played needs the space? (then reinstalled in the background after exiting the game?)

And where do you expect the OS and Apps installed?

I expect it to be installed on the slower drive, with only some specific parts that are needed to boot fast on the ssd, just like Optane, if the rumours about 4GB DDR4 dedicated for the OS some of the not so important stuff can be quickly allocated to this memory.
Btw, i find really amusing that a console OS take almost 100GB of storage on the base PS4, not sure why, maybe it has much of this space allocated for the "virtual memory"? If anyone with more insight on this topic have the answer it would be much appreciated.
i believe that based on how many games you have in your console the system would fill it (supposing they are using 3dxpoint which do not suffer from it fully used, which is another point for it being the ideal cache drive)
If you have one game it will fill the ssd with it, if you have two they put the most used game with proportionally more space for it (for example) but sure some game will have heavier textures, and it will vary on game by game basis.
I think studying how optane works would give a higher acceptance of it being more of an ideal proposition.
Optane generally don't allocate an entire game or the entire OS, sure you can force it, but is not the ideal, with a 16GB drive is about smartly using the space, and the granularity of 3d point allows it much better at much higher speeds than common ssds.
I was searching for the difference of dollar per gigabyte of SLC vs 3d xpoint, haven't found anything meaningfully relevant.
 
Or... Standard Nvme ssd... And Cell is back, as a crazy fast dedicated "unpacker".
No ?
Oh well...
It's a custom solution.
And yes that ideia of cell back with double duty passed thru my mind as well, maybe sony will surprise us with PS3 backcompat, maybe they will have 2 console variants as well, let's see.
 
More likely requiring people to buy new, expensive 'proprietary' storage is just a way to turn off people from your product. Look at the ill-fated MemoryStick format. The only reason to use XQD at all is when other formats are incapable of providing the necessary bandwidth. Otherwise, it's more expensive and not going to be wanted much by photographers if they don't gain anything from it.

If no-one's using XQD, potentially the cost will be higher due to low manufacture, which in turn means people not wanting to use XQD (now CFexpress). It has nothing to do with 'big plans' and everything to do with whether Sony can sell the idea or not. So far, since its announcement in 2010, not. PS5 isn't going to change that. However, it does offer a possibility of an NVMe card design as opposed to NVMe modules for adding/changing storage which would be marginally more consumer friendly.

I see it diferently, i think Sony's approach this time is diferent, they know their reputation with "proprietary stuff", so i have a "conspiracy theory" that Sony is exchanging technology/information with Nikon, Panasonic, etc... to adopt it first than their own, for market acceptance. As Sony is the major sensor seller and probably the most advanced as well, they might try to push certain technological aspects of their sensors, like 24 fps or higher, this will demand an excessive memory bandwidth and XQD cards with enourmous speed can offset tht need as an example, anyway i will limit myself about cameras since is out of the topic.

XQD cards could work as cache extenders, if the interface and card speed is there...So the same ideia offseting the price of the console, and allowing you to upgrade for a more deluxe version.
 
Status
Not open for further replies.
Back
Top