Predict: Next gen console tech (9th iteration and 10th iteration edition) [2014 - 2017]

Status
Not open for further replies.
I think Next-gen consoles are going to be radically different than what we are envisioning now.

Nvidia's new SHIELD and Valve's console will probably take hold in the next few years. I'm guessing we'll see consoles that use cloud for a lot of things instead of trying to stick with the on-going technology of AMD/Nvidia (which is clearly going to be 2-3 generations ahead with every release of a new console). In order to do *good* VR on consoles, you'd need a lot of power (ideally 4k per eye instead of 1080p per eye @ 120Hz). I'm not seeing this kind of power next gen unless they use cloud computers.
 
I don't believe in pure cloud computing gaming because of business model problem you can't share a server CPU/GPU between many client, like in other public cloud market. Out of HPC you don't need more than an integrated and cheap GPU into a server or no GPU at all. If Google or Facebook needed one server per running client it would not be economically viable. You can do some calculation (non gameplay physics?) with cloud computing but most of the calculation will take place on hardware. Many people are surprise by PS Now price but you need to rent the hardware. And imagine the price of PSNow with new AAA games. At least this time no exotic hardware. After there is technical problem and the quality of connection is variable. I think a pure cloud based model means lost some of current video games customer with poor connection and fiber deployment is a bit slow in Europe for example.

I don't believe in success of Android console... Valve model is not the good one with multiple configuration of hardware like a PC. I don't even see the advantage of Valve hardware against PC...

Next generation will come at best in 2019 or 2020, before it will be very difficult to create a console with enough power to do the difference. In term of power I think the minimum is something as powerful as the Titan X for GPU or 390X with DX12 hardware feature and maybe DX13 one with 32 to 64 Gb of stacked RAM, a x86 8 cores CPU but much more powerful than a Jaguar with a good SIMD ISA with the last version of AVX at full rate.

Go the ARM road means again problem with retrocompatibility. And I think AMD will be the choosen partner again or maybe a surprise and one console maker will work with Intel.

My concern about next generation is the slowdown of process node progression, so much time with 28nm...
 
Last edited:
It took another six months before that capacity clawed back into the discussion. It was considered a nice thing to have on the wish list.


There are diminishing returns. I suppose someone could imagine what they would do with hundreds of gigs of RAM, but I think at that point we would need to consider improving the IO of the platform to match.
The progress for non-volatile memories may be something that deserves more focus.

I agree with 32GB or 64GB of memory a DD will be a problem maybe hybrid DD/SSD?
 
128GB of ram for the PS5!

RAM requirements for games has begun to level off over the years. Crysis 3 (req. 3Gb, rec. 4Gb), Unity (req. 6Gb, rec. 8Gb), BF Hardline (req. 6Gb, rec. 8Gb), Witcher 3 (req. 6Gb, rec. 8Gb) and it's hard to see these numbers increasing sixteen-fold in the next 3-4 years unless some new RAM intensive software techniques become the norm, which isn't likely the case if you look at the actual hardware in use by the Steam community - over 85% of the Windows gaming community have 8Gb (29%) or less (%56). 44% have 4Gb or less! :runaway:

Instead of including an excessive amount of RAM in, that they target a more conservative amount (say 32Gb or 64Gb) and use the savings to bolster the CPU and GPU. No point having a ton of RAM without the capability to use it unless it's just going to act like a giant cache.
 
You also need to fill ram from somewhere. How large will optical media be in 5 years? I suspect not all that different as new codecs mean 4k video will not require it. That might triple or quadruple capacity but not 8x, Broadband adoption might have improved but speed is unlikely to be that magnitude higher either.

That is before you consider the artist cost to create that qty of content. Procedural rules to alter base templates may make that more cost effective? (no idea on the technology but 64k PC demos always seem pretty enough)

Is it more likely we will go big on local cache on the gpu for heavy local compute and even higher memory bandwidth over capacity. Fluid and other complex simulation, superior light effects towards and procedural techniques to fill the void.
 
The UE 4.0 Open world GDC 2015 demo use some procedural techniques for some assets. I don't think it can replace all assets maybe for vegetation it could be a good solution.
 
You also need to fill ram from somewhere. How large will optical media be in 5 years?
While true, not all data in RAM will come from the install. A fair bit of RAM will be world state simulation data that is calculated. Lots of things are now calculated and populated into datasets in RAM for looking up. But even allowing for this, 128Gb is a lot even if you reserve something like 32Gb for the OS and background apps.

Look ar Minecraft; from a tiny binary you can generate a world gigabytes in size.
 
While true, not all data in RAM will come from the install. A fair bit of RAM will be world state simulation data that is calculated. Lots of things are now calculated and populated into datasets in RAM for looking up. But even allowing for this, 128Gb is a lot even if you reserve something like 32Gb for the OS and background apps.

Look ar Minecraft; from a tiny binary you can generate a world gigabytes in size.

More than 8 Gb for OS and Background apps is crazy.
 
But back to next gen, even 16GB seems like a lot to ask. But I guess Titan X just dropped 12GB GDDR5...and they're talking about 16GB of stacked memory.

Since NVIDIA said Pascal will have a version with upto 32 GB of stacked memory and AMD will ship a board with HBM this year, I'm going to predict that we're going to see 24-32 GB of memory and all of the HBM variety.

I don't think there's a need to go for more memory as I don't think developers will necessarily want to increase game size. I think art assets will still target 1080p and there won't be a new physical media.
 
More than 8 Gb for OS and Background apps is crazy.
It sounds crazy now. When Microsoft launched Windows XP launched in 2001 it had a minimum RAM requirement of 64mb (128 recommended). The notion of 1Gb for the OS would have been outlandish and believable, but Windows 7 launched in 2009 with a minimum RAM requirement of 1Gb (32-bit) and 2Gb (64-bit). That was a 16x/32x increase over eight years. Windows 10 hasn't upped this any but you don't know what's around the corner in terms of technology, nor how well Windows 10 will really run with 1Gb RAM. My guess is not good.

Obviously during the development of Longhorn, Vista then Windows 7 the concept of what the OS should do marched on. Stability, security and a modern UI overhaul took their toll on RAM. Nobody knows what consumers may expect a console OS to do in 3-4 years at launch, let alone for the 5-10 years it'll be in use. Predicting what we'll be using RAM for in almost fifteen years time? :nope:

But even then, 128Gb sounds nuts. :yep2:
 
It sounds crazy now. When Microsoft launched Windows XP launched in 2001 it had a minimum RAM requirement of 64mb (128 recommended). The notion of 1Gb for the OS would have been outlandish and believable, but Windows 7 launched in 2009 with a minimum RAM requirement of 1Gb (32-bit) and 2Gb (64-bit). That was a 16x/32x increase over eight years. Windows 10 hasn't upped this any but you don't know what's around the corner in terms of technology, nor how well Windows 10 will really run with 1Gb RAM. My guess is not good.

Obviously during the development of Longhorn, Vista then Windows 7 the concept of what the OS should do marched on. Stability, security and a modern UI overhaul took their toll on RAM. Nobody knows what consumers may expect a console OS to do in 3-4 years at launch, let alone for the 5-10 years it'll be in use. Predicting what we'll be using RAM for in almost fifteen years time? :nope:

But even then, 128Gb sounds nuts. :yep2:

A console doesn't need to run like Windows. I think it is more comparable to iOS or Android. With 4 Gb you run Windows 7, 8 or 10 without big problem with non gaming app. 8 Gb is enough for run an OS and run Apps
 
RAM requirements for games has begun to level off over the years. Crysis 3 (req. 3Gb, rec. 4Gb), Unity (req. 6Gb, rec. 8Gb), BF Hardline (req. 6Gb, rec. 8Gb), Witcher 3 (req. 6Gb, rec. 8Gb) and it's hard to see these numbers increasing sixteen-fold in the next 3-4 years unless some new RAM intensive software techniques become the norm, which isn't likely the case if you look at the actual hardware in use by the Steam community - over 85% of the Windows gaming community have 8Gb (29%) or less (%56). 44% have 4Gb or less! :runaway:

Instead of including an excessive amount of RAM in, that they target a more conservative amount (say 32Gb or 64Gb) and use the savings to bolster the CPU and GPU. No point having a ton of RAM without the capability to use it unless it's just going to act like a giant cache.

The main area of needed benefit is bandwidth. Look at Microsoft's solution with ESRAM...it's very high bandwidth but an extremely small size. That's why I think even if you only doubled to 16GB - of HBM - it would be substantial enough.
 
Considering that almost for sure next consoles will use and hbm2 derivate, both memory size and bw will not be a problem and the os will be able to take how much it want.
 
A console doesn't need to run like Windows. I think it is more comparable to iOS or Android. With 4 Gb you run Windows 7, 8 or 10 without big problem with non gaming app. 8 Gb is enough for run an OS and run Apps
Again, you're looking at the present and disregarding (and ignoring) that software, including the functionality of operating systems, has been in constant development for half a century.

When Windows XP launched nobody had any exceptions that the OS should provide Kernel ASLR, firewalled TCP/IP stack, malware protection or deep user permissions but it is now. When PS3 and 360launched nobody had any expectations that the consoles should easily allow capture and upload of photos and video for sharing. But sharing is a thing now, as is tight integration into social networks. Early browsers worked with tens of kilobytes of RAM, now they'll use gigabytes.

You're proposition is basically predicated on: that's we're done. No memory-hungry technology or system interoperability will be developed necessitating more RAM. Nope, we're done! R&D stops. :nope: I can see the next generation of consoles supporting multiple users doing different things. Somebody using the console while connected to the TV, others using console functionality from remote devices or a VR headset. This'll take more RAM.
 
More than 8 Gb for OS and Background apps is crazy.
No it isn't.

I hate it when XB1 tomb stones apps or a game because the wife has used Netflix/Xbox Video/Via play. It's really nice to turn on your console and then continue on the Nurnburgring in Forza 5 where you left off last night, - in 5 seconds. Having to wait for apps or games to load is a pain in the ass once you've gotten used to instant switching.

Si scaling is coming to an end, which will affect the amount of DRAM going forward. With the integration of high bandwidth DRAM systems onto substrate/dies we will see a an performance/price optimized memory hierarchy: 4-8GB tightly integrated, high bandwidth, memory, 16-32GB external memory (how it's done today) and ton of flash memory (possibly on the memory bus for high bandwidth transfers). Having a high bandwidth pipe to flash memory would make app switching to tomb stoned apps more palatable.

With 3D stacking, flash memory can scale the price/bit downward a bit longer than DRAM, shifting the balance in its favour.

Cheers
 
Again, you're looking at the present and disregarding (and ignoring) that software, including the functionality of operating systems, has been in constant development for half a century.

When Windows XP launched nobody had any exceptions that the OS should provide Kernel ASLR, firewalled TCP/IP stack, malware protection or deep user permissions but it is now. When PS3 and 360launched nobody had any expectations that the consoles should easily allow capture and upload of photos and video for sharing. But sharing is a thing now, as is tight integration into social networks. Early browsers worked with tens of kilobytes of RAM, now they'll use gigabytes.

You're proposition is basically predicated on: that's we're done. No memory-hungry technology or system interoperability will be developed necessitating more RAM. Nope, we're done! R&D stops. :nope: I can see the next generation of consoles supporting multiple users doing different things. Somebody using the console while connected to the TV, others using console functionality from remote devices or a VR headset. This'll take more RAM.

Since Windows Vista the RAM for running Microsoft OS decrease or stay the same. We will see in 2020 if we need more than 8 Gb for running an OS and basic apps(media, browser, or os gaming function...) not only for consoles but for PC too. I don't think it will be the case. We are at 3 Gb now but I don't think about any usage asking much more than that. With 8 Gb the console maker will have more than enough RAM to run OS and apps.

I am on PS4 and launching Netflix is very fast same things for other apps like Youtube for example. Maybe Sony will allow launch of two or three apps and one Game. Now the restriction is one app and a game.
 
Last edited:
Look at some other OSes though, and you realise the picture isn't one of constantly increasing requirements.

Mac
Tiger (10.4), 2005, 256 MB
Leopard (10.5), 2007, 512 MB
Snow Leopard (10.6), 2009, 1GB
Lion (10.7), 2011, 2GB
Mountain Lion (10.8), 2012, 2GB
Mavericks (10.9), 2013, 2GB
Yosemite (10.10), 2014, 2GB

Android
Gingerbread (2.3), 2010, 512 MB
...
Lollipop (5.0), 2014, 512 MB

There's only so much you can have an OS do. Everything else is apps running on that OS. And as described before, several GBs of RAM is good for a lot of apps. We've all baulked at the reservations in XB1 and PS4 because they're basically overkill, and so far that's playing out. What is 3.5 GBs of RAM doing in PS4? Contrast that with a 4 GB Windows laptop running high-res photo editing and several other multitasked apps, or a 2 GBs phone running mulitiple apps, and it's clear that PS4 is doing squat - there isn't really much you can need GBs of OS RAM for.

The progression of PC RAM has followed the advance of media quality, from digital video and 2 MP photos to HD video and 16 MP photos. That's moving to 4k and/or 3D/VR video and 40+ MP images...it's still firstly going to top out and secondly be content in apps rather than OS functions. As long as people aren't editing photos at the same time as playing their games, there won't need to be a significant separation in RAM again.

Not only is more than 8 GBs for OS and background tasks crazy, but more than 4GBs, maybe even more than 2 GBs, is pretty crazy. You'd need apps with significant amounts of content, like Google Maps with the entire country resident in memory to save a few seconds of loading time. About the only 'legitimate' use for more RAM I can think of is something like a VR hub world as the base interface for the console, so you'll have a multiple GBs 3D space to swap to at any instant. A slightly less legitimate use would be to cache a load of a stuff to make it smoother, such as video thumbnails of games, finding a use for more RAM simply because it's there and you need to have it doing something.
 
Look at some other OSes though, and you realise the picture isn't one of constantly increasing requirements.

Mac
Tiger (10.4), 2005, 256 MB
Leopard (10.5), 2007, 512 MB
Snow Leopard (10.6), 2009, 1GB
Lion (10.7), 2011, 2GB
Mountain Lion (10.8), 2012, 2GB
Mavericks (10.9), 2013, 2GB
Yosemite (10.10), 2014, 2GB

Android
Gingerbread (2.3), 2010, 512 MB
...
Lollipop (5.0), 2014, 512 MB

There's only so much you can have an OS do. Everything else is apps running on that OS. And as described before, several GBs of RAM is good for a lot of apps. We've all baulked at the reservations in XB1 and PS4 because they're basically overkill, and so far that's playing out. What is 3.5 GBs of RAM doing in PS4? Contrast that with a 4 GB Windows laptop running high-res photo editing and several other multitasked apps, or a 2 GBs phone running mulitiple apps, and it's clear that PS4 is doing squat - there isn't really much you can need GBs of OS RAM for.

The progression of PC RAM has followed the advance of media quality, from digital video and 2 MP photos to HD video and 16 MP photos. That's moving to 4k and/or 3D/VR video and 40+ MP images...it's still firstly going to top out and secondly be content in apps rather than OS functions. As long as people aren't editing photos at the same time as playing their games, there won't need to be a significant separation in RAM again.

Not only is more than 8 GBs for OS and background tasks crazy, but more than 4GBs, maybe even more than 2 GBs, is pretty crazy. You'd need apps with significant amounts of content, like Google Maps with the entire country resident in memory to save a few seconds of loading time. About the only 'legitimate' use for more RAM I can think of is something like a VR hub world as the base interface for the console, so you'll have a multiple GBs 3D space to swap to at any instant. A slightly less legitimate use would be to cache a load of a stuff to make it smoother, such as video thumbnails of games, finding a use for more RAM simply because it's there and you need to have it doing something.

Maybe PlayGo system is a part of reserved RAM on PS4. Sony use a lightweight BSD system, sometimes I ask myself what are they doing with all the reserved RAM...
 
But back to next gen, even 16GB seems like a lot to ask. But I guess Titan X just dropped 12GB GDDR5...and they're talking about 16GB of stacked memory.

Actually, Nvidia CEO confirmed that maximum that Pascal will support is 32GB, but such models [if they make them] will probably not arrive into consumer space. :)

Five years from now, by 2020, I can see new consoles using 32GB of stacked memory [4 HMB stacks of 8x1GB memory chips [HMB2 will support 512MB chips if I'm not mistaken]. Heck, I would not be surprised if they go over that number to maximum of 64GB. Add 0.5-1 Tflops CPU, 10-12 Tflops GPU and we will have a good 4K ready console that will kick ass in 1080p games.

However, mass storage will be problem. Hopefully there will be significant movements in the field of cheaper SSD drives. Nextgen consoles would even benefit a lot from 1-2 TB SSD drives that are stuck to SATA2/USB3 speeds [250-350MB/s read/write].

You also need to fill ram from somewhere. How large will optical media be in 5 years?
BuRay association has already adopted the standard for triple-layered BD disc that can hold 100GB of data. They will use those discs for 4K films, and first players who will be able to read them will appear at the end of THIS year.
http://en.wikipedia.org/wiki/Blu-ray_Disc#UHD_4K_Blu-ray_Disc

In addition of that, Sony and Panasonic are working on very fast deployment of next generation of BD discs named "Archival Disc". This system that starts with 300GB of data [double-sided though] and can be scaled up to 1TB. Laser is the same as with current BD readers.
http://www.engadget.com/2014/03/10/sony-panasonic-archival-disc/
 
Last edited:
Look at some other OSes though, and you realise the picture isn't one of constantly increasing requirements.
And nobody is claiming that but bear in mind the minimum requirements really are the bare minimum RAM required to run the OS with very little on top. Windows 7, Windows 8.x and windows 10 all have a minimum RAM requirement of 1Gb (32-bit) or 2Gb (64-bit) however 2GB isn't enough to run your basic office (Outlook, Lync, Word, Excel, PowerPoint) unless you run a single application at a time. Right now, Outlook on my work PC is consuming over 700mb, Lync is over 250mb.

The same is true with your Mac examples. My first Mac was a 12" G4 PowerBook that had 256mb and ran Panther (10.3) and believe me,that thing ran slow as shit until I put another 256mb in there. When it comes to operating systems, minimum really does mean the bare minimum amount required for the OS to run with little overhead for applications unless you like disk-swapping. That's why in addition to "minimum requirement" there is usually a "recommended" amount and in my many decades of using microcomputers, "recommended" is really the starting point on which you should build higher.

There's only so much you can have an OS do. Everything else is apps running on that OS. And as described before, several GBs of RAM is good for a lot of apps. We've all baulked at the reservations in XB1 and PS4 because they're basically overkill, and so far that's playing out.

Sure there is but the basics of what an OS is expected to do is forever changing. Windows 3.x didn't include a TCP/IP stack. Windows 95/98 didn't have memory protection. Windows XP had almost no defensive against malware. Windows 7 has no full encryption or privacy settings.

The progression of PC RAM has followed the advance of media quality, from digital video and 2 MP photos to HD video and 16 MP photos. That's moving to 4k and/or 3D/VR video and 40+ MP images...it's still firstly going to top out and secondly be content in apps rather than OS functions. As long as people aren't editing photos at the same time as playing their games, there won't need to be a significant separation in RAM again.

If you look at what Windows (the core OS) is using memory for it's mostly for resource tracking, security measures, the UI and API overheads - particularly the TCP/IP stack. This isn't going to diminish, this will only increase as security measures increase to keep pace with threats. At work the server OS is [obviously] bespoke as is the TCP/IP stack which is hardened. Many of the security measures we've had for years will be tomorrow's operating in commercial products and they are RAM hungry. You can engineer security to be less RAM hungry but then they become CPU intensive. Pick one. Heuristics algorithms, that remain the cornerstone of modern security measures, are generally this choice.

Not only is more than 8 GBs for OS and background tasks crazy, but more than 4GBs, maybe even more than 2 GBs, is pretty crazy.

I don't mean to be rude but this statement reeks of ignorance. Unless you know exactly what RAM is being used for, and how it's being used, and you also have a good knowledge or experience in that particular software discipline then calling the RAM usage "crazy" is incomprehensible. There are valid questions about the RAM reserve on PS4 (and Xbox One) and it's frustrating that the purpose of the reserves aren't known but you can't judge without knowledge :nope:
 
Status
Not open for further replies.
Back
Top