Predict: The Next Generation Console Tech

Status
Not open for further replies.
Do any of you really think there are millions of people out there just waiting to jump on the chance to strap on an expensive nausea inducing helmet? Even if you get past the expense and nausea, I think you'd find a very limited market.

If the latency is low enough and the motion sensors good enough, I don't know why there'd be nausea. It would have to be a sit down experience, though, and they'd have to figure out how to get around the terminal geek factor of putting on a gaming headset. People already tend to think of gamers as antisocial weirdos..
 
What the hell does a console maker care about average? He's looking to build something that is technically cutting edge, feasible and not priced beyond reasoning (but still prepared to eat a lot of the initial cost to encourage sales).

Sony and MS ate billion$ coming into this generation! Sony might not even make it all back!

And if you think that you will buy a new computer in 2016 with just 16GB of RAM you must be looking at the netbook section.

Plenty of people are buying 4GB systems now, and according to Steam, more of them than are buying 5GB+ systems. And these are gamers!

I bet there'll be 16GB systems being sold in 2016. And I bet that for most people (non-hardcore gamers) they'll be no better than 4GB systems. Most games run fine now on 2GB. In 2016 most games will run fine on 16GB systems (especially the console ports).
 
Where did I say I was talking about *average*? I distinctly remember saying that my calculations applied to what is possible at what time, not when it becomes average.

What the hell does a console maker care about average? He's looking to build something that is technically cutting edge, feasible and not priced beyond reasoning (but still prepared to eat a lot of the initial cost to encourage sales).

And if you think that you will buy a new computer in 2016 with just 16GB of RAM you must be looking at the netbook section.

An average pc consumes around 300 W full load. An high-end pc, today, is around 600 W.
I bet that next-generation console won't go over 150 W, which strongly limits the computational power... what do we need 8 GB for, when even 4 GB may be too much with streaming technology? (That's 4 GB of assets in a single frame, which is about half of today average assets for an entire game!!).

As today, i can't see that much need for 16 GB. The average user doesn't even use 4 GB.. that's why tablets,netbook,smartphone are dwarfing PCs sales.
Eventually we will use even 16 GB, maybe because we will all have "Retina Display"-like screen and 1 Gigapixel cameras.. but look how computing is changing: it's lighter, it's mobile. In 2016 i may even not need to buy a big, loud PC, if a much smaller form factor can provide all the computing i need. :)
 
If the latency is low enough and the motion sensors good enough, I don't know why there'd be nausea. It would have to be a sit down experience, though, and they'd have to figure out how to get around the terminal geek factor of putting on a gaming headset. People already tend to think of gamers as antisocial weirdos..

Gaming has very much moved mainstream, a helmet is a massive limitation to most of the innovation that has gotten it there.
 
Gaming has very much moved mainstream, a helmet is a massive limitation to most of the innovation that has gotten it there.

True enough, but I still think that VR is the big remaining technological discontinuity that could really impress people.. the big remaining 'new experience'.

Didn't anyone else read Snow Crash? ;)
 
See MfA's thread on this subject. VR headsets won't be massive helmets with screens on, but will use something amazing like laser projection. The future will look like classy goggles. There are still issues to VR becoming the selling point of a console, but it's not limited to the geeky fashion of the VR Arcades.
 
Where did I say I was talking about *average*? I distinctly remember saying that my calculations applied to what is possible at what time, not when it becomes average.

What the hell does a console maker care about average? He's looking to build something that is technically cutting edge, feasible and not priced beyond reasoning (but still prepared to eat a lot of the initial cost to encourage sales).
The only place console makers tend to care about "cutting edge" is graphics. Everything else is designed to be as cheap as possible without limiting the graphics too much. Neither the XBox 360 or the PS3 were cutting edge when they launched. Innovative, yes. Overpowered compared to the current tech? No. 360 in 2005 and PS3 in 2006 both only had 512MB total ram, at a time when people's graphics cards were starting to have that much memory, never mind the main RAM. In order CPUs, half wide buses, low CPU cache sizes. These don't scream "cutting edge". Multiple cores were ahead of the game, but even those are more because it saves in development cost and there was not much gain in clock speed at the time.

The reason the consoles kept pace well is simply due to resource allocation. If you open task manager, you'll often see hundreds of processes on your windows (or Linux or Mac for that matter) box. The XBox and PS3 tend to have basically one. When these machines are running your game, it's all they're running. There's no memory manager or paging, there's no exception handling, there's no process switching overhead. It's like if every app on your PC had 95% CPU, 480MB dedicated memory (and no paging) and almost all the GPU.

Combine that with developers coding to well-defined, static hardware, and that's why the consoles still look pretty good.

For next gen, we want more texture space, more CPU (for better AI), and a beefier GPU to handle all that extra data. I'd be extremely surprised if any of the next gen consoles goes over 4GB combined RAM, simply because that's about where graphics cards are now, it allows a 32 bit flat memory model, and it's at 8x the previous generation, which was 8x their previous generation. Even 2GB might be possible, since there's diminishing returns at a certain point with texture resolutions.
 
True enough, but I still think that VR is the big remaining technological discontinuity that could really impress people.. the big remaining 'new experience'.

A lightweight set of head tracking VR glasses, with wide angle stereo Kinect style cameras mounted on the glasses for accurate POV based 1:1 arm/hand/finger tracking (and image mapping - LOOK IT'S MY HAND!) and gesturing would be my dream. I'd sacrifice any amount of RAM for that, and pay any amount for the system.
 
Considering Kinect's latest world-modelling tech, that'd be a perfect solutions. 3D depth cameras on the top building a 3D world ith stereoscopic displays. Would provide absolute immersion opportunities. You could use optical reference points too for cheap absolute positioning.
 
Gaming has very much moved mainstream, a helmet is a massive limitation to most of the innovation that has gotten it there.

vuzix-wrap-1200.jpg


Hardly a helmet. These are comming to market soon. No idea if they are any good but its still a good example of how technology can improve to the point where an idea is atleast worth revisiting
 
Last edited by a moderator:
Considering Kinect's latest world-modelling tech, that'd be a perfect solutions. 3D depth cameras on the top building a 3D world ith stereoscopic displays. Would provide absolute immersion opportunities. You could use optical reference points too for cheap absolute positioning.

Yeah, the possibilities are amazing when you stop to think about it. You could take the VR experience to so many places.

On the basic end you could do cool things with custom controllers (and hand/controller interaction in the VE - VR Tiem Crisis and Virtua Cop plz), or use your real environment with projected characters (cue YourShrink, where Sigmund Freud sits in your lounge with you).

You could even solve some of the mundane, safety related issues like accidentally hitting something or someone while immersed in the VR - the cameras can detect possible collisions and warn you.
 
vuzix-wrap-1200.jpg


Hardly a helmet. These are comming to market soon. No idea if they are any good but its still a good example of how technology can improve to the point where an idea is atleast worth revisiting

so how good will those work with the 40% of the population that already wears glasses? (I'm excluding the ones that wear contact lenses already)

How many times does a bad idea have to fail before its just a bad idea?

VR has some great uses in concept, but I don't expect any of it to happen in my lifetime.
 
vuzix-wrap-1200.jpg


Hardly a helmet. These are comming to market soon. No idea if they are any good but its still a good example of how technology can improve to the point where an idea is atleast worth revisiting

The problem with all of the modern solutions is the pathetic FOV all of the vuzix stuff is < 35 degrees. You really need >70 degrees for it to really feel immersive.

The problem is that to have conventional optics with a decent FOV you can't get away with a compact form factor, and your back to the helmets.

There are some interesting technologies that may or may not resolve the issue, retinal projection is often thrown around as a solution, but in 10 years no one has overcome the tiny exit pupil. The current work with holographic wave guides might provide a solution, but no one is mass producing them today.
 
Though true, as mentioned much, much earlier in this epic thread, cost reduction quite possibly won'y be as pronounced. Release cost has to keep on eye on expected cost reductions. It may be safer to release at a lower price fearing limited cost reduction than go for a $400 console hoping it'll come down to $99 in 5 years. As I see it, traditional die shrinks aren't yielding the returns that the past couple of generations saw, but at the same time we're getting new fabrication methods, so it's hard to call. Designing your console has to be more about ongoing price for its entire life, rather than just launch price.

The Xbox 360 has come down $100 on each SKU in almost 6 years and the PS3 has come down to half albeit with a lot of features taken out. Both consoles have sold pretty well respectively. They both had this level of cost reduction with two die shrinks (65nm, 45nm) and a console being released in 2013 on 32nm would still be able to expect the 22nm and the 11nm die shrinks within 6 years. Even if die shrinks aren't as forthcoming as previously we still benefit from a lower cost per wafer and increased yields over time. I don't believe anyone is going to forget about installing the ability to reduce component costs over time so this ought to be considered a non issue as much of an issue as whether or not they'll put a controller in the box.
 
Neither the XBox 360 or the PS3 were cutting edge when they launched. Innovative, yes....In order CPUs, half wide buses, low CPU cache sizes. These don't scream "cutting edge". Multiple cores were ahead of the game, but even those are more because it saves in development cost and there was not much gain in clock speed at the time.
I wouldn't say early multicore cpus weren't cutting edge because they had to sacrifice some cache and single threaded performance. Obviously they were cutting edge because they were ahead of the curve in the consumer space in terms of multiprocessing, and in doing so made tradeoffs that big iron wouldn't have to make. Unless I'm mistaken, there was no supercomputer on a chip in 2005 with OOOE and large caches that would have made cell look dated.
 
Well thought out, but your facts are inaccurate.

Intel 286 was launched in the beginning of 1982, and had 1MB RAM supported from day one. In 1983 Apple Lisa launched with 1MB RAM as standard.

Also, if memory serves, 1GB RAM became available to masses (in 2x512MB DDR configurations) in late 2004, and became high-end standard in 2005 with Athlon 64. So in fact this 1000-fold increase took about 23 years.

As for 2005 and onwards?

2005 = 1GB
2006,5 = 2GB
2008 = 4GB (in late 2008 this rose to 6GB due to i7 900 series launch with its triple-channel memory)
2009,5 = 8GB
2011 = 16GB

The correct extrapolation of RAM sizes from 2011 onwards should look like this (giving 18 months for doubling):

2011 = 16GB RAM (gaining traction with Sandy Bridge already, built 1st 16GB one back in March)
2012,5 = 32GB
2014 = 64GB (this is where the next gen consoles should hit the market, with 4GB?)
2015,5 = 128GB RAM
2017 = 256 GB
2018,5 = 512GB
2020 = 1024GB

So ... in this light 4GB for RAM in the next gen consoles is woefully obsolete, and even at 8GB it would be very hard for the console makers to justify the raging 56GB difference in RAM capacity by claiming "Windows overhead".

EDIT:

There is one perspective which would make even the 4GB make sense, sort of. That would be if consoles used ultra-fast GDDR5 (graphics RAM). Then, of course, all the RAM would be unified like in X360. At the moment we have 2GB graphics cards available (refraining from bringing in ultra-ultra high end). The extrapolation would then go like this, giving 2 years for doubling (since GDDR does it slower lately):

2011 = 2GB
2013 = 4GB (this is probably where the specs are laid down for the next gen)
2015 = 8GB (this is where it would launch)
2017 = 16GB (this is where mainstream gaming PCs would approach the point where they supercede console hardware at a reasonable price point, but still more expensive than consoles, while the high-end will be untouchable)

I think your numbers are off. Yeah, in '83 Lisa had 1 meg, but it was a total failure because it was equivlent today of $22,000, more than most cars of the time. A more average and successful computer would be the Apple Macintosh in '84, which cost about $5,000 in todays dollars, and only had 128kb of RAM.

My numbers are based off personal experience, my first PC in '91 was a IBM 286 with one meg in, it was about average for the time, maybe even a little higher end. In late 2000 I built a mid-high end Althon 1333mhz with 512 megs in it. So maybe one gigabyte for 2001 is pushing it a little, but even at 512 megs the the point is the same, especially since my guesstimate of 7 earlier seems to be overstated, from the steam results posted by others, its closer to 5, so it ends up being the same 100x slowdown.

The next consoles are going to be about smaller and more profitable, not larger, hotter, bigger and faster. If they want to keep the same bandwidth to RAM ratio as they have now with the 360/ps3, then they top out at 4 gigabytes unless theres some RAM advancement I don't know about. At best, 256 bit interface with high end GDDR 5 is gonna get you ~200 GB/s, which is about the same ratio for 4 gigabytes. For 8 Gigs you'd need nearly 400 GB/s, theres no way to cheaply or easily do that, even by 2014.
 
Last edited by a moderator:
That's really only the case because developers are being held back by 32 bit Windows users. If PC developers didn't have to worry about all their potential customers who haven't upgraded to a 64bit version of Windows they would certainly use more RAM. You're confusing a technical limitation with a developer choice.

We have an interesting point here.

Just a thought...so most likely still have games 2/3GB on pcs with the great quantities are still under the 32bit OS difficulties to address more than 4GB,there's will not be a problem for 64Bit OS and may have even greater or boom for extra RAM in PCs universe and consequently even greater need for the next gen consoles had at least a considerable fraction of these.
 
We have an interesting point here.

Just a thought...so most likely still have games 2/3GB on pcs with the great quantities are still under the 32bit OS difficulties to address more than 4GB,there's will not be a problem for 64Bit OS and may have even greater or boom for extra RAM in PCs universe and consequently even greater need for the next gen consoles had at least a considerable fraction of these.

Actually the limit is 2GB per application with a 32bit OS.
 
Status
Not open for further replies.
Back
Top