Predict: The Next Generation Console Tech

Status
Not open for further replies.
I have several early mpeg 2 BR and the AVC DVD-HDs of the same movies. And the AVC version with almost 1/2 the space is generally much much higher quality. Mpeg2 WAS a monumentally bad idea, we won't even get into the fact that it too Sony 2+ years after BR release for their encoding tools to not suck.

The reality is there is still a decent amount of quality/bandwidth left in AVC. Really it all comes down to how money computation you can throw at the encoding and decoding. A lot of devices still don't support the higher end features of AVC which can make a significant difference in bit rate per quality.

The best 1080p picture I have ever seen was MPEG-2, but that was around 40 megs/s. Given enough bitrate, MPEG-2 is great!
 
SemiAccurate put out the transcript of the interview they had with Andrew Richard and Tim Sweeney.
Putting aside their divergence about software rendering they still make pretty interesting points.

One point they pretty much agree on is that having discrete CPU and GPU doesn't make sense. Andrew Richard goes as far as stating that if you have to have to chips he would favor two "fusion" chips vs a CPU and a discrete GPU.

As architecture like Larrabee may not be valid option before at least five years as usual "random" questions arise into my mind (not all them are console related it's a bit more generic about the needs in personal computing).
Soon we will have one the same chip, CPU, GPU, decoder, encoder (in Sandy bridge for instance). The burden of quiet some tasks is being removed from the CPU. Pretty much every things the "average joe" does with his computer will get accelerated in some way (even browsing will be GPU accelerated soon).
So the question that arises in my mind is does SIMD units (like SSE, Altivex, AVX) still make sense?

In the X86 realm you can't pass on them for compatibility reasons but in a console (or in the embedded space see tegra2) will it make sense to invest silicon on this type units?
They put quiet some constrains on the CPU designs and end up taking up a significant part of a CPU die (see here or here).
By watching the Bobcat floor plan for example it looks like (huge assumption I know) by giving up on the SIMD units could allow to "bulldozerized" a bobcat as far as die size is concerned.

Could the option be valuable for a console?
 
Last edited by a moderator:
SemiAccurate put out the transcript of the interview they had with Andrew Richard and Tim Sweeney.
Putting aside their divergence about software rendering they still make pretty interesting points.

One point they pretty much agree on is that having discrete CPU and GPU doesn't make sense. Andrew Richard goes as far as stating that if you have to have to chips he would favor two "fusion" chips vs a CPU and a discrete GPU.

As architecture like Larrabee may not be valid option before at least five years as usual "random" questions arise into my mind (not all them are console related it's a bit more generic about the needs in personal computing).
Soon we will have one the same chip, CPU, GPU, decoder, encoder (in Sandy bridge for instance). The burden of quiet some tasks is being removed from the CPU. Pretty much every things the "average joe" does with his computer will get accelerated in some way (even browsing will be GPU accelerated soon).
So the question that arises in my mind is does SIMD units (like SSE, Altivex, AVX) still make sense?

In the X86 realm you can't pass on them for compatibility reasons but in a console (or in the embedded space see tegra2) will it make sense to invest silicon on this type units?
They put quiet some constrains on the CPU designs and end up taking up a significant part of a CPU die (see here or here).
By watching the Bobcat floor plan for example it looks like (huge assumption I know) by giving up on the SIMD units could allow to "bulldozerized" a bobcat as far as die size is concerned.

Could the option be valuable for a console?

MMX is used in Xbox360, for example for skinning, and other things.
What you describe i think is Fusion's aim for the far future.
The problem right now with Fusion's chip seems the bandwidth. How do you connect the chips together and to the ram pool?
 
Using mpeg2 was awfull idea in early blu-ray days as most of the releases were on 25GB discs. 50GB discs took relatively long time to become feasible and commonplace.
 
MMX is used in Xbox360, for example for skinning, and other things.
What you describe i think is Fusion's aim for the far future.
(vmx ;) ) Indeed my question is with a GPU flexible enough could off load pretty much every FP calculation to a GPU.
The problem right now with Fusion's chip seems the bandwidth. How do you connect the chips together and to the ram pool?
If their is two chips they would be connected like any bi cpus setup.
Bandwidth looks like a problem AMD and Intel should consider the solutions in place in the mobile space and consider some form of tile rendering. May be discrete and integrated GPUs should evolve following different paths :?:
 
Last edited by a moderator:
Ultraviolet is excellent and is MPEG2.

Well the movie was god awful . But while it looks good , I've certianly seen just as good from the mpeg 4 compressed discs and they are using much smaller foot prints.

No one is saying mpeg 2 can't look great. I'm sure giving an mpeg 2 1TBs of space for the movie encode would make for a wonderfull experiance. The question is if the new codecs can't give the same experiance using much less space.
 
The best 1080p picture I have ever seen was MPEG-2, but that was around 40 megs/s. Given enough bitrate, MPEG-2 is great!

And codec will look good given a high enough bit rate. Best quality video I've ever seen was 4K MJPG. Granted it was in the several 100 MB/s range... The trick with video codec is maintaining quality as bit rate reduced. AVC contains all the functionality that is in MPEG2 plus a bunch of additional functionality which enables it to give better video quality than MPEG2 at effectively any bit rate < MJPG. Likewise HVEC will contain all the functionality of AVC plus additional functionality allowing it to further increase the quality/bitrate metric.
 
I'm very skeptical about HVEC being a big improvement over AVC. I predict 10% at best better compression vs. AVC with all features being used. Also, having travelled to Europe, I can definitely say that DD is far off. If a console is going to be released in the next 3 years, worldwide, it will definitely use media.
 
I'm very skeptical about HVEC being a big improvement over AVC. I predict 10% at best better compression vs. AVC with all features being used. Also, having travelled to Europe, I can definitely say that DD is far off. If a console is going to be released in the next 3 years, worldwide, it will definitely use media.


I don't know. A fully cloud based console with minimal local storage, no hard disc, no optical media sounds very attractive to me. Such a console is in the limit of possibility for 2013+ launch IMO.
 
I don't know. A fully cloud based console with minimal local storage, no hard disc, no optical media sounds very attractive to me. Such a console is in the limit of possibility for 2013+ launch IMO.
If Onlive is any indication, no it isn't. The bandwidth problem may eventually be resolved in decades, but the latency problem won't be. Pings haven't improved from 10 years ago at all.
 
If Onlive is any indication, no it isn't. The bandwidth problem may eventually be resolved in decades, but the latency problem won't be. Pings haven't improved from 10 years ago at all.
Indeed and I wonder about what would be the point. By 2013 any manufacturers should be able to pack enought power for cheap enough to justifuy most gamers to upgrade. Why would they pay for the servers, electricity bill, etc?

The PS3 shown this generation that a system can be "unhackable" (so ugly and temporary mistakes aside). I expect at least MS and Sony next gen systems to be completely bullet proof in this regard.
The main arguments for apps like Onlive is no piracy and everybody can play.
For the first point, bullet proof system and the rising of digital distribution should lower the impact on the industry of second hand market.
The people that will play everywhere are more likely to do it on more and more portable devices or commodity PC (imho).
 
If Onlive is any indication, no it isn't. The bandwidth problem may eventually be resolved in decades, but the latency problem won't be. Pings haven't improved from 10 years ago at all.


I am not sure but as far as I know, DVD drive that Xbox360 games are run from, have around 15 MB/s bandwith and 100ms latency. That is comparable to a typical cable network latency. Besides, I am not suggesting that the DVD drive should be replaced by the internet connection alone. Network plus a limited local storage which can be used to buffer game data on the run can replace the need for optical media. Game levels can be temporarily installed in this local storage to overcome latency and bandwith issues together. Cloud computing has so many advantages on physical media, distribution, DRM, price, ease of use for the end user etc. that for me it sounds very promising.
 
Local storage doesn't help you in a cloud computing scenario. You can't buffer actions which have yet to happen other than things like prerendered videos where the latency doesn't matter. As the computation is taking place off site so to speak. It also means the latency is much more than what you think. Send action to cloud , distribute it to the free resources on the cloud then return processed data back this all takes time. There is no way to shorten this other then reducing network latency.
 
Local storage doesn't help you in a cloud computing scenario. You can't buffer actions which have yet to happen other than things like prerendered videos where the latency doesn't matter. As the computation is taking place off site so to speak. It also means the latency is much more than what you think. Send action to cloud , distribute it to the free resources on the cloud then return processed data back this all takes time. There is no way to shorten this other then reducing network latency.


Game data will be placed in the cloud, not computing. You will still have a CPU/GPU in the console. For online gaming there is no difference from todays situation I believe. As of today your actions are shared by network connection anyway, whether you read your game data from a disc or from internet. the question is if network and local storage can replace the DVD drive connection.
 
Game data will be placed in the cloud, not computing. You will still have a CPU/GPU in the console. For online gaming there is no difference from todays situation I believe. As of today your actions are shared by network connection anyway, whether you read your game data from a disc or from internet. the question is if network and local storage can replace the DVD drive connection.

Doesn't make sense as hardrive space is cheap these days. 2TB 3.5inch drives can be had for $100 to the end user. It makes more sense to download the game data than to stream it as you'd certianly be very limited by bandwidth the other way.
 
Doesn't make sense as hardrive space is cheap these days. 2TB 3.5inch drives can be had for $100 to the end user. It makes more sense to download the game data than to stream it as you'd certianly be very limited by bandwidth the other way.

OK let me explain why I think cloud storage has more advantages over digital download.

1. Obviously, you eliminate the need for large local storage and necessity of maintenance of the storage.

2. Console manufacturers will lose the additional revenue from the storage peripherals but the solution is definitely cheaper for the end user. This decreases the entry price of the console significantly.

3. No physical limit for user. Eliminates the need to uninstall games before new purchases. This may be of concern in a full digital download based system considering that next generation of consoles will not be launched with multi terabyte local storage.

4. It makes sense if you don't want to hand the game completely to the end user. Digital download dissolves the renting/used games market but it does not solve the piracy completely. With cloud storage there is nothing to steal/copy/share etc.

5. How long do you think it would take to download a full next generation game? Don't you think you will need bandwith to shorten the download time to a reasonable level? Instead, you can start playing by single click the same way you insert the disc and play.


I am not arguing that next generation consoles will adapt this model completely but I predict that it will definitely happen sometime in future. The transition from download to stream has already started with music and movies and it is bound to happen for games too. The biggest question now is, how ubiquitous the high speed networks are? Are there enough people with such broad band connection? Is this population sufficient to sustain (economically) a cloud-storage based console?

Maybe next generation consoles will be somewhere in between and accommodate all kinds of distribution systems (physical media/digital download/ cloud storage).
 
Status
Not open for further replies.
Back
Top