Predict: The Next Generation Console Tech

Status
Not open for further replies.
Who would maintain the cloud storage for the game?
The publisher, the developer, some third party?

That's an ongoing expense with no end in sight (or they can stop and possibly strand the game) for any non-subscription sales model.

That's true. Pushing computation and storage to the cloud could spell early end of life for many games. Only the most popular titles would justify the expense to the publishers, even if gamers paid subscription per title.
 
It has been mentioned earlier (in this thread) that Sony might use a variation of Nvidia's GT300 architecture in their next console, and since I've been thinking about whether this is feasible, I thought it might be worth posting a comment.

It is a very interesting time for graphics rendering technology because it appears that there are several possibilities for going forward. As Intel is pushing more complexity (and processing capability) into the GPU cores (with Larrabee), it is rumored that Nvidia is going to do something similar with the GT300. This move from SIMD to MIMD appears to be a response to the inability of the PC's CPU to scale to the processing needs of the GPU. With the PS3, Sony introduced an alternative architecture with Cell, allowing the CPU to scale with the GPU. It seems likely that they will continue to try and derive as much performance-per-watt from ~500mm2 area (close to a 50/50 split) as possible, and so I'm led to ask:

Accepting my assumptions above, does it not seem likely that the PS4's GPU will be based on a more efficient design- GPGPU capability being redundant in Sony's architecture?

And a somewhat related question:

Will we finally see the Playstation emerge as a significant alternative computing platform to the PC? Cloud computing, 2 Gb of memory, precise motion control?
 
It is a very interesting time for graphics rendering technology because it appears that there are several possibilities for going forward. As Intel is pushing more complexity (and processing capability) into the GPU cores (with Larrabee), it is rumored that Nvidia is going to do something similar with the GT300. This move from SIMD to MIMD appears to be a response to the inability of the PC's CPU to scale to the processing needs of the GPU.

Thats not the reason. PC CPU's are more than capable of keeping up with GPU's if your referring to feeding said GPU's with data to process. Its a bit of a meaningless concept anyway without considering the types of games your trying to run and the workloads involved.

Its simply the case that GPU's are becoming naturally more programmable and CPU like so Intel has decided to beat them to it and produce its own "GPU like" CPU. Eventually CPU's and GPU's will be pretty much the same thing, but that was always going to be the end result as a natural progression of technology. I don't think its a matter of one keeping up with the other.


With the PS3, Sony introduced an alternative architecture with Cell, allowing the CPU to scale with the GPU. It seems likely that they will continue to try and derive as much performance-per-watt from ~500mm2 area (close to a 50/50 split) as possible, and so I'm led to ask:

If anything the PS3 has a terrible CPU - GPU balance. Too little power in the GPU and too much in the CPU (if your contrained by a fixed number of transistors).

Accepting my assumptions above, does it not seem likely that the PS4's GPU will be based on a more efficient design- GPGPU capability being redundant in Sony's architecture?

More efficient in what way? None programmable? Because as I understand it, GPGPU for the most part is just a side effect of the programmability of modern GPU's. If you have a big, powerful and highly programmable GPU, you can run GPGPU operations on it. Obviously there are some things like dual precision that may not be necessary, if thats even a major factor in GT300's design.

And a somewhat related question:

Will we finally see the Playstation emerge as a significant alternative computing platform to the PC? Cloud computing, 2 Gb of memory, precise motion control?

Nope. Not unless it develops the ability to run x86 code all of a sudden. Or people decide to start using it on a desk witha keyboard and mouse because those things are fundamental to every day computing. It may make a good alternative for the web browser and some would argue the PS3 already does, but its certainly not going to become a real competitor to the PC in the PC space.
 
Nope. Not unless it develops the ability to run x86 code all of a sudden. Or people decide to start using it on a desk witha keyboard and mouse because those things are fundamental to every day computing. It may make a good alternative for the web browser and some would argue the PS3 already does, but its certainly not going to become a real competitor to the PC in the PC space.

I'd add RAM to that as well, even Dell's bottom of the barrel machines are shipping with 3GB these days, by 2015 when the PS4 hits its prime, 2GB is going to look pitful in desktop terms.

Edit: Oops.
 
Last edited by a moderator:
I'd add RAM to that as well, even Dell's bottom of the barrel machines are shipping with 3GB these days, by 2015 when the PS3 hits its prime, 2GB is going to look pitful in desktop terms.

Ram is dirt dirt cheap, consoles should not be wimpy in this very important area from now no. I would say in 2012 3GB should be the bare min with 4GB+ preferred.
 
by 2015 when the PS3 hits its prime.

I'm not a native English speaker, but doesn't "hitting prime" mean the strongest point in it's existence instead of so close to death that you need instruments to know whether it's still breathing... Saying that PS3 hits it's prime in 2015 is like saying that Mike Tyson was in his prime when he was 38 :smile:

But yeah the amount of memory these machines have is going to be pitiful very soon and the new machines aren't going to be any better years after their launch.
 
I'm not a native English speaker, but doesn't "hitting prime" mean the strongest point in it's existence instead of so close to death that you need instruments to know whether it's still breathing... Saying that PS3 hits it's prime in 2015 is like saying that Mike Tyson was in his prime when he was 38 :smile:

But yeah the amount of memory these machines have is going to be pitiful very soon and the new machines aren't going to be any better years after their launch.

Ah crap I was meaning PS4 of course. Assuming a 2012 launch timeframe.
 
I think you're better off looking at Graphic cards RAM amount than low end Dell for console RAM. 2 GB is the most likely. Though I am hoping for at least 4 GB. Games like Crysis already used lots of RAM albeit inefficiently, but I really want something better than Crysis for next gen consoles.
 
I don't get the fascination with those newfangled graphics architectures bred in the PC space. In a closed box, paired up with a dozen SPEs at 4GHz or more, why would you want to use GPU die space on vertex processing at all, let alone "gemetry shading"? For what purpose do you need integer bit shifts in your fragment ALUs? Are we even sure we need all computations to be FP32?

I'm a big fan of throughput. That always helps. Features can be nice, but there's a point of diminishing returns, and I think we've already passed it a good while ago.
 
I think you're better off looking at Graphic cards RAM amount than low end Dell for console RAM. 2 GB is the most likely. Though I am hoping for at least 4 GB. Games like Crysis already used lots of RAM albeit inefficiently, but I really want something better than Crysis for next gen consoles.

8* 4Gbit ram modules, they aren't shipping anything higher than 2Gbit at present in volume right? But the 4Gbit modules should be available by the time the next console ships. I guess they could manage it so long as they don't go back to say 4 ram chips for example.
 
I'm wondering at this early waking hour..............

If Nintendo was to release a Wii HD that really was a step up from the current system, how would they address those who don't upgrade. As gullible as the non-gamer public has been for the Wii I don't think they'd be so naive as to upgrade to a new system so quick, and I only see Nintendo pretty much making either a refresh of the Wii to support HD resolutions, or a system not so far off from the current spec that developers couldn't create games that could run nominally on either the Wii or "Wii HD". I think the second option would be better for the long term, as it could give the Wii the graphical upgrade so many more hardcore gamers have been clamoring and would last a few years in that configuration. However it leads me to ponder the cost of researching and developing such configuration options. I assume complete backwards compatibility would be necessary for Wii HD.

What do you guys think the best solution would be? I was thinking just another step up hardware upgrading like from the GC to the Wii with some silicon changes like framebuffer size increase. However in my world, the system would have a 45 nm G4 or G5 running at 2.0 GHz, a "doubled-up" Hollywood GPU with much the same silicon accept 45 nm, 300+ MHz, 8 pixel pipes, 8 texture pipes and 8 ROPs, same fixed function T n L and TEV unit for BC with larger eDRAM memory for higher screen resolutions, 4 bolted on vertex shader like units, same 24 MB 1T-SRAM for BC on GPU package, 32 bit z-buffer capabilities, and 256 MB of GDDR3 main system RAM. Sure the RAM seems like overkill, but it's outrageously cheap these days and it's useful if Nintendo wanted to allow people to run music at the same time or run a management system in background a la 360 or PS3. Plus it would open up doors for other media features (as much as I hate them). 256 MB is nice for a web browser. I wonder though, if power management isn't an issue, it might just be easier to just have a whole new GPU and bolt on the Hollywood for BC, in which case I'd put the equivalent of a Radeon 4550 on there. It's a vicious little brute of a GPU, and the desktop card versions use very little power.

Basically I'm going for a double or tripled up Wii here. BC would be preserved, and the ability to run a optimized port of Crysis would be doable!
 
Last edited by a moderator:
I'm wondering at this early waking hour..............

If you consider what the Wii is compared to the previous generation consoles, in a lot of ways it will probably be similar to the current generation consoles in the same way. The main thing is that they will have more ram than the current consoles as 1GB at least is a gimme considering 4 2gbit chips yield just that.

My guess is they will follow a Wii model yet again with a simple, cheap and cost effective solution which is both unobstrusive for consumers and practical. This means small, low power, quiet console. Given a Wii like form factor and a likely 20-30W power budget I can see them going for something along the lines of a quad core powerpc clocked in the realms of 1.5Ghz along with an appropriately sized GPU which is about as powerful as an entry level discrete laptop GPU.
 
Basically I'm going for a double or tripled up Wii here. BC would be preserved, and the ability to run a optimized port of Crysis would be doable!
I wonder who would buy such a thing? Are the Wii crowd going to be happy to spend another £200 just for upgraded graphics, or do they exist fairly outside the realm of graphical interest? Will existing HD console owners want to get a marginally improved console which looks set to receive the wrong sort of games compared to what the HD consoles are used to? Especially with the extended controls their current machines are getting nullifying the Wii advantage.
 
I'm pretty sure it would take a hell of a lot more than doubled or even tripled Wii's to play even a heavily optimised version of Crysis. Afterall, the 360 and PS3 must be well beyond 2-3x the power of the Wii and they seem like the bare minimum to run the highly optimised CE3.
 
I think you're better off looking at Graphic cards RAM amount than low end Dell for console RAM. 2 GB is the most likely. Though I am hoping for at least 4 GB. Games like Crysis already used lots of RAM albeit inefficiently, but I really want something better than Crysis for next gen consoles.


Now 2GB seems likely, but we still have a long way to go before next gen. Years most likely. I think the longer that drags out 4GB is more and more likely. I'd already put it at better than 50-50.

If we assume history means anything, 4GB also seems likely. Xbox>360=8X RAM, PS2>PS3 even more, 16X.

Also, I always remember back to an interview about Xbox where they said that by the end, the most expensive components were the hard drive and the RAM. I think that's why RAM is always such a precious commodity in consoles. It seems to maintain a hard cost longer.
 
Status
Not open for further replies.
Back
Top