Predict: The Next Generation Console Tech

Status
Not open for further replies.
Therefore I think we will likely see at least 2 GB ram and pssibly up to 4 GB or somewhere in between , while other aspects of the system will not see the usual performance/spec increases. I still think the PS4 will be a good performer though.

2GB and 4GB is too a little bit for PS4 I think. If you all need CGI realtime on games , they'll need more than 16GB for it stuffs.
 
16 GBs isn't going to happen without some totally new RAM tech. Consoles get around an 8x increase n RAM each gen, which would be 4 GBs. 16 GBs would be cost prohibitive, especially on fast buses.
 
I think it would be better if we got a little more than 4GB. 4GB just for VRAM probably. Based on what we got this gen it seems that 512MB for both main and video memory is limiting in this gen. Some say that 512MB should have been for VRAM alone. It has been a big bottleneck as I see it.
 
Well, with enough raw processing power I think quite a bit of procedural data and more complex shaders could be used instead of layers on top of layers of *maps.
 
Heh I was expecting such a reply. But there should be at least some amount that doesnt cause too much pain and limitations for quite some time. I mean...if developers are struggling to fit basic things in the memory then there sure is a problem. This shouldnt happen so soon.
 
Are you saying that this demo is running on the PS3 by just using 1/3 of a SPU and no GPU?

That would be impressive

Just the lighting algorithm I think, not the whole demo itself.

My guess is that either RSX isn't programmable enough to handle the algorithm or that its just so well suited to Cell, which has all this relatively spare power that they didn't bother making it compatable with the GPU.

Note that from the sounds of it the 360 can run it on only one CPU thread aswell so it doesn't sound like it takes much power. The difference in the 360 and PC though is that you can choose to run the alogorithm on either the CPU or the GPU depending on where you have the most resources (and in the PC's case what the level of GPU programmability is - if my first assumption is correct then systems with G7x and below would be limited to running it on the CPU).
 
Are you saying that this demo is running on the PS3 by just using 1/3 of a SPU and no GPU?

-> They are saying this and you look at partners ? Epic,Microsoft,Sony and others.
Its really impressive only use 1/3 of SPU for radiosity light aproach.



That would be impressive

edit: btw isnt what is shown here what Splinter Cell's developers claimed was not doable on the PS3 and excused their decision for going exclusive on the 360 with Convinction?

If yes then their claim is pretty much debunked.

This is very strange assumptions of developers (UBisoftware) about ps3 if Epic support Enlighten(or Reality Engine) in your Unreal Engine 3.0 and wee have excelent results in game like Unreal Tournament III.
 
Last edited by a moderator:
That would be pitiful for 2011/12/13, I think the quantity of ram and its bandwidth is one of the easiest ways to improve fidelity and performance. As a result, spending your budget on more ram is a better long term investment than trying to cut corners with it.

Therefore I think we will likely see at least 2 GB ram and pssibly up to 4 GB or somewhere in between , while other aspects of the system will not see the usual performance/spec increases. I still think the PS4 will be a good performer though.

And lets not forget by the time we have the new consoles BR drives will be cheap as chips, so just that in itself will be a huge cost saving for the PS4.
The PS3 design focus was on high performance increase, HD content and some revolutionary CPU concept (CELL) compared to PS2.

IMHO Sony should focus more in size, price, compatibility and software/content/service and new interfaces. How to do that? Price in the US$200 can be achieved with not using all that Amdhal´s Law predict for a five year difference (8x). Smaller size with cooler operation is a plus.

Now more focus on better software development tools, better games, full backward compatbility with current HD content and new/better content should be a major concern. My guess developers are generally not using all the processing power they could from CELL.
 
I think it would be better if we got a little more than 4GB. 4GB just for VRAM probably. Based on what we got this gen it seems that 512MB for both main and video memory is limiting in this gen.
It is limited for the hardware, but it's also a plus for the cost. Add some more RAM, and then you see the GPU is limiting. Add a bit better GPU and the disc is too slow. Add a faster disk and... It's all compromises. Relative to the PC where you see screenshots from mega-gig machines with gobs of VRAM, the RAM is a serious bind. But as a console experience 512MB isn't too bad. Things look good and are getting better. The relative experience will be much like previous consoles, plenty good enough and at a price people can or will be able to afford.
 
As I know, PS4 project is in the air now. Cell BE and RSX on PS3 only showing up 20-30% performance on its exclusive way. However multiplatform games are lower than 10% as
we see. If developers can unlock to 60-80% of PS3 performance now. You'll see more clear information of PS4 architect as soon.

Since PS and PS2 ages. I think SCEI may announce some kind of PS4's shadow around 2010
or 2011.

For me. I think my PS4 must be some kind of Cell BE 2 architect with wide dynamic OoOE (8-way issue) 512-bit PPE Core @ 6.4 GHz (Base on Power6 or beyond) with 64 x OoOE SPE (8-way issue) 512-bit register wide/entry/SPE @ 6.4 GHz on-die Rambus Terabyte/sec memory
controller with 8 channal 64 Terabyte Rambus (TB/sec Model)

FlexIO Advance from Rambus to kick 500TB/sec bandwidth overall the system from lower and
high bandwith component.

RAID-0 4x Solid State HDD 500 Terabyte(each) with 64 Terabyte buffer(each)
64x Next generation Blu-ray Drive Technology with 64 Terabyte buffer

GPU nVidia GPU with 64-way SLI and 512 Terabyte Rambas (TB/s Model)
output 11520x4860@120FPS

Oh god! that you all need to pick more than 3000 USD. for PS4 console


That would cost at least $10,000, if not far more.

It's also 100% impossible, unless you want to make a console that's nearly as big as a frigging house. No solid state HD can even begin to get close to 1 tera byte, let alone 500...
 
Keep in mind that resolution of standard televisions will not increase beyond 1080p for a very long time. I mean there are and will be monitors that go beyond 1080p but it won't be standard. Broadcasts will continue to be 720p and 1080i, maybe some doing 1080p, I don't know. Movies you buy on disc (or download) will still be 1080p. There is no way console games will go beyond 1080p for a very very long time, not until Ultra High Definition which is decades away.


some general guidelines on next-gen, everything is a natural progression, nothing outrageous.

RAM: should be at least 4 GB - would be good if they kicked it to 8 GB -- 16 GB is too much to expect.
More RAM is always wanted but is always in short supply. probably no different next-gen. 4 GB is only an 8x increase, the smallest increase we've seen from gen to gen.

main system memory bandwidth currently in the 20s (22-25 GB/s ), will no doubt move to at least the middle 100s of GB/sec. That would be greater than a 10x increase. Even more if that announced 1 TB/sec Rambus pans out. The widest *external* memory bus is only 128-bit in consoles currently. Obviously 256-bit busses will be the minimum next-gen, but I would like to see 512-bit bus since highend PC GPUs have already reached that milestone as of 2007 (ATI's R600). Yet consoles always lag behind PC in this area so they may go with only 256-bit, but supplemented with eDRAM.

embedded memory bandwidth: Currently at 256 GB/sec (Xbox 360). needs to move into the multi TB/sec for both Xbox3 and PS4. This is an area where PS3 lags far behind Xbox 360. There's no eDRAM in RSX. Sony needs to move back to eDRAM with PS4 for massive rendering bandwidth. something that's at least as impressive for PS4's time as PS2's 48 GB/sec was when it was announced in 1999, released in 2000. Microsoft/ATI made a leap over PS2 with Xbox 360's eDRAM bandwidth. I'd like to believe Microsoft/AMD will leapfrog themselves with Xbox3's eDRAM bandwidth.

CPUs - will move from multi-core (2-8 cores) to manycore (10 cores or more) - This will be true of next-gen CELL, Larrabee and anything Microsoft considers for next-gen Xbox be it Xenon2 or their own new CPU design. Floating point performance could range greatly but in general will be in the TFLOP class, whereas todays console CPUs are in the 100 to 200 GFLOP class.


polygons - still in use. will go from current several hundred million polys/sec to several billion. that's a given. I'm sure unique ways of rendering polygons and doing things with polygons will be implemented but I don't know what.

storage - while I would like to see all optical discs gone, because of their slowness, moving parts, etc I think the reality is, Blu-ray will be standard across Sony, Nintendo and Microsoft next-gen consoles. Harddrives will still be around. we'll probably see more use of solid-state media for storage and buffering though.

resolution - 1080p highest res. 720p still used. see above.

framerates - hopefully a higher percentage of 60fps compared to this gen, however 30fps will still be used for a large if not significant percentage of next-gen games. Forget 120fps its not happening.
 
Last edited by a moderator:
main system memory bandwidth currently in the 20s (22-25 GB/s ), will no doubt move to at least the middle 100s of GB/sec. That would be greater than a 10x increase. Even more if that announced 1 TB/sec Rambus pans out...

embedded memory bandwidth: Currently at 256 GB/sec (Xbox 360). needs to move into the multi TB/sec for both Xbox3 and PS4. This is an area where PS3 lags far behind Xbox 360. There's no eDRAM in RSX. Sony needs to move back to eDRAM with PS4 for massive rendering bandwidth.
Why would anyone want so much extra rendering bandwidth? You're saying a TB a second BW for 1080p isn't enough?! :oops:
 
Sorry about of topic post:

Epic setting UE 3.0 to radiosity light

" "We've been looking at integrating some radiosity features into the lighting," said Sweeney. "We can have some combination of static and lighting radiosity. None of these are guarantees that we're 100 percent promising right now, but these are all active areas that we plan to work on further for our upcoming games."


More information here:

http://www.next-gen.biz/index.php?option=com_content&task=view&id=9840&Itemid=2&limit=1&limitstart=0
 
Last edited by a moderator:
Maybe he's thinking of 1080p, 16xMSAA, FP32. :p
1080p at 60fps at 16x supersampling is 8 GB/s. Granted that was for 4 bytes per pixel colour only. But even with whatever crazy rendering modes you want, and assuming there's improvement in technology to allow for smarter polygons and anti-aliasing, 1 TB a second would give so much BW for drawing 1080p eDRAM would be totally redundant, unless you want PS2 levels of overdraw!
 
I've read with interest the both threads about software rendering and swiftshaders.

It looks like it will take quiet some time before cpu are able to catch with current igp.
A note about the swiftshader thread It would be interesting to See the result with phenom as they have access to more bandwidth than Intel chips but have tinier caches.

I've also read threads/articles about Atom, and there is nothing to right home IHMO.

On my way I read a lot of Arun comments (lately on the arstechnica forum) and I think he's right (from my tiny window of limited knowledge) larrabee is likely to get bash both for graphics and gpgpu calculations.

So at this time (some huge surprised aside) we don't have to bring IP in the discussion to dismiss the eventuallity of a larrabee based system (acting both as cpu and gpu).

My feel is does that mean that the whole concept is wrong?

It also looks like Arun have shown a lot of interest for last ambric and powerVR technology.
Interestingly, both seems use bunch of MIMD machines.

There also R&D consideration the cost of R&D per mm² increase, having a bunch of homogenous cores helps here I guess, It's easier to optmize one core for thermal dissipation, size, etc. and duplicate it at will than to have to design a lot of differents cores//executionnal units.

At this point a lot of people will feel like, what is your point?
Gpu have dumbed vertex and pixel processing for unified design and replace them by do it all executionnal units, etc. GF8800 are made of 16 "processors" (it looks like etymology have been discussed a lot lately :LOL: ) lower end part are made of 8, 4, etc. processors.

Then I read something even more interesting, Arun in some thread said that the serial part of the workload in game is or IO bound or not that demanding.
He even went a little further and for me he seemed to imply that gpu could a level of flexibility and pref that could good enough (in the same way as the larrabee could offer enough single thread performance).

I may have misunderstood but it seems to match his interest for MIMD based gpu from powerVR or Ambric technology.

So what I find funny is that we have considered a lot of possibilities, and some even considered sli setup but what about a sli design without cpu if gpu have evolved enought.

CPU are doomed! :LOL:

Ps I'm not really serious and I don't know if this would be possible, but some Arun's comments have really grab my imagination... and yes I've too much free time...
 
Last edited by a moderator:
Here are my predictions for the next gen consoles:

Common to all platforms:

100% Digital game distribution in markets that have the network infrastructure to support it.. possibly different SKUs to address the different markets.
Large amounts of flash ram to provide sufficient local storage for downloaded content.

PS4:

CPU -> based on a cell, evolution of current technology. Nothing will be introduced that will require Sony to spent large amounts of $$$ on R & D. I think they learned their lesson with PS3
GPU -> design will be outsourced to whatever company provides the best solution at the time
4GB ram

xbox will probably be similar.

Waaay the future (like 2018 timeframe) I expect that PS5 will ditch local storage all together and just stream content from servers directly, greatly reducing the cost of the console. This would obviously require the average household to be equipped with 100/100 lines though, which is fairly reasonable to expect by 2018 IMO.
 
Lol, Nintendo has sold 22 million consoles, they havent taken anything (apart from a good start), Lets not forget here that before we had little girls, grannies, and parents enjoying their brain training and my little pony games, there was a 150 million console industry in the last gen. So how on earth can you say that nintendo has 'taken it', im sorry but it *is* there for the taking.

Yes, comparing the 100+ PS2 sales 7 years later with the Wii's one year+ sales is completely fair.

Sony can talk about their '10 year plan' til the cows come home, but manufacturers do not decide how long their product lives, the market does. Sure, Sony CAN wait til 2016 before they pull the plug on the PS3, but there's no guarantee that it will be selling anywhere near PS2's sales (hell, even half of that by 2016 would be surprising given the current trend), but it would be about as smart as MS letting the original Xbox stay on the market this long.
 
Yes, comparing the 100+ PS2 sales 7 years later with the Wii's one year+ sales is completely fair.

Im not really sure whats prompted you to say that. Im not trying to compare consoles, Im highlighting how big the industry is and that its still wide open for the taking considering the market has expanded from the PS2 gen. 22 million consoles sold is not enough on the scheme of things to prove that nintendo has taken the market, as sugggested by a previous post made by assen which you can re-read from here if you it helps make it clearer for you.

http://forum.beyond3d.com/showpost.php?p=1141006&postcount=404

Sony can talk about their '10 year plan' til the cows come home, but manufacturers do not decide how long their product lives, the market does. Sure, Sony CAN wait til 2016 before they pull the plug on the PS3, but there's no guarantee that it will be selling anywhere near PS2's sales (hell, even half of that by 2016 would be surprising given the current trend), but it would be about as smart as MS letting the original Xbox stay on the market this long.

I agree that market dictates product life cycle, But I also think you stand a better chance of extending a products life if it has features that future proof it. And the inital point I was making,and re-iterating (again) is that sony have at least afforded themselves a chance of doing that with BR. No ones garaunteeing anything. This aspect of the discussion is becoming too repetitive now, so I think we should move on to other aspects, and look at the market perspective in a years time. Hopefully we will have a clearer picture of things then, and more ground with in which we can have meaningful discussion.
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top