Predict: The Next Generation Console Tech

Status
Not open for further replies.
Why would anyone want so much extra rendering bandwidth? You're saying a TB a second BW for 1080p isn't enough?! :oops:

Why would anyone NOT want that much rendering bandwidth?


I am not the first one, or the only one, that thinks TB or multi TB bandwidth will be needed for future 3D games.

http://www.tomshardware.co.uk/xdr2-to-quintuple-memory-data-transfer-speeds-by-2007,news-16613.html

"The bandwidth requirements of game platforms and graphical applications have been growing exponentially," Steven Woo, Rambus' senior principal engineer at Rambus, told Tom's Hardware Guide. "About every five or six years, it goes up by a factor of 10.

A statistical projection made in 2004 by NVIDIA's Vice President of Technical Marketing, Tony Tamasi-- cited by Woo--anticipates that a top-of-the-line 3D game could conceivably require memory bandwidth of 3 TByte per second.
 
Could anyone dig up this statistical example or what time frame they were talking about? I'd bet that it was probably stated while trying to drum up excitement about a new compression and hsr scheme :p
 
Why would anyone NOT want that much rendering bandwidth?
Because it adds cost at no perceivable benefit. If it came free, sure. But you'd have to sacrifice GPU or CPU transistors to add eDRAM.

I am not the first one, or the only one, that thinks TB or multi TB bandwidth will be needed for future 3D games...
At what resolution and quality settings? 128 bit HDR?! Ignoring what predictions someone made, if you think about 1080p at 4x MSAA, HDR, a terabit system bandwidth would happily accommodate it. More than that is overkill, an imbalance in the system that will take more away from the final experience than it would contribute.
 
Bandwidth, if nothing else, can be used to stream high-res textures. Something that any game can benefit from. I'll take today's games with textures 10x sharper (and better variety thanks to more ram) and any accompanying generational improvements, thank you very much. Toss in a large storage system for content of at least 1TB or allow the end-user to upgrade with perhaps a multi drive system (or just hope HD/flash manufacturers don't hit some ~1TB wall). Full HD for all games. Motion-sensing control system using hi-def cam system. Traditional wireless sixaxis w/rumble or some variation. Give game devs enough toys to play with for at least 5 years, and watch the games and the profits flow.
 
Bandwidth, if nothing else, can be used to stream high-res textures. Something that any game can benefit from.
1) You need the storage for those high-res textures

2) Megadrive1988's 3 TB figure is for eDRAM for rendering, not texturing. We are not going to get 3 TB main RAM bandwidth! We'll be lucky to get the TB figure.
 
I wonder what would the latency be. I somehow doubt it would be <100ms on average.

It would undoubtedly be lower then 100ms. But I think my original post was a little off, we'd need Gb lines to actually make this work I think, maybe even more. The amount of data that would have to be transfered will be way higher than a internet connection could provide. Maybe the next next gen Wii will take this approach because of the super low costs and ease of use but the next PSX and Xbox most definitely will not.
 
Just saw this. Perhaps racetrack could be the answer for the next generation of consoles, although it would be cutting it close whether they can have it working by then.
 
1) You need the storage for those high-res textures

2) Megadrive1988's 3 TB figure is for eDRAM for rendering, not texturing. We are not going to get 3 TB main RAM bandwidth! We'll be lucky to get the TB figure.


Of course, obviously, it wasn't *my* figure, it was Nvidia's Tony Tamasi figure, circa 2004.

And yeah I pointed out that main system memory bandwidth won't be in the TB/sec ballpark, but should be in the hundreds of GB/sec range. While eDRAM rendering bandwidth should be in the TB or multi TB/sec, given that it's already at 1/4 of a TB/sec (256 GB/sec) with Xbox 360's eDRAM. Microsoft would "only" need a factor of 12x increase from where they are now to reach that 3 TB/sec figure Nvidia mentioned almost 4 years ago.
 
My half a cent:

First off, I believe the approach Sony took with the PS3 will pay off in the long run. The console had a rough start, but it's doing fairly well at the moment. Once some of the heavy hitters for the system are out, and once it's had one or more price cuts under its belt, forgeddaboutit. It'll do well. With that assumption in place, I don't believe Sony will be so turned off by the PS3's big up-front investment as to turn 180 degrees and attempt to pull a Wii for their next console.

The two most exotic/expensive components in the system were the BD drive and the Cell, correct? By the time PS4 comes around, BD drives will be dirt cheap, and Cells will hopefully also be more entrenched and have benefitted from economies of scale (assuming they actually start being used in TVs and other devices). I think this would afford Sony the abillity to load up the PS4 with nice goodies and still have it retail for less than $450.
A setup like this maybe, just pure uneducated speculation and whishful thinking :smile::

multiple Cells (would leverage the existing experience of PS3 devs)
custom nvidia part
4GB system ram
1-2GB vram
???MB edram
BD drive
standard or laptop sata hd
10Gb lan
external sata port too perhaps?

Cheers.
 
I'm really wonder lately about open a topic about this.

Cell is cool, but for most workloads it looks like it get bashed by GPU.
ANd about leveraging devs experiences about about all the midlewares coming to the gpu?

This should deserve a topic but in fact I have no hope that it would be interessing for obvious reasons.
There is a lot of others topics on this board to see what could be the most clever design choices for MS, Sony, nintendo.
 
IMO, i dont think PS4 wil have a GPU, personally i can see it being 100% Cell based :smile:
Both sony and IBM would have to sink an incredible amount of money both on hard and software to make this possible.
IBM has no point doing so.
 
I'm really wonder lately about open a topic about this.

Cell is cool, but for most workloads it looks like it get bashed by GPU.

What "most workloads"...?

GPUs are good for graphics related processing & thats about it.. Cell being much more general purpose Can handle pretty much anything else.. & the Cell can even process graphics tasks pretty adaquately too..
 
What "most workloads"...?

GPUs are good for graphics related processing & thats about it.. Cell being much more general purpose Can handle pretty much anything else.. & the Cell can even process graphics tasks pretty adaquately too..

Gpu are likely to handle physic, animation, AI better than the cell.

See the port of Ageia on current Nvidia GPU (current being the important word).

Gpu and tools will be readily available for that kind of works in the coming years, more this will help portabilty between PC and consoles.

By the way I never say that the cell is "bad" or something like taht... I just question some certainies about what Sony should do and as I stated discussion will prove impossible.
 
Gpu are likely to handle physic, animation, AI better than the cell.

See the port of Ageia on current Nvidia GPU (current being the important word).

Gpu and tools will be readily available for that kind of works in the coming years, more this will help portabilty between PC and consoles.

By the way I never say that the cell is "bad" or something like taht... I just question some certainies about what Sony should do and as I stated discussion will prove impossible.

I'm pretty sure the Cell is MUCH faster than most GPUs at Physics & AI & depending on the kind of animation you're doing, could run much faster on Cell too..
 
I'm pretty sure the Cell is MUCH faster than most GPUs at Physics & AI & depending on the kind of animation you're doing, could run much faster on Cell too..
Cell is not much faster at folding, particles I would say otherwize.

I hope some more educated members will give their opinion about this.

But I'm happy that we can discuss this further ;)

From the begining of this thread almost nobody has dare to question the fact that the next Playstation has to be "cell based".

But think about this:

Gpu will elvolve great in the coming years especially as the GPU manufacturers want to get their feet in the high marging market.

I think that is more than fair to state that future GPU will be on par with cell in regard to perfs/mm² on heavily parallele workloads.

A lot of money will be put on the software side.

All this can lend to very very R&D few for consoles manufacturers.
GPU will enjoy higher volume.
GPU will benefit from more tools.
Intel owns havocs and Nvidia ageia AMD/ATI is laking here, nut these vendorss are likely to make their soft works better on their hardware (no matter if it's free to use).

Economically "cell based" doesn't make that much sense (hardware or software).

And for ease of development?
I guess it would be easier to have some cores optimised for serial workloads and only one kind of multi-purpose core/multiprocessor intended to accelerate parallel workload (whatever marketer call it, instead of BS :lol ) than some "multi processors" and a pool of spu.

In fact as KK has left now, I wouldn't be surprised if SOny choose a less proprietary hardware.


NB, I think this is true for all three manufacturers.
Differentiator next gen could be more, the size of the package, silicon budget, input than technical/design choices.
 
Cell is not much faster at folding, particles I would say otherwize.

Gpu will elvolve great in the coming years especially as the GPU manufacturers want to get their feet in the high marging market...A lot of money will be put on the software side...Economically "cell based" doesn't make that much sense (hardware or software).
Cell development has existed for a couple of years now. Add another couple of years development before GPU's are really offering effective all-purpose processing, and the reality is Cell will probably be in a much stronger development position then GPU's.

GPU's aren't going to offer a seamless, easy development system. Every new multicore architecture that's doing parallel processing is going to face the same issues. A Cell-based PS4 will offer in effect the same programming difficulties as Wii did this gen...none ;) Unless there's a radical shift, which is unlikely, code can be copied over exactly from PS3 to PS4. This maintains BC, ease of development, while more cores etc. provides an excellent scaling of the already developed techniques (assuming developers have got to grips with scheduling systems and work distribution models). Compare that to designing your engines from scratch for whole new hardware, and the advantage is obvious.
 
Status
Not open for further replies.
Back
Top