Predict: The Next Generation Console Tech

Status
Not open for further replies.
Couldn't have said it better myself :)

Here's a good classic white paper (made by Sony R&D in 2009):
http://harmful.cat-v.org/software/O...ls_of_Object_Oriented_Programming_GCAP_09.pdf

Slides 17 and 18 are especially notable. RAM latency in cycles is now 400x more than in 1980 (comparison between PS3 and probably the first x86 PCs). Same is true for memory bandwidth relative to CPU ALU performance. And the gap is widening all the time. Memory performance is now the most important thing when you are designing efficient algorithms, both for CPU and GPU (both are memory starved now, and both will be even more in the future).

Surely this depends on how far IDM, AMD & Intel can take 3D memory stacking, through-silicon-vias technology & the like going forward?

I'd imagine if any technologies are going to radically improve memory latencies to and from RAM it will be these.
 
I see the best use of huge amounts of RAM for user deformable/adjustable levels. You'd have enough space to record lots of structural change, meaning bodies that stay around, walls that get destroyed, all persistant.
 
I see the best use of huge amounts of RAM for user deformable/adjustable levels. You'd have enough space to record lots of structural change, meaning bodies that stay around, walls that get destroyed, all persistant.

sorry but that could be done within a single megabyte (edit) for 100 enemies;

10 if body is > 100m away goto 20
20 round body orientation to 1 of 32 directions (example) goto 30
30 round body location to 1mm2 goto 40
40 set body damage to one of the presets
50 goto 10

of course the body interacts with the environment, but the physics only get calculated 1 time, unless there is interaction going on.

so yeah you save on both memory and cpu time.

They could have 'next gen' right now with 200MB of ram
 
I don't see MS using anything but BluRay as optical media.
Anything MS choose would have to have sufficient production capacity to produce 5M+ discs in < 30 days, and be price competitive with BluRay.
That pretty much rules out any alternative.


My understanding about the GE technology is the bulk of it is very similar to Blu-Ray. It's is the material of the disc that allows the volume of the disc to be utilized. GE is trying to leverage the infrastructure of Blu-Ray with its design...at least what I conclude from the information I've read.
 
My understanding about the GE technology is the bulk of it is very similar to Blu-Ray. It's is the material of the disc that allows the volume of the disc to be utilized. GE is trying to leverage the infrastructure of Blu-Ray with its design...at least what I conclude from the information I've read.

Still doesn't mean it will ever be cost competitive with Bluray, which is an already widely established technology. What gain would MS or anyone else have to gain going with anything other than Bluray, unless they just want to limit their potential installed base with a DD-only console?
 
Still doesn't mean it will ever be cost competitive with Bluray, which is an already widely established technology. What gain would MS or anyone else have to gain going with anything other than Bluray, unless they just want to limit their potential installed base with a DD-only console?

Better graphical quality. I read somewhere the art team at ID was crushed by the compression ratio used on RAGE and the rumored high quality patch was supposedly around 150 gigabytes.

You have Square Enix talking about their next-gen engine and how a single Blu-Ray might not be enough.

"Of course, it's too massive of a data to use in a game as-is, but I think the look and feel will probably remain," he went on. "If we had time, we could've compressed the data even smaller. We didn't have time to do that, so we just used the same master data - but it can definitely be reduced."

Asked by the site whether fitting everything into currently used disc formats was a struggle, Hashimoto confessed: "Yeah, that could be a challenge. There's a possibility that just one Blu-ray may not be sufficient."

http://www.oxm.co.uk/43256/square-enixs-next-gen-tech-may-be-too-big-for-blu-ray/


Next-gen hasn't even begun but developers are already exceeding a 4-layer Blu-ray.


The other major aspect is security, a new optical drive should be more secure for at least a little while. That matters to publishers and of course Microsoft.
 
sorry but that could be done within a single megabyte (edit) for 100 enemies;

10 if body is > 100m away goto 20
20 round body orientation to 1 of 32 directions (example) goto 30
30 round body location to 1mm2 goto 40
40 set body damage to one of the presets
50 goto 10

of course the body interacts with the environment, but the physics only get calculated 1 time, unless there is interaction going on.

so yeah you save on both memory and cpu time.

They could have 'next gen' right now with 200MB of ram

Stop thinking physics & start thinking rendering & I'm sure you'll see why you could never achieve what Shifty proposed with such little resource available...
 
Better graphical quality. I read somewhere the art team at ID was crushed by the compression ratio used on RAGE and the rumored high quality patch was supposedly around 150 gigabytes.

You have Square Enix talking about their next-gen engine and how a single Blu-Ray might not be enough.



http://www.oxm.co.uk/43256/square-enixs-next-gen-tech-may-be-too-big-for-blu-ray/


Next-gen hasn't even begun but developers are already exceeding a 4-layer Blu-ray.


The other major aspect is security, a new optical drive should be more secure for at least a little while. That matters to publishers and of course Microsoft.


Square Enix used CGI assets they could make that a lot smaller which would be the best thing for them to do in these times where DD is going to be a big part of the new consoles. The PS4 rumored specs even has a hardware zlib decompressor which to me says they are preparing for some heavily compressed data & should also help with the smaller amount of Ram.

(Just my thoughts on it could be wrong)
 
Surely this depends on how far IDM, AMD & Intel can take 3D memory stacking, through-silicon-vias technology & the like going forward?

I'd imagine if any technologies are going to radically improve memory latencies to and from RAM it will be these.

Not really. The distance traveled isn't *that* big of an issue -- ram is so far today at least partially because it can be, as the natural latency is so high that a little more doesn't hurt that much.

DRAM is slow because the smaller you make your capacitors, the more effort you have to spend to amplify the signal coming from them to get useful results. Because of this, as DRAM has been scaled down, the sense latency has more or less stayed the same. (Well, it's halved in 15 years, but in this industry that's standing still.) A "perfect" interface could perhaps halve the latency to it, but that's still tape to modern cpus.

To see any real improvement, we need to find completely new ways of storing bits.
 
Isn't this where the discovery of the mythical before a few years ago only to have existed in math memristor might help in the future?
 
The rough hewn specs we have now look competitive, possibly superior to PS4 imo and in the end the end thats all that matters. Not power, but power relative to competition.

The media box stuff as well, easily criticized, but could be quite compelling.

Now if it ends up being a turd sold out for set top box, I'll be the first one screaming trust me.

No you wouldn't :p

hm... Judging by that list of board costs from AMD last year, it seems that 1GB of DDR3 would be about $6.35 (for DDR3-1600/1800 based on the reference clocks for the relevant cards sporting DDR3).

(Obviously not the Newegg price. :p ;) )

5.5Gbps GDDR5 seemed to be 4.47x the cost (per GB relative to DDR3)
4Gbps GDDR5 was about 3.39x.

It's kind of neat... At the time, you'd basically have 9GB DDR3 being about the same cost as 2GB of high speed GDDR5. :p

It is extremely neat, in fact, when you consider what we've heard about these consoles ;)

You have Square Enix talking about their next-gen engine and how a single Blu-Ray might not be enough.

Next-gen hasn't even begun but developers are already exceeding a 4-layer Blu-ray.


The other major aspect is security, a new optical drive should be more secure for at least a little while. That matters to publishers and of course Microsoft.

I think all 3 consoles will be using bog-standard BD-roms, but will handle security in a different manner. For instance, locking purchases to consoles similar to PC services like Origin/Steam/etc.
 
People are kidding themselves if they think Durango and PS4 will be in same "class" if one of them has twice the performance advantage on GPU side. Thats alot of performance. Big difference, even if there was some kind of next gen Cell like vector monster of CPU on the other side.
 
People are kidding themselves if they think Durango and PS4 will be in same "class" if one of them has twice the performance advantage on GPU side. Thats alot of performance. Big difference, even if there was some kind of next gen Cell like vector monster of CPU on the other side.

Which is exactly why no one really believes either GPU will have twice the performance advantage.
 
People are kidding themselves if they think Durango and PS4 will be in same "class" if one of them has twice the performance advantage on GPU side. Thats alot of performance. Big difference, even if there was some kind of next gen Cell like vector monster of CPU on the other side.

That's funny because the Wii U GPU has about twice the performance advantage over the PS3 & Xbox 360 & 3X the RAM & people think they are in the same class.
 
Isn't this where the discovery of the mythical before a few years ago only to have existed in math memristor might help in the future?

It's one contender. With flash scaling expected to end soonish, there is a hundred-billion-dollar stampede going on for finding out it's successor. Players include:

- MRAM (Toshiba, Hitachi, Hynix, IBM, Everspin(Freescale spinoff), Samsung, NEC)
- FeRAM (Ramtron, IBM, TI, Fujitsu, Samsung, Matsushita, Oki, Toshiba, Infineon, Hynix, Symetrix)
- ReRAM (HP, ITRI, IMEC, Panasonic, Rambus)
- CBRAM (NEC, Sony, Axon, Micron)
- PRAM (Intel, IBM, ST Micro, Samsung, Numoxys)
I might have missed a few backers. Also, all names after a technology are not working together, especially in FeRAM there are multiple competing approaches.

Some of these technologies have access times low enough that they will be able to beat DRAM, and act as true "universal memory", where the entire system from the last cache level to the HDD will be built from the same stuff. Some of them are so fast that a large chunk of them pasted on the CPU could really change the way we program.

Note that while everyone is talking about HP's memristors, they are not necessarily the leading candidate. HP just has the best PR. It's still too early to say which tech will win, but if I'd have to bet I'd probably put my money on PRAM.
 
That's funny because the Wii U GPU has about twice the performance advantage over the PS3 & Xbox 360 & 3X the RAM & people think they are in the same class.

Mostly a fair comparison. We're still comparing launch titles to mature AAA titles though. Nintendo 1st party titles also tend not to be bleeding edge visually. But I agree with the essence of your point. Even if PS4 was 2X Durango, you'd really only see big differences in 1st party titles.

It's one contender. With flash scaling expected to end soonish, there is a hundred-billion-dollar stampede going on for finding out it's successor. Players include:

- MRAM (Toshiba, Hitachi, Hynix, IBM, Everspin(Freescale spinoff), Samsung, NEC)
- FeRAM (Ramtron, IBM, TI, Fujitsu, Samsung, Matsushita, Oki, Toshiba, Infineon, Hynix, Symetrix)
- ReRAM (HP, ITRI, IMEC, Panasonic, Rambus)
- CBRAM (NEC, Sony, Axon, Micron)
- PRAM (Intel, IBM, ST Micro, Samsung, Numoxys)
I might have missed a few backers. Also, all names after a technology are not working together, especially in FeRAM there are multiple competing approaches.

Some of these technologies have access times low enough that they will be able to beat DRAM, and act as true "universal memory", where the entire system from the last cache level to the HDD will be built from the same stuff. Some of them are so fast that a large chunk of them pasted on the CPU could really change the way we program.

Note that while everyone is talking about HP's memristors, they are not necessarily the leading candidate. HP just has the best PR. It's still too early to say which tech will win, but if I'd have to bet I'd probably put my money on PRAM.

I'd but my money behind the Intel backed approach too ;)

Universal memory is pretty much the holy grail and would break memory paradigms that have essentially existed since there was a memory hierarchy, so that's quite a claim.

I'm interested in the densities of these techs relative to SRAM and DRAM, and I'm lazy. Do you have quick approximations?
 
Universal memory is pretty much the holy grail and would break memory paradigms that have essentially existed since there was a memory hierarchy, so that's quite a claim.
Note that universal memory does not make the memory hierarchy go away -- because of physics, getting an answer out of a smaller pool of memory is always faster than getting it out of a larger one, even if they are made of the same stuff. Good universal memory would make the hierarchy a much less steep.

I'm interested in the densities of these techs relative to SRAM and DRAM, and I'm lazy. Do you have quick approximations?

As this is upcoming, under-development tech, getting real data is essentially impossible. In configurations with large pages, they are typically quite near NAND flash, that is, quite a bit denser than SRAM or DRAM. When fine-grained random access is desired, you cannot do much better than DRAM (at least without *heavy* use of 3d integration), as the access machinery takes most of the room.
 
That's funny because the Wii U GPU has about twice the performance advantage over the PS3 & Xbox 360 & 3X the RAM & people think they are in the same class.

Because so far the games have looked it, and we lack true confirmed specs that prove it is 2X anyway. What else are we supposed to do?

2X should and will make the games better looking, we havent seen it yet with Wii U. But you're kidding yourself if you think PS3 or 360 having 1GB of RAM instead of 512MB for this whole past generation wouldn't have made a considerable difference. And we're just talking about one parameter. Now what if the GPU was 2X, plus 1GB RAM. Etc.
 
That's funny because the Wii U GPU has about twice the performance advantage over the PS3 & Xbox 360 & 3X the RAM & people think they are in the same class.
I don't know exactly how much faster Wii U (remember it also has tablet to take care of, or two in some cases) is, but 2x in GPU performance is alot. Thats leap where your artists and programmers will have to find ways not to make game look like entirely different one in comparison to another console.
 
Status
Not open for further replies.
Back
Top