Predict: The Next Generation Console Tech

Status
Not open for further replies.
That slide looks pretty low-rent.

The styling doesn't seem to match other IBM slides I've seen.
The IBM logo is rotated and pasted at the top corner for no good reason and probably violates a PR style guide or two.
(edit: apparently they do rotate the non-blue version)
There's an "enhancement" VMX unit.
 
One quick rule of thumb is to assume IBM has a better vocabulary than most net trolls with MS Paint.

If they wanted a company with professional PR that has stupid innacuracies and typographical errors, they should have done a PS4 fake slide from AMD. (Mostly joking, but AMD has had some pretty poor slides and documents before.)
 
good one
much ado about nothing then

lol I was shocked, because 80 mo edram for the CPU dosent make any sense....it is the GPU which needs a lot of edram not the CPU :LOL: and after the resolution problems due to unsufficient xenos edram (you need tiling) and the costs of edram, I doubt microsoft would go the same route again....sony abandoned edram after ps2, and I predict microsoft will do the same for its nextxbox.
 
If this is fake is possible that who started the original rumor about the 16 thread has seen this, and so we are talking about nothing?

If it's not the xcpu, what is? a gpu?
 
lol I was shocked, because 80 mo edram for the CPU dosent make any sense....it is the GPU which needs a lot of edram not the CPU :LOL: and after the resolution problems due to unsufficient xenos edram (you need tiling) and the costs of edram, I doubt microsoft would go the same route again....sony abandoned edram after ps2, and I predict microsoft will do the same for its nextxbox.
Dependwhat you want to do, DICE clearly stated in one of their presentation that they would want "the simulation" to be accessible both to the CPU and GPU.
EDRAM might allow fast access to bth the GPU and CPU.

Edram is not only good for bandwidth, actually bandwidth is less of a problem than it was by ps360 launch modern GPU does great with few bandwidth. Look at an A8 (not because of rumors by itself) it out do the ps3 in every regard with 28GB/s actually +20GB/s when mem controller efficiency is taken in account.


Anyway it was a fake.
 
at least has been a trill in this boring day
without the original image i would have started wild dreams and speculation :D
 
Edram is not only good for bandwidth, actually bandwidth is less of a problem than it was by ps360 launch modern GPU does great with few bandwidth.

Look at an A8 (not because of rumors by itself) it out do the ps3 in every regard with 28GB/s actually +20GB/s when mem controller efficiency is taken in account.

I think it is true that GPUs became very efficient these last years and require less bandwidth for the same operations running in ancient GPU designs, but if thats true for a lot of types of rendering, it is not true for at least 1 type of rendering : texture rendering,

high rez textures even with all the new GPU texture compression technologies still require a lot of bandwidth to render. I bet a game rendering 4k*4k textures would kill instantly any modern GPU not benefiting from mamooth +200 GO/s bandwidth...hell, even 2k*2k crysis, battlefield and metro 2033 high rez texture packs are a nightmare for any GPU lacking bandwidth. Thats why consoles have always suffered from the syndrom of low rez textures, because bandwidth is very expansive, it is made scarce in console hardware.

sony tried to solve this problem with its ps2 4mo graphics synthesizer fast 48 GO/s edram, but discovered that if edram solves the bandwidth rendering issue, because it is expensive this comes at the expense of quantity of memory. Microsoft faced the same challenge with its xbox360 and its unsufficient 10 MO edram.

My conclusion is that edram is not at all relevant today for next gen consoles and microsoft learned its lesson and will abandon it for its nextxbox like sony did before with ps3. modern GPUs became very efficient in doing more with less bandwidth except for texture rendering, which requires not only a lot of bandwidth but a lot of quantity of memory too, and edram cannot solve this problem. the silicon budget would be better spent on more transostors logic, or more faster RAM.
 
but this can be said about any fast RAM ;)
Not really CPU are latency sensitive way more than the GPU, if you want the CPU to do something related to rendering it could have made sense to have the data always at hand instead of 1000 cycles away in RAM. Even if the edram is super slow that would a good order of magnitude faster than the RAM wrt latencies.

There is also you don't get if there is two memory pools? The vram would still be unlikely to be coherent with main memory. If you want to access anything it would take ages (moving data, etc.).

If there is only one memory pool either you have something akin to the 360 and the RAM is ages away from the CPU. MS stated high latency is hurdle to convenient use of GPGPU / fine interaction between the CPU and the GPU.
Either way you have a SoC (or 2).

Anyway this rumor is fake not much point discussing this further.
 
Microsoft faced the same challenge with its xbox360 and its unsufficient 10 MO edram.
MS's eDRAM has no bearing on texture quality, other than freeing framebuffer bandwidth from the system RAM. With mipmapping, high-res textures should only exert bandwidth demands on the highest mip level, but then you'll be loading in less other textures because the high-res texture is filling a larger part of the screen. So I don't see that higher resolution textures require more BW. You just need more RAM. Of course, more textures, with multiple layers and a higher quality mip level will impact rendering performance.
My conclusion is that edram is not at all relevant today for next gen consoles...
There's a whole thread dedicated to that discussion.
 
I think it is true that GPUs became very efficient these last years and require less bandwidth for the same operations running in ancient GPU designs, but if thats true for a lot of types of rendering, it is not true for at least 1 type of rendering : texture rendering,

high rez textures even with all the new GPU texture compression technologies still require a lot of bandwidth to render. I bet a game rendering 4k*4k textures would kill instantly any modern GPU not benefiting from mamooth +200 GO/s bandwidth...hell, even 2k*2k crysis, battlefield and metro 2033 high rez texture packs are a nightmare for any GPU lacking bandwidth. Thats why consoles have always suffered from the syndrom of low rez textures, because bandwidth is very expansive, it is made scarce in console hardware.

sony tried to solve this problem with its ps2 4mo graphics synthesizer fast 48 GO/s edram, but discovered that if edram solves the bandwidth rendering issue, because it is expensive this comes at the expense of quantity of memory. Microsoft faced the same challenge with its xbox360 and its unsufficient 10 MO edram.

My conclusion is that edram is not at all relevant today for next gen consoles and microsoft learned its lesson and will abandon it for its nextxbox like sony did before with ps3. modern GPUs became very efficient in doing more with less bandwidth except for texture rendering, which requires not only a lot of bandwidth but a lot of quantity of memory too, and edram cannot solve this problem. the silicon budget would be better spent on more transostors logic, or more faster RAM.
I'm not sure that rendering textures is the proper wording here nor that you understand it properly ( I don't understand perfectly either, far from it but some stuff you say don't add up).

The bigger the texture the more fetch you have to do, the cache efficiency may go down to.
Still I'm not sure that wrt to bandwidth available to modern GPU pure memory bandwidth is the bottleneck.

Texture are not store in EDRAM in the 360. And if it were to store many 4kx4k textures it would get full fast. I can assure you that edram or not in next generation system the design goal won't be to store textures.

As Fehu said virtual texturing is a solution to the problem of RAM usage by textures. Things like AMD PRT (for partially resident textures) which provides hardware support for it, is somehow a proof that is where the industry is heading.
 
MS's eDRAM has no bearing on texture quality, other than freeing framebuffer bandwidth from the system RAM.

Thats my point, it was the idea of Microsoft for their xbox360 to solve the ps2 edram problem, using edram only as a front buffer, and not at all for textures (using the 512 mo unified gddr for textures), but even this solution failed, even 10 mo for only basic front buffer operations was unsufficient to sustain any 720p+2AA image without tiling. they needed at least 14 mo of edram. but thats the idea : for edram bandwidth to solve rendering problems, you need a lot of edram which is too expensive and is against the logic of going with edram...

With mipmapping, high-res textures should only exert bandwidth demands on the highest mip level, but then you'll be loading in less other textures because the high-res texture is filling a larger part of the screen. So I don't see that higher resolution textures require more BW. You just need more RAM. Of course, more textures, with multiple layers and a higher quality mip level will impact rendering performance.

I dont have enough technical knowledge on why high rez textures need a lot of bandwidth and not only more quantities of RAM, it is just my observation of modern GPU benchmarks. But if anyone here supporting the idea I would be glad to learn from him the technical reasoning behind this ! :D
 
Status
Not open for further replies.
Back
Top