16x from 90nm to 32nm? so much is possible?
but isn't ibm soi edram really different from the one in the daughter die?
http://blogs.gartner.com/andrew_white/files/2010/10/IBM-IOD-Information-Supply-Chain.JPG
Looks to be shopped from that. FAKE
http://blogs.gartner.com/andrew_white/files/2010/10/IBM-IOD-Information-Supply-Chain.JPG
Looks to be shopped from that. FAKE
good one
much ado about nothing then
Dependwhat you want to do, DICE clearly stated in one of their presentation that they would want "the simulation" to be accessible both to the CPU and GPU.lol I was shocked, because 80 mo edram for the CPU dosent make any sense....it is the GPU which needs a lot of edram not the CPU and after the resolution problems due to unsufficient xenos edram (you need tiling) and the costs of edram, I doubt microsoft would go the same route again....sony abandoned edram after ps2, and I predict microsoft will do the same for its nextxbox.
Dependwhat you want to do, DICE clearly stated in one of their presentation that they would want "the simulation" to be accessible both to the CPU and GPU.
EDRAM might allow fast access to bth the GPU and CPU.
.
It's nothing, it's just fake slide with no relevance to realityIf it's not the xcpu, what is? a gpu?
Edram is not only good for bandwidth, actually bandwidth is less of a problem than it was by ps360 launch modern GPU does great with few bandwidth.
Look at an A8 (not because of rumors by itself) it out do the ps3 in every regard with 28GB/s actually +20GB/s when mem controller efficiency is taken in account.
Not really CPU are latency sensitive way more than the GPU, if you want the CPU to do something related to rendering it could have made sense to have the data always at hand instead of 1000 cycles away in RAM. Even if the edram is super slow that would a good order of magnitude faster than the RAM wrt latencies.but this can be said about any fast RAM
MS's eDRAM has no bearing on texture quality, other than freeing framebuffer bandwidth from the system RAM. With mipmapping, high-res textures should only exert bandwidth demands on the highest mip level, but then you'll be loading in less other textures because the high-res texture is filling a larger part of the screen. So I don't see that higher resolution textures require more BW. You just need more RAM. Of course, more textures, with multiple layers and a higher quality mip level will impact rendering performance.Microsoft faced the same challenge with its xbox360 and its unsufficient 10 MO edram.
There's a whole thread dedicated to that discussion.My conclusion is that edram is not at all relevant today for next gen consoles...
I'm not sure that rendering textures is the proper wording here nor that you understand it properly ( I don't understand perfectly either, far from it but some stuff you say don't add up).I think it is true that GPUs became very efficient these last years and require less bandwidth for the same operations running in ancient GPU designs, but if thats true for a lot of types of rendering, it is not true for at least 1 type of rendering : texture rendering,
high rez textures even with all the new GPU texture compression technologies still require a lot of bandwidth to render. I bet a game rendering 4k*4k textures would kill instantly any modern GPU not benefiting from mamooth +200 GO/s bandwidth...hell, even 2k*2k crysis, battlefield and metro 2033 high rez texture packs are a nightmare for any GPU lacking bandwidth. Thats why consoles have always suffered from the syndrom of low rez textures, because bandwidth is very expansive, it is made scarce in console hardware.
sony tried to solve this problem with its ps2 4mo graphics synthesizer fast 48 GO/s edram, but discovered that if edram solves the bandwidth rendering issue, because it is expensive this comes at the expense of quantity of memory. Microsoft faced the same challenge with its xbox360 and its unsufficient 10 MO edram.
My conclusion is that edram is not at all relevant today for next gen consoles and microsoft learned its lesson and will abandon it for its nextxbox like sony did before with ps3. modern GPUs became very efficient in doing more with less bandwidth except for texture rendering, which requires not only a lot of bandwidth but a lot of quantity of memory too, and edram cannot solve this problem. the silicon budget would be better spent on more transostors logic, or more faster RAM.
MS's eDRAM has no bearing on texture quality, other than freeing framebuffer bandwidth from the system RAM.
With mipmapping, high-res textures should only exert bandwidth demands on the highest mip level, but then you'll be loading in less other textures because the high-res texture is filling a larger part of the screen. So I don't see that higher resolution textures require more BW. You just need more RAM. Of course, more textures, with multiple layers and a higher quality mip level will impact rendering performance.