720P with FSAA is 14/28 MB for 2x/4x. I'm not sure what Megafenix is talking about, because it is the same everywhere, but you would have to tile with FSAA.
Do we know how much eDram is there?
In Wii U? It's 32MB.
720P with FSAA is 14/28 MB for 2x/4x. I'm not sure what Megafenix is talking about, because it is the same everywhere, but you would have to tile with FSAA.
Do we know how much eDram is there?
Do you mean on the WiiU? I think it has 32 MB of EDRAM according to rumours, iirc, but perhaps I am mistaken.720P with FSAA is 14/28 MB for 2x/4x. I'm not sure what Megafenix is talking about, because it is the same everywhere, but you would have to tile with FSAA.
Do we know how much eDram is there?
http://mynintendonews.com/2013/09/2...if-devs-cant-create-good-looking-wii-u-games/“The Wii U eDRAM has a similar function as the eDRAM in the XBOX360. You put your GPU buffers there for fast access. On Wii U it is just much more available than on XBOX360, which means you can render faster because all of your buffers can reside in this very fast RAM. On Wii U the eDRAM is available to the GPU and CPU. So you can also use it very efficiently to speed up your application.”
“The 1GB application RAM is used for all the games resources. Audio, textures, geometry, etc. Theoretical RAM bandwidth in a system doesn’t tell you too much because GPU caching will hide a lot of this latency. Bandwidth is mostly an issue for the GPU if you make scattered reads around the memory. This is never a good idea for good performance.”
“I can’t detail the Wii U GPU but remember it’s a GPGPU. So you are lifted from most limits you had on previous consoles. I think that if you have problems making a great looking game on Wii U then it’s not a problem of the hardware.”
Do you mean on the WiiU? I think it has 32 MB of EDRAM according to rumours, iirc, but perhaps I am mistaken.
Developers Shin'en have been praising the WiiU as of recently, and they are very happy with how it performs. They speak highly of the console and don't quite understand how other developers can't use it as effectively and make justice to the platform.
http://mynintendonews.com/2013/09/2...if-devs-cant-create-good-looking-wii-u-games/
Shin'en said:On Wii U the eDRAM is available to the GPU and CPU. So you can also use it very efficiently to speed up your application.
Thats good to know, right?
It's worth remembering that Shin'en are effectively 2nd party Nintendo developers, although I don't personally think that changes the validity of what they are saying. They're a notoriously talented bunch of devs who (outside Shin'en) have a tonne of experience on all kinds of platforms. Its always great to hear their insights as they actually give out quite a bit of info, plus they have a reputation for squeezing the most out of whatever platform they're working on. As some hip gangsta forum member once put it, they are "old school devs" who are known for their "tight coding", yo. (I'm paraphrasing, but that was the gist)
I think it's worth remembering that there are "old school devs" capable of "tight coding" that are senior engineers in all big studios and all big engine makers.
Shinen are currently getting fellated by the Nintendo crowd because they need a hero to believe in, while other talented developers are shit on because they don't support the narrative that morons want to propel. The truth is that all developers are constrained by the realities of their situation. Vague but positive comments - when comparing the Wii U to a seven your older system ffs(!!!!) - don't really tell us much. They just help to promote that particular studio and their game to a userbase (understandable), but the flames of delusion can be fanned as collateral damage.
I think we more or less knew this, fail0verflow (Marcan) revealed a long time ago that there's 32MB of eDRAM mapped in the CPU's address space. Although that doesn't really say just how fast the CPU's access to it is. I wouldn't count on a really high speed bus between the CPU and GPU, but who knows.
If I remember correctly the 32MB was questioned at some point to lever the 55 (and 45?) nm manufacturing node hypothesis. But yes, most of it was known. It would be nice if they threw out some info on the shaders they use and polycounts.
[EDIT] BTW is 32MB@45nm possible given the die area?
I think we more or less knew this, fail0verflow (Marcan) revealed a long time ago that there's 32MB of eDRAM mapped in the CPU's address space. Although that doesn't really say just how fast the CPU's access to it is. I wouldn't count on a really high speed bus between the CPU and GPU, but who knows.
I remember on PSP the VRAM (also eDRAM) was directly accessible by the CPU as well, and lower latency than the main memory.. I tried using it for normal buffers once but it didn't really help me ;p
Wii U CPU has total of 3MB of eDRAM as L2 Cache and 32MB eDRAM is embedded in Wii U's GPU so Wii U's CPU could not theoretically have access to GPU's eDRAM pool and they are in seperate chips so can anyone explain me how CPU can have direct access to GPU's eDRAM?
http://en.wikipedia.org/wiki/POWER7
IBM mentioned Power7 out of nowhere involving Wii U, later it was "debunked" except Power7 can use eDRAM as L3 cache and 32MB is supported though it should be on the same chip, right? Also max eDRAM per core as L3 cache was 4MB so it is possible that Wii U has something from Power7 implemented and we know Wii U CPU is made at IBM at 45nm process as Power7 chips. This gives the posibility, right?
Was Xbox 360 and PlayStation 3 CPU's mixture of PowerPC and Power series?
Anyway after the latest update, some users are reporting louder/faster fans and that their Wii U's are warmer and even near hot. Can anyone measure power consumption of the Wii U right now, if it is above 30 watts then it should point out bumped clocks and would confirm it if it goes above 40 watts.
Wii U's GPU must be Radeon HD 5550, I know some people will disagree though Call Of Duty Black Ops 2 Local Co-op zombie mode proves that it has eyefinity and there are evidence/hints:
http://www.ign.com/boards/threads/o...ower-thread.452775697/page-203#post-481660123
So... I am new in this forum so please, don't be harsh.
Been out of the loop for a good long while with regard to what's been discussed about Wii U, and what has been discovered nearly a year after release.
I was wondering if the level of quality seen in the updated UE3 demo from 2010 would be possible on Wii U at 720p/30fps ?
http://www.youtube.com/watch?v=kjxL_J_7j9M
I realize UE4 is far beyond the scope of its capability but since UE3 is fine, why not the improved version.
Direct access means that it is mapped in the CPU's address space, not that there is a direct data bus between CPU and eDram if that's what you meant.Wii U CPU has total of 3MB of eDRAM as L2 Cache and 32MB eDRAM is embedded in Wii U's GPU so Wii U's CPU could not theoretically have access to GPU's eDRAM pool and they are in seperate chips so can anyone explain me how CPU can have direct access to GPU's eDRAM?
Latte's bandwidth is 563.2GB/s thanks to 8192 bits 32MB eDRAM from Renesas clocked at 550Mhz and it embedded into GPU.
Latte's bandwidth is 563.2GB/s thanks to 8192 bits 32MB eDRAM from Renesas clocked at 550Mhz and it embedded into GPU..
Why would Nintendo want so much BW? Makes zero sense to have such a wide bus.Latte's bandwidth is 563.2GB/s thanks to 8192 bits 32MB eDRAM from Renesas clocked at 550Mhz and it embedded into GPU.