loekf
Regular
Nice try, but seriously ... applying H.264 compression to lower the bandwidth without introducing latency ?
480p @ 30 fps = 15.5 MB/s (assuming YUV 420). Raw won't work with 802.11g (though the controller only was 802.11g-like).
I don't think they use h264. The system was explained in a recent Iwata Asks.Nice try, but seriously ... applying H.264 compression to lower the bandwidth without introducing latency ?
480p @ 30 fps = 15.5 MB/s (assuming YUV 420). Raw won't work with 802.11g (though the controller only was 802.11g-like).
http://iwataasks.nintendo.com/interviews/#/wiiu/gamepad/0/1Generally, for a video compression/decompression system, compression will take place after a single-frame of image data has been put into the IC. Then it is sent wirelessly and decompressed at receiving end. The image is sent to the LCD monitor after decompression is finished.
But since that method would cause latency, this time, we thought of a way to take one image and break it down into pieces of smaller images. We thought that maybe we could reduce the amount of delay in sending one screen if we dealt in those smaller images from output from the Wii U console GPU on through compression, wireless transfer, and display on the LCD monitor.
Generally, compression for a single screen can be done per a 16×16 macroblock. And on Wii U, it rapidly compress the data, and the moment the data has built up to a packet-size15 that can be sent, it sends it to the Wii U GamePad.
Anyway, the problem I see here is that, as stated several times, Nintendo themselves highlights the feature. Why should they? They usually simply don't talk about tech, this was a very rare exception. Do you think they fell for some AMD snake oil? That's very hard to believe.
What's the point? With the DS, Nintendo simply didn't talk about the technology. Same with Wii. Same with 3DS. It's their MO for roughly a decade. They usually do not consider technology a selling point. Why should they do it now? Why focus on this particular feature? Why not focus on the eDRAM instead, which is there, will be used, and has obvious benefits? And why highlight the feature in spec sheets aimed at developers who would/ should know better? Not to mention most consumers don't even know what GPGPU means, and neither do they care. Think about it for a second, and you'll probably agree that it makes no sense at all - unless there's more to it.No, but I believe in Occam's Razor. Some consumes have fallen for Nintendo's Snake Oil of GPGPU.
And once again: Nintendo did mention it (I even posted when and where, so just look it up if you don't believe me). Not as a "savior", because there's no indication that the system needs one to begin with, And there definitely wasn't any such indication at the time they mentioned it.There's nothing more to it. Too many people drank too much Nintendo cool-aid. Nintendo themselves have not mentioned GPGPU as being the savoir of the WiiU. It's the NDFers that have been hyping it as some WiiU savoir. Once again, GPGPU is no magical bullet. It is no savoir.
Except the whole GPGPU stuff isn't based on rumors, it's a feature highlighted by Nintendo itself, both in public presentations and in the documentation for developers.
No, you don't understand the reality of the situation. I suggest you reread the posts of other developers and well informed posters such as Sebbi, Erp, and Shifty Geezer. GPGPU is no magic bullet. Please stop with the silly dreams. Yours will just get crushed as it will not amount to anything to vastly improve the performance of the WiiU.
[EDIT: Function, I truely hope you were being tongue-in-cheek sarcastic. If so, it was missed the first time I read your post.]
Of course he is joking.
And once again: Nintendo did mention it (I even posted when and where, so just look it up if you don't believe me). Not as a "savior", because there's no indication that the system needs one to begin with, And there definitely wasn't any such indication at the time they mentioned it.
The scary thing is that I have seen people state that and not be joking.
Not at all. Wuu's GPU is definitely 100% capable of GPGPU. Only we don't know to what degree, and how that helps in games. Xenos and RSX are also capable of GPGPU work - the origins of GPGPU were in using graphics functions on non graphics data by formatting the data into a way that matched the graphics functions. It's a very clever, innovative solution to extracting performance from limited hardware. AMD and nVidia and friends (MS, Khronos Group) have taken these ideas and worked towards improving the flexibility of the GPUs to enable a broader range of workloads to be performed, but GPGPU itself isn't a feature that is or isn't present in a GPU. Like DirectX - a GPU isn't DirectX or not; a GPU has a degree of hardware support for various features in DirectX. A GPU isn't a GPGPU or not; it'll support a number of features that aid GPGPU work. We have no details on what Wuu's GPGPU level is, and can only guess by likely GPU architecture.Anyway, the problem I see here is that, as stated several times, Nintendo themselves highlights the feature. Why should they? They usually simply don't talk about tech, this was a very rare exception. Do you think they fell for some AMD snake oil?
GPGPU is seeing its way into supercomputers. It's on all the GPU roadmaps. It's a feature that has the complete investment of the GPU IHVs as a necessary component to remain competitive, even if devs aren't using it particularly well in PCs yet.I tend to believe that hardware manufacturers like AMD or Nvidia usually have to follow PC paradigms, even more so considering neither of the two is the market leader. It makes no sense putting too much time and money in a feature nobody will use, especially not if it requires costly and rarely used additions or compatibility breaking changes to the hardware. In the embedded space, there's no reason to hold back.
They can also buy in people with experience such as from AMD or nVidia. However, it's a field still in its early days and there's no way Nintendo can be trusted to bring broad expertise that helps other devs refactor their game engines - there's no reason for them to have this superior expertise unless they've heavily invested in research for years prior to Wuu's release. Furthermore, a GPU cannot (yet) replace a CPU in terms of types of code it can handle or types of jobs it can do. I'm no expert on the game code so I could entertain the notion of things like physics being executed on Wuu's GPU, but with the likes of ERP are telling us otherwise. Nintendo's commentary about GPGPU is thus pretty meaningless. Yes, they did highlight it which is rare for Nintendo, but then they were facing an uphill struggle dealing with the game media and throwing them a bone of optimism seems a likely PR move.And I'm afraid a few Linkedin profiles got changed/ erased after one or two got too much attention, so you probably won't find the sources for my claims anymore. But Nintendo has (or had) people working on that stuff. 3rd party middleware optimizations I mean, I never found a concrete mention of GPGPU either.
And once again: Nintendo did mention it (I even posted when and where, so just look it up if you don't believe me). Not as a "savior", because there's no indication that the system needs one to begin with, And there definitely wasn't any such indication at the time they mentioned it.
Nintendo did mention compute shader support (I guess wsippel refers to that). But at the most basic level this just means the GPU can execute shader programs outside of the graphics pipeline (so you don't need to set up a the complete pipeline with pass-through vertex/geometry shaders and render a quad of the desired size with your shader disguised as pixel shader to an offscreen target). That basic level got introduced in the R700 generation (plus the LDS) and basically comes for free.We have no details on what Wuu's GPGPU level is, and can only guess by likely GPU architecture.
Digital Foundry said:What's interesting about the read-out overall is that similar events can stress all three engines, but it's the extent to which frame-rates are impacted that varies dramatically. The initial scene doesn't look too promising for Wii U: indeed, we see three distinct performance bands - Xbox 360 at the top, PS3 in the middle and the new Nintendo console right at the bottom. It's clear that plenty of characters and full-screen transparencies are particular Achilles Heels for the Wii U, a state of affairs that persists in further clips later on. However, beyond that we see a fairly close match for the PlayStation 3 version in other scenarios and occasionally it even pulls ahead of the Sony platform.
So then I thought (perhaps a little naively) "the WiiU has a 64-bit bus going to main memory, what if it had another 64-bit bus (the "other channel") pointing at the edram?" The simplest way might be to run both channels at the same speed and simply address the different banks of memory sequentially.
... which is the possible (likely?) clock of the DDR3 the WiiU uses. Would this be possible? Can edram be accessed using DDR data pins? Could ~13 GB/s of video memory bandwidth be in the right kind of area for what we're seeing? Seems very low, but, yunno .... Nintendo
You probably COULD, but it wouldn't make sense as DDR3 and such are designed as off-chip interfaces, to tolerate use with memory board add-in slots (DIMM, SODIMM etc) and so on.No to what? Can't access edram using something like a DDR3 bus?