I remember some developers stating here that the GPU doesn't support shaders, so it's not a far fetched possibility Wii's rasterizer doesn't support HDR.... At best, and this is a far-fetched possibility too, it could support some kind of *basic* AA -there's a very interesting thread on the subject-.My last laptop which had a GeForce Go 7200 (3 V: 4 PS: 4 TMU: 2 ROP @ 450 MHz config) handled Far Cry w/ patch 1.3 (added HDR) pretty well and held up against HL2: Lost Coast decently too. Sure the Wii is barely half of that on the raw pixel and texture pushing side (or is it?), but the Wii is a closed system, that allows for full support of it's hardware and it's running 480p max. Not to mention the Wii is a very efficiently designed console for it's abilities, improving vastly over the Gamecube with double the RAM, even if it's theoretical fillrates are only 50% higher. Honestly HDR on the Wii shouldn't be too difficult as long as it's kept in check on GPU usage.
Quick question, were there any games with true HDR on the Xbox? Also what about depth of field? I think Pokemon what's-it-called on the Wii did use depth of field too. Even if the fillrates may not be large enough, it seems Nintendo might have anticipated the use of advanced techniques in low doses at 480p.
That's probably due to the fact that the GPU is not a RISC processor, but a CISC processor. Most modern processors are RISC, it's the present and the future, while other processors are CISC. That might explain some of the limitations of the console, as Wii's processors are (most probably) CISC. RISC processors have been always better than CISC processors.
That's just my opinion, as I'm not a developer but a techie -well, kind of- and an interested gamer and I'm here to learn as much as I can from everyone about technology and hope as I go along that I can make it to a point where I can help others :smile: