HDR on the Wii.

Looks like I have to go edit my article some more. Here's Cursed Mountain on the Wii. The engine stuff says there's HDR and the last few minutes of the trailer looks to deliver that. I wonder, how is the Wii doing HDR? I remember when it was first revealed, it kills video cards. Is it one of those effects that benefits on a weaker system due to lower geometry?

[gt]38797[/gt]
 
Perhaps its INT 8 based HDR ? I think the 360 uses int 8 , fp 10 , fp 16 and the ps3 can use int 8 , fp16 and fp 32. Though I think fp32 is a huge power hog for the ps3 and fp 16 is the same on the 360.
 
I didn't see anything in that trailer to 100% convince me that they're using a full-out render to HDR format->calc luminance->tonemap pipeline (not going to rule it out either, who knows what these guys can come up with).

Also there's lots of ways to fake the effects of HDR, or to achieve only some of them. For example you have something like in Killzone where they render a scaled luminance value to the alpha-channel and use that to indicate overbrightened pixels, or you have what Team ICO used in Shadow of the Colossus.
 
Looks like I have to go edit my article some more. Here's Cursed Mountain on the Wii.
How do you know it's from Cursed Mountain? The title name wasn't clear from the trailer.
[/sarcasm]

The engine stuff says there's HDR and the last few minutes of the trailer looks to deliver that.
What's telling you it's HDR? I'm only seeing overbright, which can be faked comfortably enough. I'm hesitant of engines claiming HDR, as they may mean 'HDR effect' rather than 'extended brightness range' and it's hard to know which they're using.
 
What's telling you it's HDR? I'm only seeing overbright, which can be faked comfortably enough. I'm hesitant of engines claiming HDR, as they may mean 'HDR effect' rather than 'extended brightness range' and it's hard to know which they're using.

That's kind of the reason why I made this post. They made the claim, I ask if their claims are real/how they're doing it. Simple as that.
 
Well, I dunno how we can pin it down! Exceptional reflection detail in changing lighting conditions is one area I'd look for HDR, but there's little scope for that. Lots of banding/posterization in bloom could point towards a low memory HDR format.
 
Yeah, I guess it is a little hard to do it from gameplay footage. Usually, I think of eye adaptation effects and reflection when I think HDR, but recently discovered bloom (I think) can also do the eye adaptation thing.
 
GT3 had a brilliant 'eye adaptation' thing going on when driving around one of the city courses and in to the sun, all without HDR AFAIK. HDR was a Godsend for offline rendering, but I don't know how much it truly brings to the table on consoles. General lighting seems improved through it, or rather, all the best lit games use HDR. But from where I'm sitting, I think it's more about ease, not needing to fake solutions for common optical effects.
 
What's telling you it's HDR? I'm only seeing overbright, which can be faked comfortably enough.

Their PR release on their engine claims they have "HDR." I can't vouch for what they mean by that, but that's what they say. The last bit with the sun glaring off the walls looks a lot like sundown in Twilight Princess.
 
SOTC looked like bloom to me. Still, as long as it recreates the look, it's good enough if you ask me.

Well yeah it was bloom, but the trick they used was that would adjust the amount of bloom in certain pre-defined areas to fake the effect of automatic exposure adjustment (which is arguably the most recognizable effect of most HDR implementations). So for example when you're outside in a field you can see everything pretty clearly, but when you're inside the temple the windows are completely bloomed out. What Shifty is talking about with GT3 is probably the same thing: have certain areas marked as either "dark" or "bright" and adjust the bloom amount accordingly.
 
In GT3 they also dimmed the normally lit geometry, the road and cars etc., increasing the contrast to simulate iris contraction. There was similar in Rogue Galaxy leaving buildings, but it was a bit coarse.
 
Considering the much higher amount of memory available as well as the pure efficiency of the entire system, I don't think psuedo or true HDR would be too difficult at 480p.
 
HDR or not, looks pretty neat to me. Better than that conduit thing. I dont think it really matters if there is HDR or not. Alot of times HDR get implemented crappy anyway and all you end up with is overbright stuff that actually looks less realistic than when you'd turn of the HDR.

All in my opinion ofcourse.
 
HDR or not, looks pretty neat to me. Better than that conduit thing. I dont think it really matters if there is HDR or not. Alot of times HDR get implemented crappy anyway and all you end up with is overbright stuff that actually looks less realistic than when you'd turn of the HDR.
It would be VERY interesting to know the performance impact of HDR in most systems. It must be very noticeable to developers, like a 70% or a higher framerate drop, I guess.
 
It would be VERY interesting to know the performance impact of HDR in most systems. It must be very noticeable to developers, like a 70% or a higher framerate drop, I guess.

My last laptop which had a GeForce Go 7200 (3 V: 4 PS: 4 TMU: 2 ROP @ 450 MHz config) handled Far Cry w/ patch 1.3 (added HDR) pretty well and held up against HL2: Lost Coast decently too. Sure the Wii is barely half of that on the raw pixel and texture pushing side (or is it?), but the Wii is a closed system, that allows for full support of it's hardware and it's running 480p max. Not to mention the Wii is a very efficiently designed console for it's abilities, improving vastly over the Gamecube with double the RAM, even if it's theoretical fillrates are only 50% higher. Honestly HDR on the Wii shouldn't be too difficult as long as it's kept in check on GPU usage.

Quick question, were there any games with true HDR on the Xbox? Also what about depth of field? I think Pokemon what's-it-called on the Wii did use depth of field too. Even if the fillrates may not be large enough, it seems Nintendo might have anticipated the use of advanced techniques in low doses at 480p.
 
Wii is a closed system, that allows for full support of it's hardware and it's running 480p max.

Exactly which HDRR implementation are you thinking :?:

Quick question, were there any games with true HDR on the Xbox?

No, but there was one method described at GDC2004 to try and fake it on DX8-class hardware via programmable shaders.
http://www.daionet.gr.jp/~masa/archives/GDC2004/GDC2004_PIoHDRR_SHORT_EN.ppt


Also what about depth of field?
This is just a shader. Several games employed it, e.g. Halo 2, Conker, Chaos Theory.
 
My last laptop which had a GeForce Go 7200 (3 V: 4 PS: 4 TMU: 2 ROP @ 450 MHz config) handled Far Cry w/ patch 1.3 (added HDR) pretty well and held up against HL2: Lost Coast decently too.

I didn't know you could run either of those games @ 320x240 ;)
 
Back
Top