HDR on the Wii.

My last laptop which had a GeForce Go 7200 (3 V: 4 PS: 4 TMU: 2 ROP @ 450 MHz config) handled Far Cry w/ patch 1.3 (added HDR) pretty well and held up against HL2: Lost Coast decently too. Sure the Wii is barely half of that on the raw pixel and texture pushing side (or is it?), but the Wii is a closed system, that allows for full support of it's hardware and it's running 480p max. Not to mention the Wii is a very efficiently designed console for it's abilities, improving vastly over the Gamecube with double the RAM, even if it's theoretical fillrates are only 50% higher. Honestly HDR on the Wii shouldn't be too difficult as long as it's kept in check on GPU usage.

Quick question, were there any games with true HDR on the Xbox? Also what about depth of field? I think Pokemon what's-it-called on the Wii did use depth of field too. Even if the fillrates may not be large enough, it seems Nintendo might have anticipated the use of advanced techniques in low doses at 480p.
I remember some developers stating here that the GPU doesn't support shaders, so it's not a far fetched possibility Wii's rasterizer doesn't support HDR.... At best, and this is a far-fetched possibility too, it could support some kind of *basic* AA -there's a very interesting thread on the subject-.

That's probably due to the fact that the GPU is not a RISC processor, but a CISC processor. Most modern processors are RISC, it's the present and the future, while other processors are CISC. That might explain some of the limitations of the console, as Wii's processors are (most probably) CISC. RISC processors have been always better than CISC processors.

That's just my opinion, as I'm not a developer but a techie -well, kind of- and an interested gamer and I'm here to learn as much as I can from everyone about technology and hope as I go along that I can make it to a point where I can help others :smile:
 
I thought the Power 750CX *the hypothesized Wii CPU* used RISC? But I also remember seeing info on the Wii mentioning some 40+ new instructions added to the instruction set from the Gamecube.
 
I remember some developers stating here that the GPU doesn't support shaders, so it's not a far fetched possibility Wii's rasterizer doesn't support HDR.... At best, and this is a far-fetched possibility too, it could support some kind of *basic* AA -there's a very interesting thread on the subject-.

The GPU supports register combiners which can be used to program various effects. Also by AA do you mean Anti-Aliasing?.. if so that's not far fetched, its most definitely supported.

That's probably due to the fact that the GPU is not a RISC processor, but a CISC processor. Most modern processors are RISC, it's the present and the future, while other processors are CISC. That might explain some of the limitations of the console, as Wii's processors are (most probably) CISC. RISC processors have been always better than CISC processors.

I don't think RISC or CISC is really something that can be applied to a GPU, Wii's system processor however is RISC based. The limitations of Wii is simply due to the fact that its a very inexpensive system..
 
Last edited by a moderator:
Wii is more expensive than X360 to the consumer :p but yes, it's inexpensive to make. But another reason is it's basically gamecube hardware, which allowed backwards compatibility and not having to learn a new system to develop on.
 
No need to sell at cost or a loss when it sells so well, MS would do the same if they could :D Wonder how much money Nintendo are actually making per system.
 
Seems that it has been a few years since anyone posted in here but now that we've seen some good use of HDR on the Wii

958247_20090601_screen005.jpg


I wanted to know just to what extent it could do it.

Also, I found this picture on the interenet that points to HDR in Red Steel. I had some very impressive lighitng effects but I haven't heard anything about HDR in it(as there really was never a lot of talk about the game). Is this HDR?

wiing1.jpg
 
Bloom != HDR.

That is not what I asked.

I'm well aware that bloom is not HDR or else I wouldn't have been asking such a question to begin with. My question was specific pertaining to a specific thing, ie the picture of Red Steel which was answered already.

There is still the other question I asked about how well the Wii could do HDR. Yes, that is HDR in the first picture in case you didn't know.
 
Last edited by a moderator:
How do you know?

Because its developers said it. The Exactraction engine used HDR on the Wii and the Ahtena engine as well.
Also, what do you mean by "How do you know?" Have you never looked into this yourself.

I even checked myself. You can see areas in the same vicinity using bloom and volumetric lighting as if they put them there for no other reason than for people to be able to distinguish it.

http://www.youtube.com/watch?v=w8M_fnz6yjM

How do you make videos appear in the post?

Whether or not it can do it is no longer a question for me. What i want to know is to what extent. As in how much else can it do while using HDR.

This game is suppose to have it as well but I can't quite pinpoint where.

http://t3.gstatic.com/images?q=tbn:.../uploads/2009/08/cursed_mountain_t-23.jpg&t=1
http://admintell.napco.com/ee/images/uploads/gamertell/cursedmountain1.jpg
http://i46.tinypic.com/33x8tpz.jpg
http://wiimedia.ign.com/wii/image/article/952/952682/cursed-mountain-20090209000326227_640w.jpg

It must certainly be in one of those pictures as the lighting in all 3 is distinctively different. I am unable to discern Bloom from HDR at just a glance so I need someone who has better eye for these things to check. I have not played the game so for all I know none of these picture could be where it was done.
 
Last edited by a moderator:
Because its developers said it.
That would be a good proof, although Googling I can't find any comments to corroborate. Checking the video you linked to, I don't see any particular evidence of HDR and tone-mapping, so I'm wondering if the comments you read are talking about the look of the game rather than the technical features? That wouldn't be the first time a developer has been inaccurate in describing the technical workings of their game. I'd have to see your reference.
Also, what do you mean by "How do you know?" Have you never looked into this yourself.
This game? No. The reason I ask is because many people look at 2D images and decide from the look if it's HDR or not, which is not possible. If that's what you were doing, and HDR wasn't implemented, then the basis of your theory would be unsound. If HDR is confirmed, then that gives a starting point to explore HDR on Wii.

How do you make videos appear in the post?
Surround the YT identification code, w8M_fnz6yjM, in
Looking at that, as I say above, I don't see any evidence of HDR. The high contrast lighting can be achieved with LDR buffers like last gen. I would expect an HDR engine to have dynamic tone mapping that adjusts exposure. Now that isn't conclusive in itself, and the inner workings could be HDR, or could be LDR with simulated HDR lighting effects like bloom. Have you any clips of a Wii title that changes brightness dynamically? Although even then, as GT3 on PS2 demonstrated, you can get pretty convincing HDR type effects on LDR with clever artistic use of the hardware! You really need some technical reference, or a strong example like a huge exposure change that preserves detail from dark to bright environments.

Whether or not it can do it is no longer a question for me.
Well presently I'm still not conivinced HDR (at least, HDR data formats) on Wii is proven. ;)

This game is suppose to have it as well but I can't quite pinpoint where.
Well you'll have trouble as HDR can't be discerned from stills. Every HDR scene is turned into a standard 24bit colour image for viewing, and so you have no way of knowing what data structures were used to create those stills. Actually, now I think about it, wouldn't colour posterisation on otherwise smoothly graduated specular highlights be indicative of tone-mapped HDR RGB?
 
Blown out highlights in an image doesn't mean anything really other than artistic direction. Yes it could be indicative of HDR but doesn't mean it actually is HDR.
 
Neither Red Steel nor Dead Space used HDR. I heard The Conduit used it. There's also Cursed Mountain and Sonic and the Black Knight that used it.
 
Have ywe any decent references for what has and hasn't got HDR? More than just a dev saying they use it, a reference to what format. Because they could be using a clever, extended use of LDR, to get HDR effects but not actually have access to an HDR framebuffer format. As far as I'm aware that's not possible in Wii. It wasn't in GC, and they don't have the programmability of hardware to fudge their own the way we do now in SM3+. There's the possibility of using multiple textures for different intensity levels, but I don't know what combiner options TEV has to try something like that.
 
"HDR" isn't exactly the most well-defined term. A lot of people tend to define it in terms of the common implementations employed by modern games (eg. floating point render targets, bloom/glare, post-process tone mapping, etc.), rather than what it's actually trying to accomplish. It's really just about having your lighting simulation and art content go beyond a standard mapping of [0,1] to the monitor's displayable range, which can be accomplished in a lot of ways. Obviously expecting a Wii game to have a similar HDR pipeline to a PS3 or PC game is unrealistic, for a lot of reasons. Bu there are definitely a few techniques from the PS2/Xbox era are well-documented, and certainly could be pulled off on the Wii. Or one of these developers could have come up with some other clever approach.
 
That's all true, and I certainly tried to allude to as much. IMO though, "HDR" as used means (or at least implies) a coherent range, rather than perhaps a split lighting range that has an extended brightness in the top end, or such. Perhaps it should be called Extended Dynamic Range when not using a coherent full range of values, to differentiate? So where [0...1000] would be HDR, [0...200 + 900...1000] would be EDR.

Also,

It's really just about having your lighting simulation and art content go beyond a standard mapping of [0,1]

...[0,1] can be HDR. It's all depends on what you set 1 as, in terms of a relative brightness. It comes down to tone mapping a sub-range to the standard range of a limited display device. HDR in terms of float values is all about increasing data resolution so you don't get black or white crush or posterisation. However, HDR float formats do make a lot of optical effects a lot easier to implement and balance, streamlining pipelines which, as you say, is what HDr is really about.
 
Back
Top