Halo 3 IQ discussion * - Stay civil and polite folks!

Status
Not open for further replies.
I think a lot of people are confused and think there's a bigger difference between 640P and 720P than there really is. It is much, much better than SD. Just to illustrate:

DifRes2.jpg


Before I resized:

http://img.photobucket.com/albums/v333/hardknock/DifRes.jpg

The fact that it's being upscaled to 720p I'd reckon nobody would have been able to tell the difference between it and the original 720p image given Halo's art direction (not so detailed textures).
 
Yes, like AA and AF. HD has typically been defined as 720P and above, unless the PR peeps are inventing new definitions for decade old terms. Maybe the console guys need to stick to "Nextgen" if they can't get to at least 720P to use the term "HD".

HD is generally defined as 720p/1080i or above because there is no standard HD format that sits between 720p/1080i and ED resolutions.

The is no well defined standard range of what constitutes HD only three standard resolutions that revolve around broadcast signals and TV specs.

Just because H3 is not rendered at 720p doesn't mean you can't call it HD.
 
The lowest resolution I consider HD, would be whatever bogus res, CRT HDTVs used, because none of those were their actual claimed resolutions. But I tend to stick with the hard and fast 720p/1080i and above as HD. While the jaggies might drive me crazy in the game, I still think it looks good. While I'm all for jaggies going away my biggest pet peeve in video is banding...but that would send me on a rant :smile:
 
I refuse to believe in reality! :devilish:

Thanks Hardknock.


hm.... I guess that would still be a better choice (using the scaler) than 480p & 2xMSAA for SD displays .
 
At the end of the day, I am ok with whatever decision the devs make to create the best looking/best performing game. I really don't care that much if they decide not to hit some arbitrary marketing bullet point if doing so will compromise their vision.

I mean, it seems like the devs on this forum say all the time that they generally hate PR people. It seems like too many of you are seduced by PR/Marketing promises and then you start hating on the content creators when they don't fulfill those promises that they themselves didn't even make.
 
You can not work with 720p frambuffer with FP16 in EDRAM without tiling.
For some reasons Bungie didn't want to or couldn't use tiling.
That's it.
 
I think a lot of people are confused and think there's a bigger difference between 640P and 720P than there really is. It is much, much better than SD. Just to illustrate:

Maybe you should define 0,0 the same for all three resolutions, you minimized the difference by distributing 720P around all four sides and using black also adds to the illusion.

Edit: Bohdy did it, thanks.
 
H3 is still considerably higher res than SDTV, and in that respect, can be considered HD. Also regards the HD marketing, the PR peeps were pushing home the idea that HD wasn't just higher resolution, but everything being better. Thus a game that's better in every way other than resolution from last gen could be attributed the HD moniker.
wow i hope that definition is not going in the b3d glossary :) (might wanna stick in a definition for 'actual ingame screenshots' while youre at it )
so 640p is HD?, ie come on seriously ppl, you cant keep shifting the goalposts
heres a couple of wiki page for HD
http://en.wikipedia.org/wiki/High-definition_video
http://en.wikipedia.org/wiki/High-definition_television
you might wanna correct the entries there
 
Last edited by a moderator:
Lot of unscrupulous flat screen vendors passed off XGA panels as HD, back in the day.

Not saying this is anywhere that bad. But lets face it, the console companies tried to piggyback on HD marketing -- HDTV sales are still high seeing double-digit growth.

If they didn't, then people wouldn't care if this or any other game met the HD threshold.
 
I'm quite happy companies are allowed to opt to lowering resolutions to keep lighting and framerate. I would have preferred some AA and AF though. Sure the resolution is disappointing no doubt, but in the end I don't much care, the game is wonderful and looks absolutely spectacular in many places.

Games should be judged on how they look and play, not some technical detail or marketing point a console manufacturer made years ago.
 
If Halo 3 can't reach HD then we bring HD to it! :LOL: How about instead of trying to redefine an existing term we just invent a new term, 'near-HD'. What do you say fellas?
The lowest resolution I consider HD, would be whatever bogus res, CRT HDTVs used, because none of those were their actual claimed resolutions. But I tend to stick with the hard and fast 720p/1080i and above as HD. While the jaggies might drive me crazy in the game, I still think it looks good. While I'm all for jaggies going away my biggest pet peeve in video is banding...but that would send me on a rant :smile:
CRT HDTVs are most definitely not HD. With a few rare exceptions most notable Sony's SuperFine Pitch CRT HDTVs most other CRT HDTVs have a horizontal resolution of around 850 and a vertical resolution of 1080i. And due to the native interlace effect you're only seeing about 800-900lines out of that 1080i. There are some 480p EDTV displays that almost have a comparable resolution. So CRT HDTVs are more like EDTV+. I have a Trinitron Sony HDTV and I don't mind not getting the full HD resolution, the beautiful colors and great contrast and resolution scalability for classic console gaming more than make up for it.
 
Last edited by a moderator:
All this 640p debacle could've been avoided if MS had a 12Mb EDRAM instead of 10Mb. Then 3x 1280x720p buffers could fit quite comfortably without any need for tiling. These buffers could be used either for 2x MSAA or as in Bungie's case for HDR rendering.
Otherwise, with the current 10Mb, most developers opt only for 2x 720p buffers, or if they want MSAA they go for 640p. It seems nobody is going for tiling even MS's premier developer Bungie!
 
If Halo 3 can't reach HD then we bring HD to it! :LOL: How about instead of trying to redefine an existing term we just invent a new term, 'near-HD'. What do you say fellas?CRT HDTVs are most definitely not HD. With a few rare exceptions most notable Sony's SuperFine Pitch CRT HDTVs most other CRT HDTVs have a horizontal resolution of around 850 and a vertical resolution of 1080i. And due to the native interlace effect you're only seeing about 800-900lines out of that 1080i. There are some 480p EDTV displays that almost have a comparable resolution. So CRT HDTVs are more like EDTV+. I have a Trinitron Sony HDTV and I don't mind not getting the full HD resolution, the beautiful colors and great contrast and resolution scalability for classic console gaming more than make up for it.

Stop following me! :p The colors on CRT are what typically make me prefer component over HDMI (even though I am primarily HDMI now). There are definitely days where I miss my old Samsung HDTV, not the greatest in the world and probably only gave me about 825lines, but man the colors...they were awesome. Now since jokers post and the hdguru.com review of set types, I think I have just sworn off LCD TVs. To bring this back around to Halo IQ, I need a new larger TV so I can see how that affects my judgement of the IQ (as if I care, just needed to mention Halo in the post to keep it from being deleted!).
 
You can not work with 720p frambuffer with FP16 in EDRAM without tiling.
For some reasons Bungie didn't want to or couldn't use tiling.
That's it.
FP16 has no blending support either. That's reason enough not to use it.

The thing is that there doesn't seem to be a good reason for using two framebuffers according to their slides. Rendering using just FP10 should be good enough. However, they wanted a little more range, and were very rigid about it instead of either flagging extra bright objects or using a higher contrast tone map. From what I can tell in their graphs, they wanted a ratio of 5000:1 between the light inputs that would map to fully lit and darkest grey on a TV, and then a factor of 32 (5 stops) headroom on top.

Why so rigid about this requirement? Cameras have only a 300:1 or 400:1 between the whitepoint and near-black, and going way beyond this can make images lose some punch. If they used that, there's almost 5 stops of headroom left in FP10.
 
Last edited by a moderator:
Maybe it would be clearer with the photos they so neatly deleted from those slides... :???: I have the game, but I wouldn't know what to keep an eye on that would require that much range. Going from inside to outside in one of the early levels with the warthog? The scarab explosion?
 
wow i hope that definition is not going in the b3d glossary :) (might wanna stick in a definition for 'actual ingame screenshots' while youre at it )
so 640p is HD?, ie come on seriously ppl, you cant keep shifting the goalposts
heres a couple of wiki page for HD

you might wanna correct the entries there
:p. Okay, if you don't rate 1180x640 as HD, what do you call it? SD? 'HD' and 'SD' are binary badges, true or false. But resolution is on a sliding scale. If instead of calling H3's 640p High Definition we call it high definition without the capitals, then we are describing it by a valid identifier.

Consider game scores. If we label any game with a score of 90 or more a Good Game and stick a shiny silver GG badge on the front to tell anyone, and we consider a game with 50 or more an Okay-sorta-game and stick an OSG label on the front, what do you call a game that scores 89? IS that just an okay-sorta-game, much like a 50%er? Or is it a good that's being cramped by the labelling system? How's about cars. If 150 MPH is consider an FC (Fast Car), and everything else is considered SC (Slow Car), an 80 MPH top speed motor wears the same SC moniker as a 140 MPH. Would you say the 140 MPH is just as slow as the 80 MPH car?

The HD entries in Wiki are regarding TV standards. TV needs standards for broadcasting and dislpay and storage, but computer displays where the computer can output a range of resolutions doesn't. How much 'HD' is used on PC displays? Is 640x480 HD? Is 1024x768 HD? What about 1200x1024? Or 1600x1200? Or 2560x1600? Resolution is variable and from a technical POV it's naive and ignorant to group it into broad and disparate terms.

Instead of answering the question 'how long is a piece of string?' with either 'short' or 'long', doesn't it make more sense to answer the question with a measurement in cms/inches? Instead of answering the question 'what resolution does your game render?' with either 'SD' or 'HD', doesn't it make more sense to answer the question with a measurement of resolution?
 
Status
Not open for further replies.
Back
Top