Stupid question alert! Why HD?

FireGoblin

Newcomer
This might sound like a really stupid question, but why do we need HD (720p) in order to make games look better?

I was watching the Japanese Grand Prix at the weekend on my SD Plasma and it looked lifelike (funny that!!), so why do the next generation console games (think GPR3) have to use HD in order to make them look more realistic?

Can't we just use the additional 'power' of 360/PS3 to process more lifelike images in SD?

Like I said, probably a stupid question (and I'm not putting down GPR3 which looks ace) but it just struck me as odd sitting watching a SD television transmission which was more lifelike that a demo HD video on my laptop!

Is the processing power required to produce 'lifelike' SD images more than that required to make 'graphic-ey' HD images, or maybe Nintendo have got a point!
 
If you do a search you can find a couple long threads discussing this. That may be a better place to engage this discussion instead of starting a new thread :D
 
They do indeed, you can make realistic images in any size, however by moving to a high definition format we get more effectively on screen at once (not totally true) and can have more active detail, polygons below the pixel level are "lost" or amalgamated together, similarly if we want to reduce aliasing without using AA more pixels = better approximation of lines. Besides PC gaming has been easily at 1280x1024 for years, consoles moving up to this level is good because it allows computers to easily treat a TV as another monitor... and more monitors is always good!
 
MrSingh, why are you posting that link? It has nothing to do with the topic at hand, and is not even video game related.

FireGoblin, I have a 1280x768 monitor ( LG L172WT LCD) for playing PC games, and playing games widescreen makes such a huge difference I would never want to go back. Widescreen is just a more natural perspective for our eyes.
 
I have to agree, that the widescreen profile should offer some nice perks, but the real reason I came to post was that I think people are generally moving to larger and larger screen sizes. So there is the impetus to desire more resolution to go along with the screen size. Going to 40/50/60" screen sizes, I think HD resolution really comes into play to keep image quality looking decent. Possibly the true IQ conniseur will want to keep to 30"-ish at HD resolutions to really capitalize on the "hi-rez" look. I would hazard to say that SD is still adequate out to 30" or so for the average person. Then there's the PC crowd that is quite used to very high dpi conditions, so there really isn't any screen size/HD res. combo that would satisfy them, other than the computer monitor sizes and ultra hi-res settings that they already play PC games on.
 
Even on a 17" monitor 960x720 is going to look quite a bit more detailed than 640x480 to anyone with decent eyesight and within reasonable viewing distance from the screen. However, fine detail simply doesn't render well at low resolution and wind up with parts blinking in and out of the screen as can be seen in most any current console game. So to allow a finer level of graphical detail throughout the whole game we have developers targeting higher rendering resolutions.

Granted, fine detail like power lines and the like can be cleaned up with various forms of anti-alising and some games simply don't need that level of detail at all so I tend to agree that Microsoft's push for 1280x720 could be a generation too early, but I do respect their intent and am sure we will see plenty of pretty graphics on the console. On the other hand I am truly excited to see what the Revolution can pull off at 480p and curious to see how many PS3 developers will target that resolution as well.
 
I have a 55" HDTV and playing PS2/XBOX @640x480 is just plain bad, jaggies, shimmering, ect....a very washed out 16bpp look and feel.....YUK!

Hooking my computer to my HDTV is amazing! the games look fantastic @1280x1024.

Xbox 360 and the PS3 are my next *toys* and @1280x720( and above) it's going to be fun!


640x480 even makes baby Jesus cry.
 
Naturally it's better to have the source be as high in detail as possible but another huge consideration is the quality of the scaler in your display (or elsewhere in your HT chain).
 
EricVonZipper said:
I have a 55" HDTV and playing PS2/XBOX @640x480 is just plain bad, jaggies, shimmering, ect....a very washed out 16bpp look and feel.....YUK!
It's your punishment for having too much money to buy a 55" screen. ;)
 
Tthere's a BIG difference in image quality between 640x480 on a console and 640x480 on a PC monitor. Look at a TV picture, a photo at 640x480 (ish) and it looks like a photo. On a PC it looks like a collection of coloured squares in the pattern of a photo (okay, slight exaggeration on my part :D). The fidelity of the pixels makes the image look artificial. The lack of fidelity on an SDTV screen helps produce an image easier on the eye. So a photorealistc console producing TV quality images would look okay on a TV, but a PC producing TV quality images would need higher resolution to look reasonable.
 
FireGoblin said:
This might sound like a really stupid question, but why do we need HD (720p) in order to make games look better?

I was watching the Japanese Grand Prix at the weekend on my SD Plasma and it looked lifelike (funny that!!), so why do the next generation console games (think GPR3) have to use HD in order to make them look more realistic?

Can't we just use the additional 'power' of 360/PS3 to process more lifelike images in SD?

Like I said, probably a stupid question (and I'm not putting down GPR3 which looks ace) but it just struck me as odd sitting watching a SD television transmission which was more lifelike that a demo HD video on my laptop!

Is the processing power required to produce 'lifelike' SD images more than that required to make 'graphic-ey' HD images, or maybe Nintendo have got a point!

I think the first flaw in your argument(ive question) is the assumption that fidelity equals realism in any case, which is simply not true.
 
Azrael said:
I think the first flaw in your argument(ive question) is the assumption that fidelity equals realism in any case, which is simply not true.

I think my first flaw is my brain, which is very small!

Nah, when I posed the initial question I was trying to think of the correct 'terms' to use which have since come out in the followup posts - fidelity, realism etc, but decided on 'lifelike' as a catch-all term.

Some very pertinent points have come up - especially increasing screen size etc, but what I was trying to ask was best posed by your post. Assuming fidelity drives screen resolution and realism drives image processing, which is more important for accurate game representation and which is easier to implement? Would people prefer a HD version of Splinter Cell, or a James Bond movie that they can control (as a simple example)?

Or to put it another way, hypothetically, what sort of increase in CPU / GPU power would be needed to get a realtime 480p output that imitates a live video feed? Is it GPU based visual 'effects', or CPU based 'physical' laws (newtonian laws etc) that make up that 'magical' quality that fools the brain into thinking we are watching live video?
 
Not sure this was covered, but i completely missed this thread, which is actually very interesting.

My opinion on this is that it's "easier" to increase resolution to get "better looking" images, than to increase the level of detail to match, say, a real life DVD image, which at 640x480 still looks more "real" than any computer generated 1600x1200 image.

To match, say, LOTR even at DVD resolution will take decades, regardless of the resolution (within certain boundaries, obviously no one would wants LOTR at 320x240).

Increasing resolutions gives us a "quick fix" for the time being. It makes images smoother, cleaner, and it gives us time to focus on the detail instead.

Also, resolution only tells one part of the story. You can have a 3200x2400 resolution, but if it's on a HUGE screen, it will look just as low res as 640x480 on a smaller screen. DPI gives a more accurate measure to the kind of "jaggies" or other resolution-related precision flaws we see.
 
FireGoblin said:
Or to put it another way, hypothetically, what sort of increase in CPU / GPU power would be needed to get a realtime 480p output that imitates a live video feed? Is it GPU based visual 'effects', or CPU based 'physical' laws (newtonian laws etc) that make up that 'magical' quality that fools the brain into thinking we are watching live video?

In order to generate a convince "real" picture, you would have to generate/emulate a similar level of detail (geometrically as well as in terms of light interaction) to the real world. Doing that would probably require rendering to a significantly high resolution internally in order to capture all these details and filter them down.

To put it another way, the reason F1 looks so good even on SD, is that it starts with a very high quality source (i.e. the real world) and down samples it to the capability of your display. To get the same quality from a computer source, the same approach would be appropriate - use a highly detailed world, great shaders, and render a hires image. In *both* cases the image will look better still if it is displayed at a higher resolution such as in HD.

There's no reason *not* to have HD.

It's not a case of *needing* HD, but that it just plain makes sense that it's a good way of improving the picture quality (others would be HDR capable displays, higher frame-rates, more reliable colour reproduction... but higher resolution is an easy one!)

In the coming generation I don't think you'll be frozen out of being able to play anything just because you don't have an HD screen. But it won't be nearly as good an experience and that's really "why HD" is good for you.
 
Back
Top