[BRAND NEW] High-Def Three Minutes MGS4 Trailer

Shifty Geezer said:
It only adds fidelity, not quality.

In a nutshell! Perfect! That is a wrap!

eg. Wimbledon this year I remember seeing a ballboys face was just a pinkish smudge. Rendered in only a handful of pixels on the TV screen there was no room for facial details. HDTV would add a lot of clarity and detail. But overall no-one would complain if a tennis game looked as good as a live broadcast, and indeed everyone would prefer that, even with blurred out faces due to low resolution, then HD fidelity and jaggies and simplistic crowds and unrealistic shaders.

And what is the point of HD if that pink smudge looks like this with the detail cranked up:

suspicion.gif


The quality has to be there for the resolution to be a significant benefit. Clearing things up is good, but not end of the world in many cases either.
 
I have to say, shifty founf the perfect example. These days watching Wimbledon, the quality of broadcasts sometimes is SO bad that it's hard to see the ball travelling at those speeds. All because it's a small yellowish blur over a blurry greeny terrain.

Wimbledon seen in 720p will be bliss to my eyes. The detail will show very well on good HDTVs.
 
Acert93 said:
The quality has to be there for the resolution to be a significant benefit. Clearing things up is good, but not end of the world in many cases either.

Thats why it should be good that these game are being made with great emphasis on High Definition. The game should be made to show high detail. Also...when you think about it, theres no real disadvantage of HD (for the end user). You either get someone with a HD TV getting real good fidelity or a user with a standard TV that gets a great amount of AA. Win Win situation.

I'm personally waiting for the 360, not only for the game..but to finally really utilize my Samsung HDTV (my father also has a 60" Sony HDTV, thats going to be fun :) ). Playing GT4 (I know its not TRUE 1080i) I got spoiled from the sharpness of the image quality. Just to test it out..I went back to 480i...and my eyes just really disliked the lack of detail and vibrancy. All and All HD is good..and i'm happy people of both HD and Non HD worlds get benifits from HD.
 
london-boy said:
I have to say, shifty founf the perfect example. These days watching Wimbledon, the quality of broadcasts sometimes is SO bad that it's hard to see the ball travelling at those speeds. All because it's a small yellowish blur over a blurry greeny terrain.

Wimbledon seen in 720p will be bliss to my eyes. The detail will show very well on good HDTVs.

What Tennis is played with a ball? I always thought it was just two clowns swinging the rackets asynchronous... you're telling me that blur is a ball? :oops: :LOL:

I WANT HD in Europe (and I want the HD direct feed of MGS4, NOW!! :oops: )! :cry:

And yes, GO! Federer! (no don't respond to this, I don't want to derail the topic further...hehe)
 
london-boy said:
Wimbledon seen in 720p will be bliss to my eyes. The detail will show very well on good HDTVs.
Wimbledon will never be bliss as long as no Briton has a chance of winning, and Henman's useless. Even 1080p HDTV won't change that. :cry: Though with a bit of an English winning streak at the moment maybe that 720p braodcast will give us an edge? :D

Um, and yeah...um...fidelity and colour balance and...technical relevant stuff, man. And black contrast ratios (always worth a mention). 15 thousand to one. Actually that's the odds of a Brit winning Wimbledon and not the contrast ratio at all...
 
If you don't have a HDTV, don't talk about how it's not needed. You have no idea how fugly and blurry SD broadcasts look after watching significant amounts of HD.

The US Open was SWEET in HD. You could almost see the individual sweat glands on Agassi's face. :)
 
God, don't give me this mental image of Agassi. Give me a russian female player any day. :cool:
 
DemoCoder said:
If you don't have a HDTV, don't talk about how it's not needed. You have no idea how fugly and blurry SD broadcasts look after watching significant amounts of HD.

Really, is such comments called for? You are assuming people who don't own HDTVs don't have family or friends who own them. Your comment may apply to someone who has never seen one, but outside of that your comment is pretty out of place.

You also seemed to have jumped in out of context. No one is saying there is not advantages to HDTV. We were not even discussing TV broadcasts. I am not sure who you are argueing with.

As Shifty said

Shifty said:
It only adds fidelity, not quality.

The added fidelity on a realworld image is great because the source itself is of high quality. A lot of games are still pretty bare and additional resolution adds clarity to the image but does not always add a lot of detail.

This was being proposed within the construct of gamers on STVs who will be getting that larger image scaled down. They lose some fidelity to the image but it is not to the same degree, as say, a real human like Andre Agassi. Upping the resolution of, say, Medal of Honor is still, well, Medal of Honor. And a STV image of Agassi still kicks its butt.
 
I think of it like playing Playstation games on your computer using "Bleem!" The resolution is a lot higher, but it's not going to make it look like a PS2 game.
 
Atsim said:
I think of it like playing Playstation games on your computer using "Bleem!" The resolution is a lot higher, but it's not going to make it look like a PS2 game.

Hey great example man. I might use that in the future. That's pretty much how I view it too. By every means a game being made in HD on a HD console will look better than a console that can't display HD right?
 
darkblu said:

Oh. Why not? I mean maybe the hype has taken me over but to me it seems like a HD developed media running on HD hardware, then being displayed on a HDTV set should look better than a non-HD made media running on non-HD hardware and then being displayed on a non-HDTV set.

I don't know maybe it's just me.

Disclaimer: I'm not being sarcastic. I really do believe this.
 
I'm not following this at all. How is a movie comparable in any way to how 3d graphics are rendered for a video game? Remember the Myst games? They looked good at a moderate resolution, but you couldn't really interact with the game world. The mathematics of a 2d scene are different from a 3d scene. A movie isn't storing geometry detail at all. It's storing light data.
 
mckmas8808 said:
a HD developed media running on HD hardware, then being displayed on a HDTV set should look better than a non-HD made media running on non-HD hardware and then being displayed on a non-HDTV set.

what you say would be true iff everything else is equal and only resolutions differ (both output and source art). this would not be the case in practice as fillrate, the ultimate output of GPUs, is not a free resource i.e. it's often a _limiting_ factor.
 
HD just kets you see what's there better.

So if it's crappy quality, HD will expose it, if it's good quality then HD will make it look even better.

On current XBOX, HD reveals jaggies everywhere, but at the same time, the textures and details on everything just POP out. So it's still better, despite the jaggies.

Looking at the first wave of clips we're getting, these games look so good that I think HD will be required to really see all the details that are being put into these graphics.

I ran a 720p clip of COD on my XBOX the other night and it looked amazing, maybe I'll try that same clip downscaled to 480p and see how much the difference is.....I should really get around to turning that sucker into a DevKit so I could export screenshots for comparison....maybe sometime this week....
 
scooby_dooby said:
Looking at the first wave of clips we're getting, these games look so good that I think HD will be required to really see all the details that are being put into these graphics.

OMG!! Me and Scooby agree on something!! Dum, dum, duuummmm. j/k:D
 
Im with Scooby, the source quality on most next gen games will probably be detailed enough per pixel to give HD users a significant boost.
 
mckmas8808 said:
Oh. Why not? I mean maybe the hype has taken me over but to me it seems like a HD developed media running on HD hardware, then being displayed on a HDTV set should look better than a non-HD made media running on non-HD hardware and then being displayed on a non-HDTV set.
720p is 3x the resolution, so you need 3x the horsepower to shade, light, and texture a scene in 720p then you do SDTV. Graphics cards have a finite amount of horsepower. If a graphics card's total performance is measured in booglefloogs, then a 1,000,000 booglefloog GPU (RSX. Xenos is rated at 1.3 megabooglefloogs, though there's suggestion they rigged their drivers for the benchmark software :p) can apply 1 booglefloog of power per pixel at 720p. But on SDTV that same 1 megabooglefloog power would drive 3 booglefloogs per pixel. That's three times as much quality.

As Acert points out, look at older software vs new on PC. Old games written to run on 800x600 on lesser hardware can now be run at 1600x1200 on modern hardware. But they still look pants next to a modern game that's applying more shader power per pixel.

Some factors aren't really affected by resolution such as poly counts for models. The GPU can transform n million vertices regardless of resolution. But shading is dependant on resolution. The end result is a balancing act, on complexity of shaders and fillrate limited effects vs. resolution. Going back to our testcase 1 megaboogleflop GPU, if we drop the resolution to 320x256, we'll get 12 boogleflops per pixel of shading power, but the end result will be such low resolution that even if the lighting and shading were photoaccurate, it'd look either blocky or blurred and be no good. So which would you choose...

1) Photoaccurate, super low resolution graphics at 320x256 and undistinguishable from a photo
2) Good shaders and lighting, with realism and antialising at 640x512
3) Simplistic shading and lighting, no AA at 1280x720?

That's the sort of choice render resolutions presents, and the difference is between size of pixels and rendering quality of pixels. Thankfully next-gen consoles have enough renderpower to drive quality visuals even at HDTV resolutions and provide supersampling for lower resolutions. But it's still a truth that if SDTV were targettable on XB360/PS3, more graphics power could be applied per pixel for nearer realism.
 
That doesn't explain why current generation games like The Incredible Hulk run at 720p.

If they really could create a game with 3x's the visuals at 480p, why don't they? They were not forced to run a 720p, but chose too, and it's one of the best looking games ever to come out on the XBOX.

How do you explain that?

Also, Sony has not stated any requirements for Dev's, so as far as we know PS3 Dev's can target SD resolutions if they like.
 
Back
Top