Choice of rendering resolution *spawn

1080p at 60fps.


On the PC I'm lucky to have a Sony Trinitron display. I always go for higher resolution over effects. I like a crisp and hard image.


Lower resolution = more vasoline smeared over the screen effect
 
I've seen the comparison of crysis 2 1280 x 720 on pc vs 1152 x 720 360 shots and the difference is very noticeable to me at least, that poor AA implementation didn't help either.
You are comparing PC version to Xbox 360 version here. Most likely Xbox version of Crysis 2 has no anisotropic filtering, and uses optimized trilinear, while PC version had proper angle independent high quality AF and high quality trilinear filtering. Many console games are even using bilinear filtering to save some cycles (usually on further away objects). And it's likely that the PC version antialiasing filter is also of slightly higher quality (causes less blurring).

For real comparison take some PC screenshots using both 1280x720 and 1152x720 resolution. You can do this easily by scaling the Cryengine Sandbox editor window to match the desired resolutions. Then you chould scale these screen captures to 1920x1080 by using bicubic upsample in Photoshop (it's produces very near quality to the filter used by Xbox hardware upscaler). Now watch these 1920x1080 upscaled images on a native 1080 monitor (or HDTV) and I bet you the differences will be almost impossible to notice.

One "problem" in slightly sub HD resolution screenshots are that marketing departments tend to scale them to even 720p. In this case it's obvious that the (1:1) 720p image looks sharper than the upscaled slightly sub HD image. But this never happens when you play the game on your own console at home. Both 720p and 1152x720 are always scaled to your HDTV screen. Either to 1366x768 if you have an HD Ready TV or to 1920x1080 if you have an Full HD TV. Be sure to select the correct TV resolution from your Xbox dashboard. You should never select 720p, because it often causes double scaling, blurring the image a lot (Xbox scales image first to 720p and then your TV scales it to 1366x768 or 1920x1080). Not selecting your native TV resolution from the Xbox settings also introduces extra latency to the image (most TV sets add 1-2 frames of latency if the image must be scaled).
 
Mebibits !=megabits

It's referring to 1024*1024 bits = megabyte rather than 1000x1000 bits

MiB is MibiBytes. MB is megabytes, used traditionally to mean Mibibytes before the Mibi prefix existed and likely here to stay because anyone who knows what they're talking about regards computers knows the decimal prefixes have been overloaded to make binary prefixes. For 10 mibibits, you'd use a lower case b : Mib or Mb. Hence look to the b/B for bits/Bytes.

Oh now we're making up words? Mebibits and Mibibytes :eek: :p
 
You are comparing PC version to Xbox 360 version here. Most likely Xbox version of Crysis 2 has no anisotropic filtering, and uses optimized trilinear, while PC version had proper angle independent high quality AF and high quality trilinear filtering. Many console games are even using bilinear filtering to save some cycles (usually on further away objects). And it's likely that the PC version antialiasing filter is also of slightly higher quality (causes less blurring).

For real comparison take some PC screenshots using both 1280x720 and 1152x720 resolution. You can do this easily by scaling the Cryengine Sandbox editor window to match the desired resolutions. Then you chould scale these screen captures to 1920x1080 by using bicubic upsample in Photoshop (it's produces very near quality to the filter used by Xbox hardware upscaler). Now watch these 1920x1080 upscaled images on a native 1080 monitor (or HDTV) and I bet you the differences will be almost impossible to notice.

One "problem" in slightly sub HD resolution screenshots are that marketing departments tend to scale them to even 720p. In this case it's obvious that the (1:1) 720p image looks sharper than the upscaled slightly sub HD image. But this never happens when you play the game on your own console at home. Both 720p and 1152x720 are always scaled to your HDTV screen. Either to 1366x768 if you have an HD Ready TV or to 1920x1080 if you have an Full HD TV. Be sure to select the correct TV resolution from your Xbox dashboard. You should never select 720p, because it often causes double scaling, blurring the image a lot (Xbox scales image first to 720p and then your TV scales it to 1366x768 or 1920x1080). Not selecting your native TV resolution from the Xbox settings also introduces extra latency to the image (most TV sets add 1-2 frames of latency if the image must be scaled).
Very interesting, thanks for sharing. I usually game on my 22" Samsung HDTV, which is, as you pointed out, HD Ready. The native resolution of the TV is 1680x1050p, and I usually select 1080i from the Xbox settings.

I am using a HDMI cable, and the option to select 1680x1050p is there but when I try to set the console to that resolution a message on the TV appears saying that "mode" is not compatible. :cry:

I could switch back to VGA but I lose the sound and I like the sound of my TV. I have a Dolby Prologic surround system, which I switch on when playing music games like Rock Band 3, especially, but with VGA I miss the treble sounds of my TV speakers a lot. Using HDMI I have the best of both worlds. Sound from my TV and also the DPL system when necessary.

In the end I set my resolution to 1080i (HDMI), activating Just Scan which they say keep the original size and aspect ratio of the image from the console framebuffer. Everything looks a bit stretched but I am 100% sure you the TV fully displays the framebuffer image of your console.

When I set it to 16:9 instead of Full Scan I noticed I lose horizontal FOV.

the best image quality I ever experienced on this TV was using the VGA cable and selecting 1680x1050p in the Xbox dashboard, :cry: but as I said, there's the issue of sound and I like how HDMI looks too. If I knew how to get my sound back on my VGA cable I would probably use the VGA all the time. I calibrated the image on HDMI, something I can't do when using VGA, but I don't mind that much, actually.

Btw, I am sorry I couldn't reply to yours and Shifty's message in the resolution and framerate for next gen thread. The thread is closed but I don't know why.

cheers
 
Last edited by a moderator:
One "problem" in slightly sub HD resolution screenshots are that marketing departments tend to scale them to even 720p. In this case it's obvious that the (1:1) 720p image looks sharper than the upscaled slightly sub HD image. But this never happens when you play the game on your own console at home. Both 720p and 1152x720 are always scaled to your HDTV screen. Either to 1366x768 if you have an HD Ready TV or to 1920x1080 if you have an Full HD TV. Be sure to select the correct TV resolution from your Xbox dashboard. You should never select 720p, because it often causes double scaling, blurring the image a lot (Xbox scales image first to 720p and then your TV scales it to 1366x768 or 1920x1080). Not selecting your native TV resolution from the Xbox settings also introduces extra latency to the image (most TV sets add 1-2 frames of latency if the image must be scaled).

Don't a lot of sub-HD games upscale in software to 1280x720 before compositing the HUD? I thought for sure that the Call of Duty games did this.

I'm sure most sub-HD PS3 games do that, since you're forced to output a display-friendly resolution.
 
I'm sure most sub-HD PS3 games do that, since you're forced to output a display-friendly resolution.
Yes, PS3 and Xbox 360 are different in that regard. Does PS3 system settings even have support for other output resolutions than 720p and 1080p? Xbox dashboard system settings has support for lots of different output resolutions listed (basically all possible monitor, projector and TV resolutions) with plenty of aspect ratios (5:4, 4:3, 16:9, 16:10). The only limitation seems to be that it doesn't support higher than 1920x1080 output resolutions, so 1920x1200 and bigger monitors are sadly out of luck (and require double scaling). But many 1920x1200 monitors support 1:1 output of 1080p input content without scaling (so if you can handle the small black bars, you do not need to rescale and have slightly less blurry image).
 
1080p at 60fps.


On the PC I'm lucky to have a Sony Trinitron display. I always go for higher resolution over effects. I like a crisp and hard image.


Lower resolution = more vasoline smeared over the screen effect

Trinitrons aren't crisp......they're analog.
 
Yes, PS3 and Xbox 360 are different in that regard. Does PS3 system settings even have support for other output resolutions than 720p and 1080p? Xbox dashboard system settings has support for lots of different output resolutions listed (basically all possible monitor, projector and TV resolutions) with plenty of aspect ratios (5:4, 4:3, 16:9, 16:10). The only limitation seems to be that it doesn't support higher than 1920x1080 output resolutions, so 1920x1200 and bigger monitors are sadly out of luck (and require double scaling). But many 1920x1200 monitors support 1:1 output of 1080p input content without scaling (so if you can handle the small black bars, you do not need to rescale and have slightly less blurry image).

Only options on PS3 are 1080p, 720p and SD resolutions. 720p should chosen for the output because the scaling is usually poor.
 
You are comparing PC version to Xbox 360 version here. Most likely Xbox version of Crysis 2 has no anisotropic filtering, and uses optimized trilinear, while PC version had proper angle independent high quality AF and high quality trilinear filtering. Many console games are even using bilinear filtering to save some cycles (usually on further away objects). And it's likely that the PC version antialiasing filter is also of slightly higher quality (causes less blurring).

For real comparison take some PC screenshots using both 1280x720 and 1152x720 resolution. You can do this easily by scaling the Cryengine Sandbox editor window to match the desired resolutions. Then you chould scale these screen captures to 1920x1080 by using bicubic upsample in Photoshop (it's produces very near quality to the filter used by Xbox hardware upscaler). Now watch these 1920x1080 upscaled images on a native 1080 monitor (or HDTV) and I bet you the differences will be almost impossible to notice.

One "problem" in slightly sub HD resolution screenshots are that marketing departments tend to scale them to even 720p. In this case it's obvious that the (1:1) 720p image looks sharper than the upscaled slightly sub HD image. But this never happens when you play the game on your own console at home. Both 720p and 1152x720 are always scaled to your HDTV screen. Either to 1366x768 if you have an HD Ready TV or to 1920x1080 if you have an Full HD TV. Be sure to select the correct TV resolution from your Xbox dashboard. You should never select 720p, because it often causes double scaling, blurring the image a lot (Xbox scales image first to 720p and then your TV scales it to 1366x768 or 1920x1080). Not selecting your native TV resolution from the Xbox settings also introduces extra latency to the image (most TV sets add 1-2 frames of latency if the image must be scaled).
I reckon the only best way to compare them is using a dev shot of 1280 x 720p on 360 vs the retail 360 shot and view them on a Full HD TV which is what mine is. But I do understand your point.
 
well, I followed sebbbi's recommendations about selecting the same native resolution as your HDTV or monitor supports. I am so very glad I did, THANKS THANKS THANKS. It took me a bit to get used to the forums...but it's really nice here on Beyond3D.

The EDID thing tells me that I can choose 1680x1050p, among other resolutions, using HDMI, but when I select that resolution the TV shows a blue screen and a message saying "mode not supported".

I had to plug the VGA cable. I selected 1680x1050p and... well, I am not going back to HDMI anymore, on my HDTV at least. The difference is staggering, it's night and day. Even upscaled, the extra resolution certainly helps.

For a little background on this, I tried something similar time ago. Very brief tests of half an hour or so, just out of curiosity. But I always went back to HDMI because they say it's digital and it's better. I am not going to discuss that, but on my TV, VGA is certainly the best option by far. The upscaler creates a black bar at the bottom of the screen, but everything looks so perfect that I don't mind.

The console's upscaler works 1000 times better than my 22" HDTV upscaler, and I notice details I had never discerned before. I had calibrated my TV using other values from similar Samsung models -it didn't look that great to me, but I thought calibrated values should be better every time-. Maybe I was wrong... On HDMI I selected either 720p or 1080i, Standard Reference Levels, etc, no black crushes, everything looked fine... I also chose Just Scan on the TV, so I didn't lose horizontal FOV.

the image quality was good, everything was just a bit stretched but not bad, or so I thought. But now :oops: I am truly amazed, switching to a VGA cable and selecting 1680x1050p on my TV, makes me realize how much I was missing. I didn't calibrate the TV at all, and I don't care, it looked almost perfect. I enabled Home Theatre PC (some special default setting that I recommend to enable- and Game Mode.

Mode -Standard, Backlight Settings - 7.., etc. I reseted everything so the TV is using its default settings. The only thing I've changed is the contrast, which I increased to 100 from a default value of 80.

I played Rock Band 3 first. You wouldn't believe me if I tell you that sometimes I felt like watching an animation film. :oops: Seriously, it was that awesome. The xbox dashboard and marketplace content looked a lot more enticing, more clear and sharp, and don't get me started with the colour... Then I tested the new settings playing Crimson Alliance.

This was another true eye opener about the differences of playing at the TV native resolution, and how, even upscaled, the image quality is truly superior at higher resolutions. When playing Crimson Alliance I remember it looked fine, the typical XBLA game with decent visuals, like Torchlight. I was wrong. The game looks better than I initially thought. I had been playing it via HDMI until two days ago. The very moment I played it using VGA at 1680x1050p, some things which were already there came to life.

They were there all the time but I couldn't perceive or distinguish those details well enough. By that I mean, the bump of some surfaces, the shiny detail of some textures, some walls hiding secrets were easier to spot... Things like that. Certainly not easy to describe, you just notice it.

I also tested Halo Reach and so on... Sigh, too bad I played those games already, but I play a lot better this way and I want to enjoy and "re-discover" some situations and places again...

I tried Forza 3, too, with another quick test, no major changes there. The cockpit view looked a lot better to me though and the framerate seemed even smoother. The game, at 60 fps was always smooth, but maybe not having to deal with the double upscaling and, with Game mode selected (which I didn't use via HDMI because doing that automatically enables Standard mode and I had calibrated the HDMI values of the TV under Movie settings) may have played a factor in the image quality, because of the lower latency.

Following this simple, but not so obvious advice at first, breathed new life to my console and TV. It makes me also realize how much I missed out on before. :cry:
 
If I put any of my displays to a contrast of 100 (percent, I guess) the pictures gets HORRIBLY bad! But if you like it... well...

My old Samsung LCD has a HORRIBLE scaler, too. First, it scaled 720P to 1680x1050, with cutting off the sides (to reach 16:10 AR) and it did so poorly. However, downscaling 1080P looked much better, but the display always complains about "not a supported resolution", which can however be dismissed, yet has to be dismissed each time to switch to this input.

My new LCD (LG D2342P, with passive 3D) scales pretty well for a PC LCD. Even 720P doesn't get too blurry.

But still... 720P sets usually have to upscale to the moronic choice of 1366x768 or whatever, there'll always be a loss of definition, especially with cheaper sets. Hence I hope that most, if not all games will be 1080P next generation. Basically all televisions and PC lcds that aren't bargain sets have a 1080P panel anyways, so putting out a 1080P signal will be just that when it is displayed on screen. Even if upscaled 720P@4AA or whatever could look better for some, it might look a LOT worse for others, unless the console will get a good internal scaling option.
 
I had to plug the VGA cable. I selected 1680x1050p and... well, I am not going back to HDMI anymore, on my HDTV at least. The difference is staggering, it's night and day. Even upscaled, the extra resolution certainly helps.

As you're aware what you describe has nothing to do with VGA being better than HDMI since your TV doesn't support 1680x1050 via HDMI...

If your TV did support that resolution via HDMI then VGA would certainly look worse. People tend to mistake resolution with signal quality.
 
If your TV did support that resolution via HDMI then VGA would certainly look worse. People tend to mistake resolution with signal quality.
I beg to differ. ;) I went from DVI connecting my 1680x1050 monitor, to my laptop's VGA, and the VGA is gorgeous. Everything is 'smoother' but the high DPI means nothing is fuzzy. There are no shortcomings in colour representation or video or anything.
 
I beg to differ. ;) I went from DVI connecting my 1680x1050 monitor, to my laptop's VGA, and the VGA is gorgeous. Everything is 'smoother' but the high DPI means nothing is fuzzy. There are no shortcomings in colour representation or video or anything.

Smooth = less sharp = fuzzy. Also DPI is fixed regardless whether you use a digital or analog connection. Look close enough and you'll see the difference between digital vs analog especially with black text on white background.;) Analog signals tend to display more edge and ringing artifacts between high contrast edges.

http://en.wikipedia.org/wiki/Ringing_artifacts

At native display resolution, digital > analog....this is not opinion it's physics.
 
Last edited by a moderator:
As you're aware what you describe has nothing to do with VGA being better than HDMI since your TV doesn't support 1680x1050 via HDMI...

If your TV did support that resolution via HDMI then VGA would certainly look worse. People tend to mistake resolution with signal quality.

I don't think he's advocating using VGA over HDMI, if all else being the same. He's stating that having digital connection and upscaling it is worst than using analog and not upscaling it. If you read his post again, you'll see that he understand hdmi is digital and vga is analog.

edit: Unless you use cheap vga cable or have a lot of interfacing cables/appliances nearby, under normal use most people can't tell the difference between vga and hdmi.
 
Smooth = less sharp = fuzzy.

At native display resolution, digital > analog....this is not opinion it's physics.
But you said "VGA would certainly look worse" and look is subjective perception based, not physics, and I can tell you categorically that VGA 1680x1050 does not look worse to me. In fact I'd say it even looks better - I made mention on my MSN tag when I first connected my laptop that "1050p on VGA looks great!" Similarly I point to SDTV gaming, and Amiga, Sega Master System, and PS2 games on a CRT looked way better than those same games on fixed-resolution displays whether upscaled or native.
 
Similarly I point to SDTV gaming, and Amiga, Sega Master System, and PS2 games on a CRT looked way better than those same games on fixed-resolution displays whether upscaled or native.

Looks better in a way that's quantifiable? I don't think so. NONE of those devices offered a digital output->crap in crap out. Blur filtering from a CRT to vaselinize the pixelation makes it "look better"...ok.
 
Back
Top