You are comparing PC version to Xbox 360 version here. Most likely Xbox version of Crysis 2 has no anisotropic filtering, and uses optimized trilinear, while PC version had proper angle independent high quality AF and high quality trilinear filtering. Many console games are even using bilinear filtering to save some cycles (usually on further away objects). And it's likely that the PC version antialiasing filter is also of slightly higher quality (causes less blurring).I've seen the comparison of crysis 2 1280 x 720 on pc vs 1152 x 720 360 shots and the difference is very noticeable to me at least, that poor AA implementation didn't help either.
Mebibits !=megabits
It's referring to 1024*1024 bits = megabyte rather than 1000x1000 bits
MiB is MibiBytes. MB is megabytes, used traditionally to mean Mibibytes before the Mibi prefix existed and likely here to stay because anyone who knows what they're talking about regards computers knows the decimal prefixes have been overloaded to make binary prefixes. For 10 mibibits, you'd use a lower case b : Mib or Mb. Hence look to the b/B for bits/Bytes.
Very interesting, thanks for sharing. I usually game on my 22" Samsung HDTV, which is, as you pointed out, HD Ready. The native resolution of the TV is 1680x1050p, and I usually select 1080i from the Xbox settings.You are comparing PC version to Xbox 360 version here. Most likely Xbox version of Crysis 2 has no anisotropic filtering, and uses optimized trilinear, while PC version had proper angle independent high quality AF and high quality trilinear filtering. Many console games are even using bilinear filtering to save some cycles (usually on further away objects). And it's likely that the PC version antialiasing filter is also of slightly higher quality (causes less blurring).
For real comparison take some PC screenshots using both 1280x720 and 1152x720 resolution. You can do this easily by scaling the Cryengine Sandbox editor window to match the desired resolutions. Then you chould scale these screen captures to 1920x1080 by using bicubic upsample in Photoshop (it's produces very near quality to the filter used by Xbox hardware upscaler). Now watch these 1920x1080 upscaled images on a native 1080 monitor (or HDTV) and I bet you the differences will be almost impossible to notice.
One "problem" in slightly sub HD resolution screenshots are that marketing departments tend to scale them to even 720p. In this case it's obvious that the (1:1) 720p image looks sharper than the upscaled slightly sub HD image. But this never happens when you play the game on your own console at home. Both 720p and 1152x720 are always scaled to your HDTV screen. Either to 1366x768 if you have an HD Ready TV or to 1920x1080 if you have an Full HD TV. Be sure to select the correct TV resolution from your Xbox dashboard. You should never select 720p, because it often causes double scaling, blurring the image a lot (Xbox scales image first to 720p and then your TV scales it to 1366x768 or 1920x1080). Not selecting your native TV resolution from the Xbox settings also introduces extra latency to the image (most TV sets add 1-2 frames of latency if the image must be scaled).
One "problem" in slightly sub HD resolution screenshots are that marketing departments tend to scale them to even 720p. In this case it's obvious that the (1:1) 720p image looks sharper than the upscaled slightly sub HD image. But this never happens when you play the game on your own console at home. Both 720p and 1152x720 are always scaled to your HDTV screen. Either to 1366x768 if you have an HD Ready TV or to 1920x1080 if you have an Full HD TV. Be sure to select the correct TV resolution from your Xbox dashboard. You should never select 720p, because it often causes double scaling, blurring the image a lot (Xbox scales image first to 720p and then your TV scales it to 1366x768 or 1920x1080). Not selecting your native TV resolution from the Xbox settings also introduces extra latency to the image (most TV sets add 1-2 frames of latency if the image must be scaled).
Yes, PS3 and Xbox 360 are different in that regard. Does PS3 system settings even have support for other output resolutions than 720p and 1080p? Xbox dashboard system settings has support for lots of different output resolutions listed (basically all possible monitor, projector and TV resolutions) with plenty of aspect ratios (5:4, 4:3, 16:9, 16:10). The only limitation seems to be that it doesn't support higher than 1920x1080 output resolutions, so 1920x1200 and bigger monitors are sadly out of luck (and require double scaling). But many 1920x1200 monitors support 1:1 output of 1080p input content without scaling (so if you can handle the small black bars, you do not need to rescale and have slightly less blurry image).I'm sure most sub-HD PS3 games do that, since you're forced to output a display-friendly resolution.
1080p at 60fps.
On the PC I'm lucky to have a Sony Trinitron display. I always go for higher resolution over effects. I like a crisp and hard image.
Lower resolution = more vasoline smeared over the screen effect
Yes, PS3 and Xbox 360 are different in that regard. Does PS3 system settings even have support for other output resolutions than 720p and 1080p? Xbox dashboard system settings has support for lots of different output resolutions listed (basically all possible monitor, projector and TV resolutions) with plenty of aspect ratios (5:4, 4:3, 16:9, 16:10). The only limitation seems to be that it doesn't support higher than 1920x1080 output resolutions, so 1920x1200 and bigger monitors are sadly out of luck (and require double scaling). But many 1920x1200 monitors support 1:1 output of 1080p input content without scaling (so if you can handle the small black bars, you do not need to rescale and have slightly less blurry image).
I reckon the only best way to compare them is using a dev shot of 1280 x 720p on 360 vs the retail 360 shot and view them on a Full HD TV which is what mine is. But I do understand your point.You are comparing PC version to Xbox 360 version here. Most likely Xbox version of Crysis 2 has no anisotropic filtering, and uses optimized trilinear, while PC version had proper angle independent high quality AF and high quality trilinear filtering. Many console games are even using bilinear filtering to save some cycles (usually on further away objects). And it's likely that the PC version antialiasing filter is also of slightly higher quality (causes less blurring).
For real comparison take some PC screenshots using both 1280x720 and 1152x720 resolution. You can do this easily by scaling the Cryengine Sandbox editor window to match the desired resolutions. Then you chould scale these screen captures to 1920x1080 by using bicubic upsample in Photoshop (it's produces very near quality to the filter used by Xbox hardware upscaler). Now watch these 1920x1080 upscaled images on a native 1080 monitor (or HDTV) and I bet you the differences will be almost impossible to notice.
One "problem" in slightly sub HD resolution screenshots are that marketing departments tend to scale them to even 720p. In this case it's obvious that the (1:1) 720p image looks sharper than the upscaled slightly sub HD image. But this never happens when you play the game on your own console at home. Both 720p and 1152x720 are always scaled to your HDTV screen. Either to 1366x768 if you have an HD Ready TV or to 1920x1080 if you have an Full HD TV. Be sure to select the correct TV resolution from your Xbox dashboard. You should never select 720p, because it often causes double scaling, blurring the image a lot (Xbox scales image first to 720p and then your TV scales it to 1366x768 or 1920x1080). Not selecting your native TV resolution from the Xbox settings also introduces extra latency to the image (most TV sets add 1-2 frames of latency if the image must be scaled).
I had to plug the VGA cable. I selected 1680x1050p and... well, I am not going back to HDMI anymore, on my HDTV at least. The difference is staggering, it's night and day. Even upscaled, the extra resolution certainly helps.
I beg to differ. I went from DVI connecting my 1680x1050 monitor, to my laptop's VGA, and the VGA is gorgeous. Everything is 'smoother' but the high DPI means nothing is fuzzy. There are no shortcomings in colour representation or video or anything.If your TV did support that resolution via HDMI then VGA would certainly look worse. People tend to mistake resolution with signal quality.
I beg to differ. I went from DVI connecting my 1680x1050 monitor, to my laptop's VGA, and the VGA is gorgeous. Everything is 'smoother' but the high DPI means nothing is fuzzy. There are no shortcomings in colour representation or video or anything.
As you're aware what you describe has nothing to do with VGA being better than HDMI since your TV doesn't support 1680x1050 via HDMI...
If your TV did support that resolution via HDMI then VGA would certainly look worse. People tend to mistake resolution with signal quality.
But you said "VGA would certainly look worse" and look is subjective perception based, not physics, and I can tell you categorically that VGA 1680x1050 does not look worse to me. In fact I'd say it even looks better - I made mention on my MSN tag when I first connected my laptop that "1050p on VGA looks great!" Similarly I point to SDTV gaming, and Amiga, Sega Master System, and PS2 games on a CRT looked way better than those same games on fixed-resolution displays whether upscaled or native.Smooth = less sharp = fuzzy.
At native display resolution, digital > analog....this is not opinion it's physics.
Similarly I point to SDTV gaming, and Amiga, Sega Master System, and PS2 games on a CRT looked way better than those same games on fixed-resolution displays whether upscaled or native.