Choice of rendering resolution *spawn

Looks better in a way that's quantifiable?
No, but perception doesn't go according to scientific principles; at least not subtle differences.
Blur filtering from a CRT to vaselinize the pixelation makes it "look better"...ok.
Yes. Given a choice between blurry images and chunky pixels, the blurry images looked better because it was more natural and gave the eye something to tween data from. Great big blocky pixels give no room for interpretation. That doesn't mean low-res blurry images are better than high-resolution images on a native display, and I was never saying that. I was just saying that non-perfect pixel representations can look (perceived) as good as digitally perfect ones when conditions are suitable, and VGA doesn't equate to blurry images that are painful to look at and clearly inferior to HDMI.
 
Yes, look at how we can go nuts over off-screen captures of games, before being a bit dissapointed at direct feeds. The brain fills in those details for us.
 
sebbbi said:
Yes, PS3 and Xbox 360 are different in that regard. Does PS3 system settings even have support for other output resolutions than 720p and 1080p? Xbox dashboard system settings has support for lots of different output resolutions listed (basically all possible monitor, projector and TV resolutions) with plenty of aspect ratios (5:4, 4:3, 16:9, 16:10). The only limitation seems to be that it doesn't support higher than 1920x1080 output resolutions, so 1920x1200 and bigger monitors are sadly out of luck (and require double scaling). But many 1920x1200 monitors support 1:1 output of 1080p input content without scaling (so if you can handle the small black bars, you do not need to rescale and have slightly less blurry image).

The Xbox 360 dash is 720p native and just scales to 1080p if selected. PS3 has both 1080p and 720p native XMB outputs. I do remember that Xbox 360 supports more scaled resolutions and upscaled really well on many of my monitors even with funny resolutions by using black bars, which was really nice before I became OCD about scaling and bought a native 720p display.
 
Yes, look at how we can go nuts over off-screen captures of games, before being a bit dissapointed at direct feeds. The brain fills in those details for us.

Actually direct feed video looks best if the game is rendered at a high enough resolution and graphical effects. Games that look like crap via direct feed most likely are the ones that need analog displays to smooth/blur everything to make them "look" better. In reality they "look" worse when using a "proper display" with a clean digital signal and 1:1 pixel mapping.

Direct feed video from games like Gears 3, RE5 look great on a proper digital display via HDMI/DVI and look worse via VGA.
 
I've connected up by my 360 and PC via both VGA and DVI to the same monitor and once you calibrate the colour there's practically no difference. Even for sharp, non AA'ed text on a white background you can barely tell and that's with switching between the two inputs from the same PC and squinting at pixels. If the monitor is sampling from the right part of the signal for the right pixel there's no reason there should be huge or even noticeable errors, and if your monitor isn't sampling correctly then get a better monitor.

The blur created by upscaling is a billion and one times worse than visual errors VGA needs to introduce.

As for what "looks best", that's all down to the end user. I'll go with CRTs for low resolution sources, thanks!
 
Yeah but who's talking about low resolution in this topic? The original point was 1680 x 1050 hooked up to HD consoles. The only people who's talking about low resolution are the ones defending analog displays hooked up to crappy old consoles with crappy analog outputs that was converted from a digital source using D/A converters...;)

BTW scaling is digital so whatever blur is there it's clean just like when you do bicubic resampling, it doesn't cause ringing artifacts around edges of high contrast images like analog VGA. VGA gives a "dirty" blur as well as dirty edges.

As for CRTs they have their own imaging problems like linearity and geometric distortion. Ever seen a GRID of perfect circles displayed on a CRT? Ever seen a CRT with perfect RGB convergence aka "focus" from edge to edge corner to corner?
 
Last edited by a moderator:
BTW scaling is digital so whatever blur is there it's clean just like when you do bicubic resampling, it doesn't cause ringing artifacts around edges of high contrast images like analog VGA. VGA gives a "dirty" blur as well as dirty edges.
When was the last time you saw a VGA display? The one I'm looking at while typing this has no artefacts. - it's pin sharp. And I'm not 'advocating' analogue displays. I just wanted to correct you on your assertion that analogue looks worse - it doesn't always, and someone choosing 1680x1050 VGA over 1680x1050 DVI or HDMI is a valid choice that can give excellent results. So don't diss the analogue! :p

But that's all kinda OT so I'll drop it now. Choice of output port is different to choice of rendering resolution.
 
I beg to differ. ;) I went from DVI connecting my 1680x1050 monitor, to my laptop's VGA, and the VGA is gorgeous. Everything is 'smoother' but the high DPI means nothing is fuzzy. There are no shortcomings in colour representation or video or anything.
Hi, Shifty Geezer... I think VGA is great, too, pin sharp on my HDTV. I was going gaga over VGA for a good reason. I went back to HDMI though. It gives me more freedom because, using VGA, some options of the TV are greyed out when I try to change them. Also having sound on the TV is a plus.

the great thing about choosing VGA and selecting the native resolution of my TV -1680x1050- is that the console upscales so well that even jaggy games look very clean. The jaggies are somewhat noticeable, of course, but not distracting at all. The downside of VGA is that the colour seems slightly washed out to me, and there is a soft blanket of mist over the whole image on the TV screen. The cause seems to be that you can't change some values of the main options menu, and they are always at factory settings. I like to set colour at 57, for instance, but it doesn't let me. HDMI colour using factory settings looks slightly richer to me -Movie mode, colour tone: Warm1. I select Just Scan because of the 1:1 pixel processing and mapping. Basically you see more of the picture.

I mean that afaik! just scan directly maps the incoming image to the sets pixels, eliminating any overscan and other possible extra processing. Using VGA I can either select 4:3 or Wide, but not Just scan, I don't know why. The console upscales really well, as Dungeonscaper pointed out before, so it's fine.

So I finally set the console dashboard to 720p via HDMI, and I changed a couple of settings, especially sharpness which was way too high, it looks absolutely amazing now. And I mean it. Factory settings with a couple of changes -Brightness, sharpness- and Movie mode, colour tone warm 1, Just Scan -most accurate- is sheer heaven.

the thing is, at first glance, given the fact that I can select the native resolution of my TV by using VGA, HDMI seemed to look a lot worse than it actually looks. The image fills the whole TV screen when choosing Just Scan, although it's slightly stretched because of the native resolution of my TV. Using VGA there's a black bar at the bottom of the screen, but the image sent to the TV is kept intact, I think.

When I first played, for instance, Street Fighter 3 online edition using the VGA cable it looked gorgeous. I thought that it looked 10 times better than using HDMI.

I thought that using HDMI the flaws were more noticeable, I saw jaggies and sharp edges around the characters. But under VGA I saw nothing of it.

I compared that game running on VGA vs HDMI, as running an emulator without filtering at all -HDMI- and running the same emulator with that 2xSAI filter, although it looked to me like more of no filtering at all vs 10xSAI or so. :oops:

(I know 10xSAI doesn't exist, I use it just to describe how gorgeous it looked to me)

I have to try Crimson Alliance more time, but it looked really good using HDMI too. Maybe it all was myself inducing self-suggestion? :???:

I tested every game and the difference wasn't as staggering as I initially thought it was. does it mean that my TV handles scaling well? questions, questions....

I was really shocked at first and I spent the last two weeks playing under VGA all the time.

today I've taken some pictures switching cables back and forth. It was very easy to do and it took no time at all. The pictures aren't that great because my mobile phone is average at this, but they show some curious differences.

the black bar at the bottom of the screen is the most obvious one. But there are others, especially in Crimson Alliance.

Note: this is home made, don't expect Digital Foundry quality. :oops:

GEARS OF WAR 3

Set 1

VGA

http://imageshack.us/photo/my-images/824/gears3vga2.jpg/

HDMI

http://imageshack.us/photo/my-images/833/gears3hdmi1.jpg/

Set 2

VGA

http://imageshack.us/photo/my-images/716/gears3vga1.jpg/

HDMI

http://imageshack.us/photo/my-images/825/gears3hdmi2.jpg/

STREET FIGHTER 3

Set 1

VGA

http://imageshack.us/photo/my-images/716/sf3vga2.jpg/

HDMI

http://imageshack.us/photo/my-images/94/sf3hdmi2.jpg/

Set 2

VGA

http://imageshack.us/photo/my-images/830/sf3vga1.jpg/

HDMI

http://imageshack.us/photo/my-images/171/sf3hdmi1.jpg/

Set 3

VGA

http://imageshack.us/photo/my-images/23/sf3vga4.jpg/

HDMI

http://imageshack.us/photo/my-images/15/sf3hdmi3.jpg/

Set 4

VGA

http://imageshack.us/photo/my-images/43/sf3vga3.jpg/

HDMI

http://imageshack.us/photo/my-images/855/sf3hdmi4.jpg/

FUEL

Set 1

VGA

http://imageshack.us/photo/my-images/215/fuelvga3.jpg/

HDMI

http://imageshack.us/photo/my-images/59/fuelhdmi2.jpg/

Set 2

VGA

http://imageshack.us/photo/my-images/23/fuelvga2.jpg/

HDMI

http://imageshack.us/photo/my-images/842/fuelhdmi3.jpg/

Set 3

VGA

http://imageshack.us/photo/my-images/101/fuelvga1.jpg/

HDMI

http://imageshack.us/photo/my-images/717/fuelhdmi1.jpg/

CRIMSON ALLIANCE

Set 1

VGA

http://imageshack.us/photo/my-images/839/crimsonvga2.jpg/

HDMI

http://imageshack.us/photo/my-images/189/crimsonhdmi1.jpg/

Set 2

VGA

http://imageshack.us/photo/my-images/17/crimsonvga1.jpg/

HDMI

http://imageshack.us/photo/my-images/716/crimsonhdmi2.jpg/

I wonder what happens with Crimson Alliance. I didn't move the character at all and I lose vertical FOV under VGA, at the bottom of the screen, when every other game looks the same (FOV wise) in both formats minus the black bars. I also like the colour in Crimson Alliance more, using HDMI but that isn't noticeable in the screenshots.

I could test Forza 3 and Rock Band 3, but maybe another day. :smile:
 
the xbox is probably using the same 400mhz dac that has been in ati cards for years, so really it should look excellent provided its hooked up to a properly calibrated crt.

in terms of iq, that vaselination could otherwise be described as film-like, a quality ascribed to dlp's and to a lesser extent plasma sets. its subjective but many people, myself included would prefer over lcd, which could also be described as being "harsh".

but for a fixed pixel display its pointless to convert it to analog at the source, and then back to digital at the display.
 
@cyan thats more of an issue with that display of course image quality is going to suffer if the image is being stretched to 16:10

i have a samsung 1080 hdtv, default 1080p though hdmi is terrible for games, cuts overscan, does alot of image processing which is fine for broadcast but terrible for games, jacked up sharpness and contrast. you have to remember that display defaults are set the way they are because thats how they are going to be demoed in bright electronics stores. and because customers are generally pretty stupid they simply equate brightest = best picture.

edit: i dumb i should read the entire post. samsung sets generally have good scalers.
 
Last edited by a moderator:
Your HDMI ports don't support 1680x1050?
Yes, apparently so. I have the Display Discovery setting of the dashboard enabled, and in the list of supported resolutions for HDMI, 1680x1050 is there. 1080p is correctly greyed out, because my TV doesn't support it, and the allowed resolutions based on the information of the EDID thing -whatever that means-, which automatically retrieves your video settings, are the correct ones.

For HDMI the supported output settings according to the X360 dashboard are:

480p, 720p, 1080i, 1024x768, 1280x1024, 1680x1050.

however, I can only select the ones that end with a letter. Either the "i" from interlace or "p" from progressive.

when I select 1680x1050 the TV screen turns blue and a message appears "Mode Not Supported". Letting the console automatically select it using the Optimal Resolution option, produces the same result. Blue screen and mode not supported message.

I am not quite sure, I can't recall exactly but Digital Foundry created a brief article about this.

If an expert or an engineer on the matter doesn't explain to me what to do, step by step, with a trail of stones, so to say, to find a solution I don't know the reason why this happens.

that EDID feature is beyond my understanding.
@cyan thats more of an issue with that display of course image quality is going to suffer if the image is being stretched to 16:10

i have a samsung 1080 hdtv, default 1080p though hdmi is terrible for games, cuts overscan, does alot of image processing which is fine for broadcast but terrible for games, jacked up sharpness and contrast. you have to remember that display defaults are set the way they are because thats how they are going to be demoed in bright electronics stores. and because customers are generally pretty stupid they simply equate brightest = best picture.

edit: i dumb i should read the entire post. samsung sets generally have good scalers.
of course, I can tell you that regardless it is the console or the TV the device upscaling the image, it looks fine to me in both cases. No artifacts or bad quality in any case.

I wonder if, when switching from HDMI to VGA after like 3-4 years I tested both and seeing everything a little smaller because of the black bar at the bottom, I was so surprised that I have lost sight of the point so completely I thought VGA looked a lot better.

now that I am playing games under HDMI again, I love the richness of the colour and it looks amazing too. I think my Sharpness settings were too high before.

Concerning what you mention about Factory settings, I can tell you that the factory settings for Samsung HDTVs is Dynamic. It's very colourful and so on, but it tends to look green, the colours are way off and everything looks excessively bright.

I prefer Movie settings by default, Just Scan enabled, backlight 7, Contrast 80 -Turngrap was right on this, I think, 100 is too high for Contrast-, colour -default or 57-, brightness, between 45-50.

the xbox is probably using the same 400mhz dac that has been in ati cards for years, so really it should look excellent provided its hooked up to a properly calibrated crt.

in terms of iq, that vaselination could otherwise be described as film-like, a quality ascribed to dlp's and to a lesser extent plasma sets. its subjective but many people, myself included would prefer over lcd, which could also be described as being "harsh".

but for a fixed pixel display its pointless to convert it to analog at the source, and then back to digital at the display.
the simple, logical truth to me is that analogue can look as good if not better than digital sometimes, especially VGA. The vaselination effect, or blanket of mist is one of the most notable differences. even so, it was most obvious when I enabled Home Theatre PC on the tv settings.

Component output doesn't look that great though, and in fact I remember that the red using component looked like pink, and was indeed very harsh.

There aren't extreme differences between VGA and HDMI -but a DF article would be a great idea to help people out- but everythings is turning digital, taking into account the changes we are seeing lately.

VGA reminds me of my PC gaming days, time ago, and I loved the picture quality.
 
For HDMI the supported output settings according to the X360 dashboard are:

480p, 720p, 1080i, 1024x768, 1280x1024, 1680x1050.

however, I can only select the ones that end with a letter. Either the "i" from interlace or "p" from progressive.

when I select 1680x1050 the TV screen turns blue and a message appears "Mode Not Supported". Letting the console automatically select it using the Optimal Resolution option, produces the same result. Blue screen and mode not supported message.

Mm... yeah, sounds like the TV is the problem. Is that the only HDMI port? I suppose you might try an HDMI to DVI adapter if you're hung up about it, but VGA is just fine. I know on my TV, only one of the HDMI ports supports PC mode, so I can get the resolution I need instead of using 720p.

Although it's really bizarre that the 360 detects 1680x1050 if the TV apparently doesn't accept it via HDMI. What's the model of your TV? Maybe it's faulty?


Component output doesn't look that great though, and in fact I remember that the red using component looked like pink, and was indeed very harsh.
That's probably due to conversion to YPbPr colour space.
 
Mm... yeah, sounds like the TV is the problem. Is that the only HDMI port? I suppose you might try an HDMI to DVI adapter if you're hung up about it, but VGA is just fine. I know on my TV, only one of the HDMI ports supports PC mode, so I can get the resolution I need instead of using 720p.

Although it's really bizarre that the 360 detects 1680x1050 if the TV apparently doesn't accept it via HDMI. What's the model of your TV? Maybe it's faulty?
My TV is the Samsung LE22S86BD model:

http://www.play.com/Electronics/Ele...Ready-Freeview-Widescreen-LCD-TV/Product.html

I have the original manual of the TV but I also downloaded it here:

http://downloadcenter.samsung.com/content/UM/200709/20070907161605328_BN68-01182K-00Eng-0814.pdf

At page 6 it mentions the supported modes for HDMI/DVI and Component. For HDMI it supports 480p at 60 Hz but not 480i, 576p at 50Hz but not 576i in any way, and finally 720p and 1080i at either 50 or 60Hz.

A few paragraphs below in the same page it refers to the picture quality. To my surprise it says: "This LCD TV displays its optimum picture resolution in 720p mode". And then: "This LCD TV displays its maximum picture resolution in 1080i mode".

I am fine with the picture quality of HDMI and 720p, so I am not going to change it. Just Scan -which can only be enabled through HDMI- and 1680x1050 would be perfect though, but I will try to mess around with the hidden service menu of Samsung TVs and see if I can find something interesting:

http://www.avforums.com/forums/lcd-led-lcd-tvs/601553-samsung-service-menu-research.html

Even if I don't, I am really happy with the overall picture quality and upscale of the TV.

That's probably due to conversion to YPbPr colour space.
I knew those options I found beyond my understanding where there for some reason... Now I have proof! :smile:

I think I am not going back to component anymore, but I have another 360 at home connected to a 46" Samsung TV I've bought with my siblings when we lived together -which is basically the same TV as the one I have, even both remotes work in either TV, but larger and with a few extra options, like 1080p, apart from better sound- using component and it might be interesting to see what can be done about that when I play there again, which I very rarely do.

Okay Alstrong. Have a pleasant day.
 
That's a quite strange TV set. 1680x1050 is 16:10, it's not a common aspect ratio in TV sets (there's no 16:10 format TV transmissions or DVDs). Do you have black borders in both 16:9 widescreen and 4:3 TV programs (side black borders in 4:3 and up/down black borders in 16:9), or does the TV set always crop the image a bit?

The VGA output to 1680x1050 native resolution should give (slightly) better quality (and reduced lag) for Xbox 360 use. The difference between analog and digital signal should be minimal, and by using the TV set native resolution there's no double scaling (picture should be sharper).

As the TV set seems to have a DVI input, you should get yourself a HDMI->DVI cable. That way you should be able use the 1680x1050 image in digital format. But I am not sure if the HDMI->DVI cable transmits the sound. We have some HDMI->DVI cables at our office to connect Xboxes to monitors, but our monitors do not have any speakers, so I don't know about the sound (we connect separate headphones with a different adapter).
 
It'll be a TV/monitor. I have the same sort of thing, if not exactly the same model. It was introduced after the launch of this gen where previous monitors used in gaming were 16:10 and stretched the picture. Arwin was playing with stretched pictures on his Sammy at that point. When I wanted a new monitor, Samsung had just introduced what I guess you could class as the 'bedroom-class' sets, which were 16:10 monitors with VGA+DVI, TV tuner, HDMI and 16:9 letterboxing for consoles. Ideal for TV, computing and gaming at the same location.
 
1680x1050 displays black borders when output natively on Xbox 360.

I'm OCD over upscaling and blurriness so I have an HDMI to DVI cable attached to a Gateway FPD1775W 17" 1270x720 display; it's the same display Ben Heck uses in his PS3 and X360 laptops. Sit 1.5 feet away and see the game exactly as it should look :)

I admire the Xbox 360s scaling. I only got the monitor when I got a PS3 as it doesn't fare well with quirky resolutions. I played on a Monster VGA cable for my 360 for a while on 1366 x 768 Sony Bravia display and it looked very nice. CRT displays look really nice too. I find the worst quality is when connecting an ordinary VGA or component cable to a decent sized HDTV, I had this feeling in my gut every time I played on that. Analog, I find, causes some pixels to twitch or flicker a bit when standing still, which really bothers me, if the cable isn't high quality.
 
That's a quite strange TV set. 1680x1050 is 16:10, it's not a common aspect ratio in TV sets (there's no 16:10 format TV transmissions or DVDs). Do you have black borders in both 16:9 widescreen and 4:3 TV programs (side black borders in 4:3 and up/down black borders in 16:9), or does the TV set always crop the image a bit?

The VGA output to 1680x1050 native resolution should give (slightly) better quality (and reduced lag) for Xbox 360 use. The difference between analog and digital signal should be minimal, and by using the TV set native resolution there's no double scaling (picture should be sharper).

As the TV set seems to have a DVI input, you should get yourself a HDMI->DVI cable. That way you should be able use the 1680x1050 image in digital format. But I am not sure if the HDMI->DVI cable transmits the sound. We have some HDMI->DVI cables at our office to connect Xboxes to monitors, but our monitors do not have any speakers, so I don't know about the sound (we connect separate headphones with a different adapter).
I have black borders in either 16:9 -up/down- and I know well I lose horizontal FOV when compared to just scan and side black borders in 4:3. The TV doesn't crop`the image at all, I think.

I followed your advice and I went back to VGA again because it's the native resolution of the TV after all. I miss not being able to select some of the fine tuning options to change and set some values like colour, sharpness, etc, but I managed to improve what I thought were washed colours, changing the Colour Space from Auto to Wide.

http://www.avsforum.com/avs-vb/showthread.php?t=845837

http://www.hdtvtest.co.uk/Samsung-LE40M86BD/Calibration.php

The improvement was rather dramatic, and playing games like Crysis 2 at 1680x1050 with a colour quality similar to HDMI is a sight to behold.

As you say, since I set the resolution to the native resolution of my TV, there is no double scaling, and that might be the reason why I could discern the bumps and cracks of the scene in Crimson Alliance. :?: The quality of my VGA cabe is okay -it's the original MS VGA cable for the X360-.

Regarding the HDMI to DVI cable, sadly, I have that cable already but the HDMI/DVI in of my TV just features a regular HDMI input for both HDMI and DVI, and I don't know which adapter or cable I can buy.

Concerning the sound, it would be fine, I don't need the TV sound at all, although it's more convenient for me because I could turn the volume up or down in a matter of seconds, because of the remote.

Thank you, sebbbi.
 
Back
Top