Samsung DLP

A bit of confusion here on my part on how things works...

My Samsung DLP is a 720p TV, and currently when I'm gaming on it I use the default resolution of 1280x720 @60hz.

That's with my 6800NU. For some reason, my old 5900 wouldn't even allow 1280x720, I was stuck at 1024x768. (I don't know if that was due to upgrading drivers when I added the card, or what the deal was.. )

Anyway, my old "max" resolution setting used to be 1024x768. After buying a 6800NU ~ a year 1/2 ago, suddenly I had other resolution settings open to me.. 1280x720 (which I use now because I'm under the belief it's the native resolution of the TV), but 1600x900 is also newly available as well as 1920x1080.

Those resolutions are unviewable for normal use on a 56" DLP, so I really don't bother.. but my question is what happens when I select those settings? For example, during game play?

Is there actually any benefit to increasing the resolution for the display adapter above the display's native resolution? Is the TV simply going to down-sample the signal back to 720p anyway?

The reason I'm asking is because I want a vid upgrade here shortly (most likely an entire system upgrade.. but the DLP ain't going anywhere any time soon), and I'm thinking that if I'm resolution bound I need to focus on cards that can push higher levels of AA and AF at the lower resolutions (IE: a less expensive card since I'm not running at 1920x1200)

Basic questions.. I know. But I searched the AVS forums and couldn't find the answer, so I thought I'd toss it out to you knowledgable folks.
 
I would think that using a higher resolution would basically be like turning on AA b/c when it was resampled it would be filtered by whatever algorithm the TV uses.

Of course then you should ask why not just use AA to begin with? Well try it with your current card and see what kind of AA looks the same to you, then you might hav ea better idea.


anyway g'luck and g'nite.
 
Thanks... I figured that would be the ultimate end result.

That pushing a higher resolution than the DLP's native resolution would essentially result in extra AA as the display then downsamples the image to 720p.

I just wanted to make sure.

Is everybody in agreement here that is the case?

So there might be a benefit to running at higher resolutions with lower AA, or just running at the native resolution with higher levels of AA and would strictly depend on my personal tastes?
 
I'm not sure that the TV would recognise the extra resolution and likely it would be 1080i rather than 1080p that would be displayed on the TV.
 
I would think you would want to run 1280x720, as (over DVI at least) it will be a pixel for pixel perfect reproduction with no scaling whatsoever.
 
Help!!

I Have A 50 Inch Dlp And Somehow It Appears To Have A Few Small Scuff Marks On The Screen. The Manual Says To Only Use A Dry Cloth To Clean It. Does Anyone Know What I Could Use To Safely Remove Them???
 
All your doing by having a higher resolution than the native one is excercising the TV's scaler.

There can be reasons to do it for media, it largely depends on whether the scaler in your source is better than the scaler in your TV. For most sources the TV will have a better scaler.

For PC Games/Work type input you have more information and if the scaler in the TV is half decent it will likely look better at higher resolution.
 
Back
Top