RancidLunchmeat
Veteran
A bit of confusion here on my part on how things works...
My Samsung DLP is a 720p TV, and currently when I'm gaming on it I use the default resolution of 1280x720 @60hz.
That's with my 6800NU. For some reason, my old 5900 wouldn't even allow 1280x720, I was stuck at 1024x768. (I don't know if that was due to upgrading drivers when I added the card, or what the deal was.. )
Anyway, my old "max" resolution setting used to be 1024x768. After buying a 6800NU ~ a year 1/2 ago, suddenly I had other resolution settings open to me.. 1280x720 (which I use now because I'm under the belief it's the native resolution of the TV), but 1600x900 is also newly available as well as 1920x1080.
Those resolutions are unviewable for normal use on a 56" DLP, so I really don't bother.. but my question is what happens when I select those settings? For example, during game play?
Is there actually any benefit to increasing the resolution for the display adapter above the display's native resolution? Is the TV simply going to down-sample the signal back to 720p anyway?
The reason I'm asking is because I want a vid upgrade here shortly (most likely an entire system upgrade.. but the DLP ain't going anywhere any time soon), and I'm thinking that if I'm resolution bound I need to focus on cards that can push higher levels of AA and AF at the lower resolutions (IE: a less expensive card since I'm not running at 1920x1200)
Basic questions.. I know. But I searched the AVS forums and couldn't find the answer, so I thought I'd toss it out to you knowledgable folks.
My Samsung DLP is a 720p TV, and currently when I'm gaming on it I use the default resolution of 1280x720 @60hz.
That's with my 6800NU. For some reason, my old 5900 wouldn't even allow 1280x720, I was stuck at 1024x768. (I don't know if that was due to upgrading drivers when I added the card, or what the deal was.. )
Anyway, my old "max" resolution setting used to be 1024x768. After buying a 6800NU ~ a year 1/2 ago, suddenly I had other resolution settings open to me.. 1280x720 (which I use now because I'm under the belief it's the native resolution of the TV), but 1600x900 is also newly available as well as 1920x1080.
Those resolutions are unviewable for normal use on a 56" DLP, so I really don't bother.. but my question is what happens when I select those settings? For example, during game play?
Is there actually any benefit to increasing the resolution for the display adapter above the display's native resolution? Is the TV simply going to down-sample the signal back to 720p anyway?
The reason I'm asking is because I want a vid upgrade here shortly (most likely an entire system upgrade.. but the DLP ain't going anywhere any time soon), and I'm thinking that if I'm resolution bound I need to focus on cards that can push higher levels of AA and AF at the lower resolutions (IE: a less expensive card since I'm not running at 1920x1200)
Basic questions.. I know. But I searched the AVS forums and couldn't find the answer, so I thought I'd toss it out to you knowledgable folks.