720p w/ 4xAA = 1080p w/o AA?

kyleb said:
Most TVs have decent enough scalers to where the supersampling provided by running higher resolution will provide better image quality than running at the native resolution.

considering the number of tvs that won't run a 720P single let alone convert it to 1080i.

I doubt many tvs will even know what a 1080P signal unless advertised as able to convert it.
 
Well my screen does just fine and its bound to become more common as time passes. Besides, quest's question was specifily directed at one that would.
 
If an hdtv really follows the hdtv standard, then it will recognize the input of a 1080p signal, because 1080p is specified in the standard. :? Of course, it may or may not show that on the screen. It only needs to be able to read the signal, convert the signal, and then display "something" on the screen to be compliant with the standards. What is displayed on the screen will be determined by what the native screen output is of the particular device. So any "true" hdtv will read a 1080p24/30 signal just fine- it will just then reformat the material to either 760p or 1080i for screen output, as whatever the case may be (not to forget those very special hdtv's that really do display 1080p, but then there are additional conditionals involved there, as well).

I have to agree that this whole 760p+AA vs. 1080i/p may well turn out to be pretty irrelevant, given all the smoothing, blending, processing stuff built-in to an hdtv just by default. You will essentially get the "AA look" either way you go, just by virtue of what an hdtv must do to the input signal to display it correctly on the screen, for all this stuff to work in the first place.
 
randycat99 said:
If an hdtv really follows the hdtv standard, then it will recognize the input of a 1080p signal, because 1080p is specified in the standard. :? Of course, it may or may not show that on the screen. It only needs to be able to read the signal, convert the signal, and then display "something" on the screen to be compliant with the standards. What is displayed on the screen will be determined by what the native screen output is of the particular device. So any "true" hdtv will read a 1080p24/30 signal just fine- it will just then reformat the material to either 760p or 1080i for screen output, as whatever the case may be (not to forget those very special hdtv's that really do display 1080p, but then there are additional conditionals involved there, as well).

I have to agree that this whole 760p+AA vs. 1080i/p may well turn out to be pretty irrelevant, given all the smoothing, blending, processing stuff built-in to an hdtv just by default. You will essentially get the "AA look" either way you go, just by virtue of what an hdtv must do to the input signal to display it correctly on the screen, for all this stuff to work in the first place.

ahhh but if you use a samsung hdtv with x360 it should look the best! :D

Again 720P with 2 or 4AA will look better than 1080 p no AA for high motion sources such as videogames...

1080 p with AA is the best you can ask for though
 
quest55720 said:
How would 720p with 4X AA compare to 1080P that gets downscaled to 720p? I can't afford a large 1080p TV any time soon so I am getting a 720p TV this fall.
You're TV probably wouldn't even get the chance to do it. The way it's done on the current Xbox is that you set in the dashboard what resolutions your TV can support (480p, 720p, 1080i), and whatever the highest resolution is that a game can run at, it automatically does. So if you set your X360 to do 1080p, but your TV can't accept that signal, you probably wouldn't see anything when you load up a game that supports 1080p (though I doubt we'll see any of those any time soon on the X360).
 
Obviously he is asking about a TV which can accept a 1080p signal; why is there difficulty understanding that?
 
kyleb said:
Obviously he is asking about a TV which can accept a 1080p signal; why is there difficulty understanding that?
Because I've never read anything about 1080p as being an FCC approved HDTV standard, so I wouldn't expect many TVs to have support for it, even if it's downsampling.
 
I highly doubt 1080p will be the standard resolution of PS3 games as Pixel Shading requirements scale linearly with the amount of pixel displayed. 720p roughly equates to 900,000 pixels/frame while 1080p is close to to 2,100,000.
Now imagine two gpus a&b with about equal pixel shading capabilities, where a renders to a 720p resolution buffer and the other to a 1080p one at about equal framerates; a could compute shaders of more then double the complexity of b. So in other words, if sony would make 1080p mandatory for PS3 its output would be perceived as sub-par in comparison to an xbox360 by probably anyone unable to use resolutions larger then 720p.
 
Iron Tiger said:
kyleb said:
Obviously he is asking about a TV which can accept a 1080p signal; why is there difficulty understanding that?
Because I've never read anything about 1080p as being an FCC approved HDTV standard, so I wouldn't expect many TVs to have support for it, even if it's downsampling.

Well I can't say I follow FCC standards, but I can say that when I set windows to 1920x1080@60hz my screen displays the image without complaint. So "you probably wouldn't see anything" isn't rightly a vaild answer to someone asking about a screen that can downsample such a signal.
 
Iron Tiger said:
Because I've never read anything about 1080p as being an FCC approved HDTV standard, so I wouldn't expect many TVs to have support for it, even if it's downsampling.

For all intents and purposes (and if you still do not care to actually look it up to verify for yourself), you are being informed right here and now that it is. :)
 
randycat99 said:
Iron Tiger said:
Because I've never read anything about 1080p as being an FCC approved HDTV standard, so I wouldn't expect many TVs to have support for it, even if it's downsampling.

For all intents and purposes (and if you still do not care to actually look it up to verify for yourself), you are being informed right here and now that it is. :)

Does this standard support 60Hz? Are the TVs supporting 60Hz?
 
blakjedi said:
Again 720P with 2 or 4AA will look better than 1080 p no AA for high motion sources such as videogames...

1080 p with AA is the best you can ask for though

On a computer monitor (a good one, that is- not the average LCD monitor that is severely mass-marketed as "SOA" these days), perhaps you have a point. As I discussed in my reply, this may likely hold very little water when it comes to hdtv's. Basically, "everyone" will be getting some sort of "AA" just by virtue of the operation of the hdtv, itself. Whether or not it comes out of the console as AA'd may not make a big difference, at all (unlike the computer monitor scenario). That's what I was trying to tell you. Now if you want to split hairs, you could argue that 720p+AA will likely be "doubled AA'd" by time it reaches the hdtv screen, and plain 1080p will transfer the most "visual information" to the hdtv where it will get final smoothing and scaling (which acts as an "AA" of sorts). The former situation is not exactly ideal (in the same way you would not want to mp3 an mp3 song one additional time if you can avoid it), while the latter situation is about the best you can hope for.

The 720p+AA is really only beneficial if the display device supports a direct "pass-thru" mode (and how many people know how to do that, let alone know that such a mode exists?). Barring that, you can't really fault the conventional strategy to send the highest amount of visual data you have to the hdtv, and let it do its thing from there. In "real use", what it comes down to is you cannot really say one way is going to be better than the other. The end result may look slightly "different", but rather than jump to the conclusion that one way is obviously the "improvement", it is more likely that both ways are being subject to give-some/take-some in different areas.
 
Acert93 said:
randycat99 said:
Iron Tiger said:
Because I've never read anything about 1080p as being an FCC approved HDTV standard, so I wouldn't expect many TVs to have support for it, even if it's downsampling.

For all intents and purposes (and if you still do not care to actually look it up to verify for yourself), you are being informed right here and now that it is. :)

Does this standard support 60Hz? Are the TVs supporting 60Hz?

"The Standard" [echoes loudly] supports 1080p24, 1080p30, and 1080i60. So if you are looking for 60 Hz refreshes at 1080 resolution, then there is only the 1080i60 that will be applicable. Ironically, it is 1080i60 that is most widely supported in actual hdtv equipment (and broadcasts), as far as those that actually output 1080 at the screen. ;) It's the "true 1080p" hdtv's that are hard to come by, hard to afford, etc. as things stand now (as it is for many things that exist on the "threshold", that category will struggle with the chicken-or-egg paradigm for a while).
 
Is there any plan to support 60Hz at 1080p?

If not I would prefer to stick with 720p and/or 1080i 60Hz.

For my preference:

60Hz > 30Hz/24Hz
Progressive Scan > Interaced

If I have to pick or choose at 1080 between Progressive scan or 60Hz, then I would go to 720p for both because to *me* that offer better image quality.
 
1080p60 would seem to be the "holy grail" for hdtv-dom, I agree, but I don't know of any movement to include it in the standard, unfortunately.

As for 1080 or not, 60p or not, p or i, it's all a matter of personal subjective preferences, right? You know this has been a "chase your tail" sort of issue that has been done many, many times...(and ultimately, the only real absolute is that no one size will fit all) ;) Different genres of games will naturally benefit in certain areas and not others. It's all besides the point, because the user will more be at the mercy of the particular hdtv they bought (rather than what the game outputs), when it comes to the actual format they will be presented in a given game. ;) Some will never see the benefits of full 60 Hz refreshes and others will never see the benefits of a full 1080 resolution, by virtue of the hdtv they bought. So naturally some games will achieve a better affect on one and not the best on the other. I guess one can only hope that the game developer has the "hd saavy" to pick to support the particular format that will best set-off the particular type of game they are making. All hdtv's will still display the game, while the ones that happen to natively display the particular format chosen, will be fortunate to display the game to its fullest potential. That's not so bad, right? :D
 
Sounds good...

Although I was hoping for the holy grail :( Especially on wide screen, low refresh rates both me some, so 1080p at 60Hz would have been delicious! Oh holy grail of HD TV, why doth you taunt me!



;)
 
True 1080p HDTVs will be just as affordable as 720p TVs were last year. A number of 50" xHD3 DLP sets are coming out in the $3k price range, which is exactly the price range 50" DLP 720p sets were in last summer.


1080p@60Hz is being evaluated by the ATSC committee right now, but ATSC is a broadcast standard designed to fit HD signals in the same 6Mhz channels as NTSC anyway, and has nothing to do with the modulation used by the external connectors on the TV. For example DirectTV has their own set of standard resolutions and completely ignores ATSC.

The DVI/HDMI spec is handled by a different standards organization. Technically, today's HDTVs are monitors, and can operate with any resolution that single-link or dual-link DVI can provide. In the case of single-link DVI (all digital HDs, LCDs, DLPs, etc) it can handle signals up to 1920x1200@60Hz. DVI is completely limited by bandwidth.

Most HD sets will thus be limited not by ATSC (only relevant for tuners) but be limited by their scaler's input bandwidth. My TV for example, has a VGA connector and can take 1600x1200@60Hz.
 
Will 1080p games be more or less common than 720p/1080i games this generation?

Will 60Hz games at any resolution be more or less common than this generation?
 
My guess is it is entirely up to the developers and what they want to do and what they feel will fit their project the best. The bonus this time around (for the hdtv era) is that everybody will get "a picture" on their screen, regardless of what they choose. So there is much less risk in picking any particular format to build a game around and end up alienating some faction of your customerbase.
 
kyleb said:
Iron Tiger said:
kyleb said:
Obviously he is asking about a TV which can accept a 1080p signal; why is there difficulty understanding that?
Because I've never read anything about 1080p as being an FCC approved HDTV standard, so I wouldn't expect many TVs to have support for it, even if it's downsampling.

Well I can't say I follow FCC standards, but I can say that when I set windows to 1920x1080@60hz my screen displays the image without complaint. So "you probably wouldn't see anything" isn't rightly a vaild answer to someone asking about a screen that can downsample such a signal.
What video card and output are you using? ATi cards only support up to 1080i through component output, so if you set it to 60Hz, the driver will interpret that as interlaced, and just refresh the fields at that frequency.
 
Back
Top