(HDTV) Console specs, what capabilities will be needed

What amount of proceesing power would a console need to display a game in 720p/1080i for any genre without it hindering framerate?

What texture size(resolution) should we expect from next-gen consoles?

GameInformer magazine got the RE4 exclusive for their next issue, and it has been said that the graphics may be better than what we saw last year. I find that somewaht hard to believe, the game is already a jawdropper. Is it even possible, could they be using the same streaming from the disk technique that F5 and Retro used?
 
Ooh-videogames said:
What amount of proceesing power would a console need to display a game in 720p/1080i for any genre without it hindering framerate?

What texture size(resolution) should we expect from next-gen consoles?

GameInformer magazine got the RE4 exclusive for their next issue, and it has been said that the graphics may be better than what we saw last year. I find that somewaht hard to believe, the game is already a jawdropper. Is it even possible, could they be using the same streaming from the disk technique that F5 and Retro used?

So 720p/1080i will always be slower than 480p, it's more pixels and therefore requires more fillrate and more pixel processing power. The only time there would be no penalty for the higher resolutions is if you were underutilising the hardware at 480p.

Having said that I'd expect a lot of next gen games to support HDTV resolutions.

IMO HDTV isn't necessarrily a good thing for games, HDTV sets increase the latency. A game at 60fps already has latency of 3/60ths (some people would argue 2/60ths) of a second worst case between controller input and you seeing the result. Some HDTV's add an additional latency of between 3 and 9 60ths's of a second. That's more than enough to have a significant impact on the way a games controls feel. It's especially true for games that require continuous small adjustments like realistic racing games.

Obviously visually things will look better on HDTV, but is responsive control a good trade off?
 
IMO HDTV isn't necessarrily a good thing for games, HDTV sets increase the latency. A game at 60fps already has latency of 3/60ths (some people would argue 2/60ths) of a second worst case between controller input and you seeing the result. Some HDTV's add an additional latency of between 3 and 9 60ths's of a second.

Huh? Why is that? I can see it happening for line doubled games, but I doubt
a progressive input would be delayed 9 frames. That would require caching 9 frames somewhere, that's a lot of memory.
 
ERP said:
Ooh-videogames said:
What amount of proceesing power would a console need to display a game in 720p/1080i for any genre without it hindering framerate?

What texture size(resolution) should we expect from next-gen consoles?

GameInformer magazine got the RE4 exclusive for their next issue, and it has been said that the graphics may be better than what we saw last year. I find that somewaht hard to believe, the game is already a jawdropper. Is it even possible, could they be using the same streaming from the disk technique that F5 and Retro used?

So 720p/1080i will always be slower than 480p, it's more pixels and therefore requires more fillrate and more pixel processing power. The only time there would be no penalty for the higher resolutions is if you were underutilising the hardware at 480p.

Having said that I'd expect a lot of next gen games to support HDTV resolutions.

IMO HDTV isn't necessarrily a good thing for games, HDTV sets increase the latency. A game at 60fps already has latency of 3/60ths (some people would argue 2/60ths) of a second worst case between controller input and you seeing the result. Some HDTV's add an additional latency of between 3 and 9 60ths's of a second. That's more than enough to have a significant impact on the way a games controls feel. It's especially true for games that require continuous small adjustments like realistic racing games.

Obviously visually things will look better on HDTV, but is responsive control a good trade off?

Is there any way to get around that issue?


Would the processing power come from the CPU or the GPU and would 512megs of memory be enough?
 
pcostabel said:
IMO HDTV isn't necessarrily a good thing for games, HDTV sets increase the latency. A game at 60fps already has latency of 3/60ths (some people would argue 2/60ths) of a second worst case between controller input and you seeing the result. Some HDTV's add an additional latency of between 3 and 9 60ths's of a second.

Huh? Why is that? I can see it happening for line doubled games, but I doubt
a progressive input would be delayed 9 frames. That would require caching 9 frames somewhere, that's a lot of memory.

They all have frame buffer memory and perform significant processing on the image. The 9 frames number came from a document that has been floating around internally here, I believe it's from a current HDTV set. It could improve with later sets, I would imagine the majority of the processing they are doing is related to video scaling since most sets (the CRT ones might) don't actually support any of the resolutions natively.
 
ERP said:
pcostabel said:
IMO HDTV isn't necessarrily a good thing for games, HDTV sets increase the latency. A game at 60fps already has latency of 3/60ths (some people would argue 2/60ths) of a second worst case between controller input and you seeing the result. Some HDTV's add an additional latency of between 3 and 9 60ths's of a second.

Huh? Why is that? I can see it happening for line doubled games, but I doubt
a progressive input would be delayed 9 frames. That would require caching 9 frames somewhere, that's a lot of memory.

They all have frame buffer memory and perform significant processing on the image. The 9 frames number came from a document that has been floating around internally here, I believe it's from a current HDTV set. It could improve with later sets, I would imagine the majority of the processing they are doing is related to video scaling since most sets (the CRT ones might) don't actually support any of the resolutions natively.

Yes, frame buffer memory for one or two frames, but caching 9 frames at
1080i requires 30 megs. That's a lot of memory.
And only Plasma and LCD sets need scaling. CRT and projection TVs most definitely support native HDTV resolutions.
 
maskrider said:
Taking 9/60 sec is already 150ms, it may affect the lip sync even if you are viewing a movie.

Good point. I can't beleive this is true. 2/60 is the most I can see happening.
 
pcostabel said:
ERP said:
pcostabel said:
IMO HDTV isn't necessarrily a good thing for games, HDTV sets increase the latency. A game at 60fps already has latency of 3/60ths (some people would argue 2/60ths) of a second worst case between controller input and you seeing the result. Some HDTV's add an additional latency of between 3 and 9 60ths's of a second.

Huh? Why is that? I can see it happening for line doubled games, but I doubt
a progressive input would be delayed 9 frames. That would require caching 9 frames somewhere, that's a lot of memory.

They all have frame buffer memory and perform significant processing on the image. The 9 frames number came from a document that has been floating around internally here, I believe it's from a current HDTV set. It could improve with later sets, I would imagine the majority of the processing they are doing is related to video scaling since most sets (the CRT ones might) don't actually support any of the resolutions natively.

Yes, frame buffer memory for one or two frames, but caching 9 frames at
1080i requires 30 megs. That's a lot of memory.
And only Plasma and LCD sets need scaling. CRT and projection TVs most definitely support native HDTV resolutions.

9/60ths is 4.5 HDTV frames, the progressive formats are all 30fps, the interlaced ones 60 but interlaced. These are the numbers I was given.
 
pcostabel said:
maskrider said:
Taking 9/60 sec is already 150ms, it may affect the lip sync even if you are viewing a movie.

Good point. I can't beleive this is true. 2/60 is the most I can see happening.

If your using an external decoder most will allow you to delay the audio signal, and they introduce a delay while decoding anyway.
 
ERP said:
pcostabel said:
maskrider said:
Taking 9/60 sec is already 150ms, it may affect the lip sync even if you are viewing a movie.

Good point. I can't beleive this is true. 2/60 is the most I can see happening.

If your using an external decoder most will allow you to delay the audio signal, and they introduce a delay while decoding anyway.

Many cheap decoders cannot adjust the delay.

I think the 9/60 sec is for interlaced sources and that means 150ms. Progressive frames shouldn't need more than 2 buffered frames for scaling unless they are doing interpolated frame rate conversion.

Pretty good deinterlacing can be done with a 3-4 buffered fields (with a 5-6 or more fields running buffer), if the quality of the first output frame can be compromised, the output can start even with just 1 buffer field (or 2 for less compromise on quality).

Scaling shouldn't take more than 2 buffered frames (progressive frames or deinterlaced frames from interlace sources). I think 5 fields time is a reasonable number, at most 6 I would say. 9 is a bit too much.
 
ERP said:
So 720p/1080i will always be slower than 480p, it's more pixels and therefore requires more fillrate and more pixel processing power. The only time there would be no penalty for the higher resolutions is if you were underutilising the hardware at 480p.

Not only is it more pixels, but anti-aliasing will be pretty much mandatory next-gen. And of course with an increase in screen resolution, you need an increase in texture resolution. With that said, a good PC can already run at 1024x768 with good anti-aliasing, a next-gen console should be able to pull it off too.
 
If that's true, and I don't doubt that it is, then that kinda sucks for gaming. The companies designing those TVs must know that there's a very good chance they might be used for games. This is weird. I hope this will be fixed in future models or that future TVs will support those resolutions natively.

:(
 
ERP said:
9/60ths is 4.5 HDTV frames, the progressive formats are all 30fps, the interlaced ones 60 but interlaced. These are the numbers I was given.

Then you were given incorrect numbers. Progressive modes run at 60. The source material frequently isn't but that doesn't give a TV set the right to skip frames. Mine doesn't.

Also, the extra processing/latency isn't a significant problem on my plasma. I'd love to know what set would buffer 9 whole frames of video at 1280x720 and do something sensible with them. I guess a video processor (such as a scaler unit) might, but even those shouldn't need that much data to play with. Either way I don't see it as an issue that games have to deal with, it's more a problem for the screen manufacturer to sort out.

And of course although I hope all next-gen platforms support HDTV resolutions as standard, I'm sure they won't be able to get away from the legacy interlaced support and so if you do have a problem with the lag, you can play it on an old set...
 
Of course I forgot to try and answer the original question :)

Fill rate requirements naturally go up. Geometry and general processing ought to be entirely irrelevant, unless you count the minor hit that an increased field of view has when going from 4:3 to 16:9.

Textures... hmmm. I guess on the whole you want to shoot for around 1:1 mapping at a reasonably close distance (i.e. where the action happens) and fill in with detail stuff closer in. At most I'd say you need double the texture density that you do with EDTV. I think we're already below the ideal thresholds so what I'm expecting is a general increase in texture density, but still requiring a lot of detail work to make things look good, especially at the higher resolutions. Adding more detailed detail textures doesn't actually increase the overall memory requirements though...
 
MrWibble said:
ERP said:
9/60ths is 4.5 HDTV frames, the progressive formats are all 30fps, the interlaced ones 60 but interlaced. These are the numbers I was given.

Then you were given incorrect numbers. Progressive modes run at 60. The source material frequently isn't but that doesn't give a TV set the right to skip frames. Mine doesn't.

Also, the extra processing/latency isn't a significant problem on my plasma. I'd love to know what set would buffer 9 whole frames of video at 1280x720 and do something sensible with them. I guess a video processor (such as a scaler unit) might, but even those shouldn't need that much data to play with. Either way I don't see it as an issue that games have to deal with, it's more a problem for the screen manufacturer to sort out.

And of course although I hope all next-gen platforms support HDTV resolutions as standard, I'm sure they won't be able to get away from the legacy interlaced support and so if you do have a problem with the lag, you can play it on an old set...

I stand corrected, I didn't realise there were 60Hz versions of the 720P resolution.

We had an internal document circulated here recently that was describing some of the HDTV issues like safe frame differing native resolutioons and color reproduction. I don't know it's source, but they claim a minimum of 3/60ths and as much as 9/60ths. Internal tech docs like this are generally accurate.

I know my HD CRT projection set is delayed several frames (I would guess to do cadence counting for 3:2 pulldown). The only reason I can see for large delays are for processing none HD signals, however the implication of the dcoument was that the delay exists on any signal.
 
ERP said:
I stand corrected, I didn't realise there were 60Hz versions of the 720P resolution.

We had an internal document circulated here recently that was describing some of the HDTV issues like safe frame differing native resolutioons and color reproduction. I don't know it's source, but they claim a minimum of 3/60ths and as much as 9/60ths. Internal tech docs like this are generally accurate.

I know my HD CRT projection set is delayed several frames (I would guess to do cadence counting for 3:2 pulldown). The only reason I can see for large delays are for processing none HD signals, however the implication of the dcoument was that the delay exists on any signal.

AFAIK there is no 30Hz version of any signal - minimum scanning frequency for 720 is 60Hz (I don't even think there's a PAL equivalent, like there is for 480p [576p at 50Hz]). For actual content I'd kind of expect a 30fps feed however, so perhaps your documentation was refering to source formats rather than display.

However I agree with you than some latency is inevitable. I'm pretty sure my panel will buffer at least one progressive frame in before display. But I don't think it can be any more than that, except as you say for an interlaced source that it must deinterlace.

For pulldown detection and correction I think 3 to 9 fields doesn't actually sound too unreasonable - and it's already an issue for movie fans, for whom sound-delay is now an essential part of their setup. It's usually quite acceptable for the brain to perceive sound arriving sometime after the picture, but never before...

I *hope* delays will be shorter for more direct paths, especially in the better TVs. I still say this is not an issue console programmers or manufacturers can do anything about however, but one for which there must be consumer awareness in order to convince display manufacturers to reduce this lag as much as possible.

It's an interesting point though, and if the lag you mention really happens with a real progressive signal, one that is more serious than I'd previously considered.
 
Back
Top