PC vs HDTV Resolutions

gurgi

Regular
I'm a bit confused here. I keep hearing that HDTV resolutions are going to bog down Xbox360 and PS3. I've even read a statement from Julian at F5 that says as much. What I don't understand is the difference between 1280x720 and the PC resolutions I've been using for years.

My PC runs games fine at high res, so is there something I'm missing here?
 
gurgi said:
I'm a bit confused here. I keep hearing that HDTV resolutions are going to bog down Xbox360 and PS3. I've even read a statement from Julian at F5 that says as much. What I don't understand is the difference between 1280x720 and the PC resolutions I've been using for years.

My PC runs games fine at high res, so is there something I'm missing here?

- 1280x720 typically doesn't run at a solid 60 fps with AA and AF enabled on newer titles.
- People expect higher quality from next generation consoles than what PCs have delivered "for years".
- PC graphics cards have higher bandwidth available to graphics alone than the next-generation consoles have for graphics and CPU together.
 
well if you took all the fillrate that it takes for the higher resolution and used it for other effects, you could end up with better graphics, larger worlds, even more stuff on screen

when the systems come out it won't be a problem but towards the end of there lives when people expect games to realy push the system, all the power might be squeezed out already
 
Entropy said:
- 1280x720 typically doesn't run at a solid 60 fps with AA and AF enabled on newer titles.
- People expect higher quality from next generation consoles than what PCs have delivered "for years".
- PC graphics cards have higher bandwidth available to graphics alone than the next-generation consoles have for graphics and CPU together.

I agree on all points, but on your last: that's not true in all cases. G70, for example, has reportedly 38.4GB/s of bandwidth, while "graphics and CPU together" on PS3 has 47GB/s. The PC cards also aren't directly comparable on X360 given that the eDram will take some bandwidth usage away from its 22GB/s.
 
well if you took all the fillrate that it takes for the higher resolution and used it for other effects, you could end up with better graphics, larger worlds, even more stuff on screen

Sure, but this isn't special to consoles, right? I guess you won't have the option to run in a lower res, or different hardware configurations...but there still tends to be a lowest common denominator.
 
I disagree with every point.

Entropy said:
- 1280x720 typically doesn't run at a solid 60 fps with AA and AF enabled on newer titles.

This is not true at all. I have a 6800GT which chews through HL2, Doom3, and FarCry at 1280x1024 with 4xAA and 8xAF.

- People expect higher quality from next generation consoles than what PCs have delivered "for years".

The consoles are closed box designs and more effecient than PCs. They have faster dedicated buses (FlexIO, Xenos<=>Xenon 20GB+ connection, etc) so even at the same level of features and chipsets a Console will look better. This is why the Xbox, with a weak Celeron 733Mhz, 64MB RAM, and a GF3 GPU power games that look MUCH better than what you see on a comparable PC.

- PC graphics cards have higher bandwidth available to graphics alone than the next-generation consoles have for graphics and CPU together.

Apples-to-Oranges. As the Xbox 360 is at question, the issue cannot even be compared. Sure, a desktop GPU may have 35GB/s of bandwidth and does make the Xbox 360's 23GB/s look very weak in comparison. But when we get to the details we learn that on the PC a lot of that bandwidth is eaten up by the framebuffer. According to one source a scene rendered in 64 bit per pixel HDR with Z-test and alpha blending per sample can consume 144GB per second in graphics bandwidth.

Obviously 144GB > 35GB.

Xenos goes another route. It takes that back buffer and isolates it on the eDRAM that has 256GB/s. Basically the 512MB of memory does not become a glorified backbuffer. What is the point of having 512MB of memory if ALL the bandwidth is being saturated by a 60MB section of the framebuffer/backbuffer? That is a waste of memory--and that is what GPUs do.

Further, on a closed box. MS has been keen to not only discuss bandwidth issues in their DX Next and XNA workshops, they have discuss solutions. One is the use of proceedural textures (e.g. the link above discusses that some). Xenos has some hard wired features to reduce bandwidth and memory burdens. e.g. A model can be stored in memory as a HOS, which is much smaller than a triangle mesh, and a CPU core can adaptively tesselate the mesh--basically dynamic LOD--and then stream that mesh directly to the GPU WITHOUT using the memory bandwidth for the streaming.

Another point would be that the Xenos is design AROUND 4xAA. The daughter chip with ROPs and eDRAM has these very feature set in mind when it was designed. You are not hitting the bandwidth limitations like on the PC.



And finally, the Xenos looks to be more powerful that currently GPUs (like the 6800GT mentioned above) and more effecient than current traditional GPU designs. If the 6800GT and X800XL can easily chew through 1280x1024 with AA and AF with a "brute power" PC design, I am certain that the Xenos--a faster GPU at pixel/vertex shading--in a closed box and more effecient design aimed at 720p with 4xAA will do fine at next gen games.
 
This is why the Xbox, with a weak Celeron 733Mhz, 64MB RAM, and a GF3 GPU power games that look MUCH better than what you see on a comparable PC.

Do you really think the fact that XBox games were made for 640x480 res had nothing to do with that?

Just to add the quote that was mentioned from Julian Eggebrecht:

"This is my single biggest worry," admits Eggebrecht. "Let's put it this way. At 640x480 [standard definition], we're at a point where we can do anything. Anything. Finally. But with high-definition, I think we're at about the same level of challenge when it comes to framerate as we are this generation. You can do a hell of a lot more polygons. You can do a hell of a lot more shaders. But the inherent fill-rate issues are still certainly there. Will it be a 30-frame time? Will it be a 60-frame time? It will be interesting to see."

Though he's certainly enthusiastic about HDTV overall.
 
well the same goes for pc just play game at 640 X 480 and 1280 X 1024, and compare the IQ and frame rate, depending on the game the framerate might double with just lower resolution, that power could then be use to add more details

but in 4 or 5 year people might not want fuzzy pictures anymore and rather have clear images with about the same amount of graphics from the year or 2 before, the graphics will still be alot better than what is around now either way
 
Teasy said:
This is why the Xbox, with a weak Celeron 733Mhz, 64MB RAM, and a GF3 GPU power games that look MUCH better than what you see on a comparable PC.

Do you really think the fact that XBox games were made for 640x480 res had nothing to do with that?

A good comparison might be to take HL2 or Farcry and run them on a 733 celery box with a GF3 in 640x480 and compare? =P
 
Teasy said:
This is why the Xbox, with a weak Celeron 733Mhz, 64MB RAM, and a GF3 GPU power games that look MUCH better than what you see on a comparable PC.

Do you really think the fact that XBox games were made for 640x480 res had nothing to do with that?

Of course it has something to do with it. But how much is the question.

Back in 2001 the Xbox clearly had games that looked better than PC games. And if a PC gamers wanted similar levels of detail needed to buy GPUs that alone cost more than an Xbox. I have friends in a clan that run their PC games at 800x600 because their HW simply cannot do anymore.

It took a few years before we started seeing a lot of games that put the Xbox to shame, even at similar resolutions. It was not until 2004, with the first DX9 games (FarCry, HL2) that we started to see a big difference between the quality. Yet even then the Xbox got solid versions of Doom3, CoR, and now even HL2. Toss in console exclusives like Conkers, Jade Empire, and some of the other better looking Xbox games I think the picture is clear:

We never saw a PIII Celeron with 64MB of memory and a GF3 card, even at lower resolutions, output games that look as good as the Xbox. You cannot even RUN Doom 3, HL2, FarCry, CoR, etc... on that system. And even on beefier systems (384MB memory, P4 1.4, GF4 4200) the game looks like garbage and needs to be run with a lot of features turned off.

So yes, the resolution had SOME to do with it, but not much. Remember, there are good looking Xbox games that support 720p. Sure not all, but when we look at the big picture the Xbox was definately ahead of the PC in quality for a while.

The same case will apply this generation. While PC games are routinely playing at 60fps with AA and AF @ 1600x1200 or higher, the new consoles will target a lower resolution of 1280x720. Some games will go higher to 1080i/p (just like some current gen games supported 720p or 1080i) but for the most part I think we will see more console games go for 720p. So there will still be that gap between PCs and Consoles. And I expect high end console games to look better than anything on the PC for the next 18months, even if with comparable HW, because consoles are closed boxes and are more effecient.
 
gurgi said:
Teasy said:
A good comparison might be to take HL2 or Farcry and run them on a 733 celery box with a GF3 in 640x480 and compare? =P

Actually Farcry looks beautifull in XB, but it has been remade for it (I think), it is at least has good as PC at lower rez IMO.

HL2 also looks very good, but at lower rez too
 
Acert93 said:
Some games will go higher to 1080i/p (just like some current gen games supported 720p or 1080i)

Excellent post, agree with what you're saying.

is there a list of current-gen games which support 720p/1080i? I'd be interested to see how they compare graphically (eg, don't look as good, less effects, but run at higher res). This will be an interesting indicator for next gen, when games will have to be 720p, will they likely lose some of the "gloss" that could have been applied at standard def?
 
PARANOiA said:
Acert93 said:
Some games will go higher to 1080i/p (just like some current gen games supported 720p or 1080i)

Excellent post, agree with what you're saying.

is there a list of current-gen games which support 720p/1080i? I'd be interested to see how they compare graphically (eg, don't look as good, less effects, but run at higher res). This will be an interesting indicator for next gen, when games will have to be 720p, will they likely lose some of the "gloss" that could have been applied at standard def?

http://www.hdtvarcade.com/ Has a pretty good list going. If you can't find a game on there, your best bet is http://hdgames.net/
 
pc999 said:
Actually Farcry looks beautifull in XB, but it has been remade for it (I think), it is at least has good as PC at lower rez IMO.

HL2 also looks very good, but at lower rez too

Farcry.... yeah it looks pretty good all considering. I think they nailed down most of the shaders (no big deal considering the hardware). The biggest drop in quality would just be the textures and resolution. Gotta wait for the framerate details, but it looks like they ported the engine quite well.

HL2... they need to work on framerate, but it looks like the closed box environment is giving them a chance to work on the lighting system to make it better than the PC iteration.

In either games, the only thing we won't see for sure is the full reflections setting or whatever it is called, although there are reflections to an extent.
 
Another cool things about consoles is that you get series of games that you'll never see on a PC (i.e recent Final Fantasies and various RPGs and other games that are pretty much going to stay on consoles). Seeing a known RPG series in High Resolutions is very cool. I recently played Star Ocean III in 480p (Not really High Res...but better then SDTV) and it looks so fricken sharp.

The only console game that I have experienced at 1080i is GT4. It tends to sparkle alot :LOL: but its a great look to the future on how console specific games may look at Higher Resolutions. I just can't wait to see what a 1080p game looks like :devilish:

Shark Sandwich said:
PARANOiA said:
Acert93 said:
Some games will go higher to 1080i/p (just like some current gen games supported 720p or 1080i)

Excellent post, agree with what you're saying.

is there a list of current-gen games which support 720p/1080i? I'd be interested to see how they compare graphically (eg, don't look as good, less effects, but run at higher res). This will be an interesting indicator for next gen, when games will have to be 720p, will they likely lose some of the "gloss" that could have been applied at standard def?

http://www.hdtvarcade.com/ Has a pretty good list going. If you can't find a game on there, your best bet is http://hdgames.net/

WOW...I was looking at the Dreamcast section and was shocked....a BOATLOAD of games support 480p resolutions....wow...
 
WOW...I was looking at the Dreamcast section and was shocked....a BOATLOAD of games support 480p resolutions....wow...

Isn't that crazy? Almost every DC game supported the VGA adapter. Dreamcast was ahead of its time! If only GC, Xbox, and PS2 would've followed. . .
 
Shark Sandwich said:
WOW...I was looking at the Dreamcast section and was shocked....a BOATLOAD of games support 480p resolutions....wow...

Isn't that crazy? Almost every DC game supported the VGA adapter. Dreamcast was ahead of its time! If only GC, Xbox, and PS2 would've followed. . .

Well Xbox did...

BlueTsunami said:
The only console game that I have experienced at 1080i is GT4. It tends to sparkle alot :LOL: but its a great look to the future on how console specific games may look at Higher Resolutions. I just can't wait to see what a 1080p game looks like :devilish:
The 1080i mode in GT4 is not actually real 1080i, but it looks noticiably better than 480p though.
 
Dr Evil said:
The 1080i mode in GT4 is not actually real 1080i, but it looks noticiably better than 480p though.


I never figured that out... how is it different from true 1080i?
 
Alstrong said:
Dr Evil said:
The 1080i mode in GT4 is not actually real 1080i, but it looks noticiably better than 480p though.


I never figured that out... how is it different from true 1080i?
It IS 1080i in that it has 1080 interlaced lines of vertical resolution (nothing more than y540 interlaced). But it doesn't match the 1080i broadcast spec's resolution of 1920x1080. I think the horizontal resolution is 640, and thus it's really not representative of what we should expect from *true* 1080i games in the next generation.

And if people want to go on talking about running a PC at Xbox specs, please note that the Xbox has 2 vertex shaders, so a comparable PC would need an underclocked GF4. And then to compare HL2 on the Xbox to it on a similarly spec'ed PC is still unfair, as the Xbox version has much lower polycounts on the models. The PC version also doesn't stream, so there's no way 64MB is gonna cut it, even if you've got another 64-128MB on your video card.
 
Back
Top