HD rendering vs. more sfx

so i've been lurking here for a couple years and thought i'd make a post about something thats been on my mind concerning the microsofts decision to render everything at a minimum of 720P with the 360.

with this decision to render every game at 720P minimum, how much of a trade off will we see with games using less special effects and graphical features in order to maintain a playable framerate?

how much "graphic horsepower" will gamers still using regular tv's (such as myself) have to sacrifice with each game being rendered internally at 720P, even if we are only seeing standard res output?
 
Frame-rate not style

turkish a. punkass said:
so i've been lurking here for a couple years and thought i'd make a post about something thats been on my mind concerning the microsofts decision to render everything at a minimum of 720P with the 360.

with this decision to render every game at 720P minimum, how much of a trade off will we see with games using less special effects and graphical features in order to maintain a playable framerate?

how much "graphic horsepower" will gamers still using regular tv's (such as myself) have to sacrifice with each game being rendered internally at 720P, even if we are only seeing standard res output?

Because of things some companies said before and decisions with first xbox games, I think they will prefer to compromise frame-rate to improve style and effects so 30fps is likely for many games with benefit of improved graphics. Therefore, if high resolution is required, I think they will drop frame-rate to 30fps instead of sfx.
 
There's enough raw horsepower in x360 to do more SFX at HD res than we've ever seen so far on any current system. Programmers simply need to learn how to tap the potential, that's all.
 
This is an issue Carmack brought up before. Basically he said he'd prefer to cut resolution before cutting effects/power used per pixel, and that with his next engine this could prove problematic since he won't be able to reduce resolution (on X360 at least).

As a SDTV owner, the benefit you are getting is any additional AA due to downscaling of the image. But with AA supposed to be free on X360, I'm not sure how much you'd notice, especially versus noticing potentially 3x the power per pixel as at 720p.

It's definitely a valid question. I suppose the only way to answer is to see how many X360 owners wind up with HDTVs. On the other hand, sometimes technology needs to be pushed, you need drivers, and HD-content is obviously needed to get HDTVs out there, even if we're only really getting started doing that. It's sort of a chicken-and-the-egg scenario.
 
Last edited by a moderator:
Titanio said:
This is an issue Carmack brought up before. Basically he said he'd prefer to cut resolution before cutting effects/power used per pixel, and that with his next engine this could prove problematic since he won't be able to reduce resolution (on X360 at least).
I don't think that's accurate. As I recall, he was in favor of having options. So if he came up with a killer effect that would only work real time at 480p, he could go with that. I don't think he knows yet whether performance will be good enough at 720p.

I certainly find that reasonable. As long as the aliased mess that are current gen games get fixed, I'm not so enamoured with 720p that I *need* it.
 
I'm not sure what's stopping a developer from rendering an image at a lower-resolution and then scaling it up to the output buffer.

Microsoft themselves used to recommend this to people as a way of getting around the XBox's lack of fillrate compared to PS2, in situations where a lot of particles were getting used.

It won't look as pretty but with decent filtering might not look terrible either - especially if you've got some kind of "killer effect" that is making good use of the fillrate you're saving.

For example, rendering at 640x720 and scaling upwards to 1280x720 would give you double the fillrate (more or less) and is only a 2x scaling horizontally. I'd prefer the horizontal scaling because on a lot of display devices the horizontal resolution is less visible than the vertical, and in this case it's a neat ratio that ought not to create any crawling in a moving image.

GT4 does this on PS2 for its 1080 output. It doesn't output a true 1920 pixels, but simply gets the video output circuit to scale up a lower-res image. Doing it that way, you don't need much more memory than a traditional progressive image (but you need to output a constant framerate or you get line-doubling, which I think I saw in GT4...)
 
Honestly, if you own a SDTV I wouldn't recommend getting an Xbox 360 at launch. You are not going to notice a substantial($400 price) difference in the games graphics with these launch games.

For me personally, I am ecstatic that games are minimum 720p. I'm always an early adopter of technology and have had an HDTV for over a year now. Current gen consoles look pretty assy on a HDTV set. Being able to finally play my games in HD is more than worth the price of admission. But if you're still on a SDTV I would just wait til a price drop or Gears Of War...

But to answer your question, From Software stated that without taking High-def into account, xbox 360 is 20 times more powerful than Xbox. But once you take the resolution increase into account, it's more like 10 times more powerful than Xbox. So the resolution increase is roughly halfing the power the system could utilize at a standard def resolution...
 
Last edited by a moderator:
I think it is a shame that the 720p is mandatory, they should give the option to the dev, after allwhat happens if the "Carmack situation" became a reality I (and I think that most of us) would prefer the killer sfx, on the other side if the games that we saw from GoW,WD,KZ2,MGS are possible in 720p then I suposse that it would very hard to have such a killer sfx unless it is a exotic one , anyway I think dev should have the choise.
 
Li Mu Bai said:
Nothing is free, the penalty simply isn't as taxing.

I know, that's why I said "supposed to" ;) But for example at 640x480, the cost should be limited to the daughter die's computational and bandwidth capacity alone - no tiling costs etc.

Inane_Dork said:
I don't think that's accurate. As I recall, he was in favor of having options. So if he came up with a killer effect that would only work real time at 480p, he could go with that. I don't think he knows yet whether performance will be good enough at 720p.

I'm working off memory, but I certainly recall him expressing the concern that there might not be enough power to do what he wants to do with the next engine at 720p. I'll try find an exact quote re. what he said here.

edit -

here it is. he isn't clear whether there will or won't be, but..eventually it stands to reason there won't be, be it the next engine or the one after etc. etc. ;) I guess the point would then be whether the system will be around long enough for that to be a concern, but since PC games tend to outpace consoles in the mid-to-late cycle, it could well be.

"Obviously things like Quake 4 are running good at the higher resolutions but the next generation rendering technology there are some things like if it comes down to per pixel depth buffered atmospherics at a lower resolution, I’d rather take that than rendering the same thing at a higher resolution. But I’ll be finding out in kind of the next six months about what I actually can extract from the hardware."
 
Last edited by a moderator:
Hardknock said:
Honestly, if you own a SDTV I wouldn't recommend getting an Xbox 360 at launch. You are not going to notice a substantial($400 price) difference in the games graphics with these launch games.

That's a joke, right?

I suggest comparing an Xbox Morrowind screenshot to a 360 Oblivion screenshot, and then try to say there won't be a substantial difference with a straight face.
 
And here I was, complaining that 720p was not going to be enough resolution, considering 1080p is slowly trickling into the market and that it will become mainstream in a few years. I guess 720p would be good enough, if more titles employed AA. But without AA or clever level and asset integration, 720p does not produce enough visual fidelity, even in a situation in which the gamer is many feet away from an output device, IMHO. If a title ops for 640x480, it'd be making use of some incredible core rendering technology that taxes the hardware enough to warrant a lower res.

Personally I prefer nice clean, consistent graphics, with excellent IQ and see them as second priority, following core rendering technology (lighting, shading models), taking precedence over additional cinematic effects; an example of how such prioritization results in a visual trade-off can be noted in Doom 3, where special effects take a backseat to core rendering technology, becasue allot of hardware at the time of release was not ready for both. Don't get me wrong, I enjoy widespread employment of special effects shaders, but I believe that if the use of a certain technique has an IQ tradeoff, like shimmering or texture aliasing, that cannot be counteracted with an IQ preserving technique, it should be removed for the sake of maintaining visual fidelity. For example, I would opt for no HDR if it means no AA on a game employing scenes with allot of high contrast edges. This is, of course, in situations where the employment of 1 technique will serve as a detriment to the employment of other. It all ends-up being about an IQ-Performance trade-off, where priorities must be set. Unfortunately, we might have individual preferences that are not inline with those of developers.
 
Last edited by a moderator:
Back
Top