What resolution and framerate should next-gen target *spawn

Status
Not open for further replies.
So apparently I'm the only one that's sick of these so-called "HD" consoles that can't even do full HD graphics? After five years, I've accepted the fact that the current generation simply can't do it, according to developers, but I refuse to just sit back and say that it's okay for them to continue the trend with the next generation.

We're not talking an incremental change with next-gen hardware, these new systems are going to be ludicrously powerful compared to what's come before, just like it is every generation. I see no reason why we have to start limiting ourselves already, unless we're just so used to the consoles being so damned weak that it's okay to cut corners.

Is it really so little to ask that they give us both 1080p graphics and improved lighting, shaders, and effects? We're not in a situation where we have to choose one or the other, you know.
 
So apparently I'm the only one that's sick of these so-called "HD" consoles that can't even do full HD graphics? After five years, I've accepted the fact that the current generation simply can't do it, according to developers, but I refuse to just sit back and say that it's okay for them to continue the trend with the next generation.

We're not talking an incremental change with next-gen hardware, these new systems are going to be ludicrously powerful compared to what's come before, just like it is every generation. I see no reason why we have to start limiting ourselves already, unless we're just so used to the consoles being so damned weak that it's okay to cut corners.

Is it really so little to ask that they give us both 1080p graphics and improved lighting, shaders, and effects? We're not in a situation where we have to choose one or the other, you know.
I'm sick of people holding onto this "high definition" buzzword to such a high regard. All I ever see are people complaining about "sub-HD" resolutions and whatnot, without giving a single thought to the fact that even 1152x640 is a lot higher than the 640x480 standard of last generation.

Obviously come next gen, the consoles will be able to deliver 1080p visuals with better shaders and whatnot than what we have with the current gen of consoles, but if these hypothetical consoles were pumping out 720p visuals instead, the difference between current and next gen would probably be even more pronounced.
 
Not only 360, both consoles would probably end up sub hd,especially PS3. PS3 has been getting worse multiplats from beginning of this gen..
There are a lot of 1080p games to more than balance that out. It's far easier to below below 720p average, when you only have about 5 to 7 total 1080p games to average in.

All of the 'street' games are 1080 but who cares, most (all?) of the games that are at 1080 aren't very good looking. They make a great argument for using lower resolutions.

And just for completeness, the PS3 games all averaged together will be sub-HD as well.
Maybe they aren't good looking to you, but that's not the case for me. Super Stardust HD looks great to me. Wipeout HD looks great to me. Fat Princess looks great to me. Under Siege looks great to me. MLB: The Show (08 thru 11) look great to me. Etc, etc.

Show me, because the lion's share of PSN games seem to be native 1080p.


I'm not sure I understand the sentiment? Do you think a DVD movie looks worse than any current games on an HDTV? If they could come anywhere close to DVD quality I would be elated. The actual value of how many pixels are displayed should be of almost no importance.
Could I watch a DVD movie without too many problems on my HDTV? Yeah. A game version with that fuzziness/vasoline look, I could not play. Would you rather watch a Superbit DVD or a low bitrate HD stream from Netflix? That's the low rez with more per pixel processing vs. high rez with less per pixel processing argument right there. I would pick the HD Netflix stream every time.
 
So apparently I'm the only one that's sick of these so-called "HD" consoles that can't even do full HD graphics? After five years, I've accepted the fact that the current generation simply can't do it, according to developers, but I refuse to just sit back and say that it's okay for them to continue the trend with the next generation..
There is a big difference between not being able to do full HD graphics and not being able to do full HD under most conditions. I think you should be a little more accurate with your statements. There are around seven 360 games that are full HD. There are around 40 PS3 games that are full HD. That makes your statement a false one.
 
Maybe they aren't good looking to you, but that's not the case for me. Super Stardust HD looks great to me. Wipeout HD looks great to me. Fat Princess looks great to me. Under Siege looks great to me. MLB: The Show (08 thru 11) look great to me. Etc, etc.

Show me, because the lion's share of PSN games seem to be native 1080p.

http://forum.beyond3d.com/showpost.php?p=1113342&postcount=2

Obviously not a complete list, but there's far more non HD games (I guesstimate more than 70) on that list than 1080 ones.


Could I watch a DVD movie without too many problems on my HDTV? Yeah. A game version with that fuzziness/vasoline look, I could not play. Would you rather watch a Superbit DVD or a low bitrate HD stream from Netflix? That's the low rez with more per pixel processing vs. high rez with less per pixel processing argument right there. I would pick the HD Netflix stream every time.

You keep missing the point or perhaps just avoiding it.
 
I'm sick of people holding onto this "high definition" buzzword to such a high regard. All I ever see are people complaining about "sub-HD" resolutions and whatnot, without giving a single thought to the fact that even 1152x640 is a lot higher than the 640x480 standard of last generation..

+1...every game that I've played on X360 has much better resolution than any Wii game. I don't even care if the game actually renders at native 720 resolution or not. If it looks better then it looks better period.
 
I'm happy with 720p and 30 fps with good 1080p scaling and motion blur.

I would much rather have enhanced graphics and more interesting environments, more/better AI than higher rendering resolution.

I mean the 720p vs 1080p difference only becomes visually perceivable if you either sit close to a smaller hdtv or have a huge 50 inch+ set..

I mean on a 1080p 32 inch set you have to sit no more than 50 inches (4 feet) away to perceive all the possible detail available.
 
So apparently I'm the only one that's sick of these so-called "HD" consoles that can't even do full HD graphics? After five years, I've accepted the fact that the current generation simply can't do it, according to developers, but I refuse to just sit back and say that it's okay for them to continue the trend with the next generation.
And you're free to complain all you like (as long as you don't spam the board ;)) but that doesn't change the fact that devs face a choice and it won't stop others of us who may want 1080p60 presenting arguments why it's not to be expected.

We're not talking an incremental change with next-gen hardware, these new systems are going to be ludicrously powerful compared to what's come before, just like it is every generation. I see no reason why we have to start limiting ourselves already, unless we're just so used to the consoles being so damned weak that it's okay to cut corners.
See the whole of the rest of this thread (SOTC S3D).

Is it really so little to ask that they give us both 1080p graphics and improved lighting, shaders, and effects? We're not in a situation where we have to choose one or the other, you know.
No, but we are in a position where choosing less resolution means giving better pixels, and will be for decades to come no doubt. Hence mandating a lower quality of visuals for better IQ that some people don't care about is a folly. In fact if 95% of people would rather have better 720p30 visuals than simpler 1080p60, you'd be in the extreme minority and your request extremely unfair. There's no sense in tying your platform and developers down. Instead let the market decide what's a tolerable resolution or not. If people hate sub-HD that much, sub-HD games wouldn't sell.
 
I do not get what you mean about sub 720p in regards to frame rate. Could you expand please?

And I don't get the question I'm afraid. All I'm saying is if you have to lower the resolution to make something worthwhile happen on screen, by all means do it, in this generation or the next.

Also, did you really get consistent 60 fps on PC RE5? And you tried the PS3 version around the same time and didn't feel any difference?

I get something like 120 fps in that game (at 1080p) It's not a very demanding game and my PC isn't too shabby. I didn't try them back to back, but if it takes that kind of comparison to highlight the differences it can't be such a huge deal.
 
I'm not opposed to 720p games if they use good AF, some AA and run without tearing. 60fps for action games (including shooters) and racers would be nice, but I get the feeling that minimum framerates are mostly limited by single-thread CPU performance and that may be hard to bump up significantly. 1080p with a solid 30fps is probably easier to pull off, even with nice AA and AF.

I don't see big problems with shading quality. Many advanced effects even current games use are of rather questionable appeal already. For one, I don't want better DOF effects and more advanced colour grading. I want less DOF effects and less colour grading. Trade these cycles for something else, like better shadow mapping and pristine texture filtering.
 
What is the state of PC cards? Can't you get mid-level cards which can handle 1080p rendering with plenty of effects?
 
So apparently I'm the only one that's sick of these so-called "HD" consoles that can't even do full HD graphics? After five years, I've accepted the fact that the current generation simply can't do it, according to developers, but I refuse to just sit back and say that it's okay for them to continue the trend with the next generation.

We're not talking an incremental change with next-gen hardware, these new systems are going to be ludicrously powerful compared to what's come before, just like it is every generation. I see no reason why we have to start limiting ourselves already, unless we're just so used to the consoles being so damned weak that it's okay to cut corners.

Is it really so little to ask that they give us both 1080p graphics and improved lighting, shaders, and effects? We're not in a situation where we have to choose one or the other, you know.
You're definitely not a the only one who thinks that way. You're not alone.

I also identify with your ideas, and just thinking that next gen we will have to buy a new machine to play again -yes, there are quite a few sub-HD games, but most games run at, or very close to, 720p these days- at 720p makes me uncomfortable...not something I'm interested in.

What I find unbelievable is how very smart guys like Alstrong, Andrew Lauritzen, and others show such conformism. I didn't expect conformity, as in everyone must follow the same basic pattern.

Years ago, back in November 2005 when I bought the Xbox 360, I was with some regular customers of that business who used to play PC games together via LAN, boasting in a cyber cafe about my brand new console, and how it featured an unified shaders GPU, a triple core CPU with 6 threads, etc etc.

A guy asked me how much it cost, and I said that it was 400€, to which he replied: "I would never ever pay 400€ for a console, never in my life". I didn't reply, but maybe now his words make sense. I am not going to pay 300-400€ anymore for a console which can only run games at 720p, AGAIN.

The software doesn't matter that much to me, not the online services -even fridges will have online in a few years from now-. I just want to buy a console that will last many years and can run modern games decently enough. In that sense I am happy with the PS3 and 360.

I just want 1080p, am I asking for so much? I am asking for just to be happy playing. I don't need much. Just a little nice console at home with crisp graphics.

For instance, looking at this game... Driver San Francisco... It runs at 720p, but I just can't read most of the signs, it's muddy. At 1080p I could discern a lot more, like in real life -I borrowed this two pics from Digital Foundry article as an example-

http://images.eurogamer.net/assets/articles//a/1/3/9/6/2/3/1/360_aa2.bmp.jpg

http://images.eurogamer.net/assets/articles//a/1/3/9/6/2/3/1/360_z.bmp.jpg

I share a lot of ideas with you. I just wanted to say that you aren't alone.

Nice to meet you, Jedi2016. ;)
 
You have to remember that 1080p is 2.25x pixels of 720p. If you want to bump the frame rate also to 60 fps (from the current 30 fps standard), you would need a total of (2.25*2=) 4.5x raw GPU performance just for that. If you want to have some noticeable extra eye candy as well, you'd likely need around 10x GPU power. But that's not a far fetched dream, since current PC high end GPUs already have over 10x performance of the current consoles.

Personally I would hope for the 10x GPU performance boost. It would be enough for most developers to move to 1080p / 60 fps. Anything less than that, and I suspect we see many 30 fps games (and sub 1080p games). At 1080p it's likely that more developers choose to partially use half resolution (now most games use half res for particles), since 920x520 is much better than 640x360.

For instance, looking at this game... Driver San Francisco... It runs at 720p, but I just can't read most of the signs, it's muddy. At 1080p I could discern a lot more, like in real life -I borrowed this two pics from Digital Foundry article as an example
In that case anisotropic filtering would benefit much more than increased resolution. Fortunately all new graphics hardware are much better in anisotropic filtering than the current generation consoles.
 
If you want to have some noticeable extra eye candy as well, you'd likely need around 10x GPU power. But that's not a far fetched dream, since current PC high end GPUs already have over 10x performance of the current consoles.
Now go 5-8y into the future after the next-gen launch and imagine still playing games with that 10x current console GPU power.
 
What I find unbelievable is how very smart guys like Alstrong, Andrew Lauritzen, and others show such conformism. I didn't expect conformity, as in everyone must follow the same basic pattern.
I'm not sure what you mean by conformism in this context... I think we've both just looked at a lot of interactive applications (and targeted tests) with different levels of AA at different resolutions. It's pretty clear when you make a direct comparison that the sampling patterns of good AA are superior to a larger, uniform grid (i.e. higher resolution). In fact the offline rendering guys are laughing at as because they've been publishing papers to this end 20+ years ago ;)

And we're not saying that you won't get 1080p at least for some games. We're just saying rendering is becoming much more complicated and doesn't fit into the simple box of "one resolution" anymore. As sebbbi has mentioned, most games render particle effects at half resolution. Some games render shadows are lower resolutions too. Other games already dynamically change their rendering resolution to maintain a given frame rate. It's going to become more and more common to evaluate various terms at different resolutions, particularly now that many games are using deferred shading (which makes that easier).

The key here to note is that if you have a shading term that is low frequency, it is a *waste of cycles* to compute it at full (1080p say) resolution. If you compute it at a resolution that more closely matches it's frequency then upsample, you get the same results and use less power.

So I think there's probably a good chance that we can compute visibility and do primary texturing at 1080p in future consoles (and hopefully some good 4x MSAA on top of that for visibility), but I doubt anyone but the lazy will just brute force compute the entire shader at that rate. It's just not an efficient use of processing resources. And in some cases, even doing visibility and texturing at 1080p might be a waste compared to using the power in some other method of increasing visual quality.

My main point is to urge you guys to stop thinking of this stuff in such simple terms, like whether a game is "1080p" or whatever. It's never that simple and will become even less-so in the future. You simply must trust the developers and art direction to make the best trade-offs in terms of getting the best image quality, because it's far more complicated than you are making it out to be.
 
http://forum.beyond3d.com/showpost.php?p=1113342&postcount=2

Obviously not a complete list, but there's far more non HD games (I guesstimate more than 70) on that list than 1080 ones.
That's not proving it. However, I've done the math. The PS3's average resolution over 301 listed resolutions is ABOVE HD (1284 x 758 rounding to nearest point). ;)

You keep missing the point or perhaps just avoiding it.
If I'm missing the point, then say what the point is. I don't think I'm missing the point, though.
 
However, I've done the math. The PS3's average resolution over 301 listed resolutions is ABOVE HD (1284 x 758 rounding to nearest point). ;)
You want to compute a median/percentiles here, not average...

As an aside, can we please stop calling it "HD"? This is Beyond3D... we can handle the actual numbers :p
 
That's not proving it. However, I've done the math. The PS3's average resolution over 301 listed resolutions is ABOVE HD (1284 x 758 rounding to nearest point). ;)

It's more important that PS3 games run at 1920x1080 because the PS3 doesn't have a proper hardware upscaler. That means anything less than that resolution is at the mercy of the tv's upscaler, which more often than not is of bad quality and results in a blurry image. Yes, legions of PS3 gamers out there are playing games blurrier than they would be on the 360 because of this very issue and most have no idea about it. For example, Uncharted 2 looks very blurry on my Panasonic 1080p plasma if I just plug the PS3 directly into my tv because my tv's upscaler is garbage. However when I plug the PS3 first into an a/v receiver that has a high quality hdmi 720->1080 hardware upscale then Uncharted 2 looks sharp again. My tv is less than 2 years old, imagine how blurry people are playing PS3 games on tv's that are even older!

It's not as important to be at 1920x1080 on the 360 because it has a proper hardware upscaler built into the console so you don't have to worry about heavy upscale induced blur.

In games like Crysis 2 that was mentioned, the blur is more likely the result of their aa implementation than the upscale.
 
Why bother? Many likely won't have the seating distance or size of the set correct on their end to discern any difference.
 
You want to compute a median/percentiles here, not average...

As an aside, can we please stop calling it "HD"? This is Beyond3D... we can handle the actual numbers :p
I can understand the psychology behind wanting to use the actual numbers. Saying HD is a line in the sand. Using numbers attempts to erase that invisible line in the mind. HD is a technical term associated with an exact number (1280x720). Surely, we can use a shorter expression than 1280x720 without being looked down upon on Beyond3D. :)

It's more important that PS3 games run at 1920x1080 because the PS3 doesn't have a proper hardware upscaler. That means anything less than that resolution is at the mercy of the tv's upscaler, which more often than not is of bad quality and results in a blurry image. Yes, legions of PS3 gamers out there are playing games blurrier than they would be on the 360 because of this very issue and most have no idea about it. For example, Uncharted 2 looks very blurry on my Panasonic 1080p plasma if I just plug the PS3 directly into my tv because my tv's upscaler is garbage. However when I plug the PS3 first into an a/v receiver that has a high quality hdmi 720->1080 hardware upscale then Uncharted 2 looks sharp again. My tv is less than 2 years old, imagine how blurry people are playing PS3 games on tv's that are even older!

It's not as important to be at 1920x1080 on the 360 because it has a proper hardware upscaler built into the console so you don't have to worry about heavy upscale induced blur.

In games like Crysis 2 that was mentioned, the blur is more likely the result of their aa implementation than the upscale.
The only reason MLB:The Show and many other PS3 games are full 1080p is because the PS3 has a standard vertical resolution scaler, huh? It's not because the hardware allows for certain games to have that resolution. Cool. Got it. 720p is blurry on your 1080p plasma TV? I've never had blurry high bitrate 720p content on my 70" 1080p TV.

In other words, you wouldn't rather have a proper 1080p picture. You would rather have a 720p picture upscaled to simulate 1080p, right? No consumer worth his/her salt should opt for 720p upscaled over native 1080p. Plus, no scaler makes that kind of a difference within the price of a console. I would say correct color reproduction far outweighs the difference in scalers between PS3 and 360. ;)
 
Status
Not open for further replies.
Back
Top