What resolution and framerate should next-gen target *spawn

Status
Not open for further replies.

wco81

Legend
If they can't do 1080p60 with more effects than this generation, maybe it's not time for next-gen yet.
 
My opinion with 720P vs. 1080P is... go for 1080P always. Most, if not all tvs sold today have 1080P panels. And a lot of the cheaper ones have the worst scalers there are. Plus, if you downscale for 720P, you get free super sampling, too.

And framerate... yeah. I think on consoles, framerate is a MAJOR issue. I play mostly on PC these days (as I recently upgraded) and I can play most games at 45Hz and up, but at 1080P with 4xAA and sometimes even more. Usually, I try to dial in the settings so set I can 60Hz average... because I can. But I'd rather have locked 30Hz than wild framerates jumping up and down all the time. Plus, input lag is also a major issue. I guess a lot of that stems from multithreaded engines today. Though it has gotten a lot better in the recent past.
 
If they can't do 1080p60 with more effects than this generation, maybe it's not time for next-gen yet.
The SOTC HD remake can't manage 1080p60 compared to an SD PS2 game, but I wouldn't say PS3 was early and gamers would have been better off waiting, having to make do with PS2 for another year or two. :p And if PS3 had been later, 2008, then next gen would need to come even later to get the same degree of performance increase. We'd be looking next-gen in 2016!
 
The SOTC HD remake can't manage 1080p60 compared to an SD PS2 game, but I wouldn't say PS3 was early and gamers would have been better off waiting, having to make do with PS2 for another year or two. :p And if PS3 had been later, 2008, then next gen would need to come even later to get the same degree of performance increase. We'd be looking next-gen in 2016!

I want next-gen sooner than later.

But it sounds like for any next gen console launched in the next 2 years, rendering would still be at 720p.

Right or wrong, most consumers are going to respond to resolution than AA or AF or other types of filtering and effects. Every TV sold, or for that matter any electronics with a display now are advertised with resolution. People associate the higher resolution number with the sharpest/clearest HD pictures.

The whole, render at 720p but with more effects and then upscaled to 1080p isn't going to be appreciated, unless those effects get you photorealism or cinema-quality CGI or something never seen in console or even PC games graphics.

Instead, you might have mobile devices touting 1080p games graphics and a lot of people might think 720p rendered graphics are inferior.
 
Maybe it's just me, but I don't even own a pc other than a couple laptops and have no interest in gaming on those. I haven't even seen a game running on pc in at least 6 years. I'm a convenience gamer and consoles are convenient. I believe that pc gaming lost a lot of popularity this gen and next gen will be even worse on pc gaming. Games are nearly at the point of looking good enough and if next gen consoles can pull off what I've seen BF3 doing but on a console... It will certainly be good enough. Anything more is too much for the average Joe to notice. Also, if Naughty Dog can do UC3 on a graphics card as crippled as RSX, next gen they should be able to do something incredible at 1080p (not 720 as some of you are suggesting). I'm suspecting that the next gen Sony and MS systems will provide better development environments and will likely be the last consoles we are likely to see... As such they will need to have a long shelf life and I suspect that a decent amount of RAM will go a long way to ensuring that.
 
It's late I'm less say a bit tired but how about opening a real prediction thread with people having only one say on what they believe next gen could be about. No discussion only one belief, no edit even for spelling or syntax, weight your words!
It could be fun as next gen re at last getting closer. Time to bet the house :)
 
Right or wrong, most consumers are going to respond to resolution than AA or AF or other types of filtering and effects.
I disagree, other than via the silly marketing numbers game. Most gamers don't even know that the majority of games already render sub-720p and upsample, and I doubt if you asked the majority of the rest of them who are aware I doubt they could single out the games that use higher or lower resolutions. We have an entire thread here dedicated to systematically working out what resolutions games use after all ;)

That said, *everyone* notices swimming, flickering and other aliasing artifacts. Even though people are somewhat accustomed to it right now, it's still one of the major differences between offline/film and games. a 480p Cg DVD still looks way better nicer than any of the current games... it's low-resolution, but nicely anti-aliased. People are starting to take this artifact seriously and rightfully so; it's obnoxious and really screams "game graphics" more than almost any other problem.

I'm a convenience gamer and consoles are convenient.
Me too, and man I can't stand installs/updates on PS3 ;) But that's another topic... in any case it's all really the same technology in the end.

I believe that pc gaming lost a lot of popularity this gen and next gen will be even worse on pc gaming.
Untrue. PC gaming is as popular as ever... in fact if you include stuff like flash/facebook games (and really there's no distinguishing reason not too) it's getting relatively *more* popular than other platforms. On the more "hard-core" games front, all that happened is a whole lot more people got into gaming, which is great :) Few people have actually stopped playing PC games in favor of consoles.

By the way the same situation applies to mobile games... a whole ton of new gamers are being born on those platforms, but it hardly invalidates consoles.

Games are nearly at the point of looking good enough and if next gen consoles can pull off what I've seen BF3 doing but on a console... It will certainly be good enough.
That's what people said about this generation, and the generation before it. They aren't even close to "good enough" for me. For one thing that lack of real-time global illumination is just obnoxious, and it's likely that we'll need at least 2 more orders of magnitude more computation power (and probably memory) to begin to acceptably address that. i.e. not this generation, and probably not the next (assuming another console generation even happens, which is up for debate).

Anything more is too much for the average Joe to notice.
Lol... how often would even a child mistake BF3 for real life? Never. It's not even close... there's a wide gap left. Movies barely just got there (in some cases) and they use a million or more times more computational power to do it than games have.

We still need a LOT more computational power to get graphics where it needs to be, and even more still once you want good physics and AI.
 
Last edited by a moderator:
If they can't do 1080p60 with more effects than this generation, maybe it's not time for next-gen yet.

I want next-gen sooner than later.

But it sounds like for any next gen console launched in the next 2 years, rendering would still be at 720p.

Right or wrong, most consumers are going to respond to resolution than AA or AF or other types of filtering and effects. Every TV sold, or for that matter any electronics with a display now are advertised with resolution. People associate the higher resolution number with the sharpest/clearest HD pictures.

The whole, render at 720p but with more effects and then upscaled to 1080p isn't going to be appreciated, unless those effects get you photorealism or cinema-quality CGI or something never seen in console or even PC games graphics.

Instead, you might have mobile devices touting 1080p games graphics and a lot of people might think 720p rendered graphics are inferior.
I love that, and I totally agree with you.

60 frames a second is a pipedream for the majority of console games still, outside the odd racer and a handful of games. It's pretty sad.

Developers these days has mostly no option but to squeeze the best visuals they can out and that means 30 frames for most games so their game looks competitive in videos and screenshots, where true smoothness and speed aren't that important.

I want next gen consoles to be capable to run games like current PCs, which is the only place nowadays where you can play almost everything at 60 fps.

I don't think hoping next gen consoles will be powerful enough to run games at 60 fps because they have excess power which has nowhere else to go is an impossible dream.

I want max res for my display, too, 1080p. Come on, is it so difficult? It has been 6 years!! :oops: since this generation started, and technology evolved enough to be counting pixels again and play at 720p all our life. If I wanted to play at 720p regularly I would buy a mobile phone or a tablet 2 years from now and stick to them. But I don't like those to play, I love consoles.

If there is one thing I've learnt, is that 1080p is always a good thing. 720p + AAx4 is fine, but at 720p or less it's difficult to discern details in the distance. It just looks muddy at times.

I don't mind waiting 3 or 4 years for next generation to start, but if we are going to play a HD 1.5 version of the current consoles I'd rather prefer to keep my console and not buy a new one.

Here is a quote from John Carmack :smile: :mrgreen: :) :yes: :love: I totally agree with. He sums it all up in a couple of lines:

"Sure, we can increase the resolution a bit, maybe go from 30 fps to 60 fps, but we are going to need to see a significant increase in the hardware capabilities before anyone will be able to justify buying a whole brand new console"

What's the point of playing again at 720p with more graphical effects? For a handful of titles it might be a good thing, but for most 1080p is the way to go, imo.

And having to pay 300-400€ for a new console which is the same thing as the old one just with "online services"? -online is not that great, it's fine and all, but way too overrated-.

You don't need to be a tech head to appreciate the increased resolution and framerate. People love fine detail, not having to get closer to the screen to read a sign or any other thing on screen.

Graphics aren't the only that needs improvements, physics still have a long way to go. I am playing Crazing Machine Elements as of late and having a blast, because while the game has average graphics -so jaggy-, the physics are amazing compared to any other game with mediocre physics out out there (Halo, Gears, etc).

Driver developer also want 1080p and 60 fps, which means smoothness and eliminating jaggies:

Driver: San Francisco developer Reflections is already thinking about creating a next-gen game engine, and wants 1080p, 60 frames-per-second games as a mandate from the next Xbox and PlayStation 4.

Studio founder Martin Edmonson told Eurogamer this would eradicate a number of challenges developers face creating games on the Xbox 360 and PlayStation 3.

"I would like to see 1080p, 60 frames-per-second as a mandate," he said. "A level playing field where we don't have these dilemmas any more - do we go for detail, do we go for frame-rate? Do we go for effects, do we go for frame-rate? Do we go for resolution or do we go for memory?

"But to have this level playing field where you accept the game is going to be smooth as silk, you accept the resolution is going to be crisp and high, then it's all down to content. That would be something I would consider a load of dilemmas and nonsense out the way. "

Driver: San Francisco, out today in the UK, runs at a cool 60fps in single player - but drops down to 30fps in multiplayer.

The 60fps effect is maintained despite the game rendering a huge, open world for players to drive around in.

http://www.eurogamer.net/articles/2011-09-02-driver-dev-thinking-about-next-gen-now
 
Last edited by a moderator:
Consider for a moment that from 720p to 1080p and a doubling of framerate requires about 4.5x the raw power ( a naive estimate given there are different bottlenecks as well). Also consider the number of sub-720p games right now, which effectively bumps that to around 6x jump in hardware specs just to maintain the graphical fidelity of this gen. (it'll be a higher multiplier considering the instability of 30fps as I imagine you're asking for zero drops to 60fps)
 
Forcing all developers to do 1080p60fps is stupid (unless the system is so powerful the requirement is trivial). There's tons of games where they are better off spending the resources in other areas whether it be increased AA or other effects and that option should be open to them.
 
Since 1080p has became the standard for TV makers I'd say for next gen 1080p 30fps is the sweetest spot for maximum IQ and graphics jump. If you want a 60fps game you can always go down to 720p and still get fairly crispy IQ on current Full HDTVs. This resolution increase is actually in line with the previous console gen so I don't see any major deviation and as for AA, you can go with either MLAA or FXAA rather than MSAA.
 
What about frame interpolation tech? I thought it looked pretty promising. Is there any chance that any of the big three will implement that kind of solution on a hardware level? Maybe if they cut the controller latency they could somehow squeeze the input lag from frame interpolation into that saved time? We could have games with the responsiveness of current gen 30fps titles that run at pseudo 60fps. For me it seems like a decent compromise since I assume that 1080p60 is pretty much out of the question for the majority of next gen games.
 
60 frames a second is a pipedream for the majority of console games still, outside the odd racer and a handful of games. It's pretty sad.
Framerate isn't tied to hardware performance. If next gen is delayed 3 years, it'll still have 30fps games, and then 20 fps titles at the end of its life like SOTC. Framerate is all about software and the devs picking a target that can be rendered in 1/30th or 1/60th of a second. This gen is perfectly capable of 60 fps, but with less fancy pixels than we currently have. And I wonder if then you'd complain about how unpretty games are? ICO can hit 1080p60 this gen (although it's capped to 30fps), but it's nothing like as good to look at as other lower resolution games.
 
Consoles should go for 1080p30 with 4xAA.

I haven't even seen a game running on pc in at least 6 years. I'm a convenience gamer and consoles are convenient.
One second after I send this comment I can right click on my little Steam icon down in the right hand corner and select "Deus Ex Human Revolution", which will launch right away. Can you do the same? In fact, I could browse this forum in an overlay while playing.
I don't drive to the store to buy a game. I buy, download and start playing.

I do all this in a leather armchair, 2.5 feet from a 30 inch 1600p screen. You say that you own "a couple laptops". I say that if you didn't have to be so mobile with your computing needs (which you clearly don't have to be, but everybody and their mother buys laptops), convenience would not be an issue.
 
Consider for a moment that from 720p to 1080p and a doubling of framerate requires about 4.5x the raw power ( a naive estimate given there are different bottlenecks as well). Also consider the number of sub-720p games right now, which effectively bumps that to around 6x jump in hardware specs just to maintain the graphical fidelity of this gen. (it'll be a higher multiplier considering the instability of 30fps as I imagine you're asking for zero drops to 60fps)

Ok, let's consider it. We'll use Xenos and a 6950 for comparison (a variant of that is within reason of what could be done next gen).

Pixel fillrate - 25.6 vs 4 = 6.4x
Texel fillrate - 70.4 vs 8 = 8.8x
Bandwidth - 160 vs 22 = 7.2x
GFLOPS - 2253 vs 240 = 9.4x
Memory - 2048 vs 512 = 4x

So even using your pulled out of somewhere multipliers everything is covered except memory and a jump of that to 4GB (8x) covers that.

Also, why should we consider poor performing devs from this gen? Many of them have already paid the price for poor performance. Personally, I'll put my trust and give my money to the good devs
 
Ok, let's consider it. We'll use Xenos and a 6950 for comparison (a variant of that is within reason of what could be done next gen).

Pixel fillrate - 25.6 vs 4 = 6.4x
Texel fillrate - 70.4 vs 8 = 8.8x
Bandwidth - 160 vs 22 = 7.2x
GFLOPS - 2253 vs 240 = 9.4x
Memory - 2048 vs 512 = 4x

So even using your pulled out of somewhere multipliers everything is covered except memory and a jump of that to 4GB (8x) covers that.
Which enables 1080p60 with the same quality visuals we have now, which would be akin to using PS3/XB360 to render PS2/XB games at 1080p60.

Again, framerate isn't decided by hardware but by software. Devs pick a target and write for it. If they choose 1080p60, they'll graphics will look pretty poopy compared to a game using the same resources and targeting 720p30, which'll look inferior in per-pixel quality to the same hardware rendering 480p30. The devs pick a compromise between per-pixel performance and image fidelity and ue the hardware as best they can to reach that.
 
Ok, let's consider it. We'll use Xenos and a 6950 for comparison (a variant of that is within reason of what could be done next gen).

Pixel fillrate - 25.6 vs 4 = 6.4x
Texel fillrate - 70.4 vs 8 = 8.8x
Bandwidth - 160 vs 22 = 7.2x
GFLOPS - 2253 vs 240 = 9.4x
Memory - 2048 vs 512 = 4x

Right, so on 40nm we need around a high end piece of hardware today just to replicate 360/PS3 generation visuals @ 1080p 60 (well in excess of 120W). 28nm will make that much more feasible of course. The later they launch, the better.

I assume you want zero drops with 60Hz as well, so considering how 30fps isn't the most solid in a number of games, there'll be some give or take there.

So even using your pulled out of somewhere multipliers everything is covered except memory and a jump of that to 4GB (8x) covers that.
1920x1080 is 2.25x pixels of 1280x720. Mandating 60fps puts double load on that, hence 4.5x. 1152x640 would bump that to about 5.6x. Like I said, it's a rough naive math, but that's just raw pixels.

Clearly you can see the highest jump with shader power, but it's just something to keep in mind. All these mandates are ridiculous because developers and games are not all equal nor have the same performance requirements. Even Crytek had to give up their GI for Crysis 2 on console.

---------

What I'm not so clear on is the logic in the particular case of 1080p + an edge post-process that potentially blurs the entire screen (defeating one of the purposes of higher res) rather than going with a more modest resolution* with 4xAA** whilst having more than double the resources for math and fillrate. Meanwhile, MSAA solves a number of sub-pixel issues that still exists even at 1080p. So then you might say, throw on MSAA @ 1080p and then you compound the RAM requirements that much more with MRTs and also per sample shading...

*Resolutions do exist between 720p and 1080p! And again perhaps there'd need to be some testing to see if people actually notice when you consider the higher levels of AA and more modern hardware scaling.

**Since 4x sample per clock ROPs would be the minimum to expect, and that sort of bandwidth should be doable in the general case.

I'm not trying to point out ignorance, but again these are just some considerations to take when trying to argue a mandated resolution and framerate.
 
Which enables 1080p60 with the same quality visuals we have now, which would be akin to using PS3/XB360 to render PS2/XB games at 1080p60.
Who says it does? Al? He's just throwing multiplier numbers out there and I'm throwing them back. He says that you'd need a 6x bump to bring the poor performers up to snuff and I'm saying a 6950 gives you pretty much an 8x bump across the board. Now if Epic is only going to get mediocre results from an 8x bump then fine, that'll be their undoing and I don't have to buy the games made on their engine. I however have quite a few devs that I want to see what they can do with an 8x bump.

Again, framerate isn't decided by hardware but by software. Devs pick a target and write for it.
Devs define the limitations and work within them, if they can't they fall by the wayside.

If they choose 1080p60, they'll graphics will look pretty poopy compared to a game using the same resources and targeting 720p30, which'll look inferior in per-pixel quality to the same hardware rendering 480p30. The devs pick a compromise between per-pixel performance and image fidelity and ue the hardware as best they can to reach that.

Again, who says it will? If an 8x bump across the board is only going to result in modest increases then we truely have reached the point of diminishing returns.

If that's the case, then maybe it's time for a revolution instead of (yet) another evolution.
 
1080p should be the de-facto standard for next gen.

1080p with no AA would still look better and sharper on a 1080p set then 720p with 2xAA.

Frame rate will and should be game specific, Games like Gears, Resident Evil, GTA and loads of others don't really need 60fps and won't benefit enough from being run at 60fps to justify the performance drop.

1080p + 2xAA + 8xAF @ 30fps would be perfect for next gen and considering how much developers milk hardware even a mid-range PC chip should suffice.
 
Resolution, frame rate and everything else should be down to the developer to decide. Looking at a game like Black Ops we can see how unnecessary mandated 1280 x 720 framebuffer was.
 
Status
Not open for further replies.
Back
Top