What resolution and framerate should next-gen target *spawn

Status
Not open for further replies.
Who says it does? Al? He's just throwing multiplier numbers out there and I'm throwing them back. He says that you'd need a 6x bump to bring the poor performers up to snuff and I'm saying a 6950 gives you pretty much an 8x bump across the board.

I explain where the multipliers come from.

So ok, you have an 8x bump over the theoretical 4.5-6x increase in raw requirements. The difference will then have to take care of improvements to the rendering quality over the entire life cycle. Is that difference enough for the next 7-10 years? How's the framerate for 6950 with Crysis 2 @ max :?: Should we expect that to be alright with the upcoming BF3 @ max settings @ 1080p? Are 2011 visuals ok with you for the next 10 years? Have you thought about the other jump in requirements brought about by FP16? Or are we doomed to 32bpp for the next generation?

Just saying... just asking. *shrug*
 
Right, so on 40nm we need around a high end piece of hardware today just to replicate 360/PS3 generation visuals @ 1080p 60 (well in excess of 120W). 28nm will make that much more feasible of course. The later they launch, the better.

I assume you want zero drops with 60Hz as well, so considering how 30fps isn't the most solid in a number of games, there'll be some give or take there.

1920x1080 is 2.25x pixels of 1280x720. Mandating 60fps puts double load on that, hence 4.5x. 1152x640 would bump that to about 5.6x. Like I said, it's a rough naive math, but that's just raw pixels.

Ok, since I'm not a "60 fps or bust" guy I'd say we're more in agreement than not as to what to expect. 16ms is a pretty steep order to meet with current clocks and multiplier bumps in those are just not coming (if ever). This is where revolution instead of evolution is needed.

Clearly you can see the highest jump with shader power, but it's just something to keep in mind. All these mandates are ridiculous because developers and games are not all equal nor have the same performance requirements. Even Crytek had to give up their GI for Crysis 2.

---------

What I'm not so clear on is the logic in the particular case of 1080p + an edge post-process that potentially blurs the entire screen (defeating one of the purposes of higher res) rather than going with a more modest resolution* with 4xAA** whilst having more than double the resources for math and fillrate. Meanwhile, MSAA solves a number of sub-pixel issues that still exists even at 1080p. So then you might say, throw on MSAA @ 1080p and then you compound the RAM requirements that much more with MRTs and also per sample shading...

*Resolutions do exist between 720p and 1080p! And again perhaps there'd need to be some testing to see if people actually notice when you consider the higher levels of AA and more modern hardware scaling.

**Since 4x sample per clock ROPs would be the minimum to expect, and that sort of bandwidth should be doable in the general case.

I'm not trying to point out ignorance, but again these are just some considerations to take when trying to argue a mandated resolution and framerate.

Personally, I'm not arguing for mandates, I'm just reacting to expectations of "720p 4xMSAA" as an acceptable target when supposedly the PC guys are already running higher than that and an 8x bump should clearly give us better.
 
Ok, since I'm not a "60 fps or bust" guy I'd say we're more in agreement than not as to what to expect. 16ms is a pretty steep order to meet with current clocks and multiplier bumps in those are just not coming (if ever). This is where revolution instead of evolution is needed.

Sounds like it. :p

---

On a side note, I'd be curious to see what devs do with those HDR compressed texture formats (BC6/7) and what further strains that will put on the hardware performance.

And although proper gamma-correct lighting has gone a long way with current 32-bpp solutions, I think it's time we all moved on there what with low precision artefacts being difficult to avoid, especially with deferred setups becoming the norm.
 
Who says it does? Al? He's just throwing multiplier numbers out there and I'm throwing them back.
He gave his rough reasoning, and it's valid back-of-the-envelope calculations.
He says that you'd need a 6x bump to bring the poor performers up to snuff and I'm saying a 6950 gives you pretty much an 8x bump across the board. Now if Epic is only going to get mediocre results from an 8x bump then fine, that'll be their undoing and I don't have to buy the games made on their engine. I however have quite a few devs that I want to see what they can do with an 8x bump.
And half of that bump is going to be spent on achieving the same quality at a higher resolution and framerate. That's just basic logic. Doubling the framerate requries twice the power. Doubling the resolution requires twice the power (twice the shader calcs and pixel draws and BW). So 4x the power nets you the same 720p30 game at 1080p60.

Devs define the limitations and work within them, if they can't they fall by the wayside.
The limits are whatever the hardware provides. Devs then choose how to use those resources. You do understand that 1080p60 isn't used this gen simply because it wouldn't look as good as 720p30, right? It's not as if the console hardware is hard-coded to 720p. And so by that same token, why would 1080p60 be a target next gen if rendering quality doesn't improve?

Again, who says it will? If an 8x bump across the board is only going to result in modest increases then we truely have reached the point of diminishing returns.

If that's the case, then maybe it's time for a revolution instead of (yet) another evolution.
Exactly! Recognising the limits of the brute-force approach is necessary to move on. Hence things like virtual texturing elliminating massive RAM requirements, and AA techniques to get around supersampling. the next major improvements needed for a generational advance in visuals are very processor intensive. Effective GI solutions are needed. Organic animation. You could take the current boxes and double them up just to enable CryEngine 3's GI at a decent quality. Double it again to achieve that at 60fps. Very quickly you're eating into your limited resources, and then it's down to very clever folk to find new solutions to get more from the hardware.

Improvements will come with software, and devs will choose their software and resolution according to what they want to achieve. There's no sense in mandating a resolution. Many games from last gen celebrated for their visuals would never have existed if they had been mandated to 720x576 60fps, but last gen we didn't have pixel counters and didn't have gamers grumbling about a number instead of actually looking at the quality of the visuals. Much as I love 60fps, and much as I'd personally mandate a 30fps minimum framerate on any console, the targets next gen will be picked by the developers as it should be.
 
I explain where the multipliers come from.

So ok, you have an 8x bump over the theoretical 4.5-6x increase in raw requirements. The difference will then have to take care of improvements to the rendering quality over the entire life cycle. Is that difference enough for the next 7-10 years? How's the framerate for 6950 with Crysis 2 @ max :?: Should we expect that to be alright with the upcoming BF3 @ max settings @ 1080p? Are 2011 visuals ok with you for the next 10 years? Have you thought about the other jump in requirements brought about by FP16? Or are we doomed to 32bpp for the next generation?

Just saying... just asking. *shrug*

Can we add a 4x multiplier for being able to program to the metal, as Epic suggested would help Vita versus iOS?

Personally, I am all for revolution by the way. ;) I would expect next-gen to target 1080p @ 60fps though generally, leaving room for 30fps 3D and split-screen. There will obviously always be exceptions, and much will depend on how capable the hardware is, but it should be possible. Who knows the kind of dynamic resolution technique Wipeout uses could become standard, too.
 
Can we add a 4x multiplier for being able to program to the metal, as Epic suggested would help Vita versus iOS?
Good question... and maybe repi and sebbbi et al. should comment as we certainly wanted to get an idea of how low was desired on PC based on the Huddy comment several months back as well.

The trend on PC is for leveling out the feature sets whilst gaining greater exposure to the hardware post-DX10, so I have to wonder how much more will or can be gained on the console side - probably less than in the past. There is the tighter path between CPU and GPU that exists on console as well, but that may no longer be as significant an advantage as it was this past generation when future iterations of Fusion etc. arrive.

At least for Microsoft, coding to the metal beyond the 360's version of the DX API seems to be a no-no. Sony may not be hindered similarly, but I'm not familiar with where OpenGL4.2+ stands as far as hardware abstraction goes (as a base for comparing API vs to-the-metal).
 
We can present this another way. PS1 to PS2 saw a resolution increase, same framerate. PS2 to PS3 was a resolution increase, same 30fps framerate. PS3 to PS4 moving to 1080p60 would be both a resolution increase and framerate increase. The fact next gen is coming later may enable that, but that looks an optimistic expectation to me. 1080p will probably be the target resolution for those with bigger displays who'd benefit from the fidelity, and it'd be nice if that gave 720p gamers the option of double framerate. Things rarely work out that well though. I'm still shocked out how poor SD IQ is this gen. PS360 on SDTV should look incredibly smooth, but it didn't.
 
I explain where the multipliers come from.

So ok, you have an 8x bump over the theoretical 4.5-6x increase in raw requirements. The difference will then have to take care of improvements to the rendering quality over the entire life cycle. Is that difference enough for the next 7-10 years? How's the framerate for 6950 with Crysis 2 @ max :?: Should we expect that to be alright with the upcoming BF3 @ max settings @ 1080p? Are 2011 visuals ok with you for the next 10 years? Have you thought about the other jump in requirements brought about by FP16? Or are we doomed to 32bpp for the next generation?

Just saying... just asking. *shrug*

Yes I read your further explanation, thanks.

What BF3 runs at with a 6950 is a good question, where's Repi at when you need him????

Before we go into the minutia of different systems/details, I think we need to know/come to terrns with just how far along we are into diminishing returns. Would 2x a 6950 (or 16x current consoles) give us a significant boost or just modest returns? If just modest returns then why continue down the current pipeline?

I don't know, maybe it's time for change as opposed to incremental increases to the status quo. At least maybe its time to start talking about change as opposed to the limitations of staying the same. *shrug*
 
He gave his rough reasoning, and it's valid back-of-the-envelope calculations.

LOL!! Mine were done on actual back of an envelope too!! (It's all I had to write on :LOL:)

The limits are whatever the hardware provides. Devs then choose how to use those resources. You do understand that 1080p60 isn't used this gen simply because it wouldn't look as good as 720p30, right? It's not as if the console hardware is hard-coded to 720p. And so by that same token, why would 1080p60 be a target next gen if rendering quality doesn't improve?

The distinction I was trying draw from yours was subtle but imo important, setting a target (like 1080p/60) and then trying to hit it becomes a string of compromises where as defining your limitations and then determining how to fill it can lead to creative thinking.

Exactly! Recognising the limits of the brute-force approach is necessary to move on. Hence things like virtual texturing elliminating massive RAM requirements, and AA techniques to get around supersampling. the next major improvements needed for a generational advance in visuals are very processor intensive. Effective GI solutions are needed. Organic animation. You could take the current boxes and double them up just to enable CryEngine 3's GI at a decent quality. Double it again to achieve that at 60fps. Very quickly you're eating into your limited resources, and then it's down to very clever folk to find new solutions to get more from the hardware.

Improvements will come with software, and devs will choose their software and resolution according to what they want to achieve. There's no sense in mandating a resolution. Many games from last gen celebrated for their visuals would never have existed if they had been mandated to 720x576 60fps, but last gen we didn't have pixel counters and didn't have gamers grumbling about a number instead of actually looking at the quality of the visuals. Much as I love 60fps, and much as I'd personally mandate a 30fps minimum framerate on any console, the targets next gen will be picked by the developers as it should be.

What I love about the consoles is that revolutionary ideas can still happen as opposed to current gpu thinking which goes something like, the next iteration is a pipe cleaner iteration of the old iteration, before the new iteration of the old iteration...........

Yet, I'm looking at the proposals for the next gen of consoles and it's looking like another iteration.

Worse, there's good reason to believe that the next gen will be the last gen. How disappointing would it be if the last console is a pipe cleaner of the old console instead of a revolution into whatever comes next.
 
Would 2x a 6950 (or 16x current consoles) give us a significant boost or just modest returns? If just modest returns then why continue down the current pipeline?
Because no-one's invented a better alternative yet. ;) If you look at global illumination, the first take on that, identifying it as a requirement for realistic images, was a brute force sampling of multiple light rays. The problems with performance lead software engineers to consider alternatives, limited by what processing power has been available. Offline they could look into precomputing, or optimising numbers of rays for bit of the scene, which were still too consuming for GPUs to do realtime. Eventually though, realtime engine developers come up with their own tricks thanks to hardware advances. Things like SSAO just weren't possible to invent last gen because we didn't have the programmable processing power, but now we have, developers are starting to look at new approaches. So now we can explore ideas like virtual texturing or volumetric lighting. But we are also limited via toolchains and current ways of thinking. A game developer can't really explore voxels or HOS when the tools don't exist to create and use that technology, and remember that developers' main aim is to create games to sell, and not develop new technologies. Few developers have the resources to invest in experimentation of completely new rendering paradigms! When that does happen, then we often get a door thrown wide open that leads to a cascade of new developments. eg. AA was a hardware solution, until Intel needed something to run on Larrabee. They invested in pure research and released a paper on post AA, which got everyone thinking, and now there are a whole host of post Aa effects. FXAA could have been used day 1 on XB360, but it hadn't been invented. 5 years later we can add it to a game. No-one could have designed PS360 for post-FX AA, but the idea didn't really exist in 2003/2004 when these consoles were being designed. And following that, IHVs can think about changing the way AA is handled in hardware.

Similarly you can't expect any company to design a console around a new graphics technology that doesn't exist yet! The best we can do is provide the best performance and programmability and see what developers do with it. Technologies will evolve and then, next gen, we can shift the hardware in a more effective direction. But like all human progress, it's a stepwise process, one advance at a time.
 
What I love about the consoles is that revolutionary ideas can still happen as opposed to current gpu thinking which goes something like, the next iteration is a pipe cleaner iteration of the old iteration, before the new iteration of the old iteration...........
Except that the revolutionary thinking is built on the same GPU hardware progress. The thing that enables consoles to lead to better innovations is fixed hardware in a viable platform over 5+ years. Developers have both time and a need to think of new approaches. On the PC, as CryTek amply demonstrate, they can get away with conventional thinking because in 18 months whatever limits their game hit have been broken by new hardware.

A new generation of consoles with access to DX11 or 12 level facilities will put a whole load of new tools in developers hands, like geometry shaders and tesselators, that'll be fully explored. It'll be good. ;)
 
After a whole generation of (mostly) not being able to meet the promise of "full HD gaming", they need to give us 1080p next time.

Today, most developers go lower than that to get extra performance. I would hope that the next-gen consoles would remove the choice: That the hardware can only render at 1080p, period. It can then downscale it if it has to for older TVs, but no more taking shortcuts. Otherwise, the developers will do the same thing they're doing today.. taking the game down to 720p or lower so they can get some fancier effects or something.

Whether or not that's actually going to happen, I have no idea. I think the next-gen will see a huge jump in the quality of shaders and lighting, just like we got this gen. Beyond that, I can only hope they actually give us HD this time.

As for framerate, I'd be happy with 30fps for most titles. And that would be a fixed 30fps, that never dips, and always has V-sync on. Some games can target 60fps if they want, but 30fps works more than adequately most of the time, especially if you add in a little motion blur to smooth things out.

As for what they're going to show us... do you think EA will use the same video they used in 2004 to show us what next-gen Madden will look like? They could, really.. they didn't get anywhere close to that target this generation.
 
After a whole generation of (mostly) not being able to meet the promise of "full HD gaming", they need to give us 1080p next time.

Today, most developers go lower than that to get extra performance. I would hope that the next-gen consoles would remove the choice: That the hardware can only render at 1080p, period. It can then downscale it if it has to for older TVs, but no more taking shortcuts. Otherwise, the developers will do the same thing they're doing today.. taking the game down to 720p or lower so they can get some fancier effects or something.

I hope that never comes about. I want the devs to have more choices, not less. Anything which artificially limits their creative aspects should be abhorred.

I would like for next-gen to target higher resolution and framerate, but I'm only expecting 720p@30fps with 4x AA to be the typical setup used.
 
Frame rate will and should be game specific, Games like Gears, Resident Evil, GTA and loads of others don't really need 60fps and won't benefit enough from being run at 60fps to justify the performance drop.

Have you played games like RE5 and GT4 in 60 fps (on the PC). GTA in particular feels much better, especially the driving sections. I haven't played GeoW, but I sometimes dream about Uncharted in 60 fps. And then I wake up.....:(
 
Have you played games like RE5 and GT4 in 60 fps (on the PC). GTA in particular feels much better, especially the driving sections. I haven't played GeoW, but I sometimes dream about Uncharted in 60 fps. And then I wake up.....:(

RE5? - Yes, I did. Didn't play any better or worse than my PS3 version and looked pretty much the same too. If achieving something special means you have to resort to sub-720p, even in the next generation, I certainly won't be holding it against you.
 
Forcing all developers to do 1080p60fps is stupid (unless the system is so powerful the requirement is trivial). There's tons of games where they are better off spending the resources in other areas whether it be increased AA or other effects and that option should be open to them.

Not talking about forcing requirements.

Talking about HW capability that developers don't have to make the tradeoff.

Current gen designs were locked in what, 6, 7 or 8 years ago probably? If the next gen doesn't launch for another 2 years at least, are we saying in 8-10 years, you can't get 1080p60 with effects for $300-400?

If they can't, maybe they should wait, or not bother. Maybe it won't be enough of a jump for people to buy consoles, when they can spend money for other gadgets in this day and age.
 
I'm sure some racers could hit 1080p60 however I wouldn't expect someone making a stealth based fps to even try as they could maybe benefit more from AA and better lighting and more other effects. So I'm sure you'll see games with that, but should a launch title be mandated to do it? Not if they feel to sacrifice anything they wanted to do to make it happen.

And just like developers shouldn't need to be burdened with a mandate, Sony and MS aren't going to delay because they can't meet someones arbitrary target of what next gen should be. It just needs to be a significant jump in fidelity to justify its existence. 720p with AA and more effects, better lighting and shadows can certainly be that jump.
 
Not talking about forcing requirements.

Talking about HW capability that developers don't have to make the tradeoff.

That level of hardware capability doesn't exist and probably won't for a very long time. It's like asking for a number big enough that it doesn't get smaller when you divide it by 2.

Current gen designs were locked in what, 6, 7 or 8 years ago probably? If the next gen doesn't launch for another 2 years at least, are we saying in 8-10 years, you can't get 1080p60 with effects for $300-400?

Current systems can do 1080p 60fps with "effects" though!

With GPUs becoming more flexible there may be additional demands on what were previously only graphics devices too.
 
... and as for AA, you can go with either MLAA or FXAA rather than MSAA.
Except MLAA and FXAA both look noticeably worse than MSAA. They are obviously better than no AA, but note that they are a compromise in and of themselves.


1080p with no AA would still look better and sharper on a 1080p set then 720p with 2xAA.
Debatable, and regardless we should be aiming for 4x MSAA as a baseline moving forward. More symmetric pattern and really the point at which it starts hitting massive diminishing returns (i.e. 8x only looks a tiny bit better than 4x).

Can we add a 4x multiplier for being able to program to the metal, as Epic suggested would help Vita versus iOS?
It's not even close to 4x except for constructed cases. More importantly, the vast majority of the overhead is on the CPU, so it doesn't really add any additional GPU power.

eg. AA was a hardware solution, until Intel needed something to run on Larrabee. They invested in pure research and released a paper on post AA, which got everyone thinking, and now there are a whole host of post Aa effects.
MLAA was fairly unrelated to Larrabee for the record. It came from the labs/ray-tracing side. Larrabee can do MSAA quite well... in fact often better than conventional GPUs since it's a binning rasterizer (and thus can accumulate and resolve in-core).

I would hope that the next-gen consoles would remove the choice: That the hardware can only render at 1080p, period.
That's not possible, nor a feasible direction. The future is multi-frequency evaluation of different shading terms (already on your current games particles are blended at lower resolutions and upsampled), so it's not as simple as "mandating a resolution" anymore, and rightfully so. Evaluating everything at pixel rate is wasteful.
 
imagine this scenario, farcry @ 1080p60hz vs crysis @ 720p30hz.

crysis would look way better, and imo would feel just as smooth, thanks to motion blur and per object motion blur, which really takes the edge off the jerkiness at lower frame rates.

someone else mentioned the frame interpolation thingy up there, if it takes significantly less time to interpolate a median frame in between rendered frames, then it would work well almost like an extension to motion blur.

but in the end it comes down to compromises, so more options are better.
 
Status
Not open for further replies.
Back
Top