Could next gen consoles focus mainly on CPU?

I have a PS4 Pro connected to a Samsung KS8000 and played checkerboarding games like Horizon Zero: Dawn, Rise of the Tomb Raider or Witcher 3. Image quality like that is incredible and imo completely sufficient for next-gen. If I had the choice between playing those games at 2160c/60fps or at 2160p/30fps, I would always choose the former. I recently played The Last of Us: Remastered on my Pro, and it was such an incredible experience at 1800p/60fps. The high framerate really helped to bring immersion to the next level, really amazing.
It's definitely better. But there are games that really need it and those that don't. The question here isn't whether we'll use jaguar again for next gen (we won't). But when we switch over the next CPU processor, does it need to be an _even larger_ allocation such that we are now changing the ratio of CPU to GPU more in CPU favour (silicon wise)? The real question is that, we're going to get all this more CPU power upgrading to Zen, its a huge boost, how much more scaling needs to be increased on the CPU side even with all these technologies developed to offload CPU work to the GPU? That's always been my stance, and I think same ratio, with Zen is already more than sufficient. They could probably even tune it smaller to be honest.

With checkerboarding, the GPU needs to render 50% less pixels compared to 2160p. I'm not a graphics programmer, but from what I have heard the only obstacle here could be memory bandwidth, because they might need a 4K framebuffer compared to resolutions like 1440+p with a similar pixel amount. And the games I listed above are using CB for both axes.
It's not 50% less, You'd have to have 1/2 resolution in both X and Y axis which is not as common as a technique from what we've read. With checkerboarding, it's render X% less new pixels from scratch + render 100-X% remaining pixels with a less demanding algorithm (than net new) that takes the surrounding pixels and the values of last frame and outputs a new pixel from that.

Which titles are those? Genuinely curious, can't think of any at the moment. Also not sure what you are arguing - are you saying that devs don't need twice the amount of GPU power to render a 2160c game at 2160p?
So for instance. Witcher 3 is 4KCBR + dynamic resolution on PS4Pro. I'ts native 4K on X1X and has more happening on it. There are quite a few titles like this if you are willing to scan through DF articles.

4Pro is 4.2TF, X1X is 6TF. 4.2 is not 50% of 6. It's 30ish% less power. Most games we've seen the 4Pro is about 30% or less in native resolution compared to 1X. So checkerboarding is likely taking that number to 4K. So it's about a 30% benefit, not 50%; conservatively speaking.

60fps is a lot more immersive than 30fps ever can be. All my 60fps single-player game experiences have been among the most memorable experiences ever.

I really think next-gen could be the generation were we might see more 60fps games than ever, thanks to exciting tech like Ryzen at 7nm, VR and ever-increasing dev budgets.

I also think if Sony will announce a 60fps standard (or a mandatory 60fps option) for next-gen at their conference, this would be the ultimate mic-drop moment, which would probably break the internet. I can't think of a more impactful announcement they could make, people would go completely crazy. Something like this will also really help to distinguish next-gen from mid-gen consoles, because we won't just see cross-gen games with higher res, but a completely improved gameplay experience.

Enforced standard would kill their platform hype imo. Developers know what is best for their game. The most wishful thinking is asking for 2 versions of the game to choose from. Enforcement of 60fps is probably gonna turn out really badly.
 
That'd be quarter res. Every other pixel is, by definition, half the number of drawn pixels.
OH whoops. Right. That’s true. Just realized that you just need to shift the next set over by one to make the pattern.

Still not 50% less resource intensive.
 
Last generation they "could not do" 60 fps, this generation has more powerful CPU and GPU and still are not "doing" 60 fps.
What makes you think next generation with even more powerful CPU/GPU will do 60 fps? They will most likely spend the extra resources on more eye candy.

https://web.archive.org/web/2010011...iacgames.com/blogcast/blog/mike_acton/1503082

Or let me turn it around, what should the price of the console be to achieve this? And how will you enforce the developers to stick to your 60 fps rule?
I said nothing about a 60 fps rule. You said 60 fps is a choice, and I'm saying it's not, not really. Due to the weak CPU this gen most games can't sustain 60 at any resolution. With a good CPU, you could have two settings, prefer resolution and prefer performance, and the latter would be able to maintain 60 with a dynamic scaler. That is a real choice.
 
Some games require 60, no one disagrees here. But not all games do, and not all developers want to be limited in that fashion.

Let's be hyperrealistic here, it's 1st/3rd person shooters/platformers/fighters/racers @ 60fps vs Final Fantasy @ cinematic rates. :p
or maybe FF framerate should be converted to the Richter Scale :mrgreen:

/s
 
Last edited:
What if you want to do cities like AssCreed Unity or Witcher 3, but with three three times the crowds and four times the quality of the behaviours and the animation?

What if you want to do a GTA VI with more cars, more players, huge crowds and better AI, and destructible environments?

What if PUBG 3 has even greater scope with an even more complex sandbox? And what if people like that??

What if we dared think for a moment that gameplay doesn't stay at the same limits defined by 2013 CPUs, and that innovation will find a way even if consoles hardware and publishers initially try and shun it (e.g. PUBG).

Even if you want to stick with 30 fps, today's CPU's offer a bleak-as-fuck outlook for large scale immersive sandbox experiences.

God forbid you should want the likes of The Witcher or GTA to run at 60 fps and increase in ambition.
 
No one is saying to stick with ole CPUs. I’m just being realistic here. Technology both hardware and software works in roadmaps. We know what’s coming before it arrives. This idea that we go from fewer to huge groups of actors isn’t gradual like we would expect it to be.

Fun is always going to be the most impactful metric and having more of everything is not necessarily more fun. Most AIs are down tuned so that they are easily beatable for instance, it’s not that we can’t make them complex.

If we want tons of AIs and tons of behaviours we still have online games that can do exactly this today. And we’ve seen it’s usage as such. Why not offload it like they do in warzone for Halo 5 one they do for Titanfall ? Or it crackdown ever comes out soon? Why not like that ?

There are a variety of ways to solve the same solution. The key is to find the right balance.
 
I said nothing about a 60 fps rule. You said 60 fps is a choice, and I'm saying it's not, not really. Due to the weak CPU this gen most games can't sustain 60 at any resolution. With a good CPU, you could have two settings, prefer resolution and prefer performance, and the latter would be able to maintain 60 with a dynamic scaler. That is a real choice.

60 fps is about choices, you can get 60 fps with any CPU as long as you design your game for it, but you can not have 60 fps + the best of the best pixel iq + lots and lots physics/ai/crowd++++. And even next gen, the developers will probably want to push the envelope more and then its about what they want to push. More physics/ai or 60fps or pixel iq, choose 2 :)

If you had as much time and money as you want to make a game, sure, you could make 60 fps mode and 30 fps mode etc. But the more options, the more bugs, features and QA has to be done. The more you spend on making it, the chance of making it back is less.

And when it comes to the console, if we go with 400 USD console, how will you budget the hardware to be sure you get 60 fps for all the games that need it.
 
I would say no. CPU improvement is absolutely needed, but visuals sell because they easy to market to you're average consumer. The next set of consoles shouldn't be released until a noticeable graphics improvement can be seen.
This
and expect most games to be cross-gen early on and therefore making it even harder to notice difference right away. Both manufactures will need their Killzone Shadowfall,Ryse Deep Down at or close to launch. These showed a solid difference.
My list for possible PS4 + 5/Xbone+Nextbox titles.
Cyberpunk 2077
Death Stranding
Final Fantasy 7 remake
Beyond good and evil 2
That avalanche Studio project that has been confirmed to be in development for Next-gen systems.
 
Last edited:
Or let me turn it around, what should the price of the console be to achieve this? And how will you enforce the developers to stick to your 60 fps rule?
There will never be such a rule and for the better believe me. It would basically kill the whole idea of VG creation. How boring would it be if every game was 60 fps with average visuals?
I don't mind some typicial 3rd person game @ 30. Uncharted 4 and AC: Origins Perfomance has been close to excellent in my playtroughs (except for that one heavy stutter that lasted few seconds in Origins but what are few seconds compared to a enormous story mode?)
Heck i've played racing games @ 30fps on PS3 and got used to it pretty quickly. no biggie
 
Last edited:
How boring would it be if every game was 60 fps with average visuals?
The same argument could be used for high resolutions.
"How boring would it be if every game was UHD with average visuals?"

UHD is much more expensive than 60fps. I think that high resolutions at 30 fps are very strange. As if the resources were not used properly. On PC I would never raise the resolution so much that I have less than 60fps. Even if I use a Gamepad as in AC: Origins.

The CPU only has to feed the GPU and as long as the game does not simulate very much or has very high NPC/player numbers a normal CPU should be able to handle 60fps. Accordingly, the frame rate in the future depends mainly on the GPU and what it is used for. If a game is running in a very high resolution it should also be able to run at 60fps in a lower one. Why should typical linear games ever have 30fps again? The only games I saw where even a 8700k at 5,2 GHz with 16GB RAM at 4,266GHz limits the frame when using a GTX 1080Ti at 2,1GHz are PUBG and Star Citizen Alpha 3.0. In the first one wants very high frame rates over 120fps and the engine is poorly optimized and the second is not yet optimized and ambitious.

For example, a Wildlands jungle scenario uses 98% of this 15 Teraflops Nvidia GPU with about 65fps in 1440p and 25-30% CPU even with TurfFX (8700k). In such areas this game already has Next Gen GPU requirements on the maximum settings which look much better.
 
Last edited:
The same argument could be used for high resolutions.
"How boring would it be if every game was UHD with average visuals?"
sure.good point!
UHD is much more expensive than 60fps. I think that high resolutions at 30 fps are very strange. As if the resources were not used properly. On PC I would never raise the resolution so much that I have less than 60fps. Even if I use a Gamepad as in AC: Origins..
yeah I can understand that and think most PC gamers used to high framerate would do the same although there will always be a handful that will pick 30fps as it is not a nightmare framerate just like some console players pick higher res/downsampling over performance modes.

heavily fluctuating rates are what i'd consider trash.
 
There will never be such a rule and for the better believe me. It would basically kill the whole idea of VG creation. How boring would it be if every game was 60 fps with average visuals?
I don't mind some typicial 3rd person game @ 30. Uncharted 4 and AC: Origins Perfomance has been close to excellent in my playtroughs (except for that one heavy stutter that lasted few seconds in Origins but what are few seconds compared to a enormous story mode?)
Heck i've played racing games @ 30fps on PS3 and got used to it pretty quickly. no biggie

60 FPS always looks better than 30 FPS, IMO. The judder in 30 FPS games just looks absolutely horrible. It's one reasons why, even though HZD looks great in screenshots, it'd never win best looking game of the year for me. Once players start moving it just falls apart, and if they do rapid view changes, it's even worse.

30 FPS is absolutely better for static screenshots, however. :p

But I understand that not everyone is as affected by the judder and hitching inherent in 30 FPS (even with motion blur techniques, even more Yuck there) not to mention the mushy feeling of controls at 30 FPS, and that screenshots look better for marketing purposes and forums wars. :p

Also, the same can be applied if you are used to gaming at 120+ Hz to 60 Hz games, hence why I haven't invested in high refresh displays. :) But at least there, even while noticeable, the erosion of graphics fidelity due to judder and hitching isn't nearly as bad as going from 60 Hz to 30 Hz. Control responsiveness is also not affected quite as severely in feel.

I'd love to see a world where all games on console had the option of a less graphically intensive 60 FPS mode in addition to a more graphically intensive 30 FPS mode (hmmm, like a PC). But due to marketing and fixed hardware, it's unlikely to happen. A more powerful CPU could certainly help more titles attain 60, however. And hell, it'd even help more 30 FPS titles maintain 30 FPS more consistently.

Regards,
SB
 
Digital Foundry and the push to 4K is going to be a headache for devs and publishers. The casual gamers and fanboys are going to complain if their PS5/X2 games aren't native 4K, and to do that PLUS see a jump in photorealism expected of a console generation jump means that we need a console more than twice as powerful as the Xbox One X.
They will complain about lazy devs & weak consoles blah blah blah if checkerboard rendering or temporal reconstruction is used instead of rendering at native 4K. They will compare XOX rendering lastgen games like Gears of War 4 at native 4K, yet Gears of War 6 with its (large jump in photorealism over GOW4) for Xbox Two only renders at 1600p and uses temporal reconstruction to bring it to 4K.
 
The casual gamers and fanboys are going to complain if their PS5/X2 games aren't native 4K...They will complain about lazy devs & weak consoles blah blah blah if checkerboard rendering or temporal reconstruction is used instead of rendering at native 4K.
People always complain. So what? It's not going to affect sales of anything. Regardless of the noises people (loud vocal minority) make, if the products offer a suitable improvement at a suitable price, people will buy in, and if not, they won't. Switch is being enjoyed by PS4/XB1 owners despite a visual downgrade, and despite The Internet complaining the console wasn't powerful enough.
 
If they're swayed by some online analysis, then they already cared about such things. If the individual actually valued the social aspects (e.g. their friends only have the one system, so there is no illusion of choice), then it's a moot point.

Having access to more information to make an informed purchasing decision isn't new to human history.
 
Back
Top