Could next gen consoles focus mainly on CPU?

It's only a choice if you consider a marriage with no way out a choice. With enough CPU all you need is a dynamic scaling option and the end-user has a real choice.

Last generation they "could not do" 60 fps, this generation has more powerful CPU and GPU and still are not "doing" 60 fps.
What makes you think next generation with even more powerful CPU/GPU will do 60 fps? They will most likely spend the extra resources on more eye candy.

https://web.archive.org/web/2010011...iacgames.com/blogcast/blog/mike_acton/1503082

Or let me turn it around, what should the price of the console be to achieve this? And how will you enforce the developers to stick to your 60 fps rule?
 
Last generation they "could not do" 60 fps, this generation has more powerful CPU and GPU and still are not "doing" 60 fps.

Last gen CPUs were hindered by in-order shenanigans. There does seem to be a greater number of higher profile titles attempting 60fps this generation beyond Call of Duty.

How about we get a 32 core CPU and a 1 CU GPU? This is the obvious solution to make sure devs stop being CPU limited.

You're not thinking 4th dimensionally. Think how many Cell processors you could get & have fully programmable blending etc. :rolleyes:
 
Last edited:
How about we get a 32 core CPU and a 1 CU GPU? This is the obvious solution to make sure devs stop being CPU limited.

we just need Intel to deliver that 10GHz Pentium 4. /s

I say no to 32 cores, consoles should have 4 fast cores with SMT right now (Xbox One X is a huge missed opportunity, imagine if it had Zen cores, it would be running PS4 Pro 30FPS titles at 60FPS consistently, totally worth the delay they would need for Zen), maybe more of those cores in the future.



the situation now is that the Xbox One X for example on PUGB is struggling at 15-20FPS on places/settings a mainstream PC CPU can get 100FPS

on Ultra the i5 8400 is getting around 80FPS (GPU limited), the Xbox is certainly nowhere near ultra

AMD is just releasing for $99 a CPU with 4 Zen cores at 3.5-3.7GHz, 4MB l3 and an 8CUs GPU for $99.
 
I had a look around at presumed developments of AMD's CCX in future iterations. I thought a 6 core CCX was already in development, but apparently not, and I've also read some compelling arguments as to why that may never be.

If that's the case, and assuming the home console manufacturers stick with AMD, it seems that we're going to get some configuration of 4 core modules in the next generation.

So that leaves me with some questions:
-- If they choose a one module, 4 core, 8 thread CPU, would that lead to much of an improvement over the current Jaguar cores?
-- Would a two module, 8 core CPU be more worthwhile with SMT for 16 threads, or without, for 8 threads?
-- Is it theoretically possible that a console may contain a two module, 8 core CPU, with the capacity for SMT, but the ability for developers to choose whether they use it?
 
You can sell screenshots etc, but it is really hard to sell e.g. 60fps or something like that in a magazine. E.g. selling HDR is really hard if you can't show it somebody directly.

I don't think the industry is focusing that much on screenshots in magazines for marketing anymore. That might have been the case 10 years ago, but today videos on platforms like YouTube are much more important. And while video compression makes it harder to see improvements regarding image quality / resolution, it's no problem to have 60fps videos on there. I think that most people who are looking at high quality screenshots are probably enthusiasts, who wouldn't mind having more games running at 60fps.

he rest shpuld probably be spent on gpu, as customers are going to find it hard to distinguish between mid gen refresh and next gen as it is.

There is a very easy solution to create a distinction between mid-gen and next-gen consoles: Make all next-gen games run at 60fps. This would be a massive, unique selling point, and the word of mouth from hardcore gamers would be incredible.

No matter how much grunt the next system's gonna pack, everyone will still try and make the prettiest game humanly possible. Seems to be working out for them too, so that's probably not going to change any time soon.

Many games which are still running at 30fps today are among the biggest games in the industry. There are over 1000 people working on games like Assassin's Creed, GTA and other open world games. I think we are reaching a point where it might make more sense from an economical point of view to slow down graphical development a bit, for example by choosing 60fps over 30fps for those games.

Last generation they "could not do" 60 fps, this generation has more powerful CPU and GPU and still are not "doing" 60 fps.
What makes you think next generation with even more powerful CPU/GPU will do 60 fps? They will most likely spend the extra resources on more eye candy.

This is wrong. Last gen we had games like FIFA, Battlefield, Metal Gear Solid or Resident Evil all running at 30fps. This gen they are running at 60fps. Some genres like sports games or linear shooters are mostly targeting 60fps nowadays. And then we have studios like Naughty Dog wanting to have 60fps for Uncharted 4, which they ultimately had to drop because the CPU just wasn't fast enough. Or Bungie, who would loved to have 60fps for Destiny 2. And with the rise of VR, we will see much more 60fps games.

Or let me turn it around, what should the price of the console be to achieve this? And how will you enforce the developers to stick to your 60 fps rule?

If there is a big CPU in next-gen consoles which is easy to develop for (=x86, not some cell-type CPU), then making a 60fps game becomes a much easier, maybe even an obvious choice. We kind of see it with PCs today. Most PC CPUs are massively more powerful than console CPUs, which means that there basically is a 60fps standard on PC, even on cheap PCs.
 
I say no to 32 cores, consoles should have 4 fast cores with SMT right now (Xbox One X is a huge missed opportunity, imagine if it had Zen cores, it would be running PS4 Pro 30FPS titles at 60FPS consistently
Not at 4K, at 1080p with high probability, at the same resolutions _maybe_. But @ 4K no, definitely not consistently.

There is a very easy solution to create a distinction between mid-gen and next-gen consoles: Make all next-gen games run at 60fps. This would be a massive, unique selling point, and the word of mouth from hardcore gamers would be incredible.

No that doesn't seem like a trade off that all developers would want. 6TF at 33ms is the same output as 12TF at 16.6ms. They put out the same amount of work per frame. If you went to 12TF @ 33ms, you're gonna be operating at 24TF @ 16ms. that's a massive jump in available compute per pixel.

if we get 30fps Assassin's Creed Origins at 4K with 6TF at 30fps, with 12TF at 30fps, you're going to get the same game with 12TF left over to do really fancy lighting and shadows that can dramatically improve the graphical fidelity of the games. that will sell way better than the 60fps variant.

Some games require 60, no one disagrees here. But not all games do, and not all developers want to be limited in that fashion.
 
over budgeting on CPU is going to be detrimental to the next console cycle. If you need more GPU power the CPU taking up that silicon will not help. Where as the opposite can happen and we can continually offload more CPU tasks to the GPU. All of our technologies and the direction of our technologies for games and other industries have been greatly about moving those operations over to the GPU.

Similarly under-budgeting will be detrimental. This is real crystal-ball territory because you never know where technology is going to go in the 5+ years that yours console tech architecture will be the [usually] low bar for gaming technology. Of course if the console tech is common (like is it is now with AMD APUs) then gaming technology is going to be limited by what is practically achievable on consoles, otherwise it's mid/high-level PC only.

What's the future? CPUs that do graphics? GPUs that do general compute? Specialist processors like DSPs or physics processors? A mix of all three? Who knows. Sometimes you look around and gaming tech looks quite mature but the swings between doing task X on this type of core, then this type of core, then over on this type of core, than back to this type of core, shows there's a long way to go! :yes:
 
Somehow an extra ~30mm^2 CCX @ 7nm is equivalent to +40% shader/tex, an appropriate increase in bandwidth & associated cost?

Of course not.
Which is why anyone would prefer 2*Jaguar quads at say 3 GHz and a 14 TFLOPs GPU than e.g. a full Summit Ridge at 3.5GHz and a 10 TFLOPs GPU.

Ah, so I'm lying.
That was totally not my point..



I use it for not playing games at 20 fps.
I just don't see where strangely fixed idea that lower-end CPU == 30FPS or worse and higher-end CPU == 60 FPS.

Where are all the PS4Bone devs saying "the only reason we're not reaching 60 FPS is because of those damn slow CPU cores!" ?
What I see is statements saying dev X went for 30 FPS for higher IQ and dev Y went for 60 FPS for better gameplay.


Demanding AI (lots of good FSMs), simulation, and solid frame rates are pretty solid reasons as it is.
Same thing.
Who's saying "I couldn't do better AI because we didn't have enough CPU cycles for it", or "we couldn't do solid framerates because of the CPU"?



for example on PUGB
I never went anywhere near that thing and even I know it's a terrible example...

Regardless, is that the reason then? We should have better performing CPUs in consoles so that poorly optimized games can run better?

If I was a console maker, I would definitely prefer to spend more die-area on a larger GPU (heck, even NPU since those are so sexy nowadays) that allows teams like DICE or Naughty Dog to transcend their greatness than on a more powerful CPU so that Forever-On-Early-Access fads like PUBG can run a little bit better.
Besides, no one assures me that the PUBG devs, if given more CPU power on the XboneX, would actually take advantage of it to make the game run faster or just spend even less time optimizing it and I'd get the same 20 FPS we have today.



I don't think the industry is focusing that much on screenshots in magazines for marketing anymore. That might have been the case 10 years ago, but today videos on platforms like YouTube are much more important.

Are youtube videos, twitch sessions (great majority in 30 FPS AFAIK?) and screenshots in ads any different from magazines and TV spots?
 
if we get 30fps Assassin's Creed Origins at 4K with 6TF at 30fps, with 12TF at 30fps, you're going to get the same game with 12TF left over to do really fancy lighting and shadows that can dramatically improve the graphical fidelity of the games. that will sell way better than the 60fps variant.

That's why they shouldn't use 4K for next-gen - at least not native 4K. With checkerboarding, they can get very similar quality while needing 50% less GPU power. Native 4K is a much bigger waste of GPU resources than 60fps, when your game is already 4K CB.

And if it weren't for the low memory bandwidth, the PS4 Pro could probably render most games at 4K CB. That means for 4K CB / 60fps, you need about 2 x 4.2 = 8.4 TF. If we assume a 12.6 TF GPU for PS5, this would still give devs 4.2 TF which they can fully use for things like next-gen lighting. Coupled with high-res textures, this would be a massive leap. And it's even better for games that already run at 60fps, because they will get 8.4 TF for next-gen effects.
 
Sony should take quad Jaguar from PS4 and use it again for main OS work.
Eh, physically meshing together an ancient design like jaguar with modern cores like zen probably isn't trivial, either physically on-die or logically, with regards to memory snooping and whatnot. One zen core could do the work of a bunch of jaguar cores, so what would be the need to include old crud at all?
 
Not at 4K, at 1080p with high probability, at the same resolutions _maybe_. But @ 4K no, definitely not consistently.

I think 60FPS vs 30FPS would be an easier thing to sell than "4K" (that is often not really 4K), so I think lowering some other aspects would be worth it and not very noticeable, the X already offers a very substantial advantage on the GPU side, it's just the CPU that is too close, ignoring the PS4 Pro for a moment, CPU would be the key to have 60FPS versions of Xbox One and PS4 base 30FPS titles (even if we are getting close to that already)

Destiny 2 could totally run at 60FPS with similar visuals for example with a better CPU, as the PC version shows.
we have some games that are around 1440P30 on the PS4P and close enough to 4K30 on the X, matching resolutions for 60 (maybe with some other small compromises) I think would deliver very consistently on titles that are 30 on the PS4.... obviously, my point was a generalization and it varies from game to game quite a bit.



I never went anywhere near that thing and even I know it's a terrible example...

Regardless, is that the reason then? We should have better performing CPUs in consoles so that poorly optimized games can run better?

If I was a console maker, I would definitely prefer to spend more die-area on a larger GPU (heck, even NPU since those are so sexy nowadays) that allows teams like DICE or Naughty Dog to transcend their greatness than on a more powerful CPU so that Forever-On-Early-Access fads like PUBG can run a little bit better.
Besides, no one assures me that the PUBG devs, if given more CPU power on the XboneX, would actually take advantage of it to make the game run faster or just spend even less time optimizing it and I'd get the same 20 FPS we have today.

take PUBG as a raw example I guess, but I really think that if they had a CPU close enough to a PC CPU they would deliver a mostly 60 on the console (XBX) with the same optimization effort they currently have, with less effort they would still get a locked 30.

isn't destiny 2 a very well optimized title? it's locked at 30 on consoles because of the CPU, on the PC you can get 60+ even with a dual core Pentium, and it transforms the game.
 
Eh, physically meshing together an ancient design like jaguar with modern cores like zen probably isn't trivial, either physically on-die or logically, with regards to memory snooping and whatnot. One zen core could do the work of a bunch of jaguar cores, so what would be the need to include old crud at all?
Entire core [or two] of the CPU will then be cordoned off only for OS work, which is is a waste. Oh well, I'm sure devs will get enough.
 
Last generation they "could not do" 60 fps, this generation has more powerful CPU and GPU and still are not "doing" 60 fps.
What makes you think next generation with even more powerful CPU/GPU will do 60 fps? They will most likely spend the extra resources on more eye candy.

https://web.archive.org/web/2010011...iacgames.com/blogcast/blog/mike_acton/1503082
30 fps console game assets will always make better looking marketing material and game will provide better looking screenshots. I agree with Mike Acton that 30 fps exclusive console games used to be easier to sell than 60 fps games. Back then YouTube didn't even support 60 fps videos. I remember that Digital Foundry's Trials HD video clips were not running smoothly (60 fps) on my Core 2 Quad. Worked fine on iPad 2 however. Online video services have progressed since, and 60 fps footage is nowadays common. But nowadays you don't have many console exclusives, so the marketing video footage tends to be captured from high end PC anyways, and that allows 60 fps + better image quality than any console is capable of. So I would argue that 30 fps vs 60 fps console frame rate isn't as relevant to marketing anymore, since you are going to create high quality assets for PC version anyways, and capture marketing footage with max settings + max frame rate, independent on your visual choices on the console.

I did some research on my own before we released Trials Evolution (2012). I took data from Major Nelson's daily active user top 20 charts monthly (he was Xbox 360 community manager at that time). One thing become clear. Every month around 10-15 games in the top 20 most played list were 60 fps games. 30 fps games were the minority, when you measured daily active users. Even though there were much more 30 fps games released compared to 60 fps games on Xbox 360. This clearly shows that 60 fps games retain their player base better than 30 fps games. People are getting back to these games because they feel good to play (less judder, more responsive controls, etc). 30 fps might be a great way to increase sales (marketing material and visuals look better), but it certainly isn't a good choice if you want to create a game with long lasting appeal.

Trials game series games have never sold as much as biggest AAA titles (Trials HD and Evolution both have sold at least 2 million copies, so not bad either). But Trials games have unbelievable high number of daily active users (years after launch) and unbelievably long tails in unit sales. Lots of players in the leaderboard with 1000+ hours of play time. I am confident that 60 fps locked (almost zero frame drops) + carefully optimized input latency + 5 second level loading times play a part of this. These games just feel good to play.

We used to make handheld console (N-Gage, Nintendo DS, Sony PSP) games before we went to home consoles (examples: Pathway to Glory, Warhammer 40K: Squad Command). During this time I noticed that lower frame rate isn't nearly as bad on a smaller handheld screen compared to a large PC monitor or TV. We had PC debug builds for handheld games. If you played these games on PC at fullscreen, the lower frame rate felt pretty bad. But on the handheld device 30 fps was totally fine. Thus I believe that the decision made by many developers to drop from 60 fps to 30 fps when porting to Nintendo Switch is likely the correct one.
 
Last edited:
As I know unified memory of modern consoles doesent allow too strong CPU becouse then they "steal" bandwidth to GPU. This is the first point why IMHO even next gen will se a (maybe) 3 ghz Jaguar (maybe 12 cores). Second point is backward compatibility... as for Ps4pro. I think "next" gen games will run at 720@30 fps on Ps4, 1440@30fps->cb 4k on ps4pro and true 4k@30fps on ps5
 
. So I would argue that 30 fps vs 60 fps console frame rate isn't as relevant to marketing anymore, since you are going to create high quality assets for PC version anyways, and capture marketing footage with max settings + max frame rate, independent on your visual choices on the console.

30 fps might be a great way to increase sales (marketing material and visuals look better), but it certainly isn't a good choice if you want to create a game with long lasting appeal.

So then I am very naive and claim.

1. Want front load, marketing driven and lots of sales = 30 fps
2. Less sales, but a longer tail of daily active users = 60fps

Question is, where do you make most money for your title? Will DLC increase the tail for both 30 fps and 60 fps games?
To me its obvious, which is #2 + loot boxes :p
 
That's why they shouldn't use 4K for next-gen - at least not native 4K. With checkerboarding, they can get very similar quality while needing 50% less GPU power. Native 4K is a much bigger waste of GPU resources than 60fps, when your game is already 4K CB.
I imagine you haven't seen the best of 4K gaming on the best 4K display yet. It looks pretty next gen to me. Assassins creed is killer. And if I had a 4Pro I'd feel that way about HZD i'm sure.

50% less resources and output similar quality to native is a touch too aggressive. Checkerboarding does not absolve the renderer of all loads by the same amount. And not all checkerboard algorithms are for both axis, sometimes we just see checkerboarding along the x or the y. Which means it's saving up to 25% less compute in that case.

Regardless we've seen checkerboard solutions on PS4 Pro 4K vs X1X 4K native, and X1X is still running higher settings that 4Pro. So that should be enough to debunk that argument because X1X is clearly no where close to 2x the GPU for 4Pro.

And if it weren't for the low memory bandwidth, the PS4 Pro could probably render most games at 4K CB. That means for 4K CB / 60fps, you need about 2 x 4.2 = 8.4 TF. If we assume a 12.6 TF GPU for PS5, this would still give devs 4.2 TF which they can fully use for things like next-gen lighting. Coupled with high-res textures, this would be a massive leap. And it's even better for games that already run at 60fps, because they will get 8.4 TF for next-gen effects.

You'll still get 60 fps games. I mean, no one is saying next gen is going to be jaguar again. But lets not get carried away on the CPU side of things. If developers want to focus on gameplay, they'll focus on 60fps. If they want to focus on story and immersion, you're going to see 30fps, better graphics and sound.
 
Why do we have to choose ?

Ryzen is 14nm 4 core 8 thread is only 44mm2 . So the 8 core would be about 88mm2 . At 14nm you have a ryzen 7 1700 @ 65w . So if you take this as the target and put it on 7nm the 8 core 16 thread chip should around 60mm2 and should use alot less power . The performance gap from jaguar @ 2.4ghz to ryzen at 3ghz is insane and would allow them to devote more resources to hitting 60fps.
 
I imagine you haven't seen the best of 4K gaming on the best 4K display yet. It looks pretty next gen to me. Assassins creed is killer. And if I had a 4Pro I'd feel that way about HZD i'm sure.

I have a PS4 Pro connected to a Samsung KS8000 and played checkerboarding games like Horizon Zero: Dawn, Rise of the Tomb Raider or Witcher 3. Image quality like that is incredible and imo completely sufficient for next-gen. If I had the choice between playing those games at 2160c/60fps or at 2160p/30fps, I would always choose the former. I recently played The Last of Us: Remastered on my Pro, and it was such an incredible experience at 1800p/60fps. The high framerate really helped to bring immersion to the next level, really amazing.

50% less resources and output similar quality to native is a touch too aggressive. Checkerboarding does not absolve the renderer of all loads by the same amount. And not all checkerboard algorithms are for both axis, sometimes we just see checkerboarding along the x or the y. Which means it's saving up to 25% less compute in that case.

With checkerboarding, the GPU needs to render 50% less pixels compared to 2160p. I'm not a graphics programmer, but from what I have heard the only obstacle here could be memory bandwidth, because they might need a 4K framebuffer compared to resolutions like 1440+p with a similar pixel amount. And the games I listed above are using CB for both axes.

Regardless we've seen checkerboard solutions on PS4 Pro 4K vs X1X 4K native, and X1X is still running higher settings that 4Pro. So that should be enough to debunk that argument because X1X is clearly no where close to 2x the GPU for 4Pro.

Which titles are those? Genuinely curious, can't think of any at the moment. Also not sure what you are arguing - are you saying that devs don't need twice the amount of GPU power to render a 2160c game at 2160p?

You'll still get 60 fps games. I mean, no one is saying next gen is going to be jaguar again. But lets not get carried away on the CPU side of things. If developers want to focus on gameplay, they'll focus on 60fps. If they want to focus on story and immersion, you're going to see 30fps, better graphics and sound.

60fps is a lot more immersive than 30fps ever can be. All my 60fps single-player game experiences have been among the most memorable experiences ever.

I really think next-gen could be the generation were we might see more 60fps games than ever, thanks to exciting tech like Ryzen at 7nm, VR and ever-increasing dev budgets.

I also think if Sony will announce a 60fps standard (or a mandatory 60fps option) for next-gen at their conference, this would be the ultimate mic-drop moment, which would probably break the internet. I can't think of a more impactful announcement they could make, people would go completely crazy. Something like this will also really help to distinguish next-gen from mid-gen consoles, because we won't just see cross-gen games with higher res, but a completely improved gameplay experience.
 
Last edited:
Back
Top