Do you think there will be a mid gen refresh console from Sony and Microsoft?

You need substantially better tech for a proper new gen and most importantly much smaller node (which won't be the case even for a 2024 launch).
It depends what features will define next generation. I think for a lot of people, a faster CPU and faster GPU will not be particularly noticeable, but if next gen is about ML and AI, and both features that can be seen to materially make games better, then next gen could be much closer.

I'm not convinced either ML or AI are features that will bring an immediate revolution in gameplay but who knows what's cooking behind closed doors.
 
The big problem with 'next gen techs' is the fact games are going to target the older hardware for years after next gen releases. Consoles have become like PCs in terms of dragging tech forwards instead of leaping. New hardware exists but isn't used because old hardware doesn't have it. New games don't use the new hardware. The only reason to buy new hardware with the new features is to run the old features faster. Eventually, there's enough people with the 'new' (now years old) features that they start to get used, we hope.

In the past, it was the consoles creating an influx of new, latest ideas, that started the next evolution of games all round. I can't see that happening ever again. Heck, even the console companies are no longer ditching their old machines and producing next-gen only titles to drive adoption. Unless there's a real game-changer tech, something a developer can target that makes a game so spectaculous that it drives sales of hardware and creates a large enough market to justify the games development, the sensible economics now are cross-gen titles that are a bit shinier and smoother on the latest hardware, and that's it.
 
The big problem with 'next gen techs' is the fact games are going to target the older hardware for years after next gen releases. Consoles have become like PCs in terms of dragging tech forwards instead of leaping. New hardware exists but isn't used because old hardware doesn't have it. New games don't use the new hardware. The only reason to buy new hardware with the new features is to run the old features faster. Eventually, there's enough people with the 'new' (now years old) features that they start to get used, we hope.

In the past, it was the consoles creating an influx of new, latest ideas, that started the next evolution of games all round. I can't see that happening ever again. Heck, even the console companies are no longer ditching their old machines and producing next-gen only titles to drive adoption. Unless there's a real game-changer tech, something a developer can target that makes a game so spectaculous that it drives sales of hardware and creates a large enough market to justify the games development, the sensible economics now are cross-gen titles that are a bit shinier and smoother on the latest hardware, and that's it.
Sadly this is probably correct and a shift from what we'd gotten used to before.

A lot of people tried to blame the surprisingly long cross-gen cycle this time around on the pandemic and/or chip shortages leading to limited new console sales, but this never really made too much sense to me. PS5 was selling pretty much in line with the extremely successful PS4, and Xbox were also touting record console sales for a good while there. The chip shortage was never what the vast majority thought it was, as if these consoles had empty production lines or something. The shortages people saw mainly came from unexpectedly high demand rather than any stringent supply situation.

I think the main things driving a long cross-gen period comes from the extraordinary high costs of game development nowadays along with a shift in attitude from Sony+MS in terms of the role that they want their 1st party games to play in terms of financial strategy. Before, early gen 1st party titles did not need to be big money makers on their own. Their main role was simply to attract people to the new consoles and show off what they could do. They used to be the 'true next gen' titles that would set the example and lead the way. But now Sony seems to want every game to be justified on its own in terms of sales+profit, and Microsoft is more concerned about Gamepass subscriptions and reach rather than what specific console/hardware people use.

And nothing about this situation is likely to change for next gen, so yea, we can probably expect another long cross-gen period again, with another slow shift to any newer tech paradigms.
 
And nothing about this situation is likely to change for next gen, so yea, we can probably expect another long cross-gen period again, with another slow shift to any newer tech paradigms.
Which end up happening too slow for software. The hardware companies will expect software to go one way, design hardware for it, and get it all wrong. the ideal solution is probably pools of optimised processor types - branchy-code processor, stream processors, and 'ML' processors - and let the software engineers decide how to use it.
 
Unless there's a real game-changer tech, something a developer can target that makes a game so spectaculous that it drives sales of hardware and creates a large enough market to justify the games development
We’ve had real game changer tech (this generation has significantly more than any previous generation minus 3D), but it’s not really moved the needle in terms of feature adoption.

We moved to RT which is a completely separate field of rendering, we have Super fast IO, and a completely new geometry front end.

I think the challenge is that the cost for tooling to support these features is too high, they can no longer build out the features and engine and build the game at the same time, it’s “too large” a leap perhaps. They would effectively have to pause all game content creation work until retooling is completed and games are just too large for them to stop production.

Perhaps A strategy to employ is what remedy did, just make a much smaller title, that way you can progress the technologies forward until you have reached where you need to
Be, before making a large title.
 
Last edited:
I dont think zoomers will want a generation that is "mid" Console manufacturers should focus instead on a bussin generation. Much more appealing
 
I agree with rolling generations and think that when MS can put an X in an S sized box at an S sized price they should put out another X sized boxed that has whatever they can get into it for $499 and supplement that power with whatever cloud computing can realistically achieve.
 
The one thing that was really positive with this endless cross-gen strategy was the advent of the importance of 60fps modes in both the eyes of consumers and developers. Now almost all the PS5/XSX exclusive games have a 60fps mode and I think that we can mostly thank the 3 years (and counting) of cross-gen. With Zen 2 CPUs developers don't have to resort to the "it's more cinematic" excuse as it has being proven plenty of times this gen.
 
The one thing that was really positive with this endless cross-gen strategy was the advent of the importance of 60fps modes in both the eyes of consumers and developers. Now almost all the PS5/XSX exclusive games have a 60fps mode and I think that we can mostly thank the 3 years (and counting) of cross-gen. With Zen 2 CPUs developers don't have to resort to the "it's more cinematic" excuse as it has being proven plenty of times this gen.
I feel quite the opposite personally. There's nothing wrong with 30fps gaming, and console users were pretty much ALL entirely happy with 30fps gaming as well up until super recently. Pretty much every blockbuster Sony 1st party title on PS4 was 30fps. Was there a SINGLE complaint in any of the reviews or user impressions of these games because they were only 30fps? Nope. Genuinely, I'd never once seen anybody complain about this. Yet now somehow 30fps is unplayable?

This mindset of using the extra CPU power to push framerate instead of game design aspects is very disappointing in my opinion. It's basically halving the potential of these new consoles. It is allowing developers to make the exact same sort of safe games they were making before instead of actually utilizing the new CPU power to push ambitions further.

If this keeps up, and 30fps games dont make any big return, just dont be surprised when people are asking at the end of the generation, "Why didn't this generation feel like that much of a leap?".
 
They won't ask this when they play a game at 30fps again.
Console gamers aren't new to 60fps gaming, ffs. They've had 60fps games since the 80's. Every generation is filled with plenty of 60fps games on console, too. But 30fps games have also always been acceptable. It is not some terrible framerate that makes gaming hard to stomach at all. Especially when you dont even have a choice to compare back to back, 30fps is super easily adjusted to and if a game is good, you forget about it quick enough. I'm currently playing Xenoblade Chronicles 3 on Switch and not for a second have I thought, "Wow I'm not having fun because the framerate isn't better". And I'm somebody who plays the vast majority of games at 60fps on PC.

30fps is entirely playable. Seriously, it is. It is not some 'unplayable mess' or some 'slideshow'. It may not be 'super smooth', but it's smooth enough to typically not get in the way, assuming there isn't further performance issues.

I think the main thing to understand here is this isn't an 'all else being equal' situation. Making every game playable at 60fps means developers are all playing it safe and not using the extra CPU power for anything more interesting.
 
Console gamers aren't new to 60fps gaming, ffs. They've had 60fps games since the 80's. Every generation is filled with plenty of 60fps games on console, too. But 30fps games have also always been acceptable. It is not some terrible framerate that makes gaming hard to stomach at all. Especially when you dont even have a choice to compare back to back, 30fps is super easily adjusted to and if a game is good, you forget about it quick enough. I'm currently playing Xenoblade Chronicles 3 on Switch and not for a second have I thought, "Wow I'm not having fun because the framerate isn't better". And I'm somebody who plays the vast majority of games at 60fps on PC.

30fps is entirely playable. Seriously, it is. It is not some 'unplayable mess' or some 'slideshow'. It may not be 'super smooth', but it's smooth enough to typically not get in the way, assuming there isn't further performance issues.

I think the main thing to understand here is this isn't an 'all else being equal' situation. Making every game playable at 60fps means developers are all playing it safe and not using the extra CPU power for anything more interesting.
What I noticed is that 30fps was OK playing at 720p. But the higher the resolution and the worse it looks for me in motion. I think the reason is because of the perceived resolution difference between a static image and in motion that gets bigger when the resolution is higher.

For me it started with 1080p 30fps gaming when I noticed the incredibly big difference of apparent resolution between static and in motion. It was very different than 480p / 720p 30fps where the difference was smaller and the image could still look OK even in motion.
 
What I noticed is that 30fps was OK playing at 720p. But the higher the resolution and the worse it looks for me in motion. I think the reason is because of the perceived resolution difference between a static image and in motion that gets bigger when the resolution is higher.

For me it started with 1080p 30fps gaming when I noticed the incredibly big difference of apparent resolution between static and in motion. It was very different than 480p / 720p 30fps where the difference was smaller and the image could still look OK even in motion.
I'd bet the difference for you was that your 1080p display was probably bigger than your 720p display, if not just some placebo or simply incorrect perception. The bigger the image, the more that lack of fluidity is perceptible. There's also other aspects of display tech that could affect the perception of fluidity or even input lag if you switched TV's, especially back in the 2000's.

Either way, the point of my argument isn't whether 60fps is better than 30fps or not. Of course it is. It's about what we as gamers are giving up when developers need all their games to run at 60fps. And this isn't some small cost at all. Again, we are literally halving the potential of these new consoles in terms of what the new CPU's could do, which was literally one of the biggest improvements there cuz of the terrible CPU's of the previous generation. And it's very disappointing that developers would just keep making games like before, as if those terrible CPU's still existed, except now they can do those same safe games at 60fps instead of 30fps.

It really ruins a lot of the point of this new generation's hardware leap, in my opinion.
 
I think we ideally need more data on how cpu use effects bandwidth on these consoles. Remember that slide from the ps4 that had the bar graph showing how cpu use increasing reduces bandwidth for the gpu in a none linear way? I've been wondering for a while if this is the reason we have things like arkham knights having to drop to 30fps in coop, there was another game with a coop feature that did this aswell but I can't remember what that was. Is this why some of the ue5 games are having to drop to as low as 680p base? The cpu getting hammered and the bandwidth left for the gpu is now somewhat bandwidth limited?

I wonder how teardown runs on a pc with hardware around the console level, I'm guessing it probably does a little bit better but then we have the issue of the pc cpu having larger caches so once again the water gets muddied.
 
I see both sides of the fps debate. The way it resolves for me is that I don't think every game should get lambasted for not having a 60 fps mode, but feel like a lot of games should give the 60 fps choice if they aren't trying to innovate too much.
 
Or at the very least 40fps mode on ps5, and VRR on xbox
This, OMG this!

I feel like it was a real miss for this gen for MS not to make it a TCR - a technical requirement, for any games on the xbox platform.
a 30 fps game should always have a 30fps + unlocked VRR mode,
a 60fps game should always have 60fps + unlocked VRR mode.

OK i understand that in some games timing is more critical than latency, but those are the exception, and just lock it to a solid 30/60 there.
I really feel like VRR is amazing tech, but it just isn't being used widely enough.
 
It's wierd that there's this misconception that everyone or almost everyone was happy with 30 FPS gaming on PS4/XBO, when there was tons of complaining around the internet and among gamers about how they wished that X game had a 60 FPS mode because 30 FPS was just sooo bad. :p

This isn't getting into whether most were or were not happy with 30 FPS, but whether almost everyone was happy with it. Hell, there were game reviews where they applauded the game and then in the same breath complained about having to play it at 30 FPS and wondered how much better it would be if it was 60 FPS.

Hell, just enter a random game thread for a PS4 game at GameFaqs and you'll see tons of people complaining about having to play at 30 FPS and then tons of people complaining about people complaining about having to play at 30 FPS. :p So this notion that everyone was happy with 30 FPS is just weird when lots of people were unhappy with 30 FPS but played the games anyway because there was no other way to play it. Basically the game was good, but the experience could have been even better at a higher FPS. Not only does it play better at 60 FPS but it looks much better in motion at 60 FPS. Not everyone wants to just look at screenshots of their games. ;)

Regards,
SB
 
I'd bet the difference for you was that your 1080p display was probably bigger than your 720p display, if not just some placebo or simply incorrect perception. The bigger the image, the more that lack of fluidity is perceptible. There's also other aspects of display tech that could affect the perception of fluidity or even input lag if you switched TV's, especially back in the 2000's.
Yes. On a 16" CRT, 30fps was a difference of say 0.1 degree FOV per frame. On a 50" 4K display, that's now 1 degree per frame. The delta between frames is significant. Motion blur probably decreases judder to make things feel smoother, but then you get a lack of clarity.
 
Back
Top