Do you think there will be a mid gen refresh console from Sony and Microsoft?

As a consumer, I don't see the appeal of mid gen refresh. A new gen = back to the drawing board and putting good future looking hardware together.

Mid gen = boosted specs of what you have and a further delay to a new gen with all it's benefits.

I don't see the appeal either. I think it comes from a romanticized idea of developers creating labors of love pushing hardware to the limits and past with clever ideas to get games that people feel weren't possible on the hardware in said console. At the same time people want the higher resolutin and frame rates which are typically the first things to get cut back when trying to shoe horn new games on older hardware. So the refreshes are the solution.

You need substantially better tech for a proper new gen and most importantly much smaller node (which won't be the case even for a 2024 launch).
Sure 2024 may not but I think 2025 would give you yet another generation of zen and a full generation of rdna. So if we were getting more amd based consoles a zen 5 /rdna 4 based console would likely be a large enough jump over zen2/rdna 2 consoles esp if you look towards higher core counts . Games like BG3 and other sseem to be very cpu limited with the current consoles and with raytracing
 
I feel quite the opposite personally. There's nothing wrong with 30fps gaming, and console users were pretty much ALL entirely happy with 30fps gaming as well up until super recently.
I gamed at at 30fps during 360/PS3 generation and PS4/XBO and accepted this as a compromise for hassle-free gaming but I was never "entirely happy" with it. Many games, like GTA V on 360/PS3, were running closer to 20fps than 30fps and whilst 30fps target continued into the PS4/XBO generation, I personally felt like more games were hitting 30fps. Over fourteen years consumers either acclimatised to 30fps, or didn't but simply couldn't afford a PC that could deliver 60fps. This is akin to eating rice every day because it's food and better than starving.

This generation, with modern engines allowing console versions could easily accommodate a variety of low/medium/high/ultra settings to deliver both performance and fidelity modes meant that 60fps became an option on almost all titles and now people have got used to what 60fps feels and feels like it's understandable that some people are not willing to revert to 30fps.
 
Gears 6, Hellblade 2 and Indiana Jones will answer whether a midgen refresh is necessary at all.

They might be able to present something on December 7th.
 
This generation, with modern engines allowing console versions could easily accommodate a variety of low/medium/high/ultra settings to deliver both performance and fidelity modes meant that 60fps became an option on almost all titles and now people have got used to what 60fps feels and feels like it's understandable that some people are not willing to revert to 30fps.

Graphics fidelity can scale relatively easily but other aspects are quite a bit more challenging and would effectively need to be designed around a specific frame target.

Just a crude example here let's say it's a scene with 10 NPCs. If it's just a fidelity scaling issue we can say halve the fidelity (through a combination of graphics settings) for a 60 fps mode vs a 30 fps mode. That just changes the visuals on the game but doesn't really fundamentally change the game itself. Actually halving the number of the NPCs however would be a much more significant change, even in terms of just visual presentation. It would not be relatively simple graphics configurations settings.

At least to me, and I think the other poster was saying the same, a disappoint so far this generation is that despite the massive CPU gains we haven't really seen a change in scope in terms of simulation and logic. Really that front seems to have stalled (if not regressed in ways) since the PS3/360 era as the PS4/Xbox One era itself didn't have very large CPU gains.
 
This is akin to eating rice every day because it's food and better than starving.
I just dont accept this analogy whatsoever. Console gamers were not merely 'tolerating' 30fps. If it was really that bad and begrudging, all the masses of blockbuster PS4 exclusives in that era would have had people saying 'the game is great BUT', yet we saw NONE of that. None. Absolutely nobody was saying, "I quite liked Uncharted 4, but only being 30fps really brought it down for me". It was simply all praise. Or look at Tears of the Kingdom. About as unanimously loved as games get, but it was only 30fps, and that was a game that released in 2023!

It's not that people wouldn't have preferred 60fps, it's just 30fps is genuinely a perfectly playable framerate for most games. This elitism over 60fps is relatively new, coming largely from the extended cross gen period for XSX/PS5, where these consoles were more or less just PS4 Pro + Xbox One X Mk2 - a way to play last gen-based games with even higher resolutions/settings/framerates. But this was never sustainable once true next gen titles came around. Yes, graphics can be easier to scale in terms of performance, but there's more to a game's demands than just basic fidelity. And even with graphics, we'll likely see bigger compromises for 60fps performance modes when available, too. And people will complain about this, not understanding you simply cant have it all.
 
Yeah, it's a bit extreme. I played 30 fps, it was okay, wished for faster but it was only a few games where the low framerate really detracted. Of course, now I want 60 fps minimum. Want faster! So I'm mostly playing PS4-looking games at higher res.

A lot depends on the game style. Some games are far more accommodating of lower framerates.
 
Yeah, it's a bit extreme. I played 30 fps, it was okay, wished for faster but it was only a few games where the low framerate really detracted. Of course, now I want 60 fps minimum. Want faster! So I'm mostly playing PS4-looking games at higher res.

A lot depends on the game style. Some games are far more accommodating of lower framerates.

I mean I capped BG3 at 60, but could I happily play at 30 or even less. I tried it at 120 and what's even the benefit, I want my fans to be slightly more quiet instead.

On the other hand I'll probably set my monitor to 165 just for Hades 2.
 
I just dont accept this analogy whatsoever. Console gamers were not merely 'tolerating' 30fps. If it was really that bad and begrudging, all the masses of blockbuster PS4 exclusives in that era would have had people saying 'the game is great BUT', yet we saw NONE of that.
You know what acclimatise means right? Your post suggests you don't.

It's not that people wouldn't have preferred 60fps, it's just 30fps is genuinely a perfectly playable framerate for most games. This elitism over 60fps is relatively new
I would like to see objective evidence that "30fps is genuinely a perfectly playable framerate for most games". Genuinely. I am really interested. And I don't see preference as elitism. It's preference. Preferring a hotdog over a burger isn't hotdog elitism. Preferring reduced resolution, lighting and detail over framerate isn't elitism. It's a preference. It's a choice.

It's the choice most PC gamers have had for decades and has become a choice for many console gamers. The outcry you get when games launch without a 60fps mode? That's disappointment that gamer's preference for 60fps isn't an option.

30fps is genuinely a perfectly playable framerate for most game
I completely agree. I will go 60fps os 40fps on PS5 for any game that involves fast movement and action because that is a superior experience for me, in part because a lot of games still feel like controller latency is tied to framerate. I have not played a 30fps game on my PS5 or Series X this generation, but there aren't many where there isn't a choice.
 
Last edited by a moderator:
@DSoup Maybe there is some acclimatisation but it depends on the game really, and also if someone is acclimatised to high end PC gaming.
For example I can't tolerate games like Wipeout Omega, GT, Devil May Cry or fighting games running below 60fps at all. The DmC reboot at 30fps lacked in flow compared to the buttersmoothness of Bayonetta and DMC4.
Games that rely on superfast reflexes and flow just don't feel the same. The gameplay is totally transformed to the worse at low framerates.

On the other hand games like Uncharted 4, God of War, Spiderman and the Last of Us 2 were so well made, and had such a stable performance, that more than compensated for the 30fps experience.
There was no comparable game on PC, even though PCs had their own selection of super detailed games.

And there is also another phenomenon that makes some 60fps (or higher) games, have some uncanny valley feel.
In a lot of games some animations seem to have been created with lower framerates in mind, in both cutscenes and gameplay. Their smoother framerates seem unnaturally fast, strange and weightless, whereas in some other games that framerate works perfectly running 60fps or above. I get that a lot with Horizon Forbidden West's and FF7 Remake's cutscenes. They look weird. But MGSV's and Tekken 8's cutscenes are looking very natural.

On the subject of acclimatisation, there are many people who tent also to get so much used to super smooth framerates, that they can't appreciate what's actually presented on screen and how good a title is in terms of gameplay.
A good example of this was DriveClub. The game is obviously stable 30fps. Technically it is a huge achievement even by today's standards, with tons of subtle details and crazy whether simulations being calculated in real time. But I remember the title being trolled by PC enthusiasts as being a 24fps/choppy framerate title, while there was no racing title on PC that competed head on head in terms of visual and technical achievement at the time of release. The acclimatization to higher framerates played it's role there.

Of course it is only expected that a console would start straggling at it's final years. 6-7 years in the market, with architecture usually based on something slightly older, is bound to show it's age. PS360 was struggling a lot more with framerate and resolution than PS4/Xbox One did. And I would say that the PS4 ended much more gracefully than PS3 did. PS4 still got butter smooth titles like Doom Eternal and Resi2Remake running at 60fps, and highly detailed games like GoW and TLOU2 running at stable 30fps. But I would say the consoles during their first and mid years are performing at a level that is considered outstanding and diminishes as time passes in general which is pretty normal.
 
Last edited:
@DSoup Maybe there is some acclimatisation but it depends on the game really, and also if someone is acclimatised to high end PC gaming.
It feels like a lot of folks have no perception, understanding, or just acceptance, that every individual's attributes span a vast spectrum in terms of tolerances, preferences and adaptability.

I've been gaming since the 8-it era where most games had to run at 50/60hz because the hardware simply didn't have enough RAM two two display buffers, or that there was only one fixed display buffer. This was true of early consoles like the Atari 2600, and many early 8 bit computers. If you had game that couldn't do 50/60hz on the TV, you may it work, it was a flickery mess, or if the video allowed, you ate a chunk of RAM to a second buffer and did your best.

But I wholeheartedly reject notions like 'most games are fine at 30fps'. Maybe they are for some, but for some others they are not. However if you are on a budget and can only afford hardware where the games are running at 30fps (often closer to 20fps in the 360/PS3 generation), then you either put up with that, or you go without entirely. That doesn't make it "OK" or even "fine", it just mens people will tolerate something that is not their preference. And peoples appreciate for tolerance will vary greatly.
 
A lot of 8 bit games ran with far lower framerates. Notably 3D ones! I don't think anyone was playing Elite at 50/60 fps! Lack of RAM wasn't a problem forcing faster refreshes. If you could only update the graphics say 25 or lower times per second, the computer just scanned out the un-updated frame each screen cycle. So locked 50/60 Hz output, but actual graphical updates less than that.
 
A lot of 8 bit games ran with far lower framerates. Notably 3D ones! I don't think anyone was playing Elite at 50/60 fps! Lack of RAM wasn't a problem forcing faster refreshes. If you could only update the graphics say 25 or lower times per second, the computer just scanned out the un-updated frame each screen cycle. So locked 50/60 Hz output, but actual graphical updates less than that.

There's something to be said about something being new and novel and the only way to play it. It's a lot easier to find that acceptable. Just look at how some people are willing to go back to 30 FPS to experience RT (I'm obviously not one of them :p). It's new and novel so it's more acceptable.

Once more examples of, for this discussion, higher framerate 3D games became available with which to compare, those older lower framerate 3D games started to look, play and feel worse and worse to the point where people that used to find it acceptable could no longer stand to play those games at those framerates.

I started playing Doom at lower than 20 FPS and loved it because it was novel and my PC at the time couldn't do any better so I had no idea there was a better way to play it. There were no settings to adjust or anything. Got a new PC which could do higher than 30 and suddenly I could no longer stand to touch Doom on that older PC. Exact same game, only difference being framerate but at higher framerate not only did it play so much better, but it also looked massively better without the low framerate judders and stutters.

Basically, acceptance of X thing because you have no experience with Y version of X thing doesn't mean X thing is preferable, desirable or even acceptable anymore once exposed to a version of X thing that offers a better experience (personal experience as that's a very subjective thing), in this case Y thing or in terms of this discussion 30 versus 60 FPS. Sure some people might still find X thing (in this case 30 FPS) acceptable, but some people will also no longer find it acceptable once exposed to Y thing (in this case 60 FPS).

Regards,
SB
 
Basically, acceptance of X thing because you have no experience with Y version of X thing doesn't mean X thing is preferable, desirable or even acceptable anymore once exposed to a version of X thing that offers a better experience (personal experience as that's a very subjective thing), in this case Y thing or in terms of this discussion 30 versus 60 FPS. Sure some people might still find X thing (in this case 30 FPS) acceptable, but some people will also no longer find it acceptable once exposed to Y thing (in this case 60 FPS).
Some also will be oblivious. I forced my PS3 into 720p for play Age of Booty at a smooth 60 fps. My friend played in 1080p. I showed him alternating the different options and he didn't particularly notice. He could tell they were different, but didn't see any point in 720p60. I wonder if that's something we can get trained on though? Like HDTV - at first it felt barely any different but now there's no going back.
 
Some also will be oblivious. I forced my PS3 into 720p for play Age of Booty at a smooth 60 fps. My friend played in 1080p. I showed him alternating the different options and he didn't particularly notice. He could tell they were different, but didn't see any point in 720p60. I wonder if that's something we can get trained on though? Like HDTV - at first it felt barely any different but now there's no going back.

Oh absolutely, it's why I noted that it was a very subjective thing in parentheses. :)

It's why one can never make absolute statements about something like this across a board spectrum of consumers.

Regards,
SB
 
A lot of 8 bit games ran with far lower framerates. Notably 3D ones!
A lot of 8 bit games on some 8-bit hardware, sure. I.e. the BBC Micro, Commodore 64, later Ataris (800 etc), ZX Spectrum etc - had sufficient RAM and/or a flexible enough graphics chip to do that. But most early 8-bit hardware, i.e. the Atari 2600/5200, the VIC-20, early Ataris (400 etc), Dragon 32, Atmos, BBC Electron, TI-99 - they had one frame buffer. So if you couldn't update the display in the vertical blank you got a mess on screen. There was a lot of shit 8 bit hardware out there you're ignoring.

I payed the crap out of Elite on the C64 and that game was a flickery mess because even using two frame buffers, it was copying the newly generated frames into the vram whilst the TV was displaying that frame. Mercenary did 3D on the C64 much better.
 
But let's talk about Ocarina of time running at 17fps (locked! :runaway:) or so on my Pal N64. I am actually currently re-doing a playthrough again and its surprisingly still OK-ish and there are many reasons why it's still playable as of today. First I am playing on a very good CRT monitor (Sony Trinitron :love:), then the game is 240p (I think) and finally the textures and assets are mostly clean, even AAed, but quite low resolution even for 240p.

All those parameters lead to the point that there are not a lot of difference between static resolution and perceived in-motion (temporal) resolution (thank god for that CRT final magic!). Retrospectively I think this is the main reason I was not caring that much about the low framerate back then.

Fast forward me playing PS4 games at 1080p 30fps on my cheap LCD monitor. The gap between static (screenshots) resolution and perceived temporal resolution is not a gap anymore, it's a gulf. It's like comparing 180p to 1080p and everything looks awful 95% of the time (notably with motion blur enabled). What I think most people do is their brain adapt to that gulf (because this is what we are made for: adapt to our environment, or die). But it shoudln't be that way, it wasn't that way before. Shockingly the temporal resolution we get in a 1080p 30fps and highly detailed game (on screenshots) can be lower than what we could get playing on a N64 running at 240p 20fps.
 
I remember the blowback on the forum here, when Insomniac did their 30fps games sell more and went with 30fps.
Which is also an interesting metric to track, since in the end it comes down to the cash. I do not know if that conclusion still holds up today, but back then they said their research told them prettier pixels was more important than fps in regards to sales.

Would be interesting to find out if that holds true in 2023 also.
 
Insomniac then backtracked and produced 60 fps games, offering users choice.

Actually, metrics on which mode people prefer to play would answer this well and truly! Games come with performance and quality modes on consoles. Which one are people choosing? That'll tell us precisely who cares about better framerates and who doesn't.
 
Back
Top