PS5 Pro *spawn

doesn't anything that run 120Hz automatically accept 40hz?

They just have to make the game flag the PS5 to output @ 120hz I think?
Yup, it's normal 120hz mode.
Just display same image for 3 frames while using vsync.

You could make 24hz mode by displaying image For 5 frames. (For maximum cinematic experience.)
 
I was kinda wondering this too. Anything HDMI 2 should "just work", however the older HDMI versions depend on the internal image processor on the TV itself. My 2014-era VIZIO M602i-B3 will accept a 40Hz input and automagically run it at 120Hz output.
Cerny said over 25% of PS5 owners own 120 fps-capable TVs, while around 1 in 10 PS5 players have variable refresh rate TVs.
 
doesn't anything that run 120Hz automatically accept 40hz?
I'm pretty sure any modern TV would work this way.

My question was centered around older TVs. 40Hz wasn't any sort of "standard" input method for HDMI; hell the only reason it fits into modern sets is because of HDMI 2 support for VRR. My M602 is ten years old at this point but it's a 120Hz LCD with 48-zone full array local dimming. It's a pretty decent set and there'd be no reason to replace it even plugged into a modern console. So, does a ten year old TV still work with 40Hz? I don't know that it's an assumption we should safely make, although in my Vizio's case, it seems to work just fine.

I have another 32" Samsung of the same era and it has no ability to accept inputs other than 24, 30, and 60Hz. Anything else and it gives the blue "unsupported input" screen. At the same time, it cost like $200 ten years ago and would probaby (but not certainly) be replaced by anyone who had the money to drop on a PS5 Pro.

So, I dunno... 🤷
 
Last edited:
I think there's an argument to be made that 60FPS is more important now than in the past because of the various temporal techniques in use now. TAA and its derivatives have less artifacts and blur the higher the frame rate is. Ray tracing effects that use a radiance cache or temporal denoisers have less temporal lag and noise the higher the frame rate is. Lowering the internal resolution to hit 60 fps will obviously just result in even more artifacts, blur, and noise though, so ideally developers would be conservative with geometry, shaders and other performance-hitting factors so games can have 60 FPS and high internal resolution.
 
Good point. I also find 60fps to be the minimum for Remote Play, so it's likely to be more important as streaming gains more traction.

I don't know if that's maybe a limitation of the PS4/5's solution, but I'd hazard a guess that it's more to do with requisite responsiveness.
 
I'm pretty sure any modern TV would work this way.

My question was centered around older TVs. 40Hz wasn't any sort of "standard" input method for HDMI; hell the only reason it fits into modern sets is because of HDMI 2 support for VRR. My M602 is ten years old at this point but it's a 120Hz LCD with 48-zone full array local dimming. It's a pretty decent set and there'd be no reason to replace it even plugged into a modern console. So, does a ten year old TV still work with 40Hz? I don't know that it's an assumption we should safely make, although in my Vizio's case, it seems to work just fine.

I have another 32" Samsung of the same era and it has no ability to accept inputs other than 24, 30, and 60Hz. Anything else and it gives the blue "unsupported input" screen. At the same time, it cost like $200 ten years ago and would probaby (but not certainly) be replaced by anyone who had the money to drop on a PS5 Pro.

So, I dunno... 🤷

I'd assume your TV is 1080p though? I mean if you need 40 hz for 1080p I dunno if we want to go down there in terms of the software side.

But in terms of the hardware topic I think any issue is that 120hz at 4k is still priced as a premium feature as opposed to a commodity feature so I don't know if we can even assume current sales favor those in terms of market share, but less existing install base.

Good point. I also find 60fps to be the minimum for Remote Play, so it's likely to be more important as streaming gains more traction.

I don't know if that's maybe a limitation of the PS4/5's solution, but I'd hazard a guess that it's more to do with requisite responsiveness.

60 fps would cut down latency quite a bit over 30 fps. But I suspect the PS4/5 streaming solution isn't as optimized as it could be which would likely require a more robust encoder and software stack (and possibly decoder as well on the client side from first party) which might magnify the differences.
 
depends what your budget is, but my TCL TV i bought 2 years and a half ago cost me 650€ (well 800€ with a 150€ redeem by TCL) for a 65" 4k/120 VRR compatbible.

In comparison my very first LCD TV bought in 2006 was 900€ at the time for a 26".
 
depends what your budget is, but my TCL TV i bought 2 years and a half ago cost me 650€ (well 800€ with a 150€ redeem by TCL) for a 65" 4k/120 VRR compatbible.

In comparison my very first LCD TV bought in 2006 was 900€ at the time for a 26".

I have a TCL 65" TV same price I bought last year 4k/120 or 144 hrz VRR compatible.
 
depends what your budget is, but my TCL TV i bought 2 years and a half ago cost me 650€ (well 800€ with a 150€ redeem by TCL) for a 65" 4k/120 VRR compatbible.

In comparison my very first LCD TV bought in 2006 was 900€ at the time for a 26".

It's not that 4k 120hz TVs aren't affordable but that the feature itself is priced at a premium which would impact the adoption rate. You can compare the TVs in the same brand, 120hz carries a rather noticeable premium. While the feature has a very specific usage and benefit case (as even for console gaming at this point it's not that supported) so it isn't seen as much of a "must have" feature to the broader market unlike say something like FALD HDR.

Actually if you look it up in general even in the US you might be surprised at what the actual market share of just 4k TVs is relative to 1080p even today. It's not as high as you might think even though 4k itself is basically "standard" at this point and cheap.
 

The PS5 pro delivers locked 60 fps at the same resolution and quality as the base PS5's fidelity mode in TLOU 2. Arguably better. This is very promising for PSSR before it has even officially shipped. Points towards the future of consoles that this hw will become standard. Also this could be very good news for those wanting to play GTA 6 at 60fps on console with good fidelity. I for one cant play games in performance modes on console because of how much the fidelity drops. Like the Star Wars Jedi games look horrible for me in performance mode. I can see how I may end up buying a PS5 pro now. But I think I will give it to my siblings and take back the Series X. Then play with them on the PS5 pro when I visit.
 

The PS5 pro delivers locked 60 fps at the same resolution and quality as the base PS5's fidelity mode in TLOU 2. Arguably better. This is very promising for PSSR before it has even officially shipped. Points towards the future of consoles that this hw will become standard. Also this could be very good news for those wanting to play GTA 6 at 60fps on console with good fidelity. I for one cant play games in performance modes on console because of how much the fidelity drops. Like the Star Wars Jedi games look horrible for me in performance mode. I can see how I may end up buying a PS5 pro now. But I think I will give it to my siblings and take back the Series X. Then play with them on the PS5 pro when I visit.
It isn’t the same resolution as the fidelity mode. Its the same resolution and frame rate as the performance mode with a better reconstruction algorithm.
 
It isn’t the same resolution as the fidelity mode. Its the same resolution and frame rate as the performance mode with a better reconstruction algorithm.
Its 4K 60 fps on the PS5 pro I dont get your point. Performance mode on the base machine is 1440p 60 fps!! Fidelity mode is 4k 30 fps or did I get something wrong?
 
Of course people like 60fps versus 30fps, all else being equal. That's not a revelation. I am the same. But the thing is - I can still play 30fps games just fine. And I'd bet 99% of these people can too if they stop whining about it on-paper and just focus on actually playing and enjoying a game instead of working on some preconceived notions. You get used to it very quickly, especially if it's a solid 30fps. It's not like 60fps is some new thing, either. Plenty of people were playing 60fps games last generation. Playstation 2 had tons of 60fps titles. People would regularly switch between 30fps and 60fps games. It was fine.

But with fixed spec consoles, we're not in a 'all else being equal' situation. We're never gonna know what we're missing if we demand every developer make every game 60fps. Developers last generation would probably have made quite different games if 60fps was some mandate. We'd have lost out on lots of great experiences. I simply dont think it's reasonable to demand that any developer has to water down their ambitions or throw them away entirely based on this lackluster notion of 30fps somehow not being acceptable anymore. I dont think most of these gamers really understand that this isn't necessarily just a case of scaling resolution or graphics a bit.

EDIT: I will admit I do forget about the OLED argument when talking about 30fps. Done it a couple times here in this forum alone. lol I'm still not sure what percent of the console install base is using OLED, but I'd guess it's still a fairly small minority, though not insignificant. I also consider it a straight up major flaw of the technology for gaming and I dont think the whole world of game development should have to rearrange all their plans just to cater to it. Plus, there is still a platform for those who want both ambitious games and great framerates in any game...
It isn’t just OLED, it’s any good display with a low response time. I have a miniLED that has a very good response time and 30 fps looks horrible.

IMO, 30 fps is barely usable anywhere and I’d rather sacrifice visuals to get 60 (or ideally 120) but that’s why I play on PC. I think the reason 30 fps is controversial now is that for a while consoles only had 30 fps so everyone was used to it, now after the PS4 people started getting used to 60, and the start of the PS5 gen almost every game had a performance mode.
 
It isn’t 4k 60. It’s 1440p 60 reconstructed to 4k via PSSR.
So its not 4k but 1440p reconstructed to the 4k via PSSR that I was referring to which is 4k!! Its 4k 60 fps. Maybe you could have said its not native 4k. But you kind of missed my point with the fact its 4k.
 
Yea, y'all are still not getting it.

30fps is not my 'preference'(I literally stated this outright in the post you responded to smh). This has nothing to do with preference of 30fps vs 60fps. I dont know why this is so hard to get, even though I feel I'm explaining it well enough. I'm merely stating that 30fps is still entirely playable/acceptable, and that if we demand every game be 60fps, we will miss out on developers being able to push the hardware and their ambitions. This isn't about 30fps Quality vs 60fps Performance. We're talking about '30fps Horizon Zero Dawn' or '60fps some other game entirely cuz 60fps was too limiting for their ambitions'.

I cant stress enough that if we look back and had demanded everything in the past be 60fps, we'd have lost out on countless classics and amazing games, all cuz gamers demanded developers throw away their ambitions and what they really wanted to make in order to hit this 'required' 60fps framerate. It'd be ridiculous.

And yes, it's a downside of OLED technology if it cant display 30fps content smoothly, when other display types can. It just is, no matter how much you want to ignore it. Demanding every developer avoid 30fps simply because OLED is bad at it is asinine.
You bring up a good point and why GTA 6 is going to come in standard at 4k 30 fps. I personally cant stand how console titles look in performance mode. The textures all look nasty. PS5 pro really fixes it for me. I can enjoy 60 fps with the same or better fidelity as the 4k 30 fps mode on the base consoles. And as you said I'm one of the people who really doesnt care for 60 fps when it comes to certain games like action adventure titles. Even in racing titles like Forza I recall putting it into some sort of performance mode and the fidelity just dropped substantially the improved fps didnt matter at all to me when things look like potatoes. I dont blame developers for wanting to push visual limits in terms of fidelity.
 
So its not 4k but 1440p reconstructed to the 4k via PSSR that I was referring to which is 4k!! Its 4k 60 fps. Maybe you could have said its not native 4k. But you kind of missed my point with the fact its 4k.
By that logic the existing performance mode was 4k too. It upscales 1440p to 4k using TAA.
 
It isn’t 4k 60. It’s 1440p 60 reconstructed to 4k via PSSR.
That's the point of PSSR. rntonga is saying PS5 Pro delivers on better PS5 gaming - '4K quality modes at higher framerates' - because PSSR enables it. He did not say PS5 Pro is rendering native 4K60 and there's no need for it to.
 
That's the point of PSSR. rntonga is saying PS5 Pro delivers on better PS5 gaming - '4K quality modes at higher framerates' - because PSSR enables it. He did not say PS5 Pro is rendering native 4K60 and there's no need for it to.
He said "The PS5 pro delivers locked 60 fps at the same resolution and quality as the base PS5's fidelity mode in TLOU 2. Arguably better." That is just not true. One may subjectively prefer the look of PSSR but it has objective pros and cons when compared to fidelity mode. The quality is not the same. 1440p with reconstruction is not the same resolution as a native 4K.
 
Back
Top