So according to GN, screen tearing is new to the console audience. Eh?? It's been there on consoles for 20 years, probably longer.
Edit: and I'm not comfortable with the way they're comparing PC 240 hz to PS5 outputting at 120 hz and then captured at 240 hz. As a comparison of how the PS5 and PC experiences can differ there's something to be said, but as a means of of comparing the raw performance of PS5 vs PC I think it's likely to skew results to some degree in favour of PC.
A machine limited to vsync at 8.3ms intervals like the PS5 - as opposed to 4.2 ms on a 240 hz PC - is naturally likely to have lower 0.1% and 1% lows as there will be more instances of longer waits for synchronisation - particularly if you're not triple (or more) buffered all the way through the pipeline. Likewise, average fps will be boosted because there will be instances where fps can go above 120 fps.
The only fair test is to test both on 120 hz displays IMO.
Edit: and I'm not comfortable with the way they're comparing PC 240 hz to PS5 outputting at 120 hz and then captured at 240 hz. As a comparison of how the PS5 and PC experiences can differ there's something to be said, but as a means of of comparing the raw performance of PS5 vs PC I think it's likely to skew results to some degree in favour of PC.
A machine limited to vsync at 8.3ms intervals like the PS5 - as opposed to 4.2 ms on a 240 hz PC - is naturally likely to have lower 0.1% and 1% lows as there will be more instances of longer waits for synchronisation - particularly if you're not triple (or more) buffered all the way through the pipeline. Likewise, average fps will be boosted because there will be instances where fps can go above 120 fps.
The only fair test is to test both on 120 hz displays IMO.
Last edited: