Digital Foundry: Microsoft Xbox One X & S supports Freesync 2 [2017-04-11]

Mmhm.

The key thing is consistency since that is what your gaming mind will want when controlling your character. Having a fluctuating controller response is shit regardless of how it's presented.

When you go from 50 fps to 40 fps you add 15 ms input lag (assuming that you have 3 frames input lag in the game).
 
Depending on the technology, tearing will occur if frame rate exceeds the monitor refresh. So you will need something to keep it under control in those situations.
http://www.144hzmonitors.com/list-of-freesync-monitors/

I thought that most freesync displays can go up to 144 Hz, but apparently 75 Hz maximum is pretty common. There even seems to a a few models with 60 Hz maximum. V-sync instead of tearing would obviously be the right choice when you exceed 75 Hz (or whatever the maximum is).
When you go from 50 fps to 40 fps you add 15 ms input lag (assuming that you have 3 frames input lag in the game).
If your game engine has 3 frames of input lag, you have already failed. Killzone 3 had 3 frames of latency on PS3. Most PS3 games had additional frame of latency, because they were doing lighting and post processing on SPUs. After GPU frame ended, the intermediate results were read by SPUs. Lighting and post processing was running one frame late compared to GPU.

I'd guess that one frame of engine latency is the most common thing nowadays. I'd be surprised if some games had 3 frames of engine latency.
 
If your game engine has 3 frames of input lag, you have already failed. Killzone 3 had 3 frames of latency on PS3. Most PS3 games had additional frame of latency, because they were doing lighting and post processing on SPUs. After GPU frame ended, the intermediate results were read by SPUs. Lighting and post processing was running one frame late compared to GPU.

I'd guess that one frame of engine latency is the most common thing nowadays. I'd be surprised if some games had 3 frames of engine latency.
.

IIRC, Killzone 2 was around 5 frames. Made it really hard to enjoy that game.
 
I'd guess that one frame of engine latency is the most common thing nowadays. I'd be surprised if some games had 3 frames of engine latency.

Maybe I expressed myself sloppy, but I seem to remember DF interviews on games like Need for Speed the devs were quite proud of their 2.5 frames internal latency (from controller IIRC). I think Carmack stated somewhere that Rage had two frames and that they almost shipped with 3 frames due to a bug.
 
IIRC, Killzone 2 was around 5 frames. Made it really hard to enjoy that game.
Maybe I expressed myself sloppy, but I seem to remember DF interviews on games like Need for Speed the devs were quite proud of their 2.5 frames internal latency (from controller IIRC). I think Carmack stated somewhere that Rage had two frames and that they almost shipped with 3 frames due to a bug.
I also think I expressed myself sloppily. I should have said clearly that I was talking about additional engine latency. Obviously every single game will have at least one frame of latency (as you need to process and render that frame). GPU is running asynchronously from the CPU. GPU will usually finish roughly half a frame later (could be a bit faster on consoles). So the baseline for best experience is 1.5 frames of latency. At 60 fps (16.6 ms) that is 25 ms.

If the baseline 1.5 frame latency is taken into account, then 2.5 frames is the gold standard for modern games. Trials games also had 2.5 frame latency, as the physics was multithreaded. Results were applied at the beginning of the next frame and rendering started immediately. 2.5 frame latency at 60 fps = 41.6 ms.

It is possible to reach roughly half a frame less latency, if the game has relatively simple simulation and game logic. In this case you first simulate everything and run the game logic and start immediately rendering. This means that you can't start pushing rendering commands at the beginning of the frame, because you don't know the location of your objects and camera yet. If we assume simple simulation and game logic, the GPU could start receiving draw calls half away in the frame. In the fully buffered scenario, the rendering starts at the beginning of the next frame, thus we are talking about 0.5 frame difference in total latency. Games like Overwatch could thus reach 2 frame total latency.

The absolute minimum of 1.5 frame total latency is only possible in games that have basically no game logic. You sample controls, move character and start immediately rendering. No physics simulation or anything fancy is possible.
 
Freesync works on HDMI 1.4a - 2.0x.
Supported. There are literally dozens of HDMI features in the specification that no TV supports. Support is definitely the first step but celebrate when manufacturers actually sell sets that fully provide features of the standards which they support. Lots of TVs should support 4K at 60Hz (or even higher) but they don't. :no:
 
Supported. There are literally dozens of HDMI features in the specification that no TV supports. Support is definitely the first step but celebrate when manufacturers actually sell sets that fully provide features of the standards which they support. Lots of TVs should support 4K at 60Hz (or even higher) but they don't. :no:

Are you saying that there are HDMI 2.0 supporting TVs that do not support 4K at 60 fps?
 
Are you saying that there are HDMI 2.0 supporting TVs that do not support 4K at 60 fps?

Yes.

https://hardforum.com/threads/4k-60hz-4-4-4-hdmi-2-0-tv-database.1837209/

Early in the HDMI 2.0 generation HDCP 2.2 chips couldn't do 4:4:4 chroma at 4k60 over HDMI 2.0, so you'd either have to drop to 4k30 or do 4k60 at 4:2:0 or 4:2:2 Chroma.

http://www.audiogurus.com/learn/news/4k-hdmi-2-0-compatible-hdcp-2-0/2718

Some manufacturers released sets that could do 4k60 (rare) while just not including HDCP 2.2.

And even then, you'd only get it on 1 HDMI port in many cases. The others would either not support 4k or not support 4k60 in any form.

Regards,
SB
 
Are you saying that there are HDMI 2.0 supporting TVs that do not support 4K at 60 fps?

The TV I had before this one would accept a 4K60 signal, but the processor inside didn't have the bandwidth to handle it, so it would drop every other frame.
 
Are you saying that there are HDMI 2.0 supporting TVs that do not support 4K at 60 fps?

Beaten by others and this disparity is not isolated to HDMI and TVs but is common to connectors providing a databus like HDMI and USB. In work we've have repurposed these hardware connectors for other non-standard networking and discovered immediately that not all connectors and controllers are equal. Many devices declare them themselves as HDMI XX or USB YY because they are electrically compatible and the controller can communication with other devices the specification requires. It definitely does not mean all features of the standard is supported by the host hardware. :nope:
 
http://www.144hzmonitors.com/list-of-freesync-monitors/

I thought that most freesync displays can go up to 144 Hz, but apparently 75 Hz maximum is pretty common. There even seems to a a few models with 60 Hz maximum. V-sync instead of tearing would obviously be the right choice when you exceed 75 Hz (or whatever the maximum is).
Thanks for sharing that list. I've been kinda torn but I think I will be getting a 4K Freesync monitor instead of a TV to use with a future PC or console -50/50 here, it depends on Scorpio's E3-. I like the image quality a bit more and I have enough 1080 TVs at home, plus I miss using the monitor to play console games. I think I can do without HDR, but we shall see.

I've seen some 4K Freesync monitors starting from 290€.
 
Thanks for sharing that list. I've been kinda torn but I think I will be getting a 4K Freesync monitor instead of a TV to use with a future PC or console -50/50 here, it depends on Scorpio's E3-. I like the image quality a bit more and I have enough 1080 TVs at home, plus I miss using the monitor to play console games. I think I can do without HDR, but we shall see.

I've seen some 4K Freesync monitors starting from 290€.

Also note that is a list of only FreeSync branded monitors. There are more that support VESA adaptive sync but do not carry the FreeSync branding.

Regards,
SB
 
Where has anyone read that Scorpio will have HDMI 2.1?

HDMI 2.1 represents a 48 Gbps connection capable port, and a special cable, and has support for resolutions over 4K 60 Hz. At the moment, there are no indications of Scorpio doing this. Besides lots of references in the DF article use the terminology HDMI 2.0.

Yet, they do speak about Freesync 2 and VRR, features known to be supported by HDMI 2.1 Question is, this does not mean the HDMI is 2.1!.
Yes, these are future features of HDMI 2.1, but features that theoreticly can also be present on HDMI 2.0 dependent on manufacturers and their system designs (HDMI 2.0c?). If you check HDMI's webpage you can see this, about VRR:

Game Mode VRR

Q: Does this require the new HDMI cable?

A: No

So, no need for extra bandwidth! Meaning that HDMI 2.0 can theoretically do it!

And why should it not? Well VRR is not diferent from Freesync (basicly same tech, diferent producers), although it´s an improvement over it. More like Freesync 2!

Freesync is already available on AMD cards, and AMD claims all cards supporting Freesync will support Freesync 2. So unless the HDMI implementation prevents it, VRR can also be supported on some HDMI 2.0 devices (Freesync requires only HDMI 1.2, and since AMD claims all Freesync supported cards will support Freesync 2, this extends Freesync 2 support to AMD Radeon™ RX 480, AMD Radeon™ RX 470, AMD Radeon™ RX 460, Radeon Pro Duo, AMD Radeon R9 300 Series, AMD Radeon R9 Fury X, AMD Radeon R9 360, AMD Radeon R7 360, AMD Radeon R9 295X2, AMD Radeon R9 290X, AMD Radeon R9 290, AMD Radeon R9 285, MD Radeon R7 260X and AMD Radeon R7 260).

So, as I see it, supporting these features does not make the Scorpio HDMI a 2.1!
 
Where has anyone read that Scorpio will have HDMI 2.1?

HDMI 2.1 represents a 48 Gbps connection capable port, and a special cable, and has support for resolutions over 4K 60 Hz. At the moment, there are no indications of Scorpio doing this. Besides lots of references in the DF article use the terminology HDMI 2.0.

Yet, they do speak about Freesync 2 and VRR, features known to be supported by HDMI 2.1 Question is, this does not mean the HDMI is 2.1!.
Yes, these are future features of HDMI 2.1, but features that theoreticly can also be present on HDMI 2.0 dependent on manufacturers and their system designs (HDMI 2.0c?). If you check HDMI's webpage you can see this, about VRR:

Game Mode VRR

Q: Does this require the new HDMI cable?

A: No

So, no need for extra bandwidth! Meaning that HDMI 2.0 can theoreticallydo it!

And why should it not? Well VRR is not diferent from Freesync (basicly same tech, diferent producers), although it´s an improvement over it. More like Freesync 2!

Freesync is already available on AMD cards, and AMD claims all cards supporting Freesync will support Freesync 2. So unless the HDMI implementation prevents it, VRR can also be supported on some HDMI 2.0 devices (Freesync requires only HDMI 1.2, and since AMD claims all Freesync supported cards will support Freesync 2, this extends Freesync 2 support to AMD Radeon™ RX 480, AMD Radeon™ RX 470, AMD Radeon™ RX 460, Radeon Pro Duo, AMD Radeon R9 300 Series, AMD Radeon R9 Fury X, AMD Radeon R9 360, AMD Radeon R7 360, AMD Radeon R9 295X2, AMD Radeon R9 290X, AMD Radeon R9 290, AMD Radeon R9 285, MD Radeon R7 260X and AMD Radeon R7 260).

So, as I see it, supporting these features does not make the Scorpio HDMI a 2.1!

Why does it matter? Is anyone asserting that Scorpio having "HDMI 2.1" support is going to allow it to deliver on any of the additional features of HDMI 2.1 beyond VRR? Maybe people are just using it as shorthand because it's easier than typing, "supports the VRR feature of HDMI 2.1 when connected to a VRR-capable HDMI 2.1 display".
 
Also note that is a list of only FreeSync branded monitors. There are more that support VESA adaptive sync but do not carry the FreeSync branding.

Regards,
SB
So any VESA adaptive sync is going to work with Scorpio or an AMD graphics card that support Freesync? Some of the issues with early HDMI 2.0 TVs that you mentioned before aside, I am sure I will be getting a monitor, taking into account I have very little space in my room and usually play close to the TV. If I need the TV experience in the living room I still have the 1080p TV I got in 2013 -too bad about the 3D- and a Samsung from late 2009 that I got along with my siblings.

33066_08.jpg


Looking into 4K Freesync monitors I've found some interesting models like these:

http://www.lg.com/us/monitors/lg-32UD99-W-4k-uhd-led-monitor

medium01.jpg


http://www.pcworld.com/article/3182...date-heralding-a-new-era-for-pc-displays.html

http://4k.com/monitor/lg-27ud68-review-4k-uhd-ips-monitor-with-freesync-lg-27ud68-w-lg-27ud68-p/

AMD-Catalyst-Omega-Driver-14.50_AMD-Freesync-Technology.jpg


This one costs about 500€ at Amazon, without HDR:

https://www.amazon.com/gp/product/B01CH9ZTI4/ref=as_li_qf_sp_asin_il_tl?tag=4k0e-20&ie=UTF8&camp=1789&creative=9325&linkCode=as2&creativeASIN=B01CH9ZTI4&linkId=c1fc63ae5854b61ad7a2da0d790bd2b0

This Samsung 28" model costs 383€, no HDR, great colours as shown in the image, so they say.

http://www.coolmod.com/samsung-u28e590d-28-4k-freesync-monitor-precio?virtuemart_category_id=12

33066_02.jpg


The cheapest I found is the LG 24UD58-B monitor, at 290€ new and 260€ -second hand-:

https://www.amazon.es/LG-24UD58-B-M...=2025&creative=165953&creativeASIN=B01M64QNU2
 
you have to make sure it is Freesync over HDMI not just that it's a Freesync monitor....most do it over displayport I think.
 
Thanks for sharing that list. I've been kinda torn but I think I will be getting a 4K Freesync monitor instead of a TV to use with a future PC or console -50/50 here, it depends on Scorpio's E3-. I like the image quality a bit more and I have enough 1080 TVs at home, plus I miss using the monitor to play console games. I think I can do without HDR, but we shall see.

I've seen some 4K Freesync monitors starting from 290€.

You might regret this. *Good* HDR is a really impressive and obvious upgrade over what we have now. You may have to have the differences between VRR and standard displays demonstrated with side-by-side videos and with 4K you may need to be sitting right in front of the display to notice the differences. The benefits of HDR you can see immediately, from across the room.
 
Why does it matter? Is anyone asserting that Scorpio having "HDMI 2.1" support is going to allow it to deliver on any of the additional features of HDMI 2.1 beyond VRR? Maybe people are just using it as shorthand because it's easier than typing, "supports the VRR feature of HDMI 2.1 when connected to a VRR-capable HDMI 2.1 display".

Why does it matter?
1 - Because 2.1 has extra features, and claiming its 2.1 when it´s not is misleading. This is a tech forum, and this is the kind of things that are discussed here. A "Why does it matter" was not the answer I was expecting in this forum.
2 - Because since r7-260 has it, PS4 pro might also be able to get it. Maybe even PS4 since it is already showed that it´s HDMI implementation allows for extra features, it is now supporting HDR, and has a over 1.2 HDMI port. So, I think it is quite relevant.
 
Why does it matter?
1 - Because 2.1 has extra features, and claiming its 2.1 when it´s not is misleading. This is a tech forum, and this is the kind of things that are discussed here. A "Why does it matter" was not the answer I was expecting in this forum.
2 - Because since r7-260 has it, PS4 pro might also be able to get it. Maybe even PS4 since it is already showed that it´s HDMI implementation allows for extra features, it is now supporting HDR, and has a over 1.2 HDMI port. So, I think it is quite relevant.

Attempting to define a device's (especially a source device's) capabilities solely by the version of HDMI they support, is wrong. If you don't do that you won't have these problems. This should be clear from the very thing you point out of how PS4 was able to add HDR support (an HDMI 2.0 feature) to the OG PS4. What HDMI revision is that device, then?
 
Back
Top