It's for forcing HDR10 and SDR (Yes, it also converts SDR) into DVLL (Dolby Vision Low Latency) profile. The HDR10 signal does not not stay the same in DVLL container, it's converted into DTM (Dynamic Tone Mapping, not to be confused with Dynamic Metadata) applied signal so tone mapping is still improved vs vanilla HDR10 despite lacking dynamic metadata. Examples of DTM conversion are LG OLED dynamic tone mapping, Panasonic UHD Bluray players' HDR Optimizer, Lumagen Radiance's DTM, MadVR's tenser core assisted (the standalone version sold for $5000~$9999 uses Geforce 2060~2070) DTM, Oppo/Sony UHD Bluray players' DVLL conversion, DVLL conversion by performing EDID exploit using HD Fury Integral and above, and last but not the least Apple TV 4K and above.
The Xbox Series DVLL DTM conversion method is in same vein as Oppo/Sony BDP, HD Fury Integral EDID exploit, and Apple TV 4K. By utilizing Dolby Vision Profile 5, it can auto convert SDR/HDR10 into DTM applied DVLL. It's especially useful for project owners which supports PQ EOTF on some of the higher end models, but can never handle HDR properly due to poor native peak brightness, and thus need a state of the art solution tone mapping solution to get around. Before the arrival of DVLL DTM auto conversion, HDR supporting projector owners had to either deal with mediocre DTM quality Panasonic HDR Optimizer or pay pretty coin for a Lumagen Radiance. DVLL DTM, while not still as good as Lumagen and MadVR, (The Lumagen and MadVR both uses frame by frame dynamic tone mapping by analyzing luminance output of a single frame. The DVLL DTM can't do that without having a dynamic metadata) still will be a superior DTM solution for a vast majority of HDR supporting TVs. For the Xbox Series, the display does need to support Dolby Vision, but for the HD Fury, they can get away with EDID override so it can work in non Dolby Vision supporting displays such as Samsung TVs and many projectors.
It could also work wonders for non tone mapping supporting displays. The Sony Z9D for example, has 1,800 cd/m2 of peak luminance output in 10% window, but due to its age, it does not support tone mapping at all, so Sony chose to clip highlights above 1,500 nits.
http://www.hdtvtest.co.uk/news/kd65zd9-201610164372.htm
"Even though
the Sony 65″ ZD9 had been judged to only resolve highlight detail up to 1500 nits in specialised test patterns, we actually saw very little – if any – loss of bright highlights in comparison with a Samsung UHD Premium TV deemed to be capable of resolving 4000 nits. Here are a couple of frameshots (photo exposure intentionally lowered to fully capture highlight detail) from
Pan 4K Blu-ray which has been mastered to 4000 nits:"
The vast majority of Sony Z9D owners seem to prefer Dolby Vision over HDR10 too as Dolby Vision can tone map entire range of 10,000 nits spectrum where as for HDR10, any highlight information above 1,500 nits needs to be discarded. Moving games from SDR, HDR10 to DVLL could also have a side benefit as the Z9D's game mode is well known to desaturate colors, so the DVLL conversion may or may not help with desaturation.
My Panasonic GZ2000 OLED TV could also benefit. While my GZ2000 on game mode still keeps 100% identical killer color accuracy as the cinema, it does choose to clip instead of tone map (DTM can still be applies though, but I do not like Panasonic's DTM brightning up mid-tones and hurting dynamic range) , and while clipping is fine as long as it supports HGIG, mine doesn't unfortunately. Since my GZ2000 supports 1,000 nits of peak brightness perfectly, DVLL conversion could work as a great workaround for lack of HGIG and provides better DTM than my TV's own...until I can pony up $10,000 for MadVR Envy.
It is true though, that another calibration is likely to be needed. Owners of Sony BDPs, and Apple TV 4K have complained about color changes after engaging in forced DVLL, and I expect it to be the same as the Xbox Series. I better check it out myself next week too.