Energy loss artifacts in Nvidia's own demo ?!
To be honest, in this case, things seem to be the opposite.
There is RTXGI in this demo, and since it's a GI, it's additive to the lighting, i.e. it adds more lighting to the scene, so brighter image with DLSS might as well be caused by this.
In this case, decoupled probes should be updated at 2x speed with DLSS and per-probes rays budged is usully something like 144 rays per probe per frame.
This would obviously lead to faster lighting convergence with DLSS On and probably a little bit more energy with DLSS On.
Though, unlike RTXGI, which is decoupled from screen resolution and doesn't require screen space denoising, there is also the RTXDI, which is heavily reliant upon sampling in screen space and upon denoising.
Usually, modern denoisers filter out very bright signal to supress the shimmering "fireflies" graphics artifacts, which would be visible otherwise, hence image becomes darker due to the energy loss you mentioned (thought, the "bias" is more common term here since it addresses both the energy loss and other systematic errors in lighting).
Denoisers' blurring can also introduce some additional bias and this bias is typically higher in lower res, so reconstructing signal from lower res can add up to the energy loss.
Unfortunately, we don't have an unbiased path traicing reference images to compare against, so it's impossible to tell which image DLSS On or Off has more bias here, but I would not be worry about that anyway since there are literally thousands of other things which introduce discrepancies with the unbiased path tracing rendering.