I wonder if you understand what HDR means; it appears that there might be some misconceptions.
The main difference is that either there's a fixed numerical range to represent light, like its intensity can go from 0 to 1.0 or sometimes 2.0; or it can go from 0 to any desired level. In the first case a simple 8-bit integer is used to represent intensity with values from 0 to 255, but for HDR you need a higher range and dynamically adjusted precision, so the internal representation is a floating point number. Google it if you need a better representation, the short summary is that the decimal point in the number can be "moved" to provide a wider range.
Once you start to use HDR and floats, all that depends on the kind of data you use is precision. Even if you only have 10 bits for the entire number (X360s format) you can still use a very wide range of light intensities compared to what 8 bit integer provides. So it is not "MDR", only low precision HDR, there is no such thing as "MDR".