Doesn't have to be a recognized standard by the IEEE to be a recognized standard of one particular section of the industry with devices pertaining to that section, of course.
Meanwhile, there are mentions and uses of 24 bit (and 40 bit, and other "strange-looking" values) peppered all over the tech industry, and IEEE itself. Using the word "standard" can be a bit strange in and of itself, since that brings aspects of market penetration and similar factors into play, since many time "standards" are headbutting all the way until one comes to dominate. And many times devices will use different bit values in different areas, so which ones become "most crucial"?
On nVidia's comments, it seems a bit strange to complain when we haven't remotely seen the full extend of 24-bit yet, their higher-precision implementation comes at massive performance cost, and the entire industry in which they are releasing those products has adopted 24-bits as a "standard" if anything is, and was a known factor from long before.