What if instead of encoding a normal map as a point on the Cartesian plane in 32-bits, you simply encoded it as a pair of angular deflections? Unless I misunderstand how they're encoded, most of the space is just wasted. I'm basically assuming that the RGB data corresponds to a point on the unit ball in the upper half-plane, i.e. [255 0 0] is (1,0,0), but {255,255,255] isn't a usable value since it's a point on the cube circumscribing the sphere. Is this correct? Why not simply store it as a pair of angles each effectively varying between 0 and pi? I guess it depends on how fast a GPU can compute a cosine and angles.
I'm kinda talking out of my ass here. Is that approach even feasible or even already being done?