Um, actually it is a normalized normal map in this case. This technique relies upon the base texture having texel values that are normalized to length one. The rest is a mathematical abstraction to approximate what would happen if you simply took enough samples of the base texture and averaged the lighting results, which physically results in MIP Maps not having texels normalized to length one and a resultant texture read not normalized.DeanoC said:To be pedantic, the nVIDIA example is NOT a normal map. A normal by defination always has normalised length. A normal is defined on the unit sphere.
Edit:
Oh, and by the way, the reason "normalized normal vector" is not redundant is simply because the word "normal" implies perpendicularity alone.