Hi, just curious about the H.264 hardware decoding function of the ATI X1K cards. Came to know that the X1K cards uses pixel shaders to provide hardware decoding of H.264 clips, and the x1300 decode 480p, x1600 decode 720p, x1800 decode 1080p.
However, does anyone know whether increasing the number of pixel shaders actually provide better decoding of H.264? (e.g. x1800 having 16 pipelines - 16 pixel shaders as compared to x1900 having 16 pipelines - 48 pixel shaders)
Any comments?
However, does anyone know whether increasing the number of pixel shaders actually provide better decoding of H.264? (e.g. x1800 having 16 pipelines - 16 pixel shaders as compared to x1900 having 16 pipelines - 48 pixel shaders)
Any comments?