I have my two cents to add to this. I bought an ATi X1600 Pro AGP. I'm not impressed with its MPEG-2 decoding.
My PVR box grabs TV using a Hauppauge PVR-250 card and playback of that with any DVD decoder like PowerDVD or even nVidia's PureVideo decoder on the ATi card produces a horrible smearing effect. The only way to eliminiate this is to force "Bob" deinterlacing in CCC and switch DVD playback to VMR HQ mode. DVDs also exhibit this to a certain extent where some stuff leaves smear trails, like from light souces. I don't think this is desirable in any kind of MPEG-2 playback. Software mode playback is perfect, but then I lose the AVIVO tech.
In particular, some of my recordings of "Smallville" show hideous trails on Lex Luthor's bald head like bits of where his head used to be being overlayed on where his head is now but in a lighter colour. 'tis horrible. Or seeing Lana Lang's cheek from its previous position superimposed on new positions as a lighter shade. I honestly doubt 'tis something wrong with the card. Some colours seem to exhibit this more than others. Forcing the deinterlacing mode to "Bob" and switching to VMR HQ mode eliminates this altogether producing decent playback.
It doesn't seem right to pay for advanced deinterlacing when it smears video all over the place.
This is the reason I've gone back to my 6600 GT. To be honest, even DVDs look clearer than with the Radeon X1600, irrespective of what the reviews says.
nVidia's PureVideo hardware has three components to it:
a) MPEG-2 decode engine
b) Motion Estimation Engine ( Used for encoding )
c) Programmable Video Processor. ( Used for WMV HD and H264 decoding. ) ( This is broken on the NV40 )
The deinterlacing is done on the pixel shaders from what I've heard.
From what I've seen with the 6600 GT and the ATi X1600, the nVidia card supports:
ModeMPEG2_A ModeMPEG2_B ModeMPEG2_C ModeMPEG2_D ModeWMV9_B ModeWMV9_A
and the ATi card supports:
ModeMPEG2_C ModeMPEG2_D ModeWMV8_B ModeWMV8_A ModeWMV9_B ModeWMV9_A
( as reported by DXDIAG.EXE )
With the PureVideo decoder the ATi card runs in Modes C and D ( which are the more hardware accelerated modes ) and the nVidia card runs in Modes A and B which are less accelerated modes, which could explain the differences in CPU utilisation. I also think it picks the modes depending on the order they're listed here, so for unencrypted MPEG-2 clips, mode A is used on nVidia and mode C on ATi and modes B and D for DVD playback on the respective cards even though the nVidia cards support Modes C and D.
My PVR box grabs TV using a Hauppauge PVR-250 card and playback of that with any DVD decoder like PowerDVD or even nVidia's PureVideo decoder on the ATi card produces a horrible smearing effect. The only way to eliminiate this is to force "Bob" deinterlacing in CCC and switch DVD playback to VMR HQ mode. DVDs also exhibit this to a certain extent where some stuff leaves smear trails, like from light souces. I don't think this is desirable in any kind of MPEG-2 playback. Software mode playback is perfect, but then I lose the AVIVO tech.
In particular, some of my recordings of "Smallville" show hideous trails on Lex Luthor's bald head like bits of where his head used to be being overlayed on where his head is now but in a lighter colour. 'tis horrible. Or seeing Lana Lang's cheek from its previous position superimposed on new positions as a lighter shade. I honestly doubt 'tis something wrong with the card. Some colours seem to exhibit this more than others. Forcing the deinterlacing mode to "Bob" and switching to VMR HQ mode eliminates this altogether producing decent playback.
It doesn't seem right to pay for advanced deinterlacing when it smears video all over the place.
This is the reason I've gone back to my 6600 GT. To be honest, even DVDs look clearer than with the Radeon X1600, irrespective of what the reviews says.
nVidia's PureVideo hardware has three components to it:
a) MPEG-2 decode engine
b) Motion Estimation Engine ( Used for encoding )
c) Programmable Video Processor. ( Used for WMV HD and H264 decoding. ) ( This is broken on the NV40 )
The deinterlacing is done on the pixel shaders from what I've heard.
From what I've seen with the 6600 GT and the ATi X1600, the nVidia card supports:
ModeMPEG2_A ModeMPEG2_B ModeMPEG2_C ModeMPEG2_D ModeWMV9_B ModeWMV9_A
and the ATi card supports:
ModeMPEG2_C ModeMPEG2_D ModeWMV8_B ModeWMV8_A ModeWMV9_B ModeWMV9_A
( as reported by DXDIAG.EXE )
With the PureVideo decoder the ATi card runs in Modes C and D ( which are the more hardware accelerated modes ) and the nVidia card runs in Modes A and B which are less accelerated modes, which could explain the differences in CPU utilisation. I also think it picks the modes depending on the order they're listed here, so for unencrypted MPEG-2 clips, mode A is used on nVidia and mode C on ATi and modes B and D for DVD playback on the respective cards even though the nVidia cards support Modes C and D.
Last edited by a moderator: