Purevideo and AVIVO

DemoCoder said:
The fact that "Bad Edit Correction" and "Inverse Telecine" are available on the 6800U but not on any other 6x00 product, even the "fixed" 6600 series video processor, seems to suggest that inverse telecine/cadence detection is based on clock performance, which suggests that on the 6x series, it is implemented via "software" on the video processor.
Ok, but 6600GT is 500 mhz, 100 mhz beyond 6800U: perhaps these features are implemented via pixel shader (16 pipeline versus 8), not on the VP... :oops:
 
I dont believe NVIDIA are using the Pixel Shaders but have two elements to the Purevideo capabilities in hardware. There is some sort of fixed function hardware that is programmable to a limited degree.
It sounds to me and from my previous understanding that Purevideo needs to be updated to add new capabilities at a later stage to be accelerated in hardware.

I thought FS video confirmed that.
 
The fixed function parts are generally iDCT and motion comp. Those don't need to be updated to do cadence detection or noise reduction.
 
I must say that the release of the video and our discussion was so timely it was bizarre! Of course FS have been working on their video for at least two months.
At least it means that NVIDIA and ATI are fighting for the HTPC market with their GPU's as well as fighting in the 3D realm. Both companies seem to be putting a lot of effort into getting the best video playback quality. This is a big win for the consumer.

Another funny coincidence, we got a V7350 in at work to play with... sending her out for review in a workstation tomorrow so not enough time to really put it through its paces.
 
And to go a little tangential, but still playing off Tahir's point above. . .we had a convo recently about HQV benchmark in IRC. One member felt pretty strongly that at least the final numerical score was, shall we say, not particularly useful <add colourful descriptors here>.

Usefull for what tho? I'm not so sure it hasn't been useful in lighting a fire under NV and ATI to ratchet up the competition for SD IQ title; and as a consumer I find that quite useful. . . This is what popular benchmarks do --make companies want to win at them.

Sure, there can be abuse when that happens (objectifiying any test makes it subject to abuse --sort of the Heisenberg Uncertainty Principle applied to benchmarks), and misinformation about what the final score is actually portraying.

But it seems to me that HQV has had a significant positive effect on the industry over the last year in spurring new efforts to improve SD IQ.
 
DemoCoder said:
The reality is, most DVDs created today are "clean" edits, use MPEG-2 flags appropriately, and don't benefit as much from non-standard cadence detection. Crappy TV series and el-cheapo produced DVDs do, but major film studios? No. King Kong, for example, does not need it, and will look the same on PureVideo, AVivo, and $26 CyberHome chinese players.

This is because DVD tools used by studios have continually evolved. Producing a high quality encode with clean transistions between edits is practically fully automated now.

Interesting. when I had a Hauppage WinTV-D card (with DScaler) about 4 years ago, lots of past TV-series suffer from horrid bad-edits. I saw it on CSI, X-Files, Jag, etc. just about any TV-production that was filmed (on actual photographic film), then edited/posted on video. Is the situation better now? (I got rid of my WinTV-D because it didn't like Win/XP.)

Now that boxed TV-sets are a significant chunk of the DVD revenue pie, I'd make the counterargument, that 'bad-edits' are here to stay, and sadly, will be a burden for the customer-side playback/display hardware.

But it seems to me that HQV has had a significant positive effect on the industry over the last year in spurring new efforts to improve SD IQ.

That's a big plus for HTPC buffs. But honestly, I can't see HTPCs ever breaking out of their niche -- not with the big corporations (Sony, Microsoft, etc.) fighting to put closed-consoles into the TV-room. One can only hope the same open benchmarks (and future one) can be run on closed console systems. The DirectTV HD-PVR used to report the MPEG-2 bitstream resolution, until users noticed DirectTV was sending out '1080i' programming at 1280x1080i (instead of ATSC 1920x1080i.) DirectTV's response? They removed the program-resolution from the readout.

The fixed function parts are generally iDCT and motion comp.

I've noticed handheld 3D-cores using the 3D-shader pipeline to perform motion-compensation. This seems like a blessing and curse to me. On the one hand, the same silicon can be used for multiple purposes (lowering the total die-area needed to handle both 3D and video-processing.) On the other, the pipeline(s) now have to clock to ridiculuous freqs to deliver the necessary motion-comp throughput. ATI and NVidia still have fixed-logic to handle certain iDCT and motion-comp, but I wonder if they'll eventually adopt a more 'unified' approach.

I miss the days when the ATI held a complete lead over the rest of the industry.
A Pentium3/800MHz equipped with a ATI Rage 128 Pro could play MPEG-2 1920x1080i (30fps) at full speed.
 
Murakami said:
Ok, but 6600GT is 500 mhz, 100 mhz beyond 6800U: perhaps these features are implemented via pixel shader (16 pipeline versus 8), not on the VP... :oops:
This sounds almost pretty likely...

The 7800GS doesn't support inverse telecine or bad edit correction...
 
I studied the video review very closely. I noticed when the earth's surface was in motion dureing Nvidia's Inverse Telecine I could barely see any pixelization makeing the motion completely smooth....alot smoother than ATI's. Is this because the Nvidia's Inverse Telecine was being outputed through 1080i or was the output normal DVD qaulity?
 
Back
Top