Fudo tosses the 'c' word at NVIDIA again, this time re Pure Video

what happened in that rydermark er scandal

edit: i know what it was about but was it proved 1 way or the other
 
Hmm, I don't know the specific details of the test in question but adjusting your behaviour to look good on static frames is a fairly common strategy on video.

If say the video in question is interlaced, then you have several ways in which you can display it.
You can
a) Display the two fields woven. (Known as weave)
b) Display the odd / even fields seperatly, scaling them up to fill the output. (Known as Bob)
c) Perform some kind of adaptive system where you interpolate between weave/bob depending on the characteristics of the video.

Now if you have a static scene then weave is the way to display this as it will provide twice the vertical resolution of a bobbed image and will not wobble up and down the way that bobbing can do. If however the scene is moving then weave will produce combing artifacts that look quite unpleasant, in which case bobbing is the way to go. If you have the processing power then you can determine which to do on a per pixel basis which will provide better results than either.

There are other methods out there as well, but this demonstrates that adjusting behaviour if the image is static is a perfectly legitimate thing to do. If the result of this causes other artifacts then it is the fault of the test that it does not pick these up, not the fault of the gfx card/drivers.




CC
 
More like noise reduction issue. ;)

It definatly is. I have had serious problems with this myself on my GTS. Thought it was my 10m VGA cable at one point :oops:

Ghosting with max noise reduction is absolutely terrible and renders any movie completely unwatchable.

However... turn noise reduction down to 50% and the ghosting completely goes away! Still great picture quality aswell.
 
The more I hear about it the less inclined I am to apply the 'c' word to it. . .tho I think "overly aggressive optimization" sounds about right! ;)
 
It is worth mentioning that noise is not necesarily a bad thing. It is generally added to computer generated images to make it looks more realistic.
Because noise does not compress well, some codecs include the ability to remove noise at encode and insert it at decode.

CC
 
Actually, Alan's issues in that article stem from the Cyberlink player behaviour. They also shouldn't be present on a HD 2600 XT, but we can't tell what board spec he used as he didn't list it fully - Charlie appears to be using an XT.
 
Charlie is the only Inq writer I can even stand to read. Besides, AMD needs supporters now more than ever.

I agree AMD needs supporters right now but blatent disinformation about NV products isn't the way to go about it.

It just reeks of his spew about Vista (ME2) :rolleyes:
 
Back
Top