Mainstream Video Quality Shootout

Murakami said:
I'm tired to argue with you, our points of view are opposite: i'm not interested in forums' talk (that i've already read), speculation or something like that...i'm searching for a proof, an official statement, because my brain tells me that if nVidia admits that NV40 video processor is "borked" there were no problems admitting the same thing for NV45.
The difference between us is that you're losing your patience insulting me about my "blind ignorance", me not.
I'll do myself a favor: i'll test this infamous WMV9 acceleration feature on my own...the only real problem is that one of the 2 WMP10 patches that enable DXVA acceleration is not removable and i don't want to format my O.S. for that matter.

Um.. even though the FX was a POS hardware for its generation.. NVidia admitted this years later when the 6800 series fixed the bad PS2 performance problem inherent in the FX architecture.

For the 6800 series.. it is still a staple selling point for the mid-high range.. and getting ANY sort of official response from NVidia will not happen anytime soon. No company will admit to bad products unless that product is well on its way out. This is no different. Official statements HAVE come out... but not from NVidia to the public directly, but only to the reviewers such as Anandtech who are relaying this info. You make it sound like it is speculation.. when really, it has come from the horse's mouth.. under someone else's voice (the reviewer in this case).

I'll do myself a favor: i'll test this infamous WMV9 acceleration feature on my own...the only real problem is that one of the 2 WMP10 patches that enable DXVA acceleration is not removable and i don't want to format my O.S. for that matter.

Lots of stuff done on your system is not exactly "easily" removable such as DX9 (though there is a tool to solve that, but this concept has always applied for many patches/software MS has provided)... I don't understand what is stopping you from testing it yourself given that you seemingly are very much interested in the results of the test. You really do need to test for yourself to know the truth.
 
Last edited by a moderator:
For a wise testing, i need to format my PC, install everything minus 2 WMP10 patches (DRM and DXVA), install Forceware 81.95 and monitoring CPU % (and, probably, GPU temp) during WMV9 playback, then install the patches and repeat...IMHO, this is the only way to be sure that NV45 video processor does not accelerate WMV9 decoding, because, unlike Ati drivers, nVidia's one haven't got a tool to switch on and off WMV9 acceleration...oh, and do the same things with a 6600 GT... :LOL:
Maybe, one day... :rolleyes:
 
Murakami said:
For a wise testing, i need to format my PC, install everything minus 2 WMP10 patches (DRM and DXVA), install Forceware 81.95 and monitoring CPU % (and, probably, GPU temp) during WMV9 playback, then install the patches and repeat...IMHO, this is the only way to be sure that NV45 video processor does not accelerate WMV9 decoding, because, unlike Ati drivers, nVidia's one haven't got a tool to switch on and off WMV9 acceleration...oh, and do the same things with a 6600 GT...
Maybe, one day...

Actually.. you don't have to go through the many steps you have suggested.

There are three parts to enabling WMV acceleration for NVidia cards...

Getting the Purevideo enabled drivers...

WMP patches of course...

The third part is enabling the feature in WMV9. If I recall correctly, the WMV9 acceleration MUST be enabled in WMP9 as well in order for it to work.

So, in order to test it.. all you have to actually do is turn the acceleration off in WMP9 and test... and then do a test with the acceleration on. WMV acceleration is three sided... WMP patch... driver support... then the app. Since you have control of the acceleration via WMP9.. you can test it this way.

Just a side note: The tipoff as the why the NVidia Purevideo support page is WRONG (and not totally correct).. is because the NV41 (PCI-E Geforce 6800 vanilla) is reported to not have the problem.
 
I have WMP 10 (not removable without a system restore; and i turned off this feature before installing it... :LOL: ) and i'm not able to find any setting about WMV9 acceleration: can you enlighten me? :oops:
P.S. The 2 patches are for WMP 10, not 9... :???:
 
Deathlike2 said:
Hints:
http://www.shacknews.com/ja.zz?comments=37332

Look at post #4 from Carnivac..

Start at Step 7 assuming you have installed drivers that support Purevideo and installed whatever MS requires you.



This is the option to enabled the acceleration for whatever it is supposed to accelerate...
Ok, i'll try; in my opinion, this option switches between hardware overlay support and VMR...maybe, VMR is required to engage (i love this word ala Star Trek... :oops: ) WMV9 acceleration... :cool:
 
Tests done! :D
My system is a Venice 3000+; sample movie is T2_720.vmw.
Well, there's something strange: without acceleration CPU % stays between 30% and 55%, with acceleration between 30% and 45%...but the GPU temp is always 55 C°, no matter if "high quality mode" is set to on or off... :???:
For a comparison, during MPEG2 decoding (with PureVideo) GPU temp goes up to 60 C°... :???:
 
Last edited by a moderator:
Murakami said:
Tests done! :D
My system is a Venice 3000+; sample movie is T2_720.vmw.
Well, there's something strange: without acceleration CPU % stays between 30% and 55%, with acceleration between 30% and 45%...but the GPU temp is always 55 C°, no matter if "high quality mode" is set to on or off... :???:
For a comparison, during MPEG2 decoding (with PureVideo) GPU temp goes up to 60 C°... :???:

So.. what do the results tell you?

On the simple basis, if the GPU temp is not going up.. between the mode on or off.. there is some acceleration going on.. but it is of the CPU kind (SSE/SSE2/SSE3)... you would think the GPU do work yes? Obviously this is not the case. Since you did a test in which know the GPU should do some work (as a result, seeing higher GPU temp)... you can at least see something going on.

Do you think your little experiment is "conclusive" enough? I believe so.
 
Probably you're right, but...incidentally, a friend of mine (with the same software setup) was not able to measure any temp range during WMV playback.
Incidentally, this guy has a 6600 GT.
 
Murakami, he may be right but the temperature thing doesn't mean much at all. If the wm acceleration is only reliant on a very small section of the die then it is unlikely to really increase the temp at all. A more meaningful way would be to look at your friends and see what % increase he saw, if you see a similar increase with similar CPU then it is likely yours is working.
 
Sxotty said:
Murakami, he may be right but the temperature thing doesn't mean much at all. If the wm acceleration is only reliant on a very small section of the die then it is unlikely to really increase the temp at all. A more meaningful way would be to look at your friends and see what % increase he saw, if you see a similar increase with similar CPU then it is likely yours is working.
Of course i'll try, but it would be very strange if WMV acceleration was working and GPU temp did not grow up even 1 C°, when MPEG 2 decoding goes up the temp 5 C° at least... :???:
However, it's hard to monitor CPU % only through task manager, it's too much subjective (given a small difference between software decoding and hardware assisted one): any suggestion?
 
Murakami said:
Probably you're right, but...incidentally, a friend of mine (with the same software setup) was not able to measure any temp range during WMV playback.
Incidentally, this guy has a 6600 GT.

I would suggest asking him to run the simple test that I was discussing about earlier.. and see the CPU utilization... though the 6600 series (all varients) have no WMV9 decode acceleration issues.

Uh uh, i found this: WMV9 decoding, G70 next to 6800 Ultra PCI-E...almost a draw!

Well, according to Dave's opinion on the results.. he believes there are one of two things possibly at work:

1) little or no improvement with the Purevideo decoding
2) unoptimized drivers

I believe it is probably closer to the latter (since there hasn't been any reports of the 7800 series having the same hardware flaw)... It would be a very stupid thing indeed if the same flaw has not been addressed in the G70...
 
Deathlike2 said:
I would suggest asking him to run the simple test that I was discussing about earlier.. and see the CPU utilization... though the 6600 series (all varients) have no WMV9 decode acceleration issues.



Well, according to Dave's opinion on the results.. he believes there are one of two things possibly at work:

1) little or no improvement with the Purevideo decoding
2) unoptimized drivers

I believe it is probably closer to the latter (since there hasn't been any reports of the 7800 series having the same hardware flaw)... It would be a very stupid thing indeed if the same flaw has not been addressed in the G70...
You're kidding, right? First, you suggest me to look at GPU temp variation, assuming that CPU% difference i misured is due to software optimisation in high quality mode (MMX, SSE and so on)...second, you suggest a friend of mine to look at CPU% difference and not GPU temp variation... :LOL: Please, a little of coherence! :rolleyes:
Besides, 7800 GTX preview i linked can illustrate one simple thing: 6800 Ultra PCI-E has no hardware flaw (hardware flaw that this review does not mention; i suppose that, if present, the reviewer chose a 6600 GT for the comparison).
 
You're kidding, right? First, you suggest me to look at GPU temp variation, assuming that CPU% difference i misured is due to software optimisation in high quality mode (MMX, SSE and so on)...second, you suggest a friend of mine to look at CPU% difference and not GPU temp variation... Please, a little of coherence!

Did I say ANYTHING about testing the GPU temp variation? I only made an analysis based on what I understood from the results (which includes what you said about the GPU temps). I didn't say you test your GPU temps.. just the CPU utilization.

I hope you at the very least reread what I said.
 
Deathlike2 said:
Did I say ANYTHING about testing the GPU temp variation? I only made an analysis based on what I understood from the results (which includes what you said about the GPU temps). I didn't say you test your GPU temps.. just the CPU utilization.

I hope you at the very least reread what I said.
"On the simple basis, if the GPU temp is not going up.. between the mode on or off.. there is some acceleration going on.. but it is of the CPU kind (SSE/SSE2/SSE3)... you would think the GPU do work yes? Obviously this is not the case. Since you did a test in which know the GPU should do some work (as a result, seeing higher GPU temp)... you can at least see something going on"
No comment; you're good to play with words but, if i have to look only at CPU utilization and not GPU temp, i have to conclude that, basing on my tests, my VGA does assist WMV9 decoding in hardware.
 
Last edited by a moderator:
Murakami said:
Probably you're right, but...incidentally, a friend of mine (with the same software setup) was not able to measure any temp range during WMV playback.
Incidentally, this guy has a 6600 GT.
Your friend might not be getting hardware accel.
Look at the clockspeed graph with rinatuner.. it should clock to to 3d speeds :D
 
radeonic2 said:
Your friend might not be getting hardware accel.
Look at the clockspeed graph with rinatuner.. it should clock to to 3d speeds :D
Excellent: so, no even a 6600 GT does WMV9 hardware decoding... :LOL:
Besides, are you sure that the clock rises during 3D and/or video decoding?
 
Back
Top