NVidia decodes H.264 in Hardware

chavvdarrr said:
why are we discussing ati and nvidia? Afaik both solutions will be available only for additional sum, no ?

"Only"? Probably not. Certainly early adopters for ATI have to shell out if they want it. It isn't clear to me the requirements on the NV side. Will the Forceware update enable this for previous owners of PureVideo, whether they bought it or received it bundled with their card?
 
geo said:
"Only"? Probably not. Certainly early adopters for ATI have to shell out if they want it. It isn't clear to me the requirements on the NV side. Will the Forceware update enable this for previous owners of PureVideo, whether they bought it or received it bundled with their card?

Just the capability to accelerate H.264 decode is free. You still need to purchase a decoder that is compatible with it. Like the referenced Intervideo, Cyberlink, and Nero packages. One can only hope that NVidia someday expose their VPP API so open-source developers can utilize it to come up with truly free solutions.

And as for where this technology is useful, think notebooks. Lowered CPU usage = increased battery life.
 
geo said:
No, I wasn't. I was under the impression that even the fastest cpus were supposed to struggle with this thing. And by struggle, I mean peg the cpu usage and still drop frames. Even my previous two year old p4 HT was doing it with no dropped frames. Edit: I'm thinking this is incorrect now --that what I was doing on that P4 was 1080p, but not h.264. They were .wmv files.

Hopefully ATI will get X1600 up to snuff with driver tweaks. I believe some informal testing has shown it capable of doing 1080p --but they haven't certified it.
Well if my cpu alone can decode 720P wmv-hd. what's so much harder about decoding H264?
 
radeonic2 said:
Well if my cpu alone can decode 720P wmv-hd. what's so much harder about decoding H264?

it is...

my 2600mhz decodes wmv-hd 1080p no problem with about 40% cpu...
a 1080p h264 its like 15/20fps....
 
radeonic2 said:
Well if my cpu alone can decode 720P wmv-hd. what's so much harder about decoding H264?

it is...

my 2600mhz A64 (3000+ oced) decodes wmv-hd 1080p no problem with about 40% cpu...
a 1080p h264 its like /20fps.... just 4fps to be perfect...


edit: with the player/codec max-pain posted, its fluid :) I can officialy see h264 1080p movies in my Pc with quality :) with only 55% of cpu time

edit 2: well, i guess its the codecs fault... My laptop dothan 2.0ghz also runs it perfect now..

i guess we dont need a powerfull cpu afterall. Just a good player with good codecs
 
Last edited by a moderator:
dskneo said:
it is...

my 2600mhz A64 (3000+ oced) decodes wmv-hd 1080p no problem with about 40% cpu...
a 1080p h264 its like /20fps.... just 4fps to be perfect...


edit: with the player/codec max-pain posted, its fluid :) I can officialy see h264 1080p movies in my Pc with quality :) with only 55% of cpu time

edit 2: well, i guess its the codecs fault... My laptop dothan 2.0ghz also runs it perfect now..

i guess we dont need a powerfull cpu afterall. Just a good player with good codecs
Ya I'm wondering what's so much harder about the codec that mac ******s like to spout off about since looking at trailers of wmv-hd and h264 they both appear nearly perfect block wise.
 
And yet the screenshots from the article upstream show an Intel 830 firewalled at 100%. It's a bit curious, like a fact or two is missing in putting this puzzle together.
 
The h.264 codec used in Quicktime for windows is not very well optimized. There are others that are much better.
 
mrcorbo said:
And as for where this technology is useful, think notebooks. Lowered CPU usage = increased battery life.

Assuming, of course that the CPU uses more power than the GPU! ;)
 
After reading what was posted on bit-tech here
Bit-Tech.com said:
The GeForce video engine GeForce 6 and 7 GPUs all have a video engine built into them. The engine, powered by the PureVideo software, will get an upgrade in ForceWare 85 that will enable the H.264 decoding. Most GPUs from the 6600GT and upwards will be able to handle full bitrate, 1080p content, according to NVIDIA.

The quality of the acceleration depends directly on the speed of the GPU. At 300MHz, the acceleration is going to be minimal. As you scale on up towards 500MHz GPU clock speed, acceleration gets to the point where less than 50% CPU is being used to deliver full speed playback.
I start having a question if the acceleration depends directly on the GPU's speed, may it be possible that the optimization not doing full decoding frame as the way NV used to optimized 3D game speed? I mean like some part or detail may be left over for the sake of FPS smoothness of the video.
 
satein said:
After reading what was posted on bit-tech here

I start having a question if the acceleration depends directly on the GPU's speed, may it be possible that the optimization not doing full decoding frame as the way NV used to optimized 3D game speed? I mean like some part or detail may be left over for the sake of FPS smoothness of the video.

That's funny, you know the same applies to ATi right, that the acceleration depends driectly on the class of GPU. But now the witch hunt will start for Nvidia "cheating" in video acceleration "benchmarks". :LOL:
 
trinibwoy said:
That's funny, you know the same applies to ATi right, that the acceleration depends driectly on the class of GPU. But now the witch hunt will start for Nvidia "cheating" in video acceleration "benchmarks". :LOL:
Yes, I agree on some of that, but if we look into what were recommended on each ATi card... it's different decode resolution on each series, not that ATi recommended that all R5xx will play 1080p. So, this would be a big different comparison to NV, if NV promised to deliver H.264 1080p video decoder on 6600GT or even lower. And yes, what kind of optimization will be used then since we never knew about it. And I didn't mean to say NV do "cheating" in Video acceleration benchmarks, but I do say what if NV did on its video acceleration since people nowaday seem to accept that "an old-day cheating" is "a good optimization" now :p .
 
satein said:
So, this would be a big different comparison to NV, if NV promised to deliver H.264 1080p video decoder on 6600GT or even lower.

I'm not entirely sure how PureVideo works but I don't think its performance is as closely aligned with pixel shader performance as is Avivo.
 
trinibwoy said:
I'm not entirely sure how PureVideo works but I don't think its performance is as closely aligned with pixel shader performance as is Avivo.
Maybe that's one of the reasons why ATI is going for lots of shader power in R580? Might be especially useful when ATI shift their transcoding software into hardware.

If Nvidia has dedicated transistors to Purevideo, they won't need that pixel power for video, whereas ATI's approach can use the pixel shaders for Avivo without needing so much in the way of dedicated transistors.
 
Last edited by a moderator:
Yeah, but some features of the H.264 pipeline don't implement well as pixel shaders. That's why presumably, NVidia's scalar VP exists.
 
Bouncing Zabaglione Bros. said:
If Nvidia has dedicated transistors to Purevideo, they won't need that pixel power for video, whereas ATI's approach can use the pixel shaders for Avivo without needing so much in the way of dedicated transistors.
But if Nvidia has dedicated transistors to Purevideo, the core would be bigger than this and it should cost more for a whole series. Also, I don't really sure that the hardware acceleration designed for MPEG2 can be suitable for H.264 decoding. Because if is can be suitable, most hardware that can run DVD smoothly should not have any problem playing 1080p H.264.
 
satein said:
Also, I don't really sure that the hardware acceleration designed for MPEG2 can be suitable for H.264 decoding. Because if is can be suitable, most hardware that can run DVD smoothly should not have any problem playing 1080p H.264.

Perhaps you could put this down to the programmable nature of PV vs fixed function?
 
geo said:
Perhaps you could put this down to the programmable nature of PV vs fixed function?
Do you mean that it functions like FPGA? I can not say I completely understand what programmable chip thing, but at least I knew some about it :smile:. AFAKI, my friend who works on FPGA programming and chip told me that he can emulate a whole CPU on FPGA only if we don't care how slow it will be. So that why I am thinking that it might be not suitable.
 
satein said:
Because if is can be suitable, most hardware that can run DVD smoothly should not have any problem playing 1080p H.264.
Dude, do you have any idea how much more processing and bandwidth intensive 1080p H.264 is than normal DVD?
 
Back
Top