NVidia decodes H.264 in Hardware

Chalnoth said:
Dude, do you have any idea how much more processing and bandwidth intensive 1080p H.264 is than normal DVD?
I realized that, thank you, that was why I asked it and believe it might not be suitable for 6600 hardware on running 1080p :smile:, (thinking of it might even run some game on 1600x1200 smoothly then). And if Nvidia claims that it can... it would be any kind of unknow optimization... in my believe refered to trinibwoy replied to me:p .
trinibwoy said:
That's funny, you know the same applies to ATi right, that the acceleration depends driectly on the class of GPU. But now the witch hunt will start for Nvidia "cheating" in video acceleration "benchmarks".
Also, as ChrisRay posted explanation on Rage3d that Nvidia purevideo uses pixel shader to do de-interfacing[ChrisRay, Rage3d].... which is one of the more more expensive effects used in decoding [ChrisRay, Rage3d]. I do really not sure that 6600 and 6200 will be up to the task of 1080p decoding.
 
Last edited by a moderator:
satein said:
I realized that, thank you, that was why I asked it and believe it might not be suitable for 6600 hardware on running 1080p :smile:, (thinking of it might even run some game on 1600x1200 smoothly then). And if Nvidia claims that it can... it would be any kind of unknow optimization... in my believe refered to trinibwoy replied to me:p .

Also, as ChrisRay posted explanation on Rage3d that Nvidia purevideo uses pixel shader to do de-interfacing[ChrisRay, Rage3d].... which is one of the more more expensive effects used in decoding [ChrisRay, Rage3d]. I do really not sure that 6600 and 6200 will be up to the task of 1080p decoding.

Think you are the victim of some sloppy terminology in your referenced post. PureVideo absolutely does NOT run on the pixel shaders and instead runs on dedicated silicon NVidia calls the "Video Processor". See Dave's excellent (as usual) explanation here.



A Quote:

[Quote='Wavey' Dave Baumann]While ATI have been mapping some of their video processing over the Shader Core for some time NVIDIA have decided not to do this as they feel instructions required for video processing do not lend themselves well to the instruction set in the pixel shader pipeline; thus a dedicated unit may be more optimal for this type of work. When running video processing though the shaders this means the 3D core is active and consuming power as well, which may not be desirable in all situations, especially where mobile devices are concerned - the NV4x VP is a smaller unit dedicated to video processing so it should require less power for video processing than utilising the shader core. This is not to say, however, that NVIDIA won't utilise the shader core in conjunction with the VP in some instance, should they choose to do so.

As with most processors, the performance, and hence capabilities, of a unit such as this will be heavily dependant on clock speed. The NV4x line will span numerous parts and numerous clock speeds, yet the VP engines clock can operate at a different frequency from the rest of the chip, which may mean that the VP in the slower NV4x models could have the same performance as the high performance NV4x parts.[/QUOTE]
 
Well, it seems like there is yet another advantage besides SM3 for 6x00s user compared to x8x0 users.

If Ati utilize shader core to do decode, I wonder why can't they support x8x0 series beside the obvious marketing reason. Maybe it is not worth the effort to support an outdated core, which may directly compete with the new X1600 series.
 
I think you will probably find that both do utilise pixel shaders for some elements of the video processing to some degree or another. However some blocks reside in fixed function hardware on ATI's chips (including elements of H.264 processing) while NVIDIA has the programmable Pure Video to perform similar functions - it is a giveaway that some processing block for H.264 are using shaders with NVIDIA by virtue of the fact that full res is only accelerated down to 6600.
 
Are we assuming that 6800GT/Ultra AGP's problems re HD decoding are going to apply to h.264 as well? (No, MuFu, you can't have a refund! :p )
 
geo said:
Are we assuming that 6800GT/Ultra AGP's problems re HD decoding are going to apply to h.264 as well? (No, MuFu, you can't have a refund! :p )
Yes. There was a hardware bug in the decoder unit.
 
Dave Baumann said:
I think you will probably find that both do utilise pixel shaders for some elements of the video processing to some degree or another. However some blocks reside in fixed function hardware on ATI's chips (including elements of H.264 processing) while NVIDIA has the programmable Pure Video to perform similar functions - it is a giveaway that some processing block for H.264 are using shaders with NVIDIA by virtue of the fact that full res is only accelerated down to 6600.

That last bit seems to be contradicted by the case of the 6150 integrated chipset's broader feature set while only featuring 2 pixel pipelines vs the 4 pipe 6200 TC 128MB part. Doesn't the 6200 have better pixel shader performance than the 6150? And if PureVideo performace is dependant on PS performance shouldn't then the 6200 have the better PureVideo feature set?
 
Okay, now some lame question, does these future ForceWare 85 drivers accelerate xvid/divx videos? All I have seen are:

"The new version of PureVideo will also hardware accelerate VC1, the codec powering Microsoft's WMV HD"

"In order to take advantage of the H.264 decode acceleration you will need two things: 1) Compliant InterVideo WinDVD, CyberLink PowerDVD or Nero software, and 2) a NVIDIA driver enabling the support."
 
Dave Baumann said:
I think you will probably find that both do utilise pixel shaders for some elements of the video processing to some degree or another. However some blocks reside in fixed function hardware on ATI's chips (including elements of H.264 processing) while NVIDIA has the programmable Pure Video to perform similar functions - it is a giveaway that some processing block for H.264 are using shaders with NVIDIA by virtue of the fact that full res is only accelerated down to 6600.
Thank you for clearly explaination. If that is ture for NV Pure video, I would say it's great and impressive for NV to be able to design 6600 to do H.264 1080p decoding better than my Dothan 2.1GHz with Mobility 9600 (64MB) :cool: . My laptop can play HD-WMV 1080p at almost full cpu ultilization during the play, but a bit jerk on playing QT HD-MOV 1080p.
 
Back
Top