ATI GPU transcoding app?

Makes me wonder how AMD could get any third party developer interested in using Stream when what they have to show is so terrible.

Aside from that, their timing isn't the best.
In 6-12 months we should have OpenCL support from both vendors... I think if developers haven't started using Stream yet, now is not the best time to start.
They could either go with the more mature and supported Cuda and move to OpenCL from there, getting free ATi support in the process... or just wait for OpenCL to emerge before beginning a new GPGPU project.
 
Hasn't Stream been available (without fancy name perhaps?) on professional boards forever, though?
 
I've been having trouble with this 64-Bit-Issue also. It was not mentioned in the Early-Preview-Readme, but then was added to the final readme. According to my mail correspondence with AMD, not even all of their own guys were aware of the 64-Bit-issue.

After the initial report on the AVC (Avivo Video Converter), I've tried some encodings under XP (32 Bit of course this time) and was getting strange results again.

Under XP, the GPU was used (albeit not very much), got warmer and the transcoding speed scaled with core clock on a given card (4670, 4850). But surprisingly the same settings proved to be faster on a non-supported card like 3870 or even X850 - with the app obviously running off the CPU (which was dualcore, mind you!) only. At first, I thought this was due to my ub0r-anti-leet mobo with the PCIe connected only via 4 lanes from the southbridge (asrock 4core dual sata2) and thus more or less my fault but then the same proved to be repeatable on a more modern setup with an X48-Board. Plus, the ALU-count didn't seems to matter as much as their freq, because a 4670 with fixed clock at 750 was faster than the 4850 at a fixed 625-MHz-tick.

Maybe the AVC uses only one SIMD (or no SIMD at all but other portions of the GPU?) at the moment - dunno. This point seems to be true for Badaboom also, since it fails to scale according to even GFLOPS, let alone the architectural differences between G96/G92 and GT200. I think right now AVC is more a proof-of-concept than even a techdemo (which i would consider Badaboom), sadly.

Only Vista 32 Carsten. XP support is also up in the air AFAIK. And they're only accelerating motion estimation, unless I'm remembering things wrong.
 
Thanks, but don't you think, AMD should've communicated this (only one OS supported) also? Especially after they got aware that 64 Bit was somehow borked?
 
Thanks, but don't you think, AMD should've communicated this (only one OS supported) also? Especially after they got aware that 64 Bit was somehow borked?

You are quite right. The provided info wasn't meant to be an apology for AMD (they should handle that themselves), it's goal was to provide you with info that may help you in your testing:)
 
Thanks - hopefully, I'll have time to re-test, when i get back from vacation. ;)
 
CyberLink to Showcase Leading Technologies that Enrich the Connected Media Lifestyle at CES 2009

Unsurpassed hi-def video editing performance: CyberLink’s award-winning video editing software PowerDirector 7 is now optimized for the latest in GPU and CPU technologies for elevating the performance of HD video production. PowerDirector 7 optimized for NVIDIA® CUDA and ATi® Stream technologies ensures accelerated performance by utilizing the power of GPU to increase video rendering and transcoding speed, while the optimization for Intel® Core i7 Processor enhances the overall HD video editing performance throughout the production process, from previewing, rendering to burning to DVDs or Blu-ray Discs.

When? :D
 
there is a leak patch apparently for it that i can't find anywhere. But its supposed to have the stuff in
 
I read that the x264 folks were trying to use CUDA, but that it was a complete pain to do so and they gave up. x264 or DIVX 7 would be the most interesting, most flexible choices I think.
 
there is a leak patch apparently for it that i can't find anywhere. But its supposed to have the stuff in

The "CUDA" update for PowerDirector was released today, so it looks like PD7 supports video encoding on GPU.
http://www.cyberlink.com/eng/press_room/view_1994.html

CyberLink PowerDirector 7 designed with NVIDIA CUDA Encoder technology achieves up to 274% performance gain when transcoding H.264 videos. By leveraging the multi-core parallel power of the GPU, PowerDirector 7 provides consumers with an accelerated video editing experience resulting in faster rendering of H.264 HD videos to AVCHD, M2T format and for viewing on iPod® and PSP®. PowerDirector 7 supports CUDA-enabled NVIDIA GeForce® processors with NVIDIA graphics drivers version 181.20 or higher.
 
I read that the x264 folks were trying to use CUDA, but that it was a complete pain to do so and they gave up. x264 or DIVX 7 would be the most interesting, most flexible choices I think.

I believe it. You might be interested in this use of CUDA: http://forum.doom9.org/showthread.php?t=141104

DGAVCDecNV 1.0.9: GPU decoding on Nvidia
ATTENTION: If you do not use an Nvidia graphics card 8xxx or higher, this thread is not for you!

http://neuron2.net/dgavcdecnv/dgavcdecnv.html

If you want to assess the GPU performance, please read the section "Disable Display" in the users manual. And set the playback speed to Maximum under Options/Playback Speed! I am still optimizing frame rates and CPU utilization. Be aware that when serving through AVCSource() the performance will be comparable to that obtained with the display disabled (Disable Display option enabled).

I'm also interested in finding out what streams fail with GPU decoding, so please report them, preferably with stream details and a sample.

His other programs are used by many.
 
The "CUDA" update for PowerDirector was released today, so it looks like PD7 supports video encoding on GPU.
http://www.cyberlink.com/eng/press_room/view_1994.html

Thanks for that, it's news to me.

It looks like they're unveiling CUDA powered encoding but they seem to go out of their way to be mealy mouthed about it.

http://www.cyberlink.com/multi/products/item_4_5_2.html

I suspect this might just be CUDA for post processing effects. Yes, if you are using post processing then CUDA will speed up the encoding. I might be wrong but I'm suspicious because that would be similar to what TMPGenc announced not too long ago. I'd quote from the Cyberlink page but it's extensive and has to be read as a whole imo.

From the TMPGenc page:

NVIDIA CUDA technology is now supported for processing the video filters and decoding. The multiple cores of the GPU can divide the workload and run the processes in parallel for a huge boost in processing speed over your computer's CPU*. This lets you apply multiple filters such as video noise removal (time), smart sharpen, color correction, and more without having to add hours to your output time!

*Speed improvements are dependent on your hardware specifications.
The GeForce8800 GTX/GTS video card using the G80 core is not currently supported. Thank you for your understanding.
Decoding benefits apply to MPEG-1/2 video only and may produce unexpected video output quality compared to regular CPU decoding.

http://tmpgenc.pegasys-inc.com/en/product/te4xp.html#tabs

Edited for grammar and some content.
 
Last edited by a moderator:
Perhaps something about Cuda 1.0 [G80] vs Cuda 1.1 [G92+] functionality?
 
Thanks for that, it's news to me.

It looks like they're unveiling CUDA powered encoding but they seem to go out of their way to be mealy mouthed about it.

http://www.cyberlink.com/multi/products/item_4_5_2.html
...

Edited for grammar and some content.

CruNcher/Doom9 Forum said:
http://s11b.directupload.net/images/090110/8pz3j4we.png <- Software (Cyberlinks Software Encoder)
http://s10.directupload.net/images/090110/dzeabnxp.png <- GPU ? (Nvidia PureVideo Encoder)
http://s10b.directupload.net/images/090110/ywb3gaev.png <- GPU ? (Nvidia PureVideo Encoder)

I'am not really sure if it runs on the GPU with that high CPU utilization it's definitely much higher (70%) then what a Badaboom Encode would consume.

The Final output is High Profile @ Level 4.0 (Badaboom can't do this yet)

Visually Nvidias Encoder Produced a much nicer result in less time here then Cyberlinks Encoder :)

+ Audio
Cyberlink: 5:39s
Nvidia : 2:02s

Im using Nvidias Pre-Alpha 185 Forceware might be that it caused a Software Fallback or maybe a Encoding option still have to figure that out :)

It's really encoding on GPU.
 
Back
Top