Cheng on Avivo

Funny, I see DRM as a DISABLER. The only thing it enables is ATI to make money by selling cards that can playback DRM formats. On the other hand, it sabotages playback by preventing you from hooking up a non-HDCP output device and getting full res. Moreover, when bit errors occur on HDMI, HDCP fucks up many displays. For example, on my Samsung DLP, the screen goes green and I have to switch video sources to power it on/off to get it to resync HDCP.


How about this question: Can third party and open-source video players like MPLAYER take advantage of h.264 acceleration in the driver *WITHOUT* using DRM? In order words, I have my own private DirectShow filter (no DRM), my own stream format, and I just want to use the codec?
 
Obviously ATI are supporting DRM because they don't want their cards to be the point of failure if DRM gets embedded in the OS/hardware.

While the concept of DRM can provide benefits, there is no doubt in my mind that the rapacious media cartels will instead attempt to use it to restrict user choice and deny consumer rights. I suspect they may well "innovate" their industry right down the toilet as soon as people give up on DRMed media when it becomes too much hassle to use.
 
Last edited by a moderator:
_xxx_ said:
That's how I got it. Anything else would not make much sense IMHO.

I would hope you are correct, but I like to actually hear it from credible authority on these kind of things. :LOL:

And can we really be sure, unless we hear it, that various filters and such that these programs have available in them for manipulating photos are taking full advantage of the 10-bitness available as they work their mojo? I mean, mightn't some of them have intermediate buffers and intermediate storage in the program that are assuming 8-bit is the best they will have available to them?

At any right, I hope you are correct --and I want to hear it from an ATI that has actually investigated the matter with the techies at Photoshop, et al.

Edit: Come to think of it, the dim reaches of my memory seem to suggest that the Photoshops et al of the world generally avoid relying on the viddy card for anything, and let the OS worry about whether there is something on the other end to actually show the darn thing. :p Which would tend to suggest they can have all the nice big intermediate storage they want --and hopefully do. Which does make one wonder dreamily what they could do if they tried performance wise with 512mb of high-speed memory. . . tho I would guess the IHV's would have to provide those kind of patches thru devrel efforts.
 
Last edited by a moderator:
Dave Baumann said:
If you have any more questions on AVIVO then put em down in here.

Does the first Avivo enabled cards (X1000 series) support natively 1366x768 resolution used in almost every single LCD TV between sizes 26" and 37" ?

(9700, 9800 and X800 -series didn't support it. they need resolutions that are multiply of 8 pixels)
(Also, nVidia has an option that claims supporting this mode, but in fact they are optimizing 3 pixels from each side of the picture, to get it work. (so basically you actually have 1360x768 with 6 pixels underscan. grrr...)

So, far I have only few reports from Matrox users claiming the mode being possible with parhelia -core based cards. But because this claimed feature actually was faked in nVidia too, I have strong believe that no one can do this right now. This is a really shame, because quite few TV models have DVI input, which would enable using it as second monitor. (plus watching HD material from big TV with pixel perfect mapping is quite different experience than from monitor.)
 
The resolution of 1366*768 is the PAL resolution of HD isn't it? Not that the question is irrelevant from you Nappe of course.

I have a question regarding HDCP. It seems this feature is an extra and needs enabling by OEM's at a presumed cost. Is HDCP actually integrated into the GPU and turned off or is it further down the architecture and needs a different or additional controller/chip/device on the PCB?
 
Mr. Rys trades riffs with Cheng on Avivo here: http://www.hexus.net/content/item.php?item=3702

This is muy interesting:

Addition: Godfrey mentions that ATI spent over three days trying to reproduce Anand's testing environment and results but with no joy, indicating that Anand's testing was flawed in some way. That backs up Godfrey's statement that the Intervideo decoder would have been the better one to use.


Who's got an AT account to helpfully engage in a little educational "let's you and him fight"?

Edit: If anyone would like to join in the fun, it's still polite (so far): http://forums.anandtech.com/messageview.aspx?catid=31&threadid=1716654&enterthread=y
 
Last edited by a moderator:
Puttering thru the Hexus piece a bit further:

HQV is a good benchmark for SD DVD quality. We do encourage reviewers and discerning customers to use multiple benchmarks. Of course ATI is looking for benchmarks well beyond DVD and SD, and we will have more information for your readers shortly – we hope to help standardize the industry to a few reliable and objective benchmarks.

I wonder what they have up their sleeve there?
 
I think ATI is doing far too much boasting about video and seriously needs to demonstrate some quality first.

Let's see if ATI can deliver SD (DVD) quality that exceeds PureVideo's de-interlacing and cadence. Howsabout getting to basecamp, first?

Jawed
 
Jawed said:
I think ATI is doing far too much boasting about video and seriously needs to demonstrate some quality first.

Let's see if ATI can deliver SD (DVD) quality that exceeds PureVideo's de-interlacing and cadence. Howsabout getting to basecamp, first?

Jawed

Well, no one enjoys looking like a liar or a fool, so I like to see some confidence in public with a big name attached to it on these things --it gives them that much more incentive to actually perform or ding their own personal credibility.

But, yeah, if you're gonna talk the smack, you better bring your "A" game to the table sooner rather than later. It sounds to me like he's pointing at Cat 5.12, since that would be the last "before the fat man comes down the chimney".
 
There he is again. :LOL: Interesting point on the notebook side re h.264. It will be interesting to see how that impacts adoption of X1k notebook chips.

Edit:

The DVD royalty issues are the reasons why we need CDs. We cannot offer a decoder by download. You raise some interesting ideas that we have discussed but let me drop a hint…..what if in the near future, all this will be irrelevant.

If I had a nickel for every time I heard someone gripe about that one. . .Would be lovely to make it go away as an issue.
 
Last edited by a moderator:
One thing I'm not clear about, now, is whether a X1600 or X1300 can decode H.264 equally as well as X1800.

Jawed
 
Jawed said:
One thing I'm not clear about, now, is whether a X1600 or X1300 can decode H.264 equally as well as X1800.

Jawed

The decode hardware is driven by GPU clock, that much I've figured out. But I'm still not 100% sure how tolerant it is to doing so without a driver to play with.

Heh, Brandon asked him the same thing about Floyd-Steinberg I did a while back:

Rys said:
From: Ryszard Sommefeldt
Sent: Monday, September 19, 2005 12:08 PM
To: Godfrey Cheng; Matthew Witheiler
Subject: RE: The dithering engine and AVIVO display block diagram

Spatial dithering using what algorithm? Floyd-Steinberg? More info!

Godfrey @ ATI said:
From: Godfrey Cheng
Sent: 19 September 2005 17:34
To: Rys; Matthew Witheiler
Subject: RE: The dithering engine and AVIVO display block diagram

Listen you biatch, do you really think we are going to tell anyone which
algorithms we use??

:LOL:
 
geo said:
If I had a nickel for every time I heard someone gripe about that one. . .Would be lovely to make it go away as an issue.
And the only way they can do that should be glaringly obvious....:!:
 
Tahir2 said:
I have a question regarding HDCP. It seems this feature is an extra and needs enabling by OEM's at a presumed cost. Is HDCP actually integrated into the GPU and turned off or is it further down the architecture and needs a different or additional controller/chip/device on the PCB?

HDCP support is in the cards now, (excuse the pun), it's for getting content from the movie studios at high resolution, that's what it's enabling. They don't want to be ripped off, which is perfectly understandable. Basically when you get your HD-DVD rom or BD-DVD rom in your computer and want to play HD content, you'll need the video card and the monitor to be compliant with HDCP, video cards are set already, monitors are starting to have it now.

Not sure where WMV fits into all that, if the security risk is at the DVI output, then WMV defeats it, you can view 720p/1080i movies with that and nab the DVI output, (if you happen to have a DVI recorder/encoder that doesn't exist in the market) makes you wonder what the point of it all is. :)
 
Sweet Jaysus. :oops: Hopefully that'll make the apps boys sit up straight in their chairs and start clamoring for devrel to help them get that in their own apps.
 
What's most pertinent, I think is it's:
  • a killer app - the first for GPGPU
  • years away from being matched by x86
And with R580 on the horizon (twice as fast as R520?) x86 is looking even sicker.

It'll be interesting to find out if this is bandwidth or compute-limited.

As it happens I'm personally not interested in video transcoding, and we've still gotta see whether it makes it to the bigtime...

Jawed
 
Back
Top