Purevideo and AVIVO

Tahir2

Veteran
Supporter
Now I really hope this is OK... one thing I can't stand is when there are too many discussions in a thread concurrently running and a thread gets closed because of some entirely unrelated issue.
This can leave questions unanswered about the other discussions and debates raised.

With that end in mind I have created a new thread quoting from myself and Democoder.

Apologies if this causes any problems... I don't see why it should. I also think it should remain in this forum because the pixel pipelines normally reserved for 3D calculations are beginning to be used for video decoding.

"...[R520 and R580] has a much more competent Video Decoding engine." - Tahir

"Evidence? Last couple of times that they were compared, NVidia's DVD decoder still produced better images than ATIs, and NVidia eventually did deliver HD decoders, proving atleast some of the worth of having a programmable video processor onboard the chip. In fact, from a capability perspective alone (video programmable processor vs fixed function+pixel shaders), NVidia's looks better from a tech-geek architectural perspective atleast." - Democoder

http://www.beyond3d.com/previews/ati/avivo/awu/

http://www.beyond3d.com/previews/ati/avivo/awu/index.php?p=03

Links provided by Tahir


"You asking me? Go read Anand's DVD codec comparisons complete with screenshots showing artifacts that occur. Even if they have improved in recent drivers, where's the evidence that video engine is "much more competent" than NVidias? Seems like a pretty bold statement to make, especially since they were playing catchup with NVidia's PureVideo codec. Would you care to explain the competency issue differences between ATI and NVidia's HW and SW video processing? (not you geo, but Tahir)" - Democoder


"Well, that's after they updated their driver. Back when I bought a 6600GT for a cheap HTPC, this or this or this review page for example, showed that PureVideo did better than AVivo. Probably as a result of reviews like this, ATI fixed their codec, and does better for example now in the color test and cadence test. No doubt, NVidia will fix theirs. This is a software issue (cadence detection is not done by some fixed function HW unit that forever dooms you as unfixable)

This is an area where quality is going to flipflop. Just 4 months ago, people in AVSForum were praising PureVideo over ATI. I honestly haven't kept up, since I already bought my 6600GT back then. But most people understood the problem to be software, and would obviously be fixed.

With experience building h264 codecs still in infancy, they are going to be numerous issues with the first batch of HDDVD/BR movies and you are going to see ATI and nVidia go back and forth.

That's why I object to the claim of much more competent video engine since it seems to suggest a fundamental hardware inferiority. This is not at all like quality differences in AF or AA. If back in November, if I had say that NVidia's video engine is much superior to ATIs, there would be tons of "but but... wait for updated drivers!" posts. And of course, B3D moderators would rush in with quotes from ATI engrs to assure people. :smile:

To me, when you talk about video engine, you're talking about performance and accuracy of HW acceleration parts of your HW, like motion-comp, DCT/iDCT, etc. These fixed function units can have precision differences, which if a problem is found, can't be fixed except by switching to software decoding for that part of the codec pipeline, or some other "workaround" like maybe using GPGPU techniques." - Democoder
OK, Democoder it seems your main issue with my first statement is that I said "[R520 and R580] has a much more competent Video Decoding engine."

You are no longer arguing that ATI has a better overall solution (both software and hardware) at this time but this may change with software updates between ATI and NVIDIA as drivers are updated.

First let me define Purevideo and AVIVO as simply as possible... you may disagree with this definition but if there is a strong disagreement then we will be talking about two different things.

"Purevideo and AVIVO are NVIDIA's and ATI's solutions for video playback incorporating in their GPU architectures currently. Both solutions require dedicated hardware and software as a minimum to work, with AVIVO also using the pixel pipelines.

Both solutions are designed for MPEG2 and H.264 decoding. I am not concerned with encoding abilities of either architecture if they exist."

With this in mind Anandtech and others released articles around December 16th/17th 2005 when AVIVO was updated. HQV, a subjective but repeatable test with strong checks for best practices and consistent examination, was used as a good benchmark by sites.

In these initial tests AVIVO was STILL NOT behind compared to Purevideo with regards to decoding accurately, with the highest Image Quality and lowest CPU utilisation.

Since Anandtech and Firingsquads initial articles there have been no comparisons between the competing technologies and there seems to be no update apart from Beyond3D's recent article.

Democoder used Anandtech's article as a guide but this is out of date.

Purevideo has been updated twice since then by NVIDIA and AVIVO has been updated several times as well.

My strong assertion is that Anandtech's December 16th article should not be used to compare Purevideo and AVIVO at this time. It is not standard practise to use old drivers to compare or review a graphics card on any site that has a good reputation for hardware reviews.

There is no Purevideo test after NVIDIA's improvements but there has been one for ATI by Beyond3D, published on February 14th.

Before the initial updates to the HQV tests resulted in the following scores:

October 5th 2005:

NVIDIA - 51 ATI - 38

http://www.anandtech.com/video/showdoc.aspx?i=2551&p=10

I believe Democoder was referring to this article. If there is a more recent comparison between Purevideo and AVIVO by Anandtech then please let me know. I use Anandtech because Democoder did initially.

If we are to use other sources there is more recent direct comparison here:

http://www.extremetech.com/article2/0,1697,1916968,00.asp (last page)

The scores are:

*NVIDIA (PV) - 51 **ATI - 103


*(concurs with AT)
**(concurs with Beyond3D's test of the Catalyst 6.2 drivers)

We have a final update from Beyond3D on 26th Feb giving ATI's AVIVO a score of 113.

http://www.beyond3d.com/previews/ati/avivo/awu/index.php?p=03


The question is can NVIDIA close this gap? Purevideo has been improved for the G71 architecture but because ATI have a superior hardware and software solution for AVIVO (Video playback) I do not believe the G71 architecture will be able to touch ATI's superior solution with the G71 and Purevideo.

That is my argument Democoder.. please rip it apart ;)
 
Last edited by a moderator:
I don't disagree with the data, I disagree with your conclusion (superior hardware). ATI gets a full 30 points from film cadence detection alone, but there is no special mojo ATI HW has that enables it to do better film cadence detection. This is purely a software issue. NVidia actually has a programmable onboard unit that can do cadence correction (used for 3:2 right now), it may or may not be possible for it to be used for other cadences, but regardless, for non-3:2 cadences, they could use the same technique ATI is using.

The other score difference is adaptive noise removal, again, another non-hardware issue. It took many driver updates for ATI to get where they are today, so I suggest you cool the hyperbole and wait another couple of updates.

BTW, did ATI fix this http://www.anandtech.com/video/showdoc.aspx?i=2551&p=7 issue which is not covered by HQV scoring?
 
DemoCoder said:
I don't disagree with the data, I disagree with your conclusion (superior hardware). ATI gets a full 30 points from film cadence detection alone, but there is no special mojo ATI HW has that enables it to do better film cadence detection. This is purely a software issue.

Accordimg to ATI's muktimedia team this is indeed a hardware function. this is backed up by the fact that when it was enabled in 5.13 the CPU utilisation decreases - if it was doing software detection then you'd expect the CPU utilisation to rise. The R520 delays put a lot of the software development behind track and until 5.13 much of the video engine just wasn't being used.
 
This doesn't tell you if it is a special hardware unit, or whether it is just using ordinary GPU calls (which will work on any compliant DX9 HW) to accelerate a portion of the cadence correction or the detection.
 
DemoCoder said:
BTW, did ATI fix this http://www.anandtech.com/video/showdoc.aspx?i=2551&p=7 issue which is not covered by HQV scoring?

Please ask Anandtech.

I am not able to test this problem as I do not have the same movie as Anandtech used nor do I know what frame that is.

I have not been able to replicate the problem on an X1800XL which I am borrowing with and DVD material I have. Firingsquad said this:

Although (or because?) NVIDIA PureVideo cannot detect the unusual cadences, its 3:2 algorithm is faster and is able to lock-onto cadences such as some of the anime test clips. ATI believes that NVIDIA PureVideo is faster because the PureVideo MPEG-2 decoder is able to provide more information to the GPU.

You keep going on about software issues and how they could be resolved by future updates... I won't follow your footsteps as this is probably a weakness ATI have tried to address but cannot fully.

Democoder you use Anandtech, but that article is many months old. You even referred to that article in the initial posts. The issue with doing that is... most of the data is woefully out of date.

FYI: I did not expect ATI's AVIVO to be anything other than a gimmick. I took home an X1800XL from work because of the fact I upgraded to a new screen and when playing a 1080p T2 and XMEN III trailer (H.264) on my 6600GT I experienced a strange shimmering effect.

For testing purposes to make sure it was not my screen causing a problem at such a high resolution playback I changed graphics card. The issue resolved itself (could have been software/drivers) but I also noted the superior playback in MPEG and AVI files.

AVIVO is very impressive and playing back movies on my PC is an area I am definitely interested in.

I believe my hyperbole is warranted, I certainly dont need a website to tell me what is best since I work with the latest PC hardware all the time - and your reliance on "old data" from Anandtech (an OK tech review site most of the time) is a bit surprising to say the least.

I personally would like to see a new comparison between AVIVO and Purevideo soon as NVIDIA have been making improvements as well. However I stand by what I said, I do not think Purevideo is going to close the gap to AVIVO this generation.

Firingsquad's conclusion was:

Ten days ago, ATI had the worst video quality on the PC. With this new driver, ATI has jumped to the top of the class and then built a nice lead. For Hollywood films, ATI edges out NVIDIA's PureVideo. Although the cadence detection is faster with NVIDIA, ATI has noise reduction will improve the quality of films once the cadence is locked on. In other words, for most Hollywood films, ATI will look better for the 2 hours whereas NVIDIA might look better for the few seconds after a bad-edit, if such a bad edit exists. When it comes to processing interlaced video, ATI now surpasses anything that we've seen on the PC with the best implementation of diagonal filtering we’ve seen yet. When it comes to unusual cadence detection, ATI AVIVO has no peer in its price range.

Even so, with Catalyst 5.13, ATI is making a serious challenge toward the dedicated video processors that cost $3000+ and they're doing it as a free upgrade. With a fully programmable architecture, there is also room for ATI to grow. Companies like Faroudja, Silicon Optix, and Gennum should take notice at this new competitor to the market.

Now that is what I call real hyperbole!

But since we are on his site I will let Dave Baumann have the final words:

* Note: In later Catalyst drivers there is a "3:2 pulldown" option under the de-interlacing section of the video controls, enable this to get the best quality out of film playback as this appears to enable the cadence detection.

Finally, possibly the most important element to many users currently, is that of playback performance of video as it stands now, and it's with the quality aspects where the Avivo Winter Update makes the greatest strides. For the most part it's with interlacing and cadence detection that ATI have significantly increased their quality, with de-interlacing benefiting any interlaced source, including live feeds from a TV tuner source (such as an add-in TV tuner board, a video-input to the graphics card or a tuner on an All-In-Wonder board), and the cadence detection benefiting sources filmed at different FPS's, which is frequently going to occur with DVD's and their transfer from 24 FPS film

It seems Dave is not that skilled in the art of hyperbole. ;)
 
Tahir2 said:
You keep going on about software issues and how they could be resolved by future updates... I won't follow your footsteps as this is probably a weakness ATI have tried to address but cannot fully.

Well, I disagree. ATI can address it. There is no magic to film cadence detection. MPlayer for example, offers 4 different cadence detection algorithms, each with varying degrees of "speed of lockon", accuracy, and CPU usage. If ATI wanted to address it, they could simply add an option to alter the algorithm used. The only reason not to do it "all the time" is that there are tradeoffs. The ultimate "cadence detection" algorithm is two-pass. You completely process the movie first, and then you play it back. However, your "latency" for live playback is 100% :)


Democoder you use Anandtech, but that article is many months old. You even referred to that article in the initial posts. The issue with doing that is... most of the data is woefully out of date.

As I already said, I've been out of it for a few months, and I don't keep up to date on all driver releases. That said, I don't dispute the data, I dispute your erroneous conclusion that the differences between PV and AIVO in cadence and noise reduction have something to do with inate HW differences and are unresolvable. You are pronouncing conclusions and the burden of proof is on you.

FYI: I did not expect ATI's AVIVO to be anything other than a gimmick. I took home an X1800XL from work because of the fact I upgraded to a new screen and when playing a 1080p T2 and XMEN III trailer (H.264) on my 6600GT I experienced a strange shimmering effect.

Well, I have the T2 WMV9 edition, and I don't notice any shimmering on my 6600GT. I routinely run the T2 WMV9 version against the DVD version to show off the differences for visitors. I've been showing this off practically since it came out. I don't know what's wrong with your system, but mine has no shimmer issue.


I believe my hyperbole is warranted, I certainly dont need a website to tell me what is best since I work with the latest PC hardware all the time - and your reliance on "old data" from Anandtech (an OK tech review site most of the time) is a bit surprising to say the least.

Well, I don't work on the "latest PC HW". I work on HT HW and I don't feel a need to upgrade constantly, since playback of MPEG-2 DVDs is a commodity, frankly. My HTPC is a small/cheap/quiet system that sits on a small shelf, connected to an DVI/HDMI switch (thru a device which removes HDCP), and onto a $8000 WXGA projector, which goes onto a 10ft diagonal screen. I am less interested in PC playback, than the need to run WMV9 playback temporarily until BRDVD/HD-DVD. Yes, it is impressive the strides that PCs are making compared to video processors (and why not, given the power), but there is still no HTPC that does what an iScan HD+ does. The major issue for mean is converting analog outputs of older HW to HD resolutions and HDMI, since the "builtin" scaling of the display devices sucks.


However I stand by what I said, I do not think Purevideo is going to close the gap to AVIVO this generation.

On what basis? Why don't you explain how cadence detection works, and why NVidia can't do it but ATI can.

The reality is, most DVDs created today are "clean" edits, use MPEG-2 flags appropriately, and don't benefit as much from non-standard cadence detection. Crappy TV series and el-cheapo produced DVDs do, but major film studios? No. King Kong, for example, does not need it, and will look the same on PureVideo, AVivo, and $26 CyberHome chinese players.

This is because DVD tools used by studios have continually evolved. Producing a high quality encode with clean transistions between edits is practically fully automated now.
 
Last edited by a moderator:
On what basis? Why don't you explain how cadence detection works, and why NVidia can't do it but ATI can.

Why dont I? I cannot, I simply dont have that kind of technical knowledge. Does that mean I am out of my depth to make statements like ATI have a superior video decoding engine.. maybe but from the presented data it does point that way. A lot of sites do agree that ATI is doing very well in this department.

I also think the fact that ATI is using the pixel shaders to some degree and not entirely fixed function hardware is very cool from a tech-geek perspective.

If ATI can adress the candence issue (and they may have done so with the latest 6.4 drivers as there is a tick for 3:2 pulldown in the Catalyst Control Center) then that is even better. I understand what you are saying with regards to playing a whole movie back for the best candence detection. Nice for encoding software, bad for watching a movie.

As to pronouncing conclusions, I would love to see Dave or any other site do a review of Purevideo again. I don't have the skills needed to do it myself. From what I have shown though, the evidence does look good for my side at this time. If NVIDIA improve Purevideo to beyond AVIVO levels for the 6600GT that is great - I own one of those cards too.

The issue with T2 and XMEN III is unresolved but I needed to check quickly that it wasn't my brand new screen causing the problem. I had just reinstalled Windows and didn't want to fiddle with it too much. I probably will have to when I put the 6600GT back in my system. Putting an X1800XL in my system was a real eye opener for me, I was genuinely impressed and since ATI have been making AVIVO noises I concluded using circumstantial evidence that ATI has dedicated more resources to its playback engine and software than NVIDIA has. If this is untrue and it is all software related, my definition does include the software element as well.
It is also important to note for the consumer that Purevideo is not free whereas AVIVO is (apart from H.264 acceleration at this time).

I understand most people do not have the priviliges I do with regards to getting to work with the latest hardware and taking it home. It is one of the reasons I still am at this job. I love this stuff and I get paid for it. It isn't rocket science, I just need to know how to solve problems logically, have a good memory and know how to use a screw driver.

The reality for me is that I like to watch movies on my computer sometimes when I am not able to on my main TV and right now ATI do this better on a variety of different formats. The differences are not trivial as noted by Firingsquad and they do employ an expert in the home theatre field apparently.

Hope you are satisfied why I said what I did in the other thread and I hope you have been partially updated on the benefits of AVIVO.
 
It looks like I have my Purevideo/AVIVO update as well:


We've seen now how well ATI does with DVD processing, and the benchmarks show that ATI does a better job at this overall than NVIDIA. Here are the final HQV benchmark scores.

ATI: 111 NVIDIA: 68

NVIDIA ForceWare 81.98
Catalyst 6.1
 
Hasn't there been some discussion tho of PV being more programmable than Avivo? I'm remembering that re h.264, given that PV is obviously somewhat older than Avivo and yet NV has been able to bring h.264 accel. So I'm not quite ready to declare Avivo more competent, etc as a broad generaliztion. Tho I want to see NV bring some more IQ wood with low/cpu utilization before I hand them the palm either.

But, at least for some months now, ATI has had the IQ award again, and unless NV wants conclusions to be drawn about their actual ability to match it, needs to get off their duff sooner rather than later. ATI took what? Not quite 3 months from Avivo release to fire it up with the enabling software? NV has had nearly 4 months since to match if they can.
 
I seem to remember that H.264 acceleration was touted as a feature from the days of Geforce 6 series.
Also there are meant to be improvements made to Purevideo in G71 but what these are.. I don't know.

I don't think the Geforce FX series accelerates H.264 with its Purevideo but may be wrong there - if it doesn't then there are likely to be different hardware within Purevideo depending on the generation of gfx card from NVIDIA.

Edit: nice table from NVIDIA

http://www.nvidia.com/page/purevideo_support.html#geforce

There was no Purevideo with the FX series.
 
Last edited by a moderator:
Tahir2 said:
I also think the fact that ATI is using the pixel shaders to some degree and not entirely fixed function hardware is very cool from a tech-geek perspective.

Well, but it also means that the techniques they use would be more or less translatable to any sufficiently performant DX9 HW. I give NVidia slightly more props by trying to supply a more general purpose unit that is more amenable to dealing with the video codec pipeline, which is only partially parallelizable to stream based GPU paradigms. . Although, I am somewhat disappointed that they it appears not sufficiently general enough to deal with quantization or more importanting, entropy coding, where a big CPU chewer is CABAC/Arithmetic Coding.


Purevideo again. I don't have the skills needed to do it myself. From what I have shown though, the evidence does look good for my side at this time. If NVIDIA improve Purevideo to beyond AVIVO levels for the 6600GT that is great - I own one of those cards too.

Well, the evidence looked 180degrees opposite back in November didn't it? Which shows you the problem of relying on correlations based on a time series of driver releases. In computer software/hardware, we don't need to rely on such roundabout reasoning, since the exact answer can often by obtained from knowledge of the HW and SW and logical reasoning.

I was genuinely impressed and since ATI have been making AVIVO noises I concluded using circumstantial evidence that ATI has dedicated more resources to its playback engine and software than NVIDIA has. If this is untrue and it is all software related, my definition does include the software element as well.
It is also important to note for the consumer that Purevideo is not free whereas AVIVO is (apart from H.264 acceleration at this time).

Well, that may or may not be the case. Didn't NVidia hire the guy who wrote PowerDVD, or was it WinDVD? It's like those "DEC" guys we always here about with their rewrite of the ATI pixel shader compiler. Sometimes it takes a long time for a re-write to bear fruit.

You think getting charged for DVD playback is bad now? Wait until HD-DVD/BR-DVD. Those are *LOADED* with patent royalties that must be paid.


Hope you are satisfied why I said what I did in the other thread and I hope you have been partially updated on the benefits of AVIVO.

Well, I have always been partial to ATI for HTPC, especially before NV40, ATI always had a more complete solution (AIW, etc). I just don't think HW wise, there is any reason why NVidia can't deliver something that passes all the cadence tests. It's a matter of dedicating resources to it, which we don't know if they are doing or not, or whether they are holding back for a "big" driver update at some point, or for "Vista"'s playback model. It could be they are putting all their resources in Vista and VC-1/H.264 and the new copy protection model and are simply placing XP PureVideo in "maintainence mode" fixing only supercritical bugs (crashes, etc)


Frankly, H.264 is so hellishly intensive for decoding (and worse for encode), and the copy protection of AACS architecture, so substantially more complex, I can't see how they can possibly support next-gen formats without a substiantal rewrite of PureVideo.
 
Interesting discussion chaps, especially as I intend to build a HTPC sometime this summer, but I wonder should it be moved to the "PC Video & HTPC" forum? That's certainly where I'd normally look for such a thread!
 
DemoCoder said:
This doesn't tell you if it is a special hardware unit, or whether it is just using ordinary GPU calls (which will work on any compliant DX9 HW) to accelerate a portion of the cadence correction or the detection.

DC, as others will attest, I was a skeptic on the entire video decoding process of AVIVO when X1000 was initially released - I thought the video hardware was the same as R4xx, and AVIVO for X1000 basically amounted to the display pipeline from a hardware perpesctive. It wasn't until I did the Cat 5.13 testing I realised it wasn't.

I did the CPU testing because I fully expected the CPU utilisation to rise with the notion they were doing some software based cadence detection, which would logically have a CPU overhead - when I ran the test on standard videos and found the utilisation to go down I then also looked at the CPU utilisation of the HQV cadence test itself to verify that standard DVD's weren't decreasing CPU utilisation while the HQV test was increasing. When it it appeared to be the case that CPU utilisation was going down in all cases I checked back with ATI to find out what was going on.

The video decoding part of AVIVO spans 3 areas: fixed function video hardware, shaders and software. The main elements that have changed with X1000 is both the software and hardware in that functionality from the Xilleon side of the business has been immplemented in both the software and hardware. Here's ATI's answer to my question on this point:

With respect to your questions, the Avivo functions are a mix of shader code and fixed-function hardware. We've taken both silicon blocks and software algorithms from our Xilleon DTV team and either converted those software algorithms to shader code or used them directly on the same fixed-function hardware. This is how we were able to very quickly go from nothing to nearly everything in HQV: we already had much of the work done on the Xilleon side.

With respect to CPU load, I'm not quite sure what you're getting at. Are you referring to lower CPU load on a post 5.13 driver with the Avivo features? If that's the case, what you're seeing is that the GPU (not CPU) is analyzing the frames of the video, and from there it's deciding which cadence is being used. This is similar to the way a high-end TV would process a signal (not surprising, as it's partially Xilleon DTV algorithms in use).

You'll see for instance that a 3:2 pulldown detection takes about 3-4 frames (as few as is technically possible) before the cadence is detected (the HQV "race car" clip is a good example).
 
Figuring out what is hardware and what is software in the video world-let of PC graphics cards is a pretty frustrating piece of work, usually full more of hints than conclusive proof.

Having said that, there does seem to be one pretty good pointer, and that is not what the hardware does right after release, but what it does after its been optimized. Both companies have shown this pattern, not just ATI with their late R520.

NV40 was announced April 14, 2004. PureVideo, the software to optimize the new PV hardware wasn't announced until Dec 21, 2004 --8 months later. Did the busted implementation in NV40 play a role in that timeline being extended? Well, we haven't heard that conclusively, but it stands to reason, doesn't it? How long would it have taken if NV40 wasn't busted? Dunno, but I tend to think it still would have taken months. Because if they'd known NV40 was busted before release, they would have fixed it. As it was, PV still took another 4.5 months after 6600 announcement (Aug 12, 2004), which had the fully working hardware block. Is there "a hardware block" at all? Yes, clearly there is --or the limitations noted in many NV documents re NV40 would not exist.

X1K was announced Oct. 23, 2005. The Avivo Winter update, that actually enabled using the new hardware for video came on Dec 17th/23rd, 2005, two months later. We have various statements that the bug in R520 put a sizeable monkey wrench in all software work for it during its delay. Is there a hardware block? The evidence is not quite as blatantly obvious without a broken implementation in one member of the family to put the flaming letters in the sky, but the clues seem consistent with such (even ignoring ATI's statements to that effect)

So, yeah, certainly even *after* the point where initial drivers for new video hardware comes out there can be improvement on the software side that results in new IQ/features. But the overall picture is certainly complicated by the fact that at least in recent years from both IHVs, you're not even looking at what the hardware can do in initial examinations of the new parts, which makes baseline comparisons "at release" fairly useless.
 
geo said:
Figuring out what is hardware and what is software in the video world-let of PC graphics cards is a pretty frustrating piece of work, usually full more of hints than conclusive proof.
Very well said.

Some things that instantly came to my mind when reading this subject:

In the end, it isn´t really that important at all that every IHV needs to tell us what exactly is done on their hardware [and which part of it] and/or their software. To put it down fairly blunt, i don´t care as long as it´s working and doing it´s job well, meaning it has some some image improving algorithms (not speaking of artificial and subjective improvements, but instead their pulldown-detection, light noise-reduction, you get the idea...), displays everything in the right colorspace, has certain adapting algorithms, which take content into account, a little less CPU-bound, good signal integrity, etc. These are the things, which have a use and are a benefit.

I think that everyone who has some insight into programming and/or hardware in general, pretty much knows that there has to be decent driver-support (software), cause no HW in this world would be able to function and be "good" fitted for some task, so there usually is "something" in every HW, which theoretically you could do / utilize better, but there are always things like time (timing) to take into consideration, like the adoption of H.264.

It´s also one of the things that really bothered me, because i´m kinda anxious about all those video capabilities myself. Marketing "can" make everything look "awesome", but until you finally test it as a customer, there´s simply no way of knowing anything about it´s real use, when speaking of the consumer side. It doesn´t help anyone out there, "knowing", that theoretically there is HW for it (fixed-function), but it isn´t used.

Speaking of NV in particular, they have a pretty bad history of bugged HW and software, especially with regards to their NV40, adding to that the need to pay for a decoding filter to get the most (or use the real useful stuff at all) out if it, which i find kinda ridiculous. Yes, there are royalties to pay, but why announce something that you can benefit of in the first place, at least not until you pay another fixed amount of money to finally get this benefit. Democoder already addressed that, this will get even worse when BD / HD-DVD will start to appear. So in the future we´ll have this 50 mio. transistors of fixed function HW, knowing that we can´t benefit of that until paying like >50 dollars. That´s a no go for me. I´d rather stay away from it altogether, and "just" use a good standalone for that...(this shouldn´t be the IHV´s goal, should it ?) At least i know what i pay for then and I have the opportunity to read about the "real" qualities of it in a couple dozen magazines to finally make a decision.

ATi on the other hand unfortunately introduced AVIVO quite late, while it´s announcement made it seem like everything of it is was implemented already, when you buy a X1000, that is. As it is now, at least it "seems" to be working ok, when speaking about it´s theoretical "usefulness". Their XCode-software however, is quite useless, when you want quality transcodes, which is one of the things i really don´t understand, it doesn´t seem to even use their HW. Personally i don´t wanna know how much time they spent coding this, since everyone i asked about really didn´t know what "market" they wanna address with this. Consumers who don´t care about quality ? Why buy "high-end"-HW in the first place ? Because theoretical capabilities get more important than real useful capabilities ? Sorry, no thanks.

B3D´s articles on this very subject thankfully shed some light on all of that, but - looking through the web - it´s also one of the few sites who even cares to test it on some real applications and are trustworthy. Sites like Anand are another story (not that i believe everything Kristopher writes, but it´s a start...)

EDIT: Some typos fixed.
 
Last edited by a moderator:
Avivo XCode seems to cause fairly strong and opposite reactions in people. Bit-tech gave it an award not so long ago.

In terms of usability, there's also no doubt that the Avivo Converter is the best on test here. It's very fast, and requires just a couple of clicks to produce great output, with the aspect ratio and resolution being handled for you with the minimum of hassle.

http://www.bit-tech.net/bits/2006/04/05/psp_video_converters/5.html
 
But does Avivo XCode actually use the GPU at all? I seem to remember reading that it doesn't actually use any of the capabilities of the GPU but is an entirely software-based solution.

If this is the case, how feasible is it that we'll see Video Encode assistance from the current generation of GPUs? IIRC, NV40 was supposed to offer this but we've heard nothing further from the NV camp since the launch.
 
Back
Top