R420 Hints from people under NDA!

Re: Why not use an external encoder?

Scarlet said:
One thing I always wonder is how ATI does all the stuff they do considering the size of their company. It seems they have more diverse interests than nV does.
ATI's not that small. They were the leader in PC graphics sales for a few years before nVidia pulled ahead.
 
I dunno - ATI isn't hesitant about using external components on their AIW products.

I don't know why they would hesitate to use an external encoder solution if they thought it was in their customer's best interests.

Apart from the usual non IHV components, the only non-ATI element is the tuner, as is the case with anyone else. They have been using Theater 200 for the converstion of the analgue stream to digital and processing the audio I/O - AFAIK NVIDIA have been using a philips player for this processing.

Again, given ATI's background, which is ever heavily focused on video, if they do need to make changes I would think the natural assumption is they they would build there own solution rather than using another one.

One thing I always wonder is how ATI does all the stuff they do considering the size of their company. It seems they have more diverse interests than nV does.

ATI is a fair bit larger than NVIDIA.
 
I'm not saying anything Dave because I don't know the specs of the R420. On the otherhand, to say that "ATI already has a programmable video processor, they use the shaders" (so can any other DX9 card) as a response to NVidia's VOP is just misleading. The two solutions do not help solve the same problems in the video codec pipeline.

Its not misleading if they are both doing similar things - using the programmable functionality to lift processing from the CPU, which they both are. Neither solutions are completely removing the CPU from the equation.

And as for saying other cards can use shaders, the question is - do they?

I don't know. They could either choose to make the Theatre chip programmable, or add fixed function motion estimation, or they could simply buy a cheap encoder IC from the dozens of companies selling them.

On of the biggest stumbling blocks NVIDIA have had in this realm in comparison to ATI is the fact that they have had to have another vendors software layer in there, which has been difficult to deal with. ATI doesn't have to deal with these issues because of Theater, and its unlikely they want to deal with that.
 
McDusty - there is a huge difference in quality when watching poor quality or low resolution movies(at full screen on a CRT) with Divx player and the r300+ feature enabled. It deblocks the image much better than any other program I use and I use all the ones you listed. Maybe there's a point where any improvemnets on the quality of the original clip passes a point where you get diminishing returns so the benefit is less noticable?. Hopefully if more vendors support this or similar or better feature, and each ISV doesn't have to write a different pulgin for each cards feature, we will see wide spread adoption. It may go the way of headcasting though :)
 
well as far as i understand the main thing ATI is been doing at this
"shaderday thing" in Canada last week is throwing mud at nVidia and waiting for them to make the first move. :?

That isn't how a marketleader should behave !!
Grow up guys, do youre own thing and don't act like cowards !!
 
hjs said:
well as far as i understand the main thing ATI is been doing at this
"shaderday thing" in Canada last week is throwing mud at nVidia

literally :?: Just trying to picture it :LOL:
 
hjs said:
well as far as i understand the main thing ATI is been doing at this
"shaderday thing" in Canada last week is throwing mud at nVidia and waiting for them to make the first move. :?
Can anyone confirm [or deny] this? :?:
 
hjs said:
well as far as i understand the main thing ATI is been doing at this
"shaderday thing" in Canada last week is throwing mud at nVidia and waiting for them to make the first move. :?

Err, not really.
 
DaveBaumann said:
hjs said:
well as far as i understand the main thing ATI is been doing at this
"shaderday thing" in Canada last week is throwing mud at nVidia and waiting for them to make the first move. :?

Err, not really.
Oh, you're just saying that because you like ATi!

C'mon and give us the real skinny! Did they curse and call nVidia names? ;)
 
hjs said:
well as far as i understand the main thing ATI is been doing at this
"shaderday thing" in Canada last week is throwing mud at nVidia and waiting for them to make the first move. :?

That isn't how a marketleader should behave !!
Grow up guys, do youre own thing and don't act like cowards !!
Ok that is just plain FUD... :rolleyes:
 
I feel like I've been in a time capsule deal, kinda like in the Aliens movies. Time marches on, but you're pretty much out of the loop for a good while.

Anyhow, I've been consumed by work these last 2 years or so, but I have been keeping tabs on the good ole' video wars.

Quite frankly, I'm actually impressed with what nVidia has done with the NV40. Features wise, they seem to have really covered their bases well this time around. Just about all the things that needed fixing were addressed. Sure, I wish it would support RGMS beyond 4x, but I'm hopeful that maybe NV45 will address things like this.

Anyhow, back to the topic. Based on the above, I think it basically negates a lot of what the ATI camp has been saying the last few days. Maybe R420 will end up being the speed king, but it's not going to match the features. I mean, come on fellas. I have a 9800 Pro sitting in this rig that I slapped together 2 weeks ago (P4 @ 3.7 GHz.) and there are but 2 titles that stress it: Halo (which has everything to do with the code) and Far Cry. Outside of that, this thing is more than capable. When you then consider what NV40 would do performance-wise, and then add in the kool new features...

What I'm afraid is that, for all intents and purposes, R420 is going to end up being what a lot of people have been saying for quite some time now: an R3xx based chip on steroids. Unless they really come out with some serious surprises, it will appear as if they haven't really advanced their last offerings by that much. And let's be honest here: ATI has basically milked the R3xx lineup to death for a good period of time now.

Anyhow, I'm personally going to wait for the next round of refresh parts to come out to make any type of switch...I'm just glad that nVidia has been working hard these last 2 years rectifying things.
 
geo said:
DoS said:
martrox said:
DoS, There has been a real rise in the volume of fanboys in the last 2-3 months here.....and there as been a equal rise in responces..... It's happened before, many times...... all the way back to the original GeForce/Voodoo3 arguements. Nothing is going to stop it, so why add to the noise level, heh? There's no need to get personal about any of this..... It would be very nice IF everyone - yes, John, even ME! ;) - would stick to the subject.

Copy that.
I am really at the edge of my sit regardng R420. I can't wait to see it head to head with NV40. This is probably the first time i am going to get one of the hign-end pieces and not only that but also as soon as they are available. I have even quit playing FarCry (i have only played until mission 4) cause i want to play it at higher res with all the eyecandy with more fps.
Regarding bandwidth efficiency and R420, if the leaked memory clock speeds for the XT part are true, then one can imagine that it has very good bandwidth utilisation and doesn't need the extra speed in order to be competitive with the 6800 Ultra, otherwise they would go for higher clocked chips (i hear 600 mhz chips have acceptable levels of availability)

Do we know if the 16-pipe version is going to be available for benchmarking in the first round (April 26th or May 4th, whichever it is)? I know we've heard that the 16-pipe version will be available later, end of May-ish. The reason I ask is that I think it is a mistake for ATI to put a 12-pipe card out against the 16-pipe NV40 as their first move. Sure, a certain amount of extrapolating can be done to try to get an apples-to-apples result for what the 16-pipe R420 can do, but first impressions are important and lasting, so why would you want to send out the B team against the other guys first string?
Actually that Question about the *B* Team should answer itself. REmember what happened with the 8500 launch? Lets just say that ATi is far from repeating a mistake like that. Just like in college football some teams *b* team is as good or even better than others *a* team.

Err, yeah, I think you'd be underestimating things if you thought that was all that was going to happen
Borrowing this quote from Dave... I think this statement is a theme that is going to carry over to quite a bit of other things ;)
 
Typedef Enum said:
I feel like I've been in a time capsule deal, kinda like in the Aliens movies. Time marches on, but you're pretty much out of the loop for a good while.

Anyhow, I've been consumed by work these last 2 years or so, but I have been keeping tabs on the good ole' video wars.

Quite frankly, I'm actually impressed with what nVidia has done with the NV40. Features wise, they seem to have really covered their bases well this time around. Just about all the things that needed fixing were addressed. Sure, I wish it would support RGMS beyond 4x, but I'm hopeful that maybe NV45 will address things like this.

Anyhow, back to the topic. Based on the above, I think it basically negates a lot of what the ATI camp has been saying the last few days. Maybe R420 will end up being the speed king, but it's not going to match the features. I mean, come on fellas. I have a 9800 Pro sitting in this rig that I slapped together 2 weeks ago (P4 @ 3.7 GHz.) and there are but 2 titles that stress it: Halo (which has everything to do with the code) and Far Cry. Outside of that, this thing is more than capable. When you then consider what NV40 would do performance-wise, and then add in the kool new features...

What I'm afraid is that, for all intents and purposes, R420 is going to end up being what a lot of people have been saying for quite some time now: an R3xx based chip on steroids. Unless they really come out with some serious surprises, it will appear as if they haven't really advanced their last offerings by that much. And let's be honest here: ATI has basically milked the R3xx lineup to death for a good period of time now.

Anyhow, I'm personally going to wait for the next round of refresh parts to come out to make any type of switch...I'm just glad that nVidia has been working hard these last 2 years rectifying things.
Hey Type :)

I would not sell the farm on the idea that ATi is just sitting on the R300 core still. Steroids can mean a lot of things and can apply to a lot of areas. ;)
 
Hellbinder said:
Steroids can mean a lot of things and can apply to a lot of areas. ;)

Almost made me think: "Steroids can make you grow some weird anomalies"?
This is heading straight for the likes of the Matrox RSN-engine, thread browsing after 4am comes with a certain health risk attached.

(wiping screen and emptying contents of keyboard into recycle bin before drying it with modern graphic card cooling solution inside the open case. never know what problems those screwless cases, hot, windy graphics cards and wireless keyboards mighy solve.. :rolleyes:)

Kjetil
 
DaveBaumann said:
Its not misleading if they are both doing similar things - using the programmable functionality to lift processing from the CPU, which they both are. Neither solutions are completely removing the CPU from the equation.

Only by the loosest definitions of "same things". Dave, this is like saying all 3d cards are roughly the same because they all do some hardware acceleration (lifting functions from CPU), but some processing always is done by the CPU/driver (e.g. tesselation) But we know that cards with geometry acceleration (vertex shaders) are a different category than cards without, that people recognize the value of shifting T&L onto the GPU, so to suggest they are comparable because the CPU is still used on both boards is disingenous.

Why are you being so weasily on the issue? Is it that hard to admit that pixel shaders can't do everything and that NVidia has added some video acceleration to the NV40 which previous generation cards don't have? You jumped into the middle of this thread and tossed out VideoShaders(tm) as if they were in the same equivalence class, when in fact, they solve a much more restricted subset of the video codec pipeline. Your own NV40 review quotes NVidia statements as to why they didn't try to use pixel shaders for the problem. Have you ignored their arguments, or do you have a counter argument to Nvidia's points?


The fact that the NV40 can only unload 95% of the CPU when decompressing and 5% is still on the CPU is irrelevent (or 60% for encoding). It's still way better than what went before, especially if you're trying to work even medium quality video (not crappy 320x240 or 480x480 streams)


And as for saying other cards can use shaders, the question is - do they?

Why is this relevant? We're talking about hardware. Maybe NVidia thought marginal gains from "AI" (heh, love that PR) deblocking algorithms in pixel shaders were marginal gains compared to what you could achieve if you had a more general purpose unit on the core.

Any pixel shader card can use shaders to process 2d buffers.
 
Back
Top