"ATI is smug but Nvidia's the bug in the rug"

Status
Not open for further replies.
Re: Doom 3 issues

Proforma said:
Well I don't have any problems with FARCry or any unreal 2004 game and Doom 3's graphics arn't really that good to make it this slow.

Its not exactly like I have a 486 trying to run Doom 3 here. Whats the latest Pentium 4 3.60 Ghz? Its not like CPU's are at 6 Ghz or something
and like I said, the graphics, physics and gameplay isn't exactly out of this world. Unreal 3, Half life 2, Far Cry look much better and I have not tried out the Unreal 3 yet but still come on.

I am waiting on a Dual Core 64-bit CPU and PCI Express so I can get a new system.

How have you established that the slow downs are due to the VPU? As others have posted here Doom3 engine is quite CPU intensive. Also while Far Cry, HL2 etc may look better subjectively, are they doing the same type/amount of work on the CPU?

Have you tried the same test at different resolutions? If the drop is not there at lower resolutions, then it would indicate the issue is to do with the VPU, but if the drop still exists even at low levels, then its more likely to be a CPU/System limitation.
 
Instancing

jvd said:
wish I could see the world like you see it. I see this as
yet another FUD from ATI's marketing Department to make it
seem like they have SM 3.0 features, along with Humus's branching
demo who just happens to work for ATI.

Well they support instancing. Which is a sm3.0 feature. So its not fud from ati.

If they do instancing they do it. Judging from the benchmarks they do it faster than nvidia do it. So how are they spreading fud ?

The problem lies in ms tieing instancing to sm3.0 though all of ati's dx 9 cores do it all the way down to the r300.

I have not seen benchmarks to support your claim. I have seen people posting results that are different on every machine because of other factors but I have not seen a benchmark on anyone being faster than anyone else. I think ATI being faster in instancing is all in your head.
 
Re: Doom 3 issues

Proforma said:
kyleb said:
lol, your cpu is what is slowing you down, not your ati card.

Its not exactly like I have a 486 trying to run Doom 3 here. Whats the latest Pentium 4 3.60 Ghz? Its not like CPU's are at 6 Ghz or something
and like I said, the graphics, physics and gameplay isn't exactly out of this world. Unreal 3, Half life 2, Far Cry look much better and I have not tried out the Unreal 3 yet but still come on.

You're deluding yourself if you think the CPU is irrelevant to Doom3 performance. Or did you miss this anandtech article? That 2GHz P4 (presumably paired with PC2100) isn't going to beat anything on that chart.

As for whatever you're supposed to be programming, exactly when do you plan on releasing this? How widespread do you think SM3.0 parts will be when you do? What percentage of that market do you expect to capture?
 
DX caps

Take a look as the DXCaps list under DX9.0c – there is now a caps bit for it; this wasn’t the case initially.

I will take a look at this to see if its there.

Well, that was Humus’s demo, not ATI’s and he is neither part of their marketing nor was it released by ATI.

Well, it seems like he does his own marketing when he can and he
just happens to work for ATI. I guess ATI doesn't say much about it.
He seems to me to be a big evangelist of both ATI and OpenGL.

However, ATI’s instancing offers exactly what DX requires (other than VS3.0!) and provide the same benefit – and that is offered to every user from 9500 to X800; what’s the issue if it provides the right benefits? Isn’t it doing ATI users more of a disservice not to offer those benefits? Its not as though they just enabled this as a selling point for X800 but enabled it for all boards that have the capability for supporting it.

Well I never said that the ATI instancing is a cheat, I don't believe that for a minute, not from the code that I have seen. I just have a problem with not being able to tap that power, but if you say that its in DX caps now then I will check it out and see for myself as I have not seen it before.

Then, as a programmer, shouldn’t you be requesting that MS offer a Caps bit for it as they have with Centriod Sampling?

I would rather them have a different system altogether. Lets hope and pray that DX 10 will have some ajustments in this area.
 
Re: Doom 3 issues

Proforma said:
FUDie said:
DaveBaumann said:
Proforma said:
kyleb said:
lol, your cpu is what is slowing you down, not your ati card.
Well I don't have any problems with FARCry or any unreal 2004 game and Doom 3's graphics arn't really that good to make it this slow.
As I said before - vertex skinning and the vertex calculation in Doom3 are done on the CPU, not the graphics core. The more characters you have one screen the more vertex processing the CPU is doing (both character skinning and shadow extrusion for the extra objects).
C'mon, Dave, he's a programmer, certainly he must know what he is talking about! :LOL:
Well I don't know everything, but you don't either.
Must be called FUD-ie for a reason. duh.
It was a name I chose, not one that was given to me. And you have no idea what I know.
I understand about vertex skinning being on the CPU instead of what it should be on ala the GPU. I got that part, the issue is that I have had all this before on the CPU and it never went as slow as Doom 3, maybe its because of the shadows.
What an amazing thought! MAYBE IT'S THE SHADOWS! Now why didn't anyone else think of that? Did you ever think that this hasn't been discussed before? Did you ever read NVIDIA's 3DMark03 optimization document? It discusses this very method. B3D has also had numerous discussions on DOOM 3's rendering.

-FUDie
 
Re: I made some of the same observations and was labeled &lt

Proforma said:
...ATI is putting out a lot of FUD these days and they sound like Nvidia.

Examples? Otherwise please stop spewing your opinions as facts.
 
Re: Doom 3 issues

You're deluding yourself if you think the CPU is irrelevant to Doom3 performance. Or did you miss this anandtech article? That 2GHz P4 (presumably paired with PC2100) isn't going to beat anything on that chart.

Do you see that CPU list? Do you see the lowest CPU score
on that list which is for an Athlon XP 2000+ (1.6 ghz) for 1280x1024?

yeah, well I am at 1024x768 and I am not getting those speeds
and I have a 2Ghz Pentium 4. I would say I get frame rates less that and that your asumption would be correct, but its so slow and choppy that it makes me think its not just a CPU issue. I didn't say a faster CPU wouldn't make a difference, it would make the world of difference since Carmack decided to do a ton of things on the CPU instead of the GPU.

As for whatever you're supposed to be programming, exactly when do you plan on releasing this? How widespread do you think SM3.0 parts will be when you do? What percentage of that market do you expect to capture?

Its going to be a few years, its an offline RPG. By then maybe ATI might make a SM 3.0 part.
 
Re: Doom 3 issues

Proforma said:
Do you see that CPU list? Do you see the lowest CPU score
on that list which is for an Athlon XP 2000+ (1.6 ghz) for 1280x1024?


um.... those benchmarks are using a 6800 ultra btw......
 
Re: Doom 3 issues

Proforma said:
Do you see that CPU list? Do you see the lowest CPU score
on that list which is for an Athlon XP 2000+ (1.6 ghz) for 1280x1024?

...

yeah, well I am at 1024x768 and I am not getting those speeds

There are two charts - one for 800x600, the other for 1280x1024. The slowest scores at both resolutions are the same - and that's paired with a 6800 Ultra. It shouldn't be too hard to work out that the CPU is your problem - especially as the timedemo level isn't the most strenuous.

Its going to be a few years, its an offline RPG. By then maybe ATI might make a SM 3.0 part.

Emphasis added 8)

[edit] Also note that the timedemo doesn't exercise the AI, sound propagation, or physics code - all CPU dependent.
 
Re: DX caps

Proforma said:
I just have a problem with not being able to tap that power,

You can - when you want to check for instancing rather than just checking for VS3.0 in the CAPS you check for VS3.0 or ATI's FOURCC mode - I do other forms of programming in my day job and the "or" operator doesn't offer me to many difficulties! ;)

but if you say that its in DX caps now then I will check it out and see for myself as I have not seen it before.

To be clear - I was talking about Centriod Sampling now having a CAPS bit. Centriod Sampling was initially in a very similar situation to DX9 support until an issue turned up with Valve and MS give it its own CAPS bit.

As for the performance of Instancing, while I haven't tested it myself, Driverheaven appears to have.
 
Re: Doom 3 issues

Alstrong said:
Proforma said:
Do you see that CPU list? Do you see the lowest CPU score
on that list which is for an Athlon XP 2000+ (1.6 ghz) for 1280x1024?


um.... those benchmarks are using a 6800 ultra btw......

Right, but it shouldn't matter that much on the GPU, should it not? I have a decent GPU, Radeon 9800 Pro with 128 megs
 
Re: DX caps

DaveBaumann said:
Proforma said:
I just have a problem with not being able to tap that power,

You can - when you want to check for instancing rather than just checking for VS3.0 in the CAPS you check for VS3.0 or ATI's FOURCC mode - I do other forms of programming in my day job and the "or" operator doesn't offer me to many difficulties! ;)

but if you say that its in DX caps now then I will check it out and see for myself as I have not seen it before.

To be clear - I was talking about Centriod Sampling now having a CAPS bit. Centriod Sampling was initially in a very similar situation to DX9 support until an issue turned up with Valve and MS give it its own CAPS bit.

As for the performance of Instancing, while I haven't tested it myself, Driverheaven appears to have.

Well just to be clear I am looking for a way to detect instancing on all cards and not have to check for a card ID specifically. I want to use DX to check dynamically via drivers.

The driverheaven peformance proves nothing. It doesn't prove new driver results as new drivers can change everything and drivers can make all the difference in the world, they only ran it on one machine, etc...
 
Re: DX caps

Proforma said:
Well just to be clear I am looking for a way to detect instancing on all cards and not have to check for a card ID specifically. I want to use DX to check dynamically via drivers.

Yes, you simply check for VS3.0 capabilities or if the "INST" FOURCC mode is exposed in the drivers - simple, and caters for every card that has instancing capabilities.

The driverheaven peformance proves nothing. It doesn't prove new driver results as new drivers can change everything and drivers can make all the difference in the world, they only ran it on one machine, etc...

Well, it proves that it makes a difference and in the same order of magnitude as NV40.
 
Re: I made some of the same observations and was labeled &lt

gkar1 said:
Proforma said:
...ATI is putting out a lot of FUD these days and they sound like Nvidia.

Examples? Otherwise please stop spewing your opinions as facts.

Examples that they arn't? Check out the latest crap out of the mouth of ATI's marketing department. No facts to back up what they say and nobody has been able to give me those facts = FUD, ala Nvidia.

Why do a ton of you people love ATI so much? Its getting as bad as the love for Nintendo. Do they own you? Why are you caught up with brand so much?

I bought an ATI Radeon before that I had a Geforce, and then a Geforce 3. I buy what I need to, I don't do any favors for any company as they arn't doing me any favors.

A lot of the people, dare I say most are ATI lapdogs and this is sad. Instead having a mind of your own.

Thats why I hardly can't stand on coming on here anymore, because the facts are not facts anymore but just FUD From both companies and you guys don't have a mind of your own.

I don't lick up Nvidia, I just think that after two years of not being for me, they are for me this year. I hold no alligance to them nor am I going to do them any favors. I buy on the basis whats fast and what also has the features I need. Unlike a lot of you, I don't get hung up that Nvidia lied for the last two years while ATI has been lying for the last six months. Neither company is any better than the other, they just make 3D cards and I happen to use some of the features of those cards and not just for gaming and I am nobody's lapdog and I find it sad that some of you are.
 
Re: DX caps

DaveBaumann said:
Proforma said:
Well just to be clear I am looking for a way to detect instancing on all cards and not have to check for a card ID specifically. I want to use DX to check dynamically via drivers.
Yes, you simply check for VS3.0 capabilities or if the "INST" FOURCC mode is exposed in the drivers - simple, and caters for every card that has instancing capabilities.
There's slightly more to it than that, but ATI Developer Relations will be glad to fill you in.
 
Re: I made some of the same observations and was labeled &lt

Proforma said:
gkar1 said:
Proforma said:
...ATI is putting out a lot of FUD these days and they sound like Nvidia.

Examples? Otherwise please stop spewing your opinions as facts.

Examples that they arn't?

You're kidding, right? Examples that they aren't saying something? :rolleyes:

Check out the latest crap out of the mouth of ATI's marketing department.

Where's this "latest crap" at? Post a link, and I hope it's not that oc-zone interview of Richard Huddy. Because the only "fud" in that interview is a direct result of the questions asked. Ask someone questions about their competition, and chances are that their response will probably be about their competition. Seriously, what do you expect when you ask ati's director of public relations a question like:

Everybody knows Nvidia puts a lot of resources and time for marketing their products aggressively, but how do you see NVIDIA solution? (NV40)
 
Re: I made some of the same observations and was labeled &lt

Proforma said:
gkar1 said:
Proforma said:
...ATI is putting out a lot of FUD these days and they sound like Nvidia.

Examples? Otherwise please stop spewing your opinions as facts.

Examples that they arn't? Check out the latest crap out of the mouth of ATI's marketing department. No facts to back up what they say and nobody has been able to give me those facts = FUD, ala Nvidia.

Why do a ton of you people love ATI so much? Its getting as bad as the love for Nintendo. Do they own you? Why are you caught up with brand so much?

I bought an ATI Radeon before that I had a Geforce, and then a Geforce 3. I buy what I need to, I don't do any favors for any company as they arn't doing me any favors.

A lot of the people, dare I say most are ATI lapdogs and this is sad. Instead having a mind of your own.

Thats why I hardly can't stand on coming on here anymore, because the facts are not facts anymore but just FUD From both companies and you guys don't have a mind of your own.

I don't lick up Nvidia, I just think that after two years of not being for me, they are for me this year. I hold no alligance to them nor am I going to do them any favors. I buy on the basis whats fast and what also has the features I need. Unlike a lot of you, I don't get hung up that Nvidia lied for the last two years while ATI has been lying for the last six months. Neither company is any better than the other, they just make 3D cards and I happen to use some of the features of those cards and not just for gaming and I am nobody's lapdog and I find it sad that some of you are.


I meant to say back your statements with quotes and facts. Stop spewing your opinions as facts. And for a "programmer" you certainly are a very lazy one when you are bitching about checking FOURCC formats to support instancing on hardware of potential users of your "game".

Hopefully your game will get a publisher eh? ;)
 
Well, personally I am concerned that ATI not supporting SM3.0 at this point is going to bite them, and their users, on the ass about a year from now when various SM3.0 code written around NV parts between now and then has issues on ATI hardware. We've seen it before.
 
digitalwanderer said:
geo said:
We've seen it before.
When? (Not meaning to flame, I'm really curious)

ATI bitching during R8500 days that a good chunk of their infamous "driver problems" were actually the result of developers writing to NV's bugs. If memory serves (and if I'm incorrect I'll apologize to him) OGL Guy offered that explanation more than once on these boards at the time.

NV bitching NV3x days that DX9 (SM2.0) code was written around ATI and disadvantaged them seriously.

Was some of that sour grapes? Probably. Was some of it true? Probably. Having been on the smelly end of that stick once, it surprises me a bit that ATI willingly put themselves there again. As I said, not today, but soon.

"The race is not always to the swift, nor the battle to the strong --but that's the way to bet." --Damon Runyon
 
Status
Not open for further replies.
Back
Top