"ATI is smug but Nvidia's the bug in the rug"

Status
Not open for further replies.
Doom 3 Performance

As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce 3 video card that is two years old will deliver a solid gaming experience that will let you enjoy the game the way id Software designed it to be. That fact alone should let many of you know that you will not be left behind in experiencing DOOM 3.

Source: http://www2.hardocp.com/article.html?art=NjQy

You have seen my specs and I can't get a decent speed to save my life and everyone tells me its a CPU issue. hmmmmm
 
Re: Doom 3 Performance

Proforma said:
As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce 3 video card that is two years old will deliver a solid gaming experience that will let you enjoy the game the way id Software designed it to be. That fact alone should let many of you know that you will not be left behind in experiencing DOOM 3.

Source: http://www2.hardocp.com/article.html?art=NjQy

You have seen my specs and I can't get a decent speed to save my life and everyone tells me its a CPU issue. hmmmmm

well if u set your res to 640x480 and put your settings to low u will get 60 fps no sweat .


So i don't see your point
 
Evildeus said:
http://www2.hardocp.com/image.html?image=MTA5MDc4NzE0M1RPNjJBTU9FV1hfOF8zX2wuZ2lm

thanks evil . I dunno about u guys but with an average of 49.8 fps out of a possible 60 (as the game is capped at that ) it provides an enjoyable experiance
 
That's not entierely true, if you look at the graph you will see some really big drop (somtimes more than 30 FPS) which should mean shuttering, not really enjoyable all the time, even if the average is pretty good :)
 
geo said:
Was some of that sour grapes? Probably. Was some of it true? Probably. Having been on the smelly end of that stick once, it surprises me a bit that ATI willingly put themselves there again. As I said, not today, but soon.

Two things can happen. ATI's lack of SM3.0 could retard the development of SM3.0 and people will continue to code SM2.0b. This seems quite likely to me given the very small number of high end Nvidia cards that are capable of utilising SM3.0 at useful speeds, and the long development lag to catch up with hardware specs. The fact that ATI provides things like instancing and seems to run 2.0b faster than Nvidia's SM3.0 will help. In short, Nvidia is overselling the importance and speed with which SM3.0 will be useful this generation.

The alternative is that Nvidia goes nuts and spends a *lot* of money to get game devs to do exclusive SM3.0 paths. These games will very likely be more than playable on ATI hardware, and as soon as ATI bring out SM3.0 hardware, ATI will be able to take advantage of all the money Nvidia has poured into getting SM3.0 supported.
 
Bouncing Zabaglione Bros. said:
The alternative is that Nvidia goes nuts and spends a *lot* of money to get game devs to do exclusive SM3.0 paths. These games will very likely be more than playable on ATI hardware, and as soon as ATI bring out SM3.0 hardware, ATI will be able to take advantage of all the money Nvidia has poured into getting SM3.0 supported.

Hopefully that... Or those games will need heavy patching to work for Ati... which TWIMTBP program and Nvidia might disallow or delay.
 
Has anybody released a demo/game/app/whatever that showed SM3.0 benefits on nv40?

I mean: a. stuff that can't be done using sm2.0 (a,b,...) and can be done real-time on nv40
b. faster algorithms using SM3.0 on that specific piece of hardware.

because in one year time i see Nvidia publising another list of guidelines that start with: "use the lowest shader version"

I mean SM3.0 IS great, but is it great on NV40?
 
Re: Doom 3 issues

Proforma said:
Alstrong said:
Proforma said:
Do you see that CPU list? Do you see the lowest CPU score
on that list which is for an Athlon XP 2000+ (1.6 ghz) for 1280x1024?


um.... those benchmarks are using a 6800 ultra btw......

Right, but it shouldn't matter that much on the GPU, should it not? I have a decent GPU, Radeon 9800 Pro with 128 megs

Just to point out I have the same GPU (actually not the same, a bit worse - the vanilla 9800) and 768 mb of ram with A64 3000+ - playing @ 1024*768 no slowdowns...

one more thing you want to have at least 768 MB of ram even though 1 gig would be better.
 
Mendel said:
Bouncing Zabaglione Bros. said:
The alternative is that Nvidia goes nuts and spends a *lot* of money to get game devs to do exclusive SM3.0 paths. These games will very likely be more than playable on ATI hardware, and as soon as ATI bring out SM3.0 hardware, ATI will be able to take advantage of all the money Nvidia has poured into getting SM3.0 supported.

Hopefully that... Or those games will need heavy patching to work for Ati... which TWIMTBP program and Nvidia might disallow or delay.

It's the same situation that Intel is in with AMD with regards to 64bit, and AMD is in with Intel with regards to multi-core chips. Right now each company is pushing the technolgy lead they have over the other, while quietly developing their own version of the other's technology. As they each launch their own version, they can take advantage of the investment their competitor has made in that area.

The main thing is that the programmer mindset has been changed, the tools have been developed, techniques and software have caught up with the hardware.

In terms of graphics cards, the last generation for SM2.0 was different with ATI and Nvidia, because both companies offered SM2.0, but Nvidia's implementation was much worse.

This time around Nvidia have more features in the form of SM3.0, but it's slower and more costly in terms of manufacture and power. ATI have a faster, cheaper, cooler SM2.0b, but have to risk Nvidia being able to do something useful with SM3.0. If Nvidia manage to change programmers' mindsets to use SM3.0, and ATI bring out their own SM3.0 hardware before these games hit the market in 18 months, ATI will be able to take advantage of all that Nvidia investment.
 
vb said:
Has anybody released a demo/game/app/whatever that showed SM3.0 benefits on nv40?

Anandtech did some benchmarks in 'Far Cry' using the SM3.0 path. A lot of the time it didn't make too much difference, but in particular cases (especially instances with multiple light sources) then the perfomance improvement was significant - over 10%.

The main point that the performance numbers make is not that SM3.0 has a speed advantage over SM2.0 (as even the opposite may be true), but that single pass per-pixel lighting models can significantly reduce the impact of adding an ever increasing number of lights to a scene.

For next generation engines (and their developers) this will be a very important point. The only argument is really at what point it will become significant (I have a feeling ATI fans will only acknowledge that point has come once ATI release SM3 capable hardware ;) )
 
Re: I made some of the same observations and was labeled &lt

gkar1 said:
Proforma said:
...ATI is putting out a lot of FUD these days and they sound like Nvidia.

Examples? Otherwise please stop spewing your opinions as facts.

#1 - ATI praising their Doom3 shader replacement in an investor conference call, contradicting all of the noise they made last year when Nvidia made shader replacements in games. And then again recently Huddy's FUD in an interview of accusing Nvidia of shader replacements in Doom3, despite their being no evidence that their are any for the NV40 chipset (in fact coding experiments by users here support this) - and his overlooking of ATI's shader replacement that they praised in the conference call.

#2 - ATI making noise about Nvidia using partial precision in Doom3 reducing quality in an official statement to Team Radeon.com, while IQ comparisons have shown this is completely false and FUD. In fact, HardOCP said Nvidia had slightly better IQ than ATI in Doom3.

#3 - ATI making a big deal about their PCI-express cards being native, and claiming that it would give them a big performance advantage over the competition - this is shown to be FUD, as Nvidia's bridged chips are just as fast, and their SLI implementation is much faster.

#4 - Last year Nvidia's Brilinear was cheating and not real Trilinear according to ATI, this year Brilinear is "Optimized Trilinear" and better than full Trilinear according to ATI, as they don't offer full Trilinear - and were caught instructing reviewers to disable brilinear and enable full trilinear on Nvidia cards full knowing their own card was only doing brilinear.

#5 - ATI's FUD that Shader Model 3.0 gives nothing over plain old SM2.0 and is too slow to be used this generation - yet FarCry's SM3.0 gave Nvidia cards significant speed boosts, and that was only very basic usage of SM3.0 with none of the advanced features. Beyond3D details the potential of SM3.0 over SM2.0 - once more advanced features start getting used we may see the performance gap widen, as ATI themselves stated in Huddy's "Save the Nanosecond" powerpoint presentation. In this presentation, Huddy noted that ATI will attempt to hold back development of advanced SM3.0 coding until they have a SM3.0 part available to support their FUD, which is only bad for the gaming industry.

#6 - ATI's relationship with Valve, who has become one of the biggest FUD spreaders of them all.
 
PatrickL said:
Ruined it was asked to answer with facts not with bullshit, thanks.

Right, if you need any proof of the above statements they are all available on the web, not bullshit, but truth. If you want when I get back from work later I can provide links for *all of them*. Poster asked for examples of ATI's FUD, I pointed them out.
 
Diplo said:
For next generation engines (and their developers) this will be a very important point. The only argument is really at what point it will become significant (I have a feeling ATI fans will only acknowledge that point has come once ATI release SM3 capable hardware ;) )

Edit:
Both of our custom benchmarks show ATI cards leading without anisotropic filtering and antialiasing enabled, with NVIDIA taking over when the options are enabled. We didn't see much improvement from the new SM3.0 path in our benchmarks either.

shader power differences are to be seen most without AF (it just takes more texture samples) or AA. 10% gain from sm2.0 path using multiple shaders to sm3.0 path using one shader with branching and still performing lower than other PS2.0 hardware doesn't sound too impresive.
/edit

SM3.0 is a big evolutionary step for shaders. SM4.0 will keep most PS3.0 requirements and apply them to VS and PS equally

To restate my question: do we have anything to prove or disprove that NV40 is a high performing SM30 piece of hardware?[/quote]
 
Ruined said:
#1 - ATI praising their Doom3 shader replacement in an investor conference call, contradicting all of the noise they made last year when Nvidia made shader replacements in games. And then again recently Huddy's FUD in an interview of accusing Nvidia of shader replacements in Doom3, despite their being no evidence that their are any for the NV40 chipset (in fact coding experiments by users here support this) - and his overlooking of ATI's shader replacement that they praised in the conference call.

1 - ati aren't doing any shader replacements, they have made notice that performance can be improved. They have not put in direct shader replacements in the driver for Doom3 unlike nvidia as pointed out by John Carmack. In the past when these issues have cropped up, ati have asked the developer to make the changes to the code and not do it directly in the drivers - there is nothing to suggest this will not be the case here.

2 - John Carmack made the statements about the shader replacements in nvidia's drivers, ati just repeated that.

#2 - ATI making noise about Nvidia using partial precision in Doom3 reducing quality in an official statement to Team Radeon.com, while IQ comparisons have shown this is completely false and FUD. In fact, HardOCP said Nvidia had slightly better IQ than ATI in Doom3.

iq comparisons do not prove this is fud, since it may not be too noticable - as John Carmack has stated before. The most natable differences in IQ are probably coming from the anisotropic filtering bug highlighted in humus's thread.

#3 - ATI making a big deal about their PCI-express cards being native, and claiming that it would give them a big performance advantage over the competition - this is shown to be FUD, as Nvidia's bridged chips are just as fast, and their SLI implementation is much faster.

read the recent beyond3d review with the 6800 gt - the native pci express solution from ati is faster in applications require the bandwidth and that was what ati claimed.

#4 - Last year Nvidia's Brilinear was cheating and not real Trilinear according to ATI, this year Brilinear is "Optimized Trilinear" and better than full Trilinear according to ATI, as they don't offer full Trilinear - and were caught instructing reviewers to disable brilinear and enable full trilinear on Nvidia cards full knowing their own card was only doing brilinear.

except that you could see it when nvidia first did it, nobody saw ati's for a year.

#5 - ATI's FUD that Shader Model 3.0 gives nothing over plain old SM2.0 and is too slow to be used this generation - yet FarCry's SM3.0 gave Nvidia cards significant speed boosts, and that was only very basic usage of SM3.0 with none of the advanced features. Beyond3D details the potential of SM3.0 over SM2.0 - once more advanced features start getting used we may see the performance gap widen, as ATI themselves stated in Huddy's "Save the Nanosecond" powerpoint presentation. In this presentation, Huddy noted that ATI will attempt to hold back development of advanced SM3.0 coding until they have a SM3.0 part available to support their FUD, which is only bad for the gaming industry.

1 - The later far cry patch, using shader 2 again, puts the performance right back up there for ati's cards as well.

2 - Huddy's presentation was actually suggesting that if you use branching in nv40 you are likely to hurt performance more than using multiple shaders. there have been tests that show branching is quite costly on nv40.

#6 - ATI's relationship with Valve, who has become one of the biggest FUD spreaders of them all.

And what of nvidia's relationship with id, who they even list in their financial disclosures? funny that nobody questions the performance disparity here and the obvious financial ties.
 
Re: I made some of the same observations and was labeled &lt

Ruined said:
gkar1 said:
Proforma said:
...ATI is putting out a lot of FUD these days and they sound like Nvidia.

Examples? Otherwise please stop spewing your opinions as facts.

#1 - ATI praising their Doom3 shader replacement in an investor conference call, contradicting all of the noise they made last year when Nvidia made shader replacements in games. And then again recently Huddy's FUD in an interview of accusing Nvidia of shader replacements in Doom3, despite their being no evidence that their are any for the NV40 chipset (in fact coding experiments by users here support this) - and his overlooking of ATI's shader replacement that they praised in the conference call.

I seem to recall the guy said "sources on the web showed", didn't use the word "their". Carmack himself says NV are doing shader replacement, and you yourself have said on may forums that ATI can solve the Doom3 performance by fixing the application controlled AF.

#2 - ATI making noise about Nvidia using partial precision in Doom3 reducing quality in an official statement to Team Radeon.com, while IQ comparisons have shown this is completely false and FUD. In fact, HardOCP said Nvidia had slightly better IQ than ATI in Doom3.

One sites opinion makes Fact? Reducing precision can loose quality, whether it is perceivable to the end user is a different matter.

#3 - ATI making a big deal about their PCI-express cards being native, and claiming that it would give them a big performance advantage over the competition - this is shown to be FUD, as Nvidia's bridged chips are just as fast, and their SLI implementation is much faster.

Source on performance advantage? Sales performance perhaps? ;)

SLI faster compared to what?

#4 - Last year Nvidia's Brilinear was cheating and not real Trilinear according to ATI, this year Brilinear is "Optimized Trilinear" and better than full Trilinear according to ATI, as they don't offer full Trilinear - and were caught instructing reviewers to disable brilinear and enable full trilinear on Nvidia cards full knowing their own card was only doing brilinear.

last years brilinear is much different in image quality to this years. I would think the reviewer document was taking this into account.

#5 - ATI's FUD that Shader Model 3.0 gives nothing over plain old SM2.0 and is too slow to be used this generation - yet FarCry's SM3.0 gave Nvidia cards significant speed boosts, and that was only very basic usage of SM3.0 with none of the advanced features. Beyond3D details the potential of SM3.0 over SM2.0 - once more advanced features start getting used we may see the performance gap widen, as ATI themselves stated in Huddy's "Save the Nanosecond" powerpoint presentation. In this presentation, Huddy noted that ATI will attempt to hold back development of advanced SM3.0 coding until they have a SM3.0 part available to support their FUD, which is only bad for the gaming industry.

NV them self have said the dynamic branching in SM3 can have a major performance hit. Has it been utilized to it's full extent? How many other SM3 features are not used because of the performance disadvantage at this present time?

#6 - ATI's relationship with Valve, who has become one of the biggest FUD spreaders of them all.

So what does that make Electronics Arts then?
 
Status
Not open for further replies.
Back
Top