ATI Benchmarking Whitepaper

PiXEL_ShAdER said:
oh FFS, are you saying only nvidia can do the effects, I mean what the f*** are you taking about man, if deveopers got off there arses are put these effects in the games you would'nt be saying that.
Can you say "proprietary"? The effects in the Vulcan demo ain't a part of DX9 or openGL that I know of, they're kind of unique to nVidia...thus developers aren't going to waste their time on some effects that are going to run on only a very small percentage of their possible customers PCs. :) (The high-end FX cards ain't been selling so good. ;) )
 
digitalwanderer said:
PiXEL_ShAdER said:
oh FFS, are you saying only nvidia can do the effects, I mean what the f*** are you taking about man, if deveopers got off there arses are put these effects in the games you would'nt be saying that.
Can you say "proprietary"? The effects in the Vulcan demo ain't a part of DX9 or openGL that I know of, they're kind of unique to nVidia...thus developers aren't going to waste their time on some effects that are going to run on only a very small percentage of their possible customers PCs. :) (The high-end FX cards ain't been selling so good. ;) )

Am I bothed about nvidia sales, NO, obviously you've got nothing better to do.

Thats why Nvidia have a How to do volumtic textures, ultra shadow in the deveopers area, because I dont see any fire effects that come close to that.
 
PiXEL_ShAdER said:
Am I bothed about nvidia sales, NO, obviously you've got nothing better to do.
Uhm, I must have missed something there. This sentance doesn't make a whole lot of sense to me. Could you try explaining a little more comprehensibly please?

Thats why Nvidia have a How to do volumtic textures, ultra shadow in the deveopers area, because I dont see any fire effects that come close to that.
Again, propriatary effects. The developers already have to spend way too much time coding around nVidia's hardware's shortcomings, do you really think they want to waste more time coding in effects that won't really help their sales? (Since there just ain't that many high-end FX owners....period.)
 
PiXEL_ShAdER said:
The GF2 could do per pixel lighting and shadows but no deveopers used the technology.

Sure it could. Probably looked great in the Demo too, But as with a lot of other Nvidia "Advanced Features" it did it so slowly it couldn't be used in a Game engine. That's probably why it was never used. Just as Today, the FX architecture is capable of some amazing effects when run in a demo dedicated to showing that effect, But when used in a Game engine (Along with all the other things going on), would drag the game down to the point that it was unplayable. Witness the trouble just with a simple thing like PS 2.0 shaders or 32b FP precision. . They're great at announcing wonderful new features(For PR Value), and terrible at getting them to work fast enough in a Game enviroment to be useful.
 
Am I bothed (...) volumtic textures (...) deveopers

That was supposed to be "Am I bothered", "volumetric textures", and "developers", but you seem to optimize a large amount of spelling. But when you are just scrolling down posts, that's close enough...

Seriously, if you can't bother to at least proof-read your posts a little, then don't bother posting.

You remind me of a guy I ran in a few years ago on a (now deceased) Matrox-related French forum. He was busy explaining how great the GF2 was going to be (with the "per-pixel lightning), and how the NV20 was going to kill everything... He was exposed by the webmaster and turned out to be a not-so-clever Nvidia PR employee... You wouldn't happen to work near Santa Clara ?

The GF2 could do per pixel lighting and shadows but no deveopers used the technology.

Yep, and the NV1 could do those "quadratic textures", I suppose this was another great idea that the developers didn't want to use ?
 
Oh come on, FFS, keep to the point, I cannot be bothered to proof read my posts, besides you know what I mean and i'm typing to fast thats my problem.

hmm, Doom3 will be using per pixel lighting, and just look how good that is, it also uses alot of old DX style technology but still puts all games on it's arse for looks.
 
Again, please, tone the posts down.

However, Doom3 is a perfect illustration of the point - Doom3 is basically a match for the original GeForce featureset, and yet despite modern cards being able to perform all the work in a single pass the performance is still pretty low. Tech demos showing off specific things often are not contrained by the same elements actual games are which is one of the reasons why you see a disparity between tech demos and games of the now.
 
PiXEL_ShAdER said:
Oh come on, FFS, keep to the point, I cannot be bothered to proof read my posts, besides you know what I mean and i'm typing to fast thats my problem.
Friendly piece-o-advice two: No, not everyone will "know what you mean". That's why it is important to take the time to make sure that your post does get across your idea/meaning to the best of your ability.

Don't type faster than you can think, it can get you into some REAL trouble...trust me. ;)
 
PiXEL_ShAdER said:
hmm, Doom3 will be using per pixel lighting, and just look how good that is, it also uses alot of old DX style technology but still puts all games on it's arse for looks.
Depends.
Its lighting will look better than other games, sure - but right now, compare screenshots of HL2 and Doom3 characters - the HL2 ones look vastly better.

Doom3 is not the "be all end all" of PC graphics. It will have the best looking lighting, but not the best looking models. They are too low poly.
 
PiXEL_ShAdER said:
oh FFS, are you saying only nvidia can do the effects, I mean what the f*** are you taking about man, if deveopers got off there arses are put these effects in the games you would'nt be saying that.

The GF2 could do per pixel lighting and shadows but no deveopers used the technology.

That's because developers only want to code for the standard API of OpenGL and DirectX. The whole industry has been moving for the last half decade towards these two API's where the developer's content is shown the same on all hardware.

Nvidia came out with hardware that needs customised code paths outside of the standard API in order to get decent speed and at the cost of image quality, and you wonder why developers don't want to use it? Even now, developers are reporting that the Nvidia drivers never do what they are supposed to do, and change from leaked release to leaked release.

Trying to make developers jump through hoops they left behind years ago in order to support hardware that doesn't comply to standards, at a time when game development is more costly and lengthly than ever was *never* going to work.

Nvidia thought they could force it to happen by being the dominant graphics chip supplier, but it's their bad luck that Nvidia were a year late while ATI shipped the far more capable R3x0 series. Maybe Nvidia should "get off their arses" and ship a graphics chip that can run standard API code with as much speed and IQ as their ATI competition?
 
Bouncing Zabaglione Bros. said:
That's because developers only want to code for the standard API of OpenGL and DirectX. The whole industry has been moving for the last half decade towards these two API's where the developer's content is shown the same on all hardware.

I'm sure many developers like to program towards certain cards' uniqueness, but they'd rather the extra coding goes toward EXTRA performance, rather than playing catch-up to other cards or striving for playability. ;)
 
PiXEL_ShAdER doesn't need links or proof... apparently his fervent statements are enough....
 
In the Beta version I downloaded of HL2 there was OpenGL support, I ask a friend if mine to try OpenGL on his Radeon 9700 Pro, It didnot work.

Not really worth the download at 1.3Gb but some levels were quite impressive I may add, by the way, AA worked fine.

If you want proof of OpenGL I can get my friend to send me the shoots, as I dont have it anymore because it was not worth keeping.
 
Back
Top