New ATi Demo

jvd, I have never "faulted" the R420 or put down ATI. Probably the most "damning" criticism you could find from me is my opinion that "temporal" AA isn't all people were cracking it up to be. Other than that, I am and have been pleased by ATI.

The problem with the NVidia attacks is that not only do they not get any credit for the extra features, but people go out of their way to talk them down. I mean, I'd give ATI props for implementing HDR in the R300 even though it's not really used that much, if at all. But if I were to take the tone of NVidia opponents, I'd not only have to not give credit for HDR in R300, I'd have to viciously attack it as a toy "unusable" feature, a PR checkbox, etc

In people's attempts to argue against overhype of SM3.0, they often go too far in the other direction.
 
DemoCoder said:
But if I were to take the tone of NVidia opponents, I'd not only have to not give credit for HDR in R300, I'd have to viciously attack it as a toy "unusable" feature, a PR checkbox, etc

Well that rather depends on whether something *is* a is a usable feature or a marketing tickbox, doesn't it? For instance, was PS2.0 usable in games on NV3x? Or was it a marketing tickbox? After all, Nvidia did keep telling people to use lower shader versions, even doing silent shader replacement in their drivers.

I know everyone has their biases, but you have to allow for what the facts of the matter are, not just assume that everyone who says something negative about any IHV (particularly your favourite one) is simply promoting their bias with no actual facts to back them up.
 
Bouncing Zabaglione Bros. said:
Well that rather depends on whether something *is* a is a usable feature or a marketing tickbox, doesn't it? For instance, was PS2.0 usable in games on NV3x?

That all depends on the shader and how it is used. Even >PS1.4 length PS2.0 shaders become less than usable as length increases. We had lots of synthetic benchmarks, but no top-tier heavy-PS2.0 games. Even FarCry isn't really a PS2.0 title. But people routinely come up with theories like "ATI only adds features when they are usable, like 3dfx did with 16vs32-bit color, vs NVidia which adds features that are always unusable". No one seemed to question whether ATI's HDR was "usable", and it was used to bash Nvidia over the head when the NV3x came out and didn't have it. On the other hand, people preeemptively questioned all the NV4x features before any evidence even existed.
 
DemoCoder said:
Bouncing Zabaglione Bros. said:
Well that rather depends on whether something *is* a is a usable feature or a marketing tickbox, doesn't it? For instance, was PS2.0 usable in games on NV3x?

That all depends on the shader and how it is used. Even >PS1.4 length PS2.0 shaders become less than usable as length increases. We had lots of synthetic benchmarks, but no top-tier heavy-PS2.0 games. Even FarCry isn't really a PS2.0 title. But people routinely come up with theories like "ATI only adds features when they are usable, like 3dfx did with 16vs32-bit color, vs NVidia which adds features that are always unusable". No one seemed to question whether ATI's HDR was "usable", and it was used to bash Nvidia over the head when the NV3x came out and didn't have it. On the other hand, people preeemptively questioned all the NV4x features before any evidence even existed.
Can you explain to me how R300's HDR worked? I'm sort of new to 3d graphics. Thanks
 
DemoCoder said:
Bouncing Zabaglione Bros. said:
Well that rather depends on whether something *is* a is a usable feature or a marketing tickbox, doesn't it? For instance, was PS2.0 usable in games on NV3x?

That all depends on the shader and how it is used. Even >PS1.4 length PS2.0 shaders become less than usable as length increases. We had lots of synthetic benchmarks, but no top-tier heavy-PS2.0 games. Even FarCry isn't really a PS2.0 title. But people routinely come up with theories like "ATI only adds features when they are usable, like 3dfx did with 16vs32-bit color, vs NVidia which adds features that are always unusable". No one seemed to question whether ATI's HDR was "usable", and it was used to bash Nvidia over the head when the NV3x came out and didn't have it. On the other hand, people preeemptively questioned all the NV4x features before any evidence even existed.


What are trying to say ? That we don't have dx 9 games because the r3x0 series wouldn't run them well ?

The 9700pro is running farcry very well. In fact it is running it better than the 5950 ultra and is almost 2 years old

If anything we don't have sm 2.0 games because nvidia has held it back.


Perhaps they are doing it again with sm3 . Perhaps they have sacrficed sm2 speed to put sm3 in and neither will run as fast as the r420s will run sm2.0 .

We don't have enough info right now but based on the last card from nvidia most will lean towards it not being a usable feature .

Do you blame us ?

If you touched a hot stove and it burned you , would u then 2 years later go back to the hot stove and touch it again ?
 
DemoCoder said:
But people routinely come up with theories like "ATI only adds features when they are usable, like 3dfx did with 16vs32-bit color, vs NVidia which adds features that are always unusable".

People based their opinions of the future on (a) synthetic benchmarks, and (b) historical evidence. That's more valid then simply throwing away all historical evidence and criticising people because they are commenting on future developments. It has also proved to be very good at predicting what the future would bring us. People have pointed to a pattern of corporate behaviour that Nvidia have been showing for years, and your response is "ahh, but we don't know what will happen in the future".

My point is that people are critcal of all IHVs, more so of the big two, and more so of the one that people percieve as having problems or offering lesser products. Why are people more critical of Nvidia in the last couple of years, when they were the powerhouse of consumer 3D graphics cards for several years before that? In my opinion, it's because Nvidia have actually deserved a lot of that criticism, because of their corporate behaviour and the products they have released.
You don't need to defend them from that, because a lot of it is well deserved and actually backed by facts. Instead you seem to be saying that what Nvidia does is okay because not enough people attack ATI.

BTW, can you explain to me if PS2.0 isn't a "real" PS2.0 game, why Nvidia is showcasing it as an example of PS3.0 ? Or why if long shaders become unusable, why Nvidia has been pushing it's very long shader lengths since NV30? Isn't that a marketing checkbox?
 
Bouncing Zabaglione Bros. said:
Or why if long shaders become unusable, why Nvidia has been pushing it's very long shader lengths since NV30? Isn't that a marketing checkbox?
Simple answer: offline rendering and research stuff. Shaders can be used for a lot of applications, not just for computer graphics.
 
nAo said:
Bouncing Zabaglione Bros. said:
Or why if long shaders become unusable, why Nvidia has been pushing it's very long shader lengths since NV30? Isn't that a marketing checkbox?
Simple answer: offline rendering and research stuff. Shaders can be used for a lot of applications, not just for computer graphics.

Nvidia sells it's cards as gaming products. That's who they are addressing whenever they have pushed long shaders, not a tiny minority of researchers. Remember "cinematic computing"? I don't remember Nvidia announcing it was aimed at researchers only...
 
Bouncing Zabaglione Bros. said:
Nvidia sells it's cards as gaming products. That's who they are addressing whenever they have pushed long shaders, not a tiny minority of researchers. Remember "cinematic computing"? I don't remember Nvidia announcing it was aimed at researchers only...

They also need to adress the developers with their cards. And both Ati and Nvidia uses the same chips for the professional market.
 
Bouncing Zabaglione Bros. said:
Nvidia sells it's cards as gaming products.
That's the primary target, not the only one.
That's who they are addressing whenever they have pushed long shaders, not a tiny minority of researchers.
Well..researchers seem to appreciate all these new highly programmable GPUs with long shaders support.
Remember "cinematic computing"? I don't remember Nvidia announcing it was aimed at researchers only...
I don't remember them announcing their products are aimed to gamers only.

ciao,
Marco
 
Bouncing Zabaglione Bros. said:
People based their opinions of the future on (a) synthetic benchmarks, and (b) historical evidence. That's more valid then simply throwing away all historical evidence and criticising people because they are commenting on future developments. It has also proved to be very good at predicting what the future would bring us. People have pointed to a pattern of corporate behaviour that Nvidia have been showing for years, and your response is "ahh, but we don't know what will happen in the future".

That Nvidia has had a habit of adding unusable features is debatable though. The NV3X is surely a problem child but what was the problem with the cards before that ?

And what synthetic benchmarks show that the 6800 won't be able to use it's additional features (SM3.0, FP blending) ?
 
Bjorn said:
That Nvidia has had a habit of adding unusable features is debatable though. The NV3X is surely a problem child but what was the problem with the cards before that ?

Look at the introduction of 32 bit colour, 32 bit precision, FSAA, large textures, T&L, etc. All of these things were introduced as marketing tickboxes that took a generation further to get usable. It's one of the things that contributed greatly to Nvidia's success. By courting OEMs with these tickboxes, Nvidia was able to sell a lot of cards and make a lot of money.

Bjorn said:
And what synthetic benchmarks show that the 6800 won't be able to use it's additional features (SM3.0, FP blending) ?

We've already seen very poor AA peformance above 4xAA, shader performance that seems a lot lower than it should be on Shadermark, no functionality of the 2D processor at all, lots of SM3.0 evangelism on a game that is SM2.0 and seems to look a lot worse on NV40 than R300.

How about you show us the benchmarks that show NV40 *can* use these features well? Prove the positve, rather than asking me to prove the negative.

Of course it's all a bit moot until Nvidia can actually ship the damn things.
 
Bouncing Zabaglione Bros. said:
Look at the introduction of 32 bit colour, 32 bit precision, FSAA, large textures, T&L, etc. All of these things were introduced as marketing tickboxes that took a generation further to get usable. It's one of the things that contributed greatly to Nvidia's success. By courting OEMs with these tickboxes, Nvidia was able to sell a lot of cards and make a lot of money.

There are a lot of people that would disagree on the 32 bit colour thing. And T&L was very usable on the GF1 if i'm not mistaken, and don't bring up the piece of crap TestDrive game as some kind of example that it wasn't. And who was the first company with high performance MSAA ?

Nvidia has introduced a lot of feature that perhaps hasn't been used that much under the cards lifetime but saying that "checkbox" features has been Nvidias way to success is rather lame imo.

We've already seen very poor AA peformance above 4xAA, shader performance that seems a lot lower than it should be on Shadermark, no functionality of the 2D processor at all, lots of SM3.0 evangelism on a game that is SM2.0 and seems to look a lot worse on NV40 than R300.

I don't really see that poor AA performance over 4X AA has to do with this. Especially since it's using a MSAA/SSAA combination. And shader performance can probably be improved a lot but it's hardly bad at this moment. And what game are you talking about that supports SM3.0 ?
I'm guessing that it's Far Cry but we need to get the patch and final DX9 2.0c before making any judgement on that.
 
DemoCoder said:
Althornin said:
Oh Democoder - why is it that nVidiots feel compelled to bag on ATI demos?

And why do you always feel compelled to make nonsense arguments? I didn't "bag" on ATI's demo.
lol, i didnt say you did.
i said "nvidiots", who did, and have, in this thread. You always say you dont consider yourself one.
now who is making the nonsense argument?
You are the one who labeled people as ATI fanatics simply because "they downtalk an nV40 feature".
I guess you didnt see the point of me using the same broad type brush to paint others as nvidiots - it was supposed to show you how foolish your own comment was, but your huge flaming bias prevented you from seeing a possible point.
 
T&L was very usable on the GF1 if i'm not mistaken,

giants was another. IT was actually faster to turn off tnl on my geforce sdr and on my sisters geforce ddr .


Also on the tnt 1 32 bit color in most games took a huge hit. At the time i was not as big of a pc gamer as I am now. But in the games i owned you'd put 16 bit color on for stable high framerates .

other than that i don't have any complaints untill the geforce 3 and sm 1 (doom 3 showing off the card ) and of course the fx series .


For ati i'd have to go with the 3rd tmu in the first radeon . p.s 1.4 in the 8500 .

The r3x0 series though has no faults at all .

sm3 on the nv40s may turn out like the 8500 and p.s 1.4 or it may turn out like the r3x0s .

But considering the only sm2.0 game out right now that uses it often is farcry and the drivers are using the nv3x path for the nv40s . This is not a good sign at all.
 
so.... if there were a Starcraft 2......would this (geometry instancing) help keep the idea of mass armies, assuming Blizzard uses a 3d engine? :D

4v4 BGH2 :oops:....1600 probes/drones/scv's :oops:
 
Alstrong said:
so.... if there were a Starcraft 2......would this (geometry instancing) help keep the idea of mass armies, assuming Blizzard uses a 3d engine? :D

4v4 BGH2 :oops:....1600 probes/drones/scv's :oops:

I allways thought it was the ai that limited the graphics of a rts and not the actual graphics cards .
 
Back
Top