"An interview with Richard Huddy & Kevin Strange

jvd said:
for what ? he doesn't tell us anything about his benchmarks .

and they are not consistant with what anand shows us he also has all opimizations on .

chris rays results are better but why optimizations on ?


oh well i'm still waiting for my question in the other thread to be answered

Maybe he had them on because his card is the 6800NU and he uses it like that as would other people with this card to get better frame rates ? No doubt Chris will pop along so you can ask him.

I think this demonstrates it is not a check box feature. Sorry, haven't seen your other comment.
 
Under PS3.0, CryTek has apparently implemented single-pass per-pixel lighting. With this per-pixel lighting model, a pixel shader is run that takes into account and processes all light sources in the level that affect a particular pixel in one pass. The PS2.0 implementation apparently uses multiple rendering passes (one for each light) for each affected pixel. This means that in heavily lighted scenes, one (more intense) lighting pass can run, which eliminates the time it takes to setup and execute another pass, even if both implementations have the same result. It is unclear to us exactly why this is possible in PS3.0 and not in PS2.0 (we have even seen examples of technology like this running on PS2.0 hardware). We would really love the chance to go more in-depth with Crytek about their lighting algorithms.

I wna tto know just like anand wants to know . If this can indeed be done on ps 2.0 hardware why wasn't it ?

Its this a result of twimtp ?
 
jvd said:
dizietsma said:

for what ? he doesn't tell us anything about his benchmarks .

and they are not consistant with what anand shows us he also has all opimizations on .

chris rays results are better but why optimizations on ?


oh well i'm still waiting for my question in the other thread to be answered

Not sure I understand your question JVD, The performance differences in my tests were merely to show the differences between SM 3.0 and SM 2.0 rendering pathways. Not to create an apples to apples comparison to anything. Since it wasnt being compared against anything else. The only purpose of my tests was to show a real time difference between SM 3.0 and SM 2.0 on my hardware.

They were done in real time because I preferred not to use Nvidias Demos sent to me.
 
1) The only negative issue with PS 3.0 and nVidia is where it is used to perpetuate rather distored misinformation (unfortunately, this is coming to define PS 3.0 more than other factors because of choices nVidia is making).

2) Distracted by this, a lot of people seem to forget that the PS 2.0 of the NV40 performance is not at all slow. Slower than some X800s, perhaps..having some issues with drivers and very specific performance characteristics, perhaps...but not at all deserving of the classification of slow. This is dramatically outside of the realm of reasonable

...

Who is "taking it in the pants" for the extra expense of transistors spent on PS 3.0 is primarily card manufacturers and nVidia. It is efforts directed to avoid that by deception and misinformation campaigns that are negative for consumers, not PS 3.0 features directly.
 
jvd said:
Under PS3.0, CryTek has apparently implemented single-pass per-pixel lighting. With this per-pixel lighting model, a pixel shader is run that takes into account and processes all light sources in the level that affect a particular pixel in one pass. The PS2.0 implementation apparently uses multiple rendering passes (one for each light) for each affected pixel. This means that in heavily lighted scenes, one (more intense) lighting pass can run, which eliminates the time it takes to setup and execute another pass, even if both implementations have the same result. It is unclear to us exactly why this is possible in PS3.0 and not in PS2.0 (we have even seen examples of technology like this running on PS2.0 hardware). We would really love the chance to go more in-depth with Crytek about their lighting algorithms.

I wna tto know just like anand wants to know . If this can indeed be done on ps 2.0 hardware why wasn't it ?

Its this a result of twimtp ?

Now that is a good point and maybe it is because of the $$. Maybe they will fix it so it works that way with PS2.0 later ?

Of course if a company is giving you money then you prioritise your time in a certain way I would imagine :)
 
I think the improvements are quite interesting. We can see where the optimisation is the best. Should be around 5-10% improvement overall.
 
dizietsma said:
I think the new results from Far Cry mean you can now officially change your mind ..if you want to ?

If you mean the Anadtech report, even a heavily Nvidia-biased site is showing virtually no difference between NV40 running both paths, and no difference in visuals. On a game with so much in the way of graphics and heavily used as an NV40 marketing tool, I'm not too impressed. Where's the magic that Nvidia promised us with SM3.0? Or is it just another washout like their 32 bit rendering and "Cinematic Computing"?

These results are not a vindication of SM3.0 on NV40. If anything, they confirm it's another case of Nvidia putting in unusable support for a feature in order to get a marketing checkbox which they then base their marketing around.

Thought: What's going to happen when SM4.0 arrives? Does Nvidia claim it's the best thing since sliced bread and bribes developers not to support SM3.0 hardware, thus again orphaning their own customers? This time it's NV3x customers that get the shaft, next year it's NV4x customers?
 
Bouncing Zabaglione Bros. said:
These results are not a vindication of SM3.0 on NV40. If anything, they confirm it's another case of Nvidia putting in unusable support for a feature in order to get a marketing checkbox which they then base their marketing around.
From those benchmarks I see some pretty solid and remarkable results, not just a marketing checkbox.
Obviously we need more games and more tests to understand better SM3.0 benefits.

Thought: What's going to happen when SM4.0 arrives? Does Nvidia claim it's the best thing since sliced bread and bribes developers not to support SM3.0 hardware, thus again orphaning their own customers?
Only if nvidia will do it first..but I hope that's not the case.
It would be really funny if ATI will release SM4.0 hw before NVIDIA and wacthing sides switch opinion another time :)

ciao,
Marco
 
Bouncing Zabaglione Bros. said:
Thought: What's going to happen when SM4.0 arrives? Does Nvidia claim it's the best thing since sliced bread and bribes developers not to support SM3.0 hardware, thus again orphaning their own customers? This time it's NV3x customers that get the shaft, next year it's NV4x customers?

What's your point? ATi pushed for SM2.0 when they had the hardware, and they'll do it for SM4.0 too. It's got nothing to do with orphaning their own customers, it's called progress. Even though bribery (if it's even taken place here) is hardly the way to go, in this instance it's been shown that it doesn't make much difference anyway.
 
Bouncing Zabaglione Bros. said:
dizietsma said:
I think the new results from Far Cry mean you can now officially change your mind ..if you want to ?

If you mean the Anadtech report, even a heavily Nvidia-biased site is showing virtually no difference between NV40 running both paths, and no difference in visuals. On a game with so much in the way of graphics and heavily used as an NV40 marketing tool, I'm not too impressed. Where's the magic that Nvidia promised us with SM3.0? Or is it just another washout like their 32 bit rendering and "Cinematic Computing"?

These results are not a vindication of SM3.0 on NV40. If anything, they confirm it's another case of Nvidia putting in unusable support for a feature in order to get a marketing checkbox which they then base their marketing around.

Thought: What's going to happen when SM4.0 arrives? Does Nvidia claim it's the best thing since sliced bread and bribes developers not to support SM3.0 hardware, thus again orphaning their own customers? This time it's NV3x customers that get the shaft, next year it's NV4x customers?


Really ? I see a 3-25% increase in speed for the consumer at no cost to them whatesover due to driver tweaks and the SM3 patch. Also the quality bugs seem to be reducing, most obviously the floor patterning bug.

So faster and less bugs, but you are unimpressed ? I wonder why ?
 
PaulS said:
Bouncing Zabaglione Bros. said:
Thought: What's going to happen when SM4.0 arrives? Does Nvidia claim it's the best thing since sliced bread and bribes developers not to support SM3.0 hardware, thus again orphaning their own customers? This time it's NV3x customers that get the shaft, next year it's NV4x customers?

What's your point? ATi pushed for SM2.0 when they had the hardware, and they'll do it for SM4.0 too. It's got nothing to do with orphaning their own customers, it's called progress. Even though bribery (if it's even taken place here) is hardly the way to go, in this instance it's been shown that it doesn't make much difference anyway.

I don't recall ATI bribing developers not to write SM 1.x fallback modes like Nvidia is doing with SM3.0 and SM2.0 fallback modes.
 
dizietsma said:
Really ? I see a 3-25% increase in speed for the consumer at no cost to them whatesover due to driver tweaks and the SM3 patch. Also the quality bugs seem to be reducing, most obviously the floor patterning bug.

So faster and less bugs, but you are unimpressed ? I wonder why ?

Because that's not what Anandtech concludes at the end of their article. There are a couple of FPS difference and no visual difference. The only exception to that is Nvidia's own handpicked/tuned demo. How much of that is due to SM3.0, and how much is due to fixing the NV3x path that NV40 was forced to run?

http://www.anandtech.com/video/showdoc.html?i=2102&p=11


That's not exactly a ringing endorsement from a site heavily linked to Nvidia is it?

As for bugs, they should never have been there in the first place, but I guess that's what happens when your driver is overwriting the developer's code.
 
jvd said:
I wna tto know just like anand wants to know . If this can indeed be done on ps 2.0 hardware why wasn't it ?

Its this a result of twimtp ?

What I want to know is why should they have done it in ps 2.0?
 
Bouncing Zabaglione Bros. said:
I don't recall ATI bribing developers not to write SM 1.x fallback modes like Nvidia is doing with SM3.0 and SM2.0 fallback modes.

I'm not sure I understand this claim. Are you saying that when Nvidia approached Crytek to support SM3.0 they explicitly forbade them from also implementing the new features in SM2.0 ?

Like an above poster I would love to see ATI come out with SM4.0 first and all the people bitching now will be all grins.
 
Like an above poster I would love to see ATI come out with SM4.0 first and all the people bitching now will be all grins.
Bit like the people that shouted dx9 was nothing but a cooked up collaboration between ms and ati and yet now sm3.0 is the best thing since slided bread? Swings both ways you know.
 
digitalwanderer said:
Bjorn said:
Heavily linked to Nvidia ?
You're right, he phrased that badly...it should be "Heavily biased towards nVidia?" 8)

I don't necessarily agree. Have you read their "Building a low, mid, high end" system recomendations lately ? haven't seen a NV card there for quite some time. Though the last high end system had a 6800 as recomended card, but it has been Ati all the way there before that.
 
Bjorn said:
I don't necessarily agree. Have you read their "Building a low, mid, high end" system recomendations lately ? haven't seen a NV card there for quite some time. Though the last high end system had a 6800 as recomended card, but it has been Ati all the way there before that.
Nope, I quit going there after they didn't take down the incorrect Quake3 results even though people were on them to for WEEKS!

Lots of sites recomended ATi last generation Scott, and for very good reason; they had the best card. The thing is when did they start recomending it and how did they cover the FX scandal is a huge part of how I judge sites bias.

Anananananda tech were whores during the scandal, and Anand himself just hid from the issue. :devilish:

Not the way I think you should be providing relevant and accurate information to your readers, at least it was enough for me to lump them in the "sell out" group of big sites.

Why, would you disagree with that assessment? :|
 
trinibwoy said:
Bouncing Zabaglione Bros. said:
I don't recall ATI bribing developers not to write SM 1.x fallback modes like Nvidia is doing with SM3.0 and SM2.0 fallback modes.

I'm not sure I understand this claim. Are you saying that when Nvidia approached Crytek to support SM3.0 they explicitly forbade them from also implementing the new features in SM2.0 ?

http://www.beyond3d.com/forum/viewtopic.php?t=13622&highlight=

trinibwoy said:
Like an above poster I would love to see ATI come out with SM4.0 first and all the people bitching now will be all grins.

That's still a long way off though. IIRC, SM4.0 is due with Longhorn, and there is talk that it will *only* be supported in Longhorn.

Even so, if SM4.0 came out first from ATI, there would be a certain number of developers who wouldn't support it until Nvidia brought out their version, just as there were developers who wouldn't support DX9 because Nvidia's NV3x series wasn't capable of doing so, despite ATI's large market of capable DX9 cards. It's those TWIMTBP dollars again, even it it's in the form of Nvidia writing code for the developers that does a deviceID check and only runs correctly on Nvidia hardware, as well as hard cash.
 
Back
Top