The way it's meant to be patched...

kindbudmaster,

I agree with you completely. I think it has been born out that gamersdepot's accuracy and relevance is not related to following journalistic principles, but to do with the actions of nVidia being so amazing that sensationalism and non-objective commentary doesn't seem (at the moment) to depart from the truth. While the degree of problems in this regard is not nearly as extreme as certain other sites that come to mind (but worse, IMO, than certain others I do think have problems as well), I think they are distinguished mostly by having ended up picking the right company to crucify.

Perhaps their choice was based on a solid foundation, but that doesn't change that it displays a marked lack of objectivity in the way they are progressing.

Wavey,

On that note, I'm glad you modified your statement Wavey. Myself, you know I've long ago decided against buying nVidia products. Without any 180 degree changes in my outlook to confuse things for me, though indeed feeling some measure of amazement, I continue to find it important to stick to objectivity when dealing with these issues, as amazing as they are.

But I don't think you should be hard on yourself...your statements weren't on the front page or an article, and, true to expectations, your personal standards have shown through in a timely fashion in your follow up to them.

Others have spoken up on the problems with silence, and I think alternative solutions and approaches have been mentioned. I will reiterate my suggestion concerning the Det 50's, as far as one specific part of the current situation, in that an objective evaluation and isolation of cheats, as well as a a separate evaluation of any valid and confirmable actual legitimate optimizations, would be the way to present a complete picture true to this site's spirit as I perceive it. I think something like this is even more important in defining the site, and as an opportunity to illustrate the virtue of thoroughness and not being swayed by the "political" situation to nVidia, just in case there is something buried somewhere that nVidia is doing "right", even in this mess.

...

I try not to jump on bandwagons. I'm not going to tell people they shouldn't when they are faced with valid reasons to do so, I'm going to try to walk along side and prevent it from smashing through important things on its way to its destination. I think that's something Beyond3D has done successfully and seems to be prepared to continue to do, and I'd just like to encourage that.
 
When I talked to NVIDIA there didn’t seem to be a level of appreciation that we took both benchmarks in the "like for like" settings

nv doesn't like the fact that their cards don't perform as well as competing solutions in "apples for apples" comparison, that's all.
 
A company that touts CineFX / 32bit / the fastest etc etc then expects reviewers to lower settings to create an even fight against it's rivals is hardly what I'd call a balanced review anyway. All Nvidia have to is change the DX9 sticker to DX8.1 and the problem would be neutralized.
 
Ack! Someone took your forum commentary and posted it on Anandtech's front page. Perhaps an expansion of your sig for such eventualities is in order, Wavey? :-?
 
They believed ATI tipped us off to there being a benchmark mode in the patch - I will state, that we had no such information from ATI.

Not that NV had any moral problems tipping "guys with webpages" WRT Quake/Quack a while ago themselves...
 
Tim said:
http://www.anandtech.com/#20525

It seems that anandtech has seen the light.

I read the comments.
Yowza :oops: !
I don't read any message boards beside B3D, and there's the explanation!
Admittedly I've seen far worse examples (e.g. at Firingsquad), but it's so long time between my visits at other message boards that I get surprised every time.
 
DaveBaumann said:
I apologise, I shouldn’t have said that as I haven’t actually verified it myself - there is at least one editor looking in to this that I am aware of. However, as I said, I shouldn’t have said this and it was an unprofessional emotional response to the events that have happened recently.

Well, you are still only human. ;)
But I think it would be wise to contact anand about it (it seems they haven't noticed your new sig yet and already quoted it...) and to add a short note to the post in question (to prevent any further quotings "out of context")
 
Core and Eidos believe that Tomb Raider: AOD performs exceptionally well on NVIDIA hardware.
it performs exceptionally well like a retarded kid can read exceptionally well... FOR A RETARDED KID!!!

it should be:
Core and Eidos believe that Tomb Raider: AOD performs exceptionally well on NVIDIA hardware compared to home made grahics cards.
 
DaveBaumann said:
Deathlike2 said:
That type of thing has already happened. Have you heard of the problem with the water shader in Tiger Woods 2003? It works on NVIDIA's boards, but not any others. Evidently there is a device ID detect in there - if you change another shader enabled board to an NVIDIA device id it runs the NVIDIA shader, not only that but it runs it fine - change the device ID of an NVIDIA board to a non-NVIDIA board and it breaks. Ask EA who wrote that shader code.

I wasn't aware of that... is there a source?

I apologise, I shouldn’t have said that as I haven’t actually verified it myself - there is at least one editor looking in to this that I am aware of. However, as I said, I shouldn’t have said this and it was an unprofessional emotional response to the events that have happened recently.

No, NVIDIA hasn’t "got to me", but this type of comment shouldn’t be what B3D is used for and I shouldn’t have made it. However, I am still rather astounded by the events recently.

Thank you for the update. However, my sneaking suspicion is that it will turn out true in the end anyways. I really just do not trust Nvidia anymore. And that's a sad position for one to be forced to take given the relative lack of competitors in the 3d market space outside ATI.
 
DaveBaumann said:
Deathlike2 said:
That type of thing has already happened. Have you heard of the problem with the water shader in Tiger Woods 2003? It works on NVIDIA's boards, but not any others. Evidently there is a device ID detect in there - if you change another shader enabled board to an NVIDIA device id it runs the NVIDIA shader, not only that but it runs it fine - change the device ID of an NVIDIA board to a non-NVIDIA board and it breaks. Ask EA who wrote that shader code.

I wasn't aware of that... is there a source?

I apologise, I shouldn’t have said that as I haven’t actually verified it myself - there is at least one editor looking in to this that I am aware of. However, as I said, I shouldn’t have said this and it was an unprofessional emotional response to the events that have happened recently.

No, NVIDIA hasn’t "got to me", but this type of comment shouldn’t be what B3D is used for and I shouldn’t have made it. However, I am still rather astounded by the events recently.

...

Since starting to write this I have a long conversation with NVIDIA on some of the Tomb Raider stuff. They believed ATI tipped us off to there being a benchmark mode in the patch - I will state, that we had no such information from ATI. We started working with Core to get some enhancements to the benchmark (which are in the shipping version of the game) back in July and the first either NVIDIA or ATI knew that we were looking at this was when Reverend mailed about some driver issues for both parties - we did receive a reply from ATI, but AFAIK Reverend didn’t get anything back from NVIDIA.

There appears to be a slight impasse in the methods of benchmarking that NVIDIA would want, and also a lack of appreciation of what was actually benchmarked here. The problem is that at present there is a difference between the "out of the box" settings and the "like for like" settings. Reverend spent a lot of time talking with Core to get an understanding of what the "like for like" settings would be, which is what our settings documentation is based on. However, this like for like benchmarking isn’t necessarily how the game will be played out of the box - this is what I believe the last statement from Eido’s is in relation to. When I talked to NVIDIA there didn’t seem to be a level of appreciation that we took both benchmarks in the "like for like" settings and we also showed the default board performance as a current user would play it out of the box (and also listing all the settings that were enabled or disabled). I specifically did this because I figured there would be complaints that it doesn’t represent how you would play it, however this appears to have gone unnoticed - IMO reviews are also about understanding performance to base purchasing decisions on, not justifying (or getting annoyed at) the purchase you have already made.

Dave, if need be, I can send you all the emails I traded with Core where I made the initial suggestions about all the benchmark stuff and of which Core implemented.

I was also asked by a NVIDIA personnel if ATI tipped us off abpout (or even provided us with) the benchmarking mode in TRAOD, right after Dave did his beta bench article. I shut him up.
 
To note something on Anandtech's front page is something (I guess some people DO care about what goes on Beyond3D).. congrats :)

I was also asked by a NVIDIA personnel if ATI tipped us off abpout (or even provided us with) the benchmarking mode in TRAOD, right after Dave did his beta bench article. I shut him up.

LOL, That's giving it to them.

ust to clarify . Were you refering to Tiger Woods 2003 or Tiger Woods 2004?

Tiger Woods 2003, Tiger Woods 2004 will be out at some point (it would be an idea to contact BigBerthaEA about it.. (he's hyping it slightly, but he's also under an NDA)... Tiger Woods 2003 is partly under discussion
 
TW2004 release date is set for the 22nd I think. I also think Tim's NDA has expired on the game as well, so he could probably answer some of these questions if he wanted to.
 
DaveBaumann said:
screw it

That type of thing has already happened. Have you heard of the problem with the water shader in Tiger Woods 2003? It works on NVIDIA's boards, but not any others. Evidently there is a device ID detect in there - if you change another shader enabled board to an NVIDIA device id it runs the NVIDIA shader, not only that but it runs it fine - change the device ID of an NVIDIA board to a non-NVIDIA board and it breaks. Ask EA who wrote that shader code.

Dave, look here in the Rev's review:

http://www.beyond3d.com/reviews/albatron/gffx5900pv/index.php?p=10

under the "Other Condsiderations" heading. To me, the R3x0 water shading is infinitely better than what nVidia's getting out of it...;) The wader shading for the Dets is flat--the trees don't even properly reflect in the water movement. The water shading with the Cats is mirrored, and you can see how the reflection of the trees distorts with the water rippling. It almost looks to me is if EA couldn't get the effect it wanted for the water with the Dets so they put in some "rough water" animated texturing to mask it. That's the way the screen shots look to me...
 
Deathlike2 said:
Tiger Woods 2003, Tiger Woods 2004 will be out at some point (it would be an idea to contact BigBerthaEA about it.. (he's hyping it slightly, but he's also under an NDA)... Tiger Woods 2003 is partly under discussion
No, no he's not under an NDA anymore about it...he was discussing it over at Rage3D a bit yesterday. Hold on a sec...

Big Bertha EA over at Rage3D said:
I will be able to comment more fully on this issue as I am supposed to be getting final game code tomorrow.

From what I have seen on alpha builds, TW2004 ran pretty much neck and neck with the 9800 Pro on my rig.

When we reached Beta and more shaders were added to the game code, the 9800 Pro did not bat an eye and was roughly TWICE as fast in framerates at the same resolution as the 5900 Ultra. The build I have tonight is 3 versions shy of final and the same story holds true.

I have also emailed the developer to get some answers to the issues that Dave brought up. Not sure when I will get an answer as this is a TWIMTBP game and I am not sure how much the developer can comment. That being said, I will follow up here with more when I have something to report.

-Tim

Direct link to quote.
 
DaveBaumann said:
...Since starting to write this I have a long conversation with NVIDIA on some of the Tomb Raider stuff. They believed ATI tipped us off to there being a benchmark mode in the patch - I will state, that we had no such information from ATI.

In other words, the person at nVidia you talked to basically believes that you are a liar, and involved in an active conspiracy to falsify information about their products along with ATi. Dave, I would send whomever you spoke with a written letter and ask him to restate those assertions in written form. You have a right--but I'll make a large wager they will never do so.

The one thing I learned long ago through bitter experience is that it is often the case when dealing with dishonest people and/or companies that they will accuse everyone else of doing exactly what they are doing--which is being dishonest. Somebody at nVidia is telling some whoppers, both to themselves and to anyone else who will listen. It is a classic example of conduct by professional crooks and and con men.

We started working with Core to get some enhancements to the benchmark (which are in the shipping version of the game) back in July and the first either NVIDIA or ATI knew that we were looking at this was when Reverend mailed about some driver issues for both parties - we did receive a reply from ATI, but AFAIK Reverend didn’t get anything back from NVIDIA.

The sin in the eyes of nVidia here is that you didn't allow nVidia to "co-ordinate" all of your activities in that regard, I'm sure. How dare you be brazen enough to think you could run your own web site without their guidance? (Is probably what they'd like you to think.)

There appears to be a slight impasse in the methods of benchmarking that NVIDIA would want, and also a lack of appreciation of what was actually benchmarked here. The problem is that at present there is a difference between the "out of the box" settings and the "like for like" settings.

I fail to see a problem with that at all--that's true of almost any software you can name. The real problem, of course, is that nVidia did not want it exposed that their hardware is truly not capable of doing a good job with real DX9-engine 3d games. So they will throw off on you, Core, FutureMark and anybody else who dares to question their PR machine. Whether the criticisms are valid is of no concern to nVidia, obviously.
 

Hmm, page one of that thread claims that Stalker has been bought by Nvidia, and they are coding all the PS 2.0 stuff for the developer and making sure it runs badly on ATI hardware.

Is this a growing trend of Nvidia trying to buy games and making them "Nvidia Only"? If so, Stalker's developers are going to be very unhappy when all the ATI owners with the high-end cards capable of running Stalker well don't buy the game because of the visuals being deliberatly crippled by Nvidia coders.

Does Nvidia really think that giving developers a load of cash and crippling code paths (which the ATI cards will probably run as well as the Nvidia cards - if not better - anyway), will actually have people buying 5900Us? Do they think we will throw away our R350's which run 2-4 times faster than Nvidia cards with better IQ in 99.9 percent of games, just to play the one or two "Nvidia Crippled" games?

Just when you think Nvidia reach an all time low, they again manage to pull themselves even deeper into the gutter....
 
/me puts on the aiuminium hat

*Conspiracy mode on*

Is the TWIMTBP campaign something that Nvidia setup so it could prevent the use of benchmarks in games that use pixel shaders that show it's hardware in a bad light???

Think about it. UT2003 is a TWIMTBP game, and benchmarks are ok in that, because Nvidia cards do well in it (with just a 'bit' of extra help from the drivers). Of course it doesn't use PS2.0 as well. TRAoD though, completely different story.

*Conspiracy mode off*

/me takes off the aiuminium hat
 
Back
Top