The way it's meant to be patched...

CorwinB said:
Check the front page of Gamers Depot...

It seems that after Nvidia pulled some strings, Eidos removed the Tomb Raider A0D v49 patch, which contains the benchmarking code used by B3D.

I am totally speechless on this one. In addition to the benchmarking mode that reveals the GFFX's less than stellar DX9 performance, the patch also brought a large number of fixes customers who *paid* for the game were expecting... In other words, in addition to forcing developers to do additional work to hide the deficiencies of their under-performing hardware, the goons at Nvidia now go some extra length to be sure that if their "100 millions" customers can't enjoy the new games because of crappy hardware, then nobody will.
Revisionist history at its finest.

Developer - "We've found that NV3x performance on PS 2.0 shaders is much worse than similar Radeon products."
NVIDIA hands pile of money to Developer.
Developer - "Oh sorry, we were mistaken. The NV3x is a great platform to work with and we look forward to our future dealings with NVIDIA."

HEY NVIDIA: YOU CAN'T BRIBE EVERYONE!

-FUDie
 
FUDie said:
HEY NVIDIA: YOU CAN'T BRIBE EVERYONE!

-FUDie

True, but they can try. I think there is a great business opportunity in developing a few PS2.0 shaders demo right now... While Nvidia still has some cash left. As a side note, I'm wondering how Nvidia's accounting books will look like next quarter. There should be some really creative stuff into them. :p
 
http://www.filefront.com/r.b/page__...GUAGES.exe/_clear__page,filepath/_output.html

Tomb Raider: the Angel of Darkness by Core Design - retail v49 patch
Bugs Fixed and changes in V49:
Save game time are now in local time and not GMT.
Fixed Lightning effects in final battle and hanger.
Added build version to start screen.
Changed conversation to use action key.
Multisampling and post process are no mutually exclusive. (Only possible with
DX9.0a or higher)
Stopped ammunition decreasing too fast.
Added key remapping for cycle weapons.
Stopped quick-save from producing empty filenames.
Fixed incorrect lighting on staticly lit animated objects.
Stopped failed controller profile from removing existing profiles.
Added user clip plane support to projected shadows.

Command Line
-level= Load Specific Level e.g.
"-level=datamapsparis5_3"
-selectlevel Level Select Dialog
-fullscreen
-windowed This is unsupported. The desktop
colour depth should match the colour depth in the settings.
-norun
-nofmv
-padplayback= Plays back the basic input. When
finished the application will end.
-padrecord= Records the basic inputs until the
application finished
-benchmark= Gives a Benchmark figure at the end
of the run. There must be a bin.pad file, that was recorded using
'padrecord'
-settings Displays a dialog that allows a user
to change there device settings
-settings_override= Overrides the gfx settings for a
predefined subset. Some things are not changed, like resolution. The profile
name must be in upper case.
-mode=xx Overrides the gfx setting's screen mode value.
Under Windows98/ME, or any situation where D3D returns a 0 for a modes refresh,
the refresh rate is ignored.
-multisample= Overrides the gfx setting's
multisample value. This can be 0, 2-16. A value of 1 is not supported.
-debugkeys
-godmode Sets the power ups to level 10 and
gives a selection of guns and ammo

Benchmark Instructions
To record a demo of paris1 use the following shortcut:

Target: "[INSTALLDI]inTRAOD_P3.exe" -padrecord=binparis1.pad
-level=datamapsparis1 -debugkeys
Start in/working directory: "[INSTALLDIR]"

Once you have run around enough use Alt-F4 to quit recording.
'paris1' can be swapped for any level name, but you have to change the
'-padrecord=?.pad' to the same name.

To play it back use:

Target: "[INSTALLDIR]inTRAOD_P3.exe" -benchmark=paris1
Start in/working directory: "[INSTALLDIR]"

This will record only the basic input data.
The debug function keys and other hardwired keys will not be recorded.
You should not use the menus while recording a benchmark demo.
'-benchmark' can then be combined with options like '-settings_override',
'-mode' and '-multisample' to create different benchmark setups.

The results of the benchmark will be appended to 'benchmark.csv'.
This feature is a gift and will not be supported by Eidos Interactive or Core
Design.
Thanks to Anthony Tan from http://www.beyond3d.com for all the useful feed back
on this feature.
<------ :LOL: Well at least Rev got credit.
 
CorwinB said:
Core and Eidos believe that Tomb Raider: AOD performs exceptionally well on NVIDIA hardware.

Well, I think the only thing left for them is to believe (I can imagine that if NV hardware actually performed well, they wouldn't have pulled the patch).

And to believe something like that, they must have smoked something mightily hallucinogenic...

Yes--I'd like to see corroboration on the Eidos statement myself--have to say it does sound like a PR statement though--notice how scrupulously Baldwin avoids using the term "DX9" and how he doesn't list the conditions under which Eidos "believes" what it believes (ostensibly not the DX9 conditions that can be setup within the game.) If it's a legitimate statement, however, no doubt it was nVidia-inspired. Funny--you'd think Eidos would care more about its software than about a particular IHV's DX9 hardware. I mean, from what I recall the game sets up the nV3x on the DX8.x path by default. Hard to see why Eidos would get worked up about 3d-card support they had nothing to do with. What, is Core "guilty" of supporting DX9???? It really makes no sense.
 
This is all getting really bizarre. I can't make myself believe that things are happening the way people say they are.

It's not because I have any preference for any hardware company, but because I can't believe there is a group of people who would actually think pulling the patch is a good idea.

There's got to be a better reason than Nvidia not liking benchmark results. My sense of disbelief tells me that Nvidia couldn't possibly be that arrogant, or Eidos be that spineless.

Or are we all in a lot more trouble than I think? :(
 
I only hope that it will not get even worse - TWIMTBP games getting some 'optimizations' to suddenly make them run slower on ATI hardware...
 
3dilettante said:
My sense of disbelief tells me that Nvidia couldn't possibly be that arrogant, or Eidos be that spineless.

Or are we all in a lot more trouble than I think? :(

Well, I think what Deano Calver had to say explains this a bit. I personally thought a lot of the anger directed at NVidia was unjustified ramblings from fanboys, but everyday it turns out to be more and more true.
http://www.beyond3d.com/forum/viewtopic.php?t=7873

...what wasn't prehaps so expected was how glad everybody else was that HL2 results matched the results most of us had already seen ourselves. Somebody on the forums (sorry can't remember who) asked why developers seemed quite shy in stating our results, I obviously can't talk for everbody but the answer is probably a simple case of somebody had to first and whoever that person/company was, they better be able to handle the heat that it would produce.
 
I wonder how far Nvidia can go then. What are they thinking about themselves??

Should we now make our game buying decisions based on our graphics hardware? Does TWIMTBP equal Needs a GF To Be Played?

At least Half-life 2 is such a big title that I hope it has enough momentum to turn the tide. Though I wonder if we'll see a FutureMark-kind of sudden change from them... someone please assure me that Nvidia doesn't have the power to do that. Or would they release new Detonators that just plain crash with HL2?? Surely they can't do that... right? :oops: :oops: :oops: :oops: :oops:
 
DaveBaumann said:
Laa-Yosh said:
I only hope that it will not get even worse - TWIMTBP games getting some 'optimizations' to suddenly make them run slower on ATI hardware...

screw it

That type of thing has already happened. Have you heard of the problem with the water shader in Tiger Woods 2003? It works on NVIDIA's boards, but not any others. Evidently there is a device ID detect in there - if you change another shader enabled board to an NVIDIA device id it runs the NVIDIA shader, not only that but it runs it fine - change the device ID of an NVIDIA board to a non-NVIDIA board and it breaks. Ask EA who wrote that shader code.

WTF???

I think, this is the time for a class action suit against NVIDIA.
They notorisusly falsified EVERY bit of the available information about GFFX since last November
- everything which can be used BEFORE buying, before you make your call... falsified everything that decision can be based on...

Mark my words: something just started...

EDIT: typos
 
Exactly what is going on???

To me when I try the link the FTP server is full.
Now that would suggest the file is probably there.
So is it
A) a mis understanding cause I'm using a gay browser that didn't tell me ftp was maxed out
B) they pulled the link to the link somewhere? they made a newer update?
C) they changed the content of the patch
 
Naw, they might have been able to pull that with futuremark, but not with halflife 2. The reviewers would turn on them in droves. Not only that, but every single gamer with a geforce card (which would now include the GF3 and GF4 cards out there) would scream bloody murder. Sure it'd hurt valve, but it'd decimate nvidia. If people see that ATI cards can play the game, and hell, even Xaber, matrox, and 3dfx cards can squeek by, It's Nvidia they'd be pissed at. Hell, look at the nvnews forums already. Support for nvidia right now is either in the form of "nV1d14 r0x0Rs!" or "Well, things really arn't that bad... Not nearly as bad as it seems... Er... wait for Doom III!"

I don't think they could pull it off at this point. Too many people are already too fed up with them, and the news is starting to propogate out to the masses.

Nite_Hawk

Laa-Yosh said:
I wonder how far Nvidia can go then. What are they thinking about themselves??

Should we now make our game buying decisions based on our graphics hardware? Does TWIMTBP equal Needs a GF To Be Played?

At least Half-life 2 is such a big title that I hope it has enough momentum to turn the tide. Though I wonder if we'll see a FutureMark-kind of sudden change from them... someone please assure me that Nvidia doesn't have the power to do that. Or would they release new Detonators that just plain crash with HL2?? Surely they can't do that... right? :oops: :oops: :oops: :oops: :oops:
 
Does their depravity know no bounds? [H] has a poll on the front page that ask which you would buy, ATI, nVidia or other. Of `4200 votes >90% are for ATI. Maybe this problem will sort itself out. Has anyone heard any rumblings from any IHV’s that use nV GPU’s or OEM clients for that matter, about dropping nV. I can’t imagine that they all to happy lately.
 
I am extremely pissed atm... I'm just thinking about all the hard work me and the Core Design programmer went through and I am very pissed.

It's not actually thoroughly surprising that this has happened since I already suspected that this may happen (read last two paras before the "Developer Stuff" section, especially the last para) but nevertheless I am shocked and pissed that it actually did happen.

Did I say I'm pissed?
 
I thought this bit was particularly prescient Rev, dated August 31st.

Reverend said:
It's only if certain DX9 games have been released and NVIDIA GFFX cards run them poorly, that NVIDIA may tell (or perhaps persuade with cash in hand) such developers to release patches that cut down on such DX9 features, or use other shaders that makes NVIDIA GFFX cards run them better, that developers may be tempted to go "Damn NVIDIA". Speculation of course. But we all know politics always exist in this industry.
 
Back
Top