The way it's meant to be patched...

Colourless said:
/me puts on the aiuminium hat

*Conspiracy mode on*

Is the TWIMTBP campaign something that Nvidia setup so it could prevent the use of benchmarks in games that use pixel shaders that show it's hardware in a bad light???

Think about it. UT2003 is a TWIMTBP game, and benchmarks are ok in that, because Nvidia cards do well in it (with just a 'bit' of extra help from the drivers). Of course it doesn't use PS2.0 as well. TRAoD though, completely different story.

*Conspiracy mode off*

/me takes off the aiuminium hat
/me takes Colourless' aluminum hat and runs off to hide under me bed!
 
digitalwanderer said:
/me takes Colourless' aluminum hat and runs off to hide under me bed!

Give that back! You should respect those who are higher in the organization. I am number 22, while you are only 357.
 
Bouncing Zabaglione Bros. said:

Hmm, page one of that thread claims that Stalker has been bought by Nvidia, and they are coding all the PS 2.0 stuff for the developer and making sure it runs badly on ATI hardware.

Is this a growing trend of Nvidia trying to buy games and making them "Nvidia Only"? If so, Stalker's developers are going to be very unhappy when all the ATI owners with the high-end cards capable of running Stalker well don't buy the game because of the visuals being deliberatly crippled by Nvidia coders.

Does Nvidia really think that giving developers a load of cash and crippling code paths (which the ATI cards will probably run as well as the Nvidia cards - if not better - anyway), will actually have people buying 5900Us? Do they think we will throw away our R350's which run 2-4 times faster than Nvidia cards with better IQ in 99.9 percent of games, just to play the one or two "Nvidia Crippled" games?

Just when you think Nvidia reach an all time low, they again manage to pull themselves even deeper into the gutter....



Remember this from me:

Hi,

In fact, the graphics engine is developed on Radeon9700 :)
We demand special support from FX driver guys, 'cause in pure/standart
DX9 it's impossible so run the engine on FX, at the moment.
Yes, FX will be slightly faster, but in the margin of several
percentages...

So, don't worry, your card will be excelent performer in
S.T.A.L.K.E.R.

--
Best regards,
Oles V. Shishkovtsov
GSC-Game World
oles@gsc-game.kiev.ua


Monday, February 17, 2003, 7:05:06 PM, you wrote:

OVJ> This is a forwarded message
OVJ> From: Anton Bolshakov <anton@gsc-game.kiev.ua>
OVJ> To: "Oleg V . Javorsky" <oleg@gsc-game.kiev.ua> (GSC Game World)
OVJ> Date: Monday, February 17, 2003, 7:00:11 PM
OVJ> Subject: : Stalker Support

Then when this went public on the net, 3DGPU writes this :

http://www.3dgpu.com/comments.php?id=2323&category=9 <---Link is dead



We use all hardware for development of Stalker, although most are NVIDIA. We will produce a game with the best quality, compatibility, and performance on all supported hardware (T&L or better).
Here are the general things. As a programmer, I need to get access to the latest hardware and talk to it's manufacturers, otherwise we may get way behind the competition. I want to give credit to NVIDIA for agreeing to be our technical partner and render us this kind of assistance (we contacted NVIDIA and ATI for several months, but ATI did not respond). NVIDIA offers me early hardware and very good support. Prior to GeForceFX I worked with Radeon 9700 but I am currently developing the Stalker engine on NV35. Naturally, such close work with NVIDIA engineers allows me to come up with better optimizations and support the new technologies of NVIDIA boards.

In Stalker you'll be able to play fine on NVIDIA and ATI hardware--the gameplay and run stability should be the same. On ATI boards Stalker will run fast, but on NVIDIA boards it will run even faster, plus gamers will get a set of unique effects, namely due to close work with the company enginers and support of NVIDIA hardware features.

Then ...ATI must have talked to them :LOL: :

I'd like to bring in certain explanations. In the previous message from Oles Shishkovtsov it was mentioned "we contacted NVIDIA and ATI for several months, but ATI did not respond". I'd like to add some details here: it probably wasn't put in a proper way, for it's not that ATI refused us in technical help, it's more that not everything went as we expected. We got our 9700 board with delay early October. That was already the release variant of the board, we couldn't get hold of earlier versions. ATI guys provided us with access onto developer forums and were ready to answer our questions.


We wouldn't like you to get a wrong impression out of that previous message that ATI developer relations department works badly. We did receive two 9700 boards, and we'd like to thank ATI for that. It's just that everything went not as fast as we expected, boards came after quite some time and we did not get a possibility to demonstrate the game at ATI's booth at ECTS. So, everything's fine with ATI's developer relations, probably something just went wrong on our side.

Oleg Yavorsky
PR Manager
GSC Game World
www.gsc-game.com
www.stalker-game.com


Keep in mind the Hypocritical comments in the last email exchange, they got a DX9 board in October, prior to even the API's release...then didn't get a FX board till Jan of the following year.

Then claim things didn't happen as fast as they wanted to, instead of admitting they have taken a undisclosed cash settlement to ensure Nvidia looks better. :LOL:
 
Pixel Shaders/Vertex Shaders used in S.T.A.L.K.E.R:

we won't use 1.2,1.3,1.4 Shaders. We will use only versions 1.1 and
2.0. V. 1.1. allows making everything, at the same time, it has no
compatibility issues, while 1.2, 1.3 and 1.4 have. So we opted for
1.1.

Best regards,

Anton
 
How can you blame GSC ? There in the Ukraine AFAIK and probably have VERY limited resources. It ain't exactly the lap of luxury over there. They didn't even have a publisher till the the last E3.
The probably didn't have muchof a choice given what DEV costs are today.
 
Colourless said:
digitalwanderer said:
/me takes Colourless' aluminum hat and runs off to hide under me bed!

Give that back! You should respect those who are higher in the organization. I am number 22, while you are only 357.
/me gives Colourless back his aluminum hat abashedly :oops:
 
It goes back to the old arguement, it is in the developers best interest to ensure their customers get the maxium experience out of their hardware.

How can PS 2.0 'run faster' on Nvidia hardware..I mean really.
It is too bad it has come down to this, payoffs to Devs. vs. Devs working with everyone to ensure optimal performance on all cards.

WHY I like Valves approach, and they certainly tried to their best to give good performance to HL2 for Nvidia users.
 
indio said:
How can you blame GSC ? There in the Ukraine AFAIK and probably have VERY limited resources. It ain't exactly the lap of luxury over there. They didn't even have a publisher till the the last E3.
The probably didn't have muchof a choice given what DEV costs are today.

Yes, it's understandable if Nvidia ponied up a lot of cash. However, there is a difference to writing a NV3x optimised path like D3/Valve to get the best out of the poor NV3x, and actively trying to cripple the performance of the game on a different card.

Given the massive difference in PS capabilities between Nvidia and ATI, I would expect the R3x0 performance to be able to *at least* keep very close to the NV3x performance, *unless* the code deliberatly cripples ATI performance. If this is the case, I would hope that ATI sorts out Nvidia-style shader replacement.

However, it is always possible that all this talk of Stalker is for PR puposes. When it arrives, we may find that the developers have quietly done a DX9 path that the ATI cards will excel at.
 
digitalwanderer said:
Colourless said:
digitalwanderer said:
/me takes Colourless' aluminum hat and runs off to hide under me bed!

Give that back! You should respect those who are higher in the organization. I am number 22, while you are only 357.
/me gives Colourless back his aluminum hat abashedly :oops:

/Number 17 grabs hat administers 10 lashings with a wet noodle to the subordinates!
 
Bouncing Zabaglione Bros. said:
However, it is always possible that all this talk of Stalker is for PR puposes. When it arrives, we may find that the developers have quietly done a DX9 path that the ATI cards will excel at.

Of course, we'll never know...because nVidia would at that time have any benchmarking facility in STALKER removed, with another round of "FRAPS is unreliable" PR....
 
Something I still don't get:
While Eidos and Core appreciate the need for modern benchmarking software that utilizes the advanced shading capabilities of modern graphics hardware, Tomb Raider: AOD Patch 49 was never intended for public release and is not a basis for valid benchmarking comparisons.
Why the hell did Eidos need a fortnight to realise they published a patch that was "never intended for public release?"

What's scarier--that NV strongarmed Eidos to remove the patch because of its benchmarking feature, or that Eidos's QA needs two weeks to realise they spread a patch that's not meant to be used by gamers? ;)

93,
-Sascha.rb
 
Joe DeFuria said:
Of course, we'll never know...because nVidia would at that time have any benchmarking facility in STALKER removed, with another round of "FRAPS is unreliable" PR....

Not to mention getting those pesky DX9-ish features removed from the game....;) If not for DX9, nVidia could have played at DX8.x forevermore...ah, now M$, and DX9 (shades of xBox2!) have become the bane to nVidia's existence--the roadblock against "lighting every pixel on every screen", as the nVidia CEO dreams of each night while listening to Wagner and thinking about needing a bigger "living room."

:devilish: --Just kidding!....;)
 
nggalai said:
Something I still don't get:
While Eidos and Core appreciate the need for modern benchmarking software that utilizes the advanced shading capabilities of modern graphics hardware, Tomb Raider: AOD Patch 49 was never intended for public release and is not a basis for valid benchmarking comparisons.
Why the hell did Eidos need a fortnight to realise they published a patch that was "never intended for public release?"

What's scarier--that NV strongarmed Eidos to remove the patch because of its benchmarking feature, or that Eidos's QA needs two weeks to realise they spread a patch that's not meant to be used by gamers? ;)

93,
-Sascha.rb

As I pointed out in an earlier post, generally the publisher thoroughly checks out the patch before placing it on his web site--this happens at EA and Atari and everywhere else I know about--and I'm quite certain it happened in this case as well.

I think ATi should put out some feelers here...the story nVidia told Dave B may be the story they've put out to Eidos--that Core's developers were influenced through the B3d back door and that B3d was serving as the corrupt minion of ATi.

"I say, Holmes! Who's the mastermind? Who's pulling the strings?"

"Watson, my dear chap. It's Moriarty, of course--always Moriarty! Curse the fiend!"

"But how do you know?"

"Watson--you see but you do not observe! Notice that cleverly hidden in the name 'Moriarty' are the letters 'ATi'.....!"

"By jove, Holmes, you've got it!"

"Let's be off, Watson, the game is afoot..!"

The two men pause at the door and look at each other...

"I say, Holmes, where are we going?"

"Watson, that's a good fellow...Kindly hand me my pipe, my violin, and my syringe case...I need to think about this a bit more...let's not be hasty. My nemesis may be more cunning than I assume....Hmmmm...."

:devilish:
 
From what I understand of what you'd do to optimize forNV3x cards, i wouldn't think that it would be exceeidingly difficult for ATI to reorder and optimize the shaders quite substantially to improve performance.

For the NV3x you want to first reduce register usage. That would mean that any NV3x oprimized code would have lots of free registers. Registers that an ATI optimizer could use to it's benefit.

One way that you'd free up registers for NV3x is of course to only sample textures when you need them. Now this will somewhat hurt for ATI cards. However, with all the unused registers, ATI could easily move all/most of the texture access to the start of the shader as they like.

In addition they could do various other oprimizations to the shader code as well. NV3x oprimized code is probably going to recalucate values rather than store them in register for re-use. Again, this will hurt ATI by costing extra clock cycles. ATI could just reuse the values by using extra registers and removing the undeeded repeated code. Something that compiler oprimizers usually like to do.

So, I have to wonder, how effectively could Nvidia, or a developer working with Nvidia, write shaders that would run slowly on ATI cards without actually hurting themselves too much in the process.
 
Colourless said:
From what I understand of what you'd do to optimize forNV3x cards, i wouldn't think that it would be exceeidingly difficult for ATI to reorder and optimize the shaders quite substantially to improve performance.

For the NV3x you want to first reduce register usage. That would mean that any NV3x oprimized code would have lots of free registers. Registers that an ATI optimizer could use to it's benefit.

One way that you'd free up registers for NV3x is of course to only sample textures when you need them. Now this will somewhat hurt for ATI cards. However, with all the unused registers, ATI could easily move all/most of the texture access to the start of the shader as they like.

In addition they could do various other oprimizations to the shader code as well. NV3x oprimized code is probably going to recalucate values rather than store them in register for re-use. Again, this will hurt ATI by costing extra clock cycles. ATI could just reuse the values by using extra registers and removing the undeeded repeated code. Something that compiler oprimizers usually like to do.

So, I have to wonder, how effectively could Nvidia, or a developer working with Nvidia, write shaders that would run slowly on ATI cards without actually hurting themselves too much in the process.

In addition to this it would give ATI the reputaion as having the miracle drivers that inrease performance by ~25% or so. Killing off what is left of the myth of DET. performance increases.
 
Back
Top