ATI & Nvidia Game Tampering??

Dimahnbloe

Newcomer
After analyzing the situation in the graphics market today, I must say that I am both disappointed and worried. The reason I say this is because within the past few month’s I think that we have been going backwards. I have noticed that game developers have started developing not only for the benefits of all hardware, but for features or code specific to one IHV

When I first saw it in Half-Life 2 the crossed my mind, but now it overwhelmingly apparent with the situation in Far-Cry. “Get in the Game,†and “The Way It’s Meant to be Played†is the modern day version of “Glide vs. OpenGL vs. DirectX: Game developers Coding for specific hardware enhancements instead of the better good of the industry. With Game developers taking money from hardware manufactures, what will be the difference between a game and a demo for Nvidia or ATI??

Take the Demo the Humus wrote for example, while it gave a considerable boost to ATI hardware, the gaines were not nearly as much for Nvidia’s hardware. Reading further down his post he stated that it could easily also be optimized for Nvidia hardware. The reason that I bring this up is not to shun his work, or show that he had ill-motives (sorry if it seemed that I did) but it is to simply state that if things continue to go the way that they are going right now, it would be easy for a Game developer to write un-optimized, or de-optimized code and still have the end result (the graphics that we view) look the same. In that situation I would not be able to catch them because I am not a game developer and I don’t think to many people have access to the code in such a way to do so. Instead of cheating in the drivers, they would be cheating in the Game. In my opinion, that is even worst.

I believe that Cry-Tech certain image enhancement are possible with PS2 as well as PS3, but what troubles me is that we have not seen any information from Cry-Tech stating that they are going to work to try to code for better PS 2.0 image enhancements or performance optimizations

The problem is that while some one from the Nvidia side of things are happy with Far Cry, they may not be as happy when a game comes out like half life 2 and is completely optimized for r400, leaving the nv40, the run it if you can mode,

What’s to say that Cry-Tech is not going out of their way to ensure that performance does not increase for the r400

I would take performance increases in Far-Cry and Half-Life 2 with a grain of salt and in my opinion, neither should be used for benchmarking purposes. They are both tainted, optimized for one vendor and possibly, purposely, de-optimized for the other.

Yes I do know that PS3 will be eventually supported by both manufacturer’s, but it may not be supported exactly the same way therefore there will be opportunities for enhancements that could have easily been adopted for both, but only adopted for one because the game developer is being paid to do so. Again this could be done without visual defects, so we would never know if the hardware is not as good as the other, or if it is the game developer not supporting the other hardware with the same efforts. Would it not piss you off if your card is running 20-30 FPS slower in a game because the game developer deemed that it was not in their “financial best interest†to write a few extra lines of code??(Not to say it’s always simply a few extra lines, you get the point)

In the End, we the consumer will loose, for the folks with an NV40 maybe not in far cry, but in half life two and for those with an r400, maybe not in Half-Life2 but in Doom three and ect..

*****
On a side note: Am I the only one that noticed that who noticed that with the new patch or drivers on the xbit labs review, that the GFX 5950 Scores plummeted by a considerable margin???? Is this the shape of the things to come for those people that purchased NV3x Generation of cards??? :(
*****
 
Dimahnbloe said:
*****
On a side note: Am I the only one that noticed that who noticed that with the new patch or drivers on the xbit labs review, that the GFX 5950 Scores plummeted by a considerable margin???? Is this the shape of the things to come for those people that purchased NV3x Generation of cards??? :(
*****
No you weren't the only one, yes it is the shape of things to come for nV3x owners.
 
Dimahnbloe said:
On a side note: Am I the only one that noticed that who noticed that with the new patch or drivers on the xbit labs review, that the GFX 5950 Scores plummeted by a considerable margin???? Is this the shape of the things to come for those people that purchased NV3x Generation of cards??? :(

It looks that way to me. But this should be nothing new to those who have followed the PC Graphics realm. With every new generation of hardware Nvidia has released, their older hardware generations have become slower with each successive software driver released. They shift their focus from coding manual optimizations for older generations to doing so for the current generation. At least that's the way it seems to be with the TNT, GeForce 1, GF2, GF3, and GF4 cards from what I've seen.
 
BRiT said:
With every new generation of hardware Nvidia has released, their older hardware generations have become slower with each successive software driver released.
And also consider that the previous generations didn't require hand-coded workarounds to compete. The future looks pretty damn bleak for anyone thinking of hanging onto their NV3x card.
 
hovz said:
its weird how ati dropped by as much as 14 fps.....


Yes id like this explained as well.
Seems strange a patch causes aloss in performance this big on the x800 cards.

Crytek are a company concerned about the whole graphics market arent they? They wouldnt purposely screw ati owners would they/because if they are doing this it may backfire on a new company in a massive way.
Imagine the uproar if it was found out they are only interested in nvidia improvements at the expense of not giving a toss how ati cards run their flagship game.

Farcry could be the one and only game that crytek sells in a profitable way- if i find their not interested in ati users i will never buy a single thing from crytek.Im sure theres a lot of people that will feel the same way.
 
I think the wait are going to be over when D3 and HL2 arrives, anyone with a NV40 or R420 are going to enjoy it, marketing and brand´s belong´s to any kind of buisness..)i spell wrong sorry)
 
Most likely, the performance hit in the 5950 is cause by "fixing the image quality bugs"...i.e. dropping the PS1.x/2.x hacked mode and enabling true PS2.0 mode.

It seems obvious to me that PS 3.0 was not going to be enough to match ATI's PS 2.0 speed, so they had to find additional ways to force ATI cards to run even slower.

Developer favoritism is nothing new. John Carmack is probably the most guilty at this. He chooses a company he likes and tries to push the gaming industry in their favor. You need to look no further than Quake 3 for that. Not only did he go out of his way to favor nVidia, he also screwed over the entire rest of the sound card industry in favor of one manufacturer...Aureal. He didn't give a flying crap about what card you owned, he was only going to support A3D and the rest of sound cards were forced to use plain stereo sound. He also purposely did not support the new force feedback devices simply because he decided he did not like the idea and used his influence to try and kill it.

The thing that really gets me is I still read posts from people stating that developers "would never favor one maunfacturer over another...that would be stupid...there would be a backlash from the people they screwed over...", etc. even though there are so many examples of developers doing just that.
 
Dimahnbloe said:
On a side note: Am I the only one that noticed that who noticed that with the new patch or drivers on the xbit labs review, that the GFX 5950 Scores plummeted by a considerable margin???? Is this the shape of the things to come for those people that purchased NV3x Generation of cards??? :(

I'm guessing they've dropped the NV3x optimized ps1.x path completely now since it was giving NV40 problems. IIRC NV40 was using the NV3x path by default, and suffered lower image quality as a result.
I imagine the NV3x's are now using the PS2.0 path, which as we all know, is not exactly one of their strong points :LOL:

I'll do some testing on my NV35 when when the patch (and DX9c) gets released.
 
I think the nv3x's are now running 16fp p.s 2.0 / p.s 1.4 instead of having the fx interger percision thus fixing the bugs but slowing it down.

I have no clue whats going on with the radeons .

Only thing i can think of is there is something fishy with the patch i.e slowing it down on purpose . Or the drivers had an optimization that is now turned off.

Oh there is a mipmap bug from what anand said . Mabye that bug shuts off ati's trylinear scheme ?
 
jvd said:
I think the nv3x's are now running 16fp p.s 2.0 / p.s 1.4 instead of having the fx interger percision thus fixing the bugs but slowing it down.

I have no clue whats going on with the radeons .

Only thing i can think of is there is something fishy with the patch i.e slowing it down on purpose . Or the drivers had an optimization that is now turned off.

Oh there is a mipmap bug from what anand said . Mabye that bug shuts off ati's trylinear scheme ?

Thats essentially whats happening. This is mostly in the areas with Lighting, That banded floor bug is gone as its moved to FP16. I heard the drop in that room was like 4-5 FPSish.

Keeping in mind that room(archive) runs slowly on my Geforce FX 5900 in Patch 1.1 as is. Then again I wouldnt call it remarkably fast on my 6800NU either. It seems to slow down alot compared to most scenes in far cry.
 
I don't have that problem on my radeon though. I see it on my fx 5800ultra (haven't tried it on my dads 6800gt)

I want to know whats going on with the radeons.
 
jvd said:
I don't have that problem on my radeon though. I see it on my fx 5800ultra (haven't tried it on my dads 6800gt)

I want to know whats going on with the radeons.

What problem wouldnt you have? The banded floor? I wouldnt suspect you would.
 
ChrisRay said:
jvd said:
I don't have that problem on my radeon though. I see it on my fx 5800ultra (haven't tried it on my dads 6800gt)

I want to know whats going on with the radeons.

What problem wouldnt you have? The banded floor? I wouldnt suspect you would.
no a drop in fps in that area .

I actually don't see any drops in the radeon performance over 1.1

So i don't get whats going on .in the benchmarks.
 
jvd said:
ChrisRay said:
jvd said:
I don't have that problem on my radeon though. I see it on my fx 5800ultra (haven't tried it on my dads 6800gt)

I want to know whats going on with the radeons.

What problem wouldnt you have? The banded floor? I wouldnt suspect you would.
no a drop in fps in that area .

I actually don't see any drops in the radeon performance over 1.1

So i don't get whats going on .in the benchmarks.

So you have patch 1.2? I'm not saying there was a performance drop anywhere but the Nv3x. That room is actually 5-6 FPS faster with 1.2 than 1.1 on the Nv40 though, But no difference with SM 3.0/SM 2.0 so the improvement was made elsewhere.
 
yes , i was able to get the patch this morning through a friend.

I don't think it was the sm3.0 that provided the boosts . I think it was the right mix of new drivers and sm 3.0 that provided the boosts .

Anyway i do not understand why there would be a drop in the radeons performance. Nothing should have changed. If anything since this is obviously a performance / quality patch it should have boosted either quality or performance of the radeons and it seems to have done niether
 
ChrisW said:
It seems obvious to me that PS 3.0 was not going to be enough to match ATI's PS 2.0 speed, so they had to find additional ways to force ATI cards to run even slower.

Any proof to this accusation?

Seems like a pretty significant claim that if proven true would be VERY difficult for Crytek to defend, dontcha think?
 
Big Bertha EA said:
ChrisW said:
It seems obvious to me that PS 3.0 was not going to be enough to match ATI's PS 2.0 speed, so they had to find additional ways to force ATI cards to run even slower.

Any proof to this accusation?

Seems like a pretty significant claim that if proven true would be VERY difficult for Crytek to defend, dontcha think?

So far we just have the performance drop and no explaination. Perhaps when the patch is released the notes will tell us more.

Right now we shoulnd't make statments like that . But we can certianly ask if thats whats going on
 
jvd said:
So far we just have the performance drop and no explaination. Perhaps when the patch is released the notes will tell us more.

Right now we shoulnd't make statments like that . But we can certianly ask if thats whats going on
Seconded. Speculation is all well and good, but no accusations should be levelled until the patch is out and we hear what they have to say.
 
Back
Top