The way it's meant to be patched...


LOL, at this point, wouldn't all reviewers doing some ATI/NV shader performance/IQ analysis be saying that by now?


That type of thing has already happened. Have you heard of the problem with the water shader in Tiger Woods 2003? It works on NVIDIA's boards, but not any others. Evidently there is a device ID detect in there - if you change another shader enabled board to an NVIDIA device id it runs the NVIDIA shader, not only that but it runs it fine - change the device ID of an NVIDIA board to a non-NVIDIA board and it breaks. Ask EA who wrote that shader code.

I wasn't aware of that... is there a source?

Perhaps BigBerthaEA (from the NVNews boards and others) could comment on that..

At the moment, I am thinking about refusing to review any NVIDIA-based products, not because I don't like what NVIDIA is doing, but because I honestly do not know how to review their products accurately anymore.

Well, you obviously have to hold the same standard from one company and another... I'd say wait til the official Det50s come out.. and do some analysis of them... if they totally break all faith of "redemption"... it probably is the best way to go... although.. I don't think it's a great idea to exclude stuff like this in the longrun (then people would think you are biased towards NVidia)..

It WOULD be an idea to NOT to review anymore "FX series" cards... until a radically new solution (architecture) is available (I believe NVidia will release a new card based on the current architecture for a while, but that beats me)

(The suggestion I mentioned is in reference to something the head editor of 3DVelocity stated in his forums.. I think)
 
I think they confused some words in that press release by accident, it should read :

Core and Eidos believe that Tomb Raider: AOD performs well except on NVIDIA hardware.

not :

Core and Eidos believe that Tomb Raider: AOD performs exceptionally well on NVIDIA hardware.

:LOL: ;)
 
Kristof said:
SimonF came up with it... credit goes to him 8)
Ummm Errr... I deny all knowledge of that.... must have been a spammer forging my email address.
 
Reverend said:
I am just totally amazed by the lengths NVIDIA are going through to try and limit the defiency between their hardware and ATI's. I understand it but I am amazed they are actually doing it. This TRAOD/Eidos incidence is just very probably the last straw for me when it comes to any remaining respect I may have for a company that I once admired.

Even Microsoft isn't so obviously blatant in doing dirty tactics.

I'm thinking along the same lines but how on earth can nVidia even start to think that their company can afford the same luxury of arrogance and strong arming as Microsoft can? Did their last sales figures made them think that they are a monopoly without competition? That customers have no choice? That game developers will accept anything?

I think we all understand that they do have to protect their market shares and promote their products as good as possible, but words like high-handed, smugness and self-sufficient comes to mind when one have to describe their management dicisions since the whole NV30 cover-up started.

Maybe nVidia should remember Nixon: It is not the lie itself, but the damn cover-ups, that will kill you in the end. :|

... and thus we have a word for the whole affair:

CineGate!
 
I feel sorry for the Core guy.... Core have lost Tomb Raider,, they've been pretty much humiliated by the Eidos board... So that guy's job is probably not looking pretty good at the moment,,, and now he's probably gunna get a roasting..
Though, i can't understand how a patch can come out without Eidos knowing what was in it ! ! !

Stunned by the Tiger Woods issue thats incredible and M$ need to get involved here this is ruining DX. Mind you I had a v5 and although the box of ANOTHER EA game said "needs dx8.0" the EA online FAQ said "needs dx8.1" and the game promptly didnt work correctly on my v5. Apparently this was unfixable ( yeah right ), luckily i had a kryo2 board i could use instead...

-dave-
 
It seems to me the reason Nvidia is happy to continue with cheat drivers and persuading publishers to lower or alter game quality in their favour is because while it's been condemmed by online review sites and forums very little of this is seeping down to the general public.
I very much doubt that Joe Bloggs who pops down to PCWorld has any clue whats been going over the last year.
Until the hardcopy PC mags start slamming this kind of behavour Nvidia will be safe to continue. I also believe it will continue even when the NV40 is released, after all, why would the GPU still require Int16 at all in the next generation
A positive move by the online review community would be to collate all the years information, put it in date and point format, have all the contributors sign it then pass it onto PC mag and newspaper editors - it would only take 1 major newspaper to get interested and it would blow wide open.
 
THe_KELRaTH said:
A positive move by the online review community would be to collate all the years information, put it in date and point format, have all the contributors sign it then pass it onto PC mag and newspaper editors - it would only take 1 major newspaper to get interested and it would blow wide open.

it would be better to send the information to the OEM companies like Dell...

hit them where it hurts
 
gokickrocks said:
it would be better to send the information to the OEM companies like Dell...

hit them where it hurts

None of this effects companies like DELL at all. Now if the end user isn't buying their hardware systems because it has an Nvidia card inside then they would produce more product using ATI - Simple supply and demand.
 
NVidia is getting nasty.

What does this make me think? That their next-gen chips are either nowhere near ready or they already know that their performance will be nowhere near good enough. So they panic.

It will be interesting to see how much influence the "network" of websites and enhusiasts who read and post on the web have in damaging the sales of NV hardware, given that they are seeing through a lot of this crap.

I'm hoping however that somehow NV pull through this - I don't want to be left with an effective ATI monopoly in cutting edge 3D tech, as that would just stifle progression as they sit back. We've already essentially lost Matrox, we might gain XGI if they turn out to be any good (I'm sceptical), and PowerVR-based tech has been too quiet for too long (although there's always hope/rumours...).

Sort it out NV. Spend more on R&D and less on PR spin. Compete by being competitive, and not by trying to mislead your customers. Don't treat us like idiots - it's insulting.
 
Gnep said:
That their next-gen chips are either nowhere near ready or they already know that their performance will be nowhere near good enough. So they panic.

Deja vu.

What would happen if they screw up the next gen? Perhaps we should all go out and buy nv30-boards after all...
 
Ollo said:
What would happen if they screw up the next gen? Perhaps we should all go out and buy nv30-boards after all...

Maybe they would collapse life 3DFx did...
That wouldn't be a big problem, we'll simply have PowerVR and ATI, both firms being much nicer and having more respect to their customers.
 
xGL said:
we'll simply have PowerVR and ATI

Except PowerVR won't have anything for 2003, whereas XGI does :rolleyes:

The idea was that nVidia would collapse after releasing a poor NV4x chip, so PowerVR don't need to have something ready yet :) Only by the time NV4x is released.

But yes, XGI can produce something interesting.
 
XGI, PowerVR, add the DeltaChrome + successors, add that Creative might let the 3DLabs boys design for the consumer market, consider that nVidia engineers will be soaked up by these new players.....

No, we really don't need nVidia, no more than we needed 3Dfx, or Matrox, or NumberNine, or Tseng Labs, or any of the rest who have come and had a place in the sun only to sink back into obscurity. nVidia is eminently replaceable in the graphics market, given a couple of years or so.

Consumers, reviewers, developers, OEMs - noone needs an IHV that decieves them and makes their lives awkward.

Entropy
 
Deathlike2 said:
That type of thing has already happened. Have you heard of the problem with the water shader in Tiger Woods 2003? It works on NVIDIA's boards, but not any others. Evidently there is a device ID detect in there - if you change another shader enabled board to an NVIDIA device id it runs the NVIDIA shader, not only that but it runs it fine - change the device ID of an NVIDIA board to a non-NVIDIA board and it breaks. Ask EA who wrote that shader code.

I wasn't aware of that... is there a source?

I apologise, I shouldn’t have said that as I haven’t actually verified it myself - there is at least one editor looking in to this that I am aware of. However, as I said, I shouldn’t have said this and it was an unprofessional emotional response to the events that have happened recently.

No, NVIDIA hasn’t "got to me", but this type of comment shouldn’t be what B3D is used for and I shouldn’t have made it. However, I am still rather astounded by the events recently.

...

Since starting to write this I have a long conversation with NVIDIA on some of the Tomb Raider stuff. They believed ATI tipped us off to there being a benchmark mode in the patch - I will state, that we had no such information from ATI. We started working with Core to get some enhancements to the benchmark (which are in the shipping version of the game) back in July and the first either NVIDIA or ATI knew that we were looking at this was when Reverend mailed about some driver issues for both parties - we did receive a reply from ATI, but AFAIK Reverend didn’t get anything back from NVIDIA.

There appears to be a slight impasse in the methods of benchmarking that NVIDIA would want, and also a lack of appreciation of what was actually benchmarked here. The problem is that at present there is a difference between the "out of the box" settings and the "like for like" settings. Reverend spent a lot of time talking with Core to get an understanding of what the "like for like" settings would be, which is what our settings documentation is based on. However, this like for like benchmarking isn’t necessarily how the game will be played out of the box - this is what I believe the last statement from Eido’s is in relation to. When I talked to NVIDIA there didn’t seem to be a level of appreciation that we took both benchmarks in the "like for like" settings and we also showed the default board performance as a current user would play it out of the box (and also listing all the settings that were enabled or disabled). I specifically did this because I figured there would be complaints that it doesn’t represent how you would play it, however this appears to have gone unnoticed - IMO reviews are also about understanding performance to base purchasing decisions on, not justifying (or getting annoyed at) the purchase you have already made.
 
DaveBaumann said:
Deathlike2 said:
That type of thing has already happened. Have you heard of the problem with the water shader in Tiger Woods 2003? It works on NVIDIA's boards, but not any others. Evidently there is a device ID detect in there - if you change another shader enabled board to an NVIDIA device id it runs the NVIDIA shader, not only that but it runs it fine - change the device ID of an NVIDIA board to a non-NVIDIA board and it breaks. Ask EA who wrote that shader code.

I wasn't aware of that... is there a source?

I apologise, I shouldn’t have said that as I haven’t actually verified it myself - there is at least one editor looking in to this that I am aware of. However, as I said, I shouldn’t have said this and it was an unprofessional emotional response to the events that have happened recently.

No, NVIDIA hasn’t "got to me", but this type of comment shouldn’t be what B3D is used for and I shouldn’t have made it. justifying (or getting annoyed at) the purchase you have already made.

I for one figured it was second hand information and maybe you shouldn't have posted it . However you posted what you heard in the forums and not as news so don't be so harsh on yourself.
:)
In any case , if the information is true the implications are of an IHV breaking the law (atleast in the US). AFAIK intentionally breaking another companies product is an anti-competitive practice and a felony.

There appears to be a slight impasse in the methods of benchmarking that NVIDIA would want, and also a lack of appreciation of what was actually benchmarked here.

You might as well talk to a brick wall in this respect. De nile is a river and Nvidia is drowning in it.
 
Back
Top