State of 3D Editorial

To continue the theme of my last post I would like to add that in the future it will be even more important to have reviewers see, understand and appreciate the differences in image quality. We are getting to the point of diminishing returns as far as image quality goes. If a reviewer is not aware of these differences and the processing that is involved in producing hem, we will be left with simple FPS shot-outs. Which in tun will probably lead to image quality being shortchanged in order to win these types of battles. IMO
 
stevem said:
g__day said:
V) OEMs - throw your weight around more when it comes to truth in marketing or practices of the IHVs in benchmarks - more of the "We'll have you nuts if you mis-represent your capabilities to our buyers in leading benchmarks, cheat or cut corners and we'll read you the rite act" :)
Not likely. The OEMs are pissed for various reasons, but have to take the IHV's spiel at face value. Apart from board level & general architecture briefs, they're as much in the dark as everyone else. In fact, the big OEMs have sought diversification for a long time, only to be realised relatively recently (again for various reasons). The name of the game is to be in the game. Hedging is only a security net.

...

The trouble here is that if IHV lie to OEMs and OEMs make any cliam to the public relying on the IHV's information then they carry liability for selling a product that doesn't deliver what is promised. Not too many OEMs say "Nvidia claims X" they say the fastest box or DX9 compliant etc. So if they're lied too or feed partial truths they have an increased exposure not of their making. This could lead to be increased returns or becomming connected to an eventual class action against a IHV whose materials they used. Unless they themselves refute or omit the IHVs untrue claims they are being made party to a risk they just don't want.

So as a whole Investors and OEM distributors and consumers all have the power to say play fair and honest. Maybe an industry watchdog will send them a friendly warning to play honestly.
 
DX History and Nvidia

I have enjoyed reading and lurking on this site for a while now but now I have a question regarding Nvidia's opting out of the early DX 9 development. It is a critical piece of history and I have not heard or seen a lot of corroborative information or public record. I have heard it said by some revisionists, that Nvidia was frozen out of early DX 9 development. I would like anyone who can add to the public record to post their information here. I emailed Josh Walrath and he quickly responded. He gave the following information,


[These decisions not to attend were not documented by NVIDIA. I had a friend that attended many of the initial DX9 meetings that fleshed out the standard, and NVIDIA was absent. From all indications, this was a voluntary move by NVIDIA. I have just read a post from a former MS employee who worked with developers extensively, and he mentions that NVIDIA wasn't happy with many of the initial specifications of DX9, and wanted their own implemented. The person goes on to say that NVIDIA essentially tried to blackmail MS with Xbox chip allocations to try to get MS to change their minds on DX9. By the time the dust settled, DX9 was firmly entrenched, and NVIDIA already had the NV3x series of chips in advanced planning and engineering stages. In other words, NVIDIA followed their own ideal of DX9, but it turns out that much of that ideal was not implemented into DX9. So basically they had a part that did not match up well with the specification.

Microsoft legally would not be able to lock NVIDIA out of the DX9 discussions, because this could be labeled as anti-competitive behavior (and MS is very sensitive to that word now).quote]

I think it is important to have a public record and if you do to could you add to what is already known?

Thanks
 
Hellbinder said:
I tell ya... The more I learn about Nvidia the more I wonder why anyone would EVER want to work there. It reminds me of a concentration camp or something. Really messed up.

I would also like to comment that I have personally considered Brian Burke the Single biggest low like unethical lying *beep* since the days he worked at 3dFX. he has only gotten Worse since he went to work for Nvidia.

A good pay I guess. Plus, for engineers, the possibility to have access to absolutely ridiculous budget - AFAIK, NVIDIA's emulation infrastructure as well as some other things are still beyond what ATI can offer to its engineers.

Brian Burke? Nah, I believe Dan Vivoli is most likely worse ;) Not that I ever had the "pleasure" of talking with someone like him :LOL:


---

Spam: I personally think The Inq's explanation for that makes a lot of sense, although it's hard to proof it:
http://www.theinquirer.net/?article=7781
After a very short while, Intel and Nvidia both walked away from the whole thing and now we know why.

It seems that Microsoft wanted both companies to agree to reveal any patents they had relating to the technology the Vole was building. Once revealed, the Vole expected the companies to hand the patents over for good. Intel and Nvidia walked away. Only recently has it signed an altered agreement. That is why the GeForce FX does not follow the DirectX 9 specifications to the letter.

Another revelation that our friend managed to overhear was about the recent arbitration between Microsoft and Nvidia. We all know that the Vole was trying to get Nvidia to cut its prices but now we may know why. It was nothing to do with patents or Nvidia's ability to supply chips in volume. It was all to do with missed milestones and the chipset not performing as fast as Nvidia had promised. Microsoft is losing money at an incredible rate on Xbox and has been trying every trick in the book to get Nvidia to cut its prices.

The last thing our friend managed to eavesdrop on as he was packing up his kit was that Nvidia staff knew nothing about the GeForce FX being dropped.

That last sentence, I put just because I thought it was a splendid example of how even in early 2003, nobody knew what was going on at nV.


Uttar
 
Hello this will be my first post here while I normally prefer to just lurk and keep to my self there are a few things I want to say as someone who is nothing more then a computer enthusiast. Oh ya just wanted to say sorry about the ranting ahead of time.

First I think Josh deservers a lot of created. I know that if I wrote something with as many inaccuracys that his article apparently has I’d be too embarrassed to ever post under the same handle again. I think it takes some balls to not only admit his mistakes but then ask for help to fix it from the very same place that was trashing his work.

Second this talk about if the NV34 is a DX whatever card I think the answer is pretty simple for a part to be a DX whatever card it must at minimum meet the DX whatever specs and be capable of running the code with no alterations any additional abilities are nice but has no impact where it is or is not a DX whatever part. Since the R3X0 meets these requirements so it is a DX9 card the NV34 however dose NOT meet these requirements therefore is not a DX9 card in fact as far as I can tell Nvidia dose not have a DX9 card at all since none of the NV3X line meets these requirements.

Three if the tables had been turned and the NV30 had been out first and if nvidia had not hyped it so much no one would be bashing the 5600’s performance as it WOULD have been the fasted card out till the 9700 came out Nvidia just made a few bad architectural decisions and had a $h|# load of bad luck. This has happened before and it will happen again anyone remember the voodoo 5 no hardware T&L what the hell where they thinking. The only difference is that by that point 3dFX was already seen as lacking as nvidia had been on top since the TNT’s and as I remember the only ones who thought the V5 was going to put 3dFX back on top where the same fanboys (in some cases literally) who today blindly insist that the NV3X is better then the R3X0 and decide to ignore the facts because nvidia could never mess-up.

As far as the future is concerned assuming R420 and NV40 are an equal progression from current gen chip and since PS3.0 and FP32 are not predicted till DX10 dose anyone really think NV40 will be a substantial enough brake from the NV30 to put it ahead of the R420 seeing as neither will be DX10 cards

Lastly dose any else find it ironic that the FX in GeForce FX supposedly comes from the fact that this is the first card to use 3dFX tech, the Nvidia killed 3dFX and now the GeForce FX is killing Nvidia a few old 3dFXers have got to be laughing at this. :)
 
g__day said:
The trouble here is that if IHV lie to OEMs and OEMs make any cliam to the public relying on the IHV's information then they carry liability for selling a product that doesn't deliver what is promised. Not too many OEMs say "Nvidia claims X" they say the fastest box or DX9 compliant etc. So if they're lied too or feed partial truths they have an increased exposure not of their making. This could lead to be increased returns or becomming connected to an eventual class action against a IHV whose materials they used. Unless they themselves refute or omit the IHVs untrue claims they are being made party to a risk they just don't want.

So as a whole Investors and OEM distributors and consumers all have the power to say play fair and honest. Maybe an industry watchdog will send them a friendly warning to play honestly.
I agree that the market redresses these issues in the long run. The IHVs pimp their wares, but the bottom line is that OEMs become complicit in the generation cycle for continued revenue. Market forces then dictate the deals signed. OEMs are "happier" in this round of negotiations. Things are rarely B&W, though...
 
IMHO nVidia have only really made one bad decision that has led to the current situation - they purchased the wrong company.

Its no secret that ATi would be up a certain creek without a paddle if not for ArtX - ATi today IS ArtX in everything but name for all intents and purposes.

ArtX had its origins inside SGI, where most of nVidia's talent also originated. The potential threat should have been obvious.

Unfortunately it seems nVidia was blinded by its war with 3dfx (no real need there - 3dfx was doing a fine job of self destructing all by itself with no outside help required) and the gigapixel TBR IP held by 3dfx (nice to have I guess, but I prefer IP useful today rather than potentially useful in the future - especially considering the pricetag and the lack of impact TBR has had thus far).
 
who wrote that thing?


It takes the cake for ignorance. Nvidia put themselves in this situation and they and they alone will have to get out of it.
 
ArtX had its origins inside SGI, where most of nVidia's talent also originated.

It been said to me that Dave Orton likes to occasionall rib David Kirk (they used to work with each other at SGI) when they are talking with each other about the fact that ATI currently has more ex-SGI employee's than NVIDIA :!:
 
I'll just state what andyspki did

I believe that the point being highlighted was that according to the article nVidia was apparently taking 'great pains' to make sure CG ran well on competitors' cards. Clearly if, in order to run well, it required us to write the whole back-end of the compiler then that is hardly nVidia taking great pains - that seems to me to be them leaving the pains entirely up to us.

Instead of writing back-ends for an unnecessary and divisive additional shading language we were busy concentrating our efforts on providing high-quality support for the two industry-standard high level shading languages.

I don't see that you're making any relevant point here - perhaps instead of automatically repeating some tired, irrelevant rhetoric about ATI's lack of 'support' for CG you should instead read the thread more carefully.

radar1200gs said:
CorwinB said:
As ATi would be well aware, any hardware vendor is free to create their own Cg backend, and nVidia actively encourage this. Of course, ATi has never taken the time to actually do this, being far too busy slagging off Cg instead...

Sure. Nvidia encourages other hardware manufacturers to write back-ends for a language of which Nvidia controls all specifications. Upside of doing this when compared to using HLSL and GLSlang ? None that I can think of.

To continue your reasoning, you could say that Nvidia was far too busy slagging off the R300 technology ("A 256 bits bus is overkill", "You can't build a true next generation part on 0.15 microns", "I personally think 24 bits is the wrong answer") that they forgot building a competiting part...

nVidia controlling the specifications for the Cg language makes no difference whatsoever. Every backend implimentation must successfully compile the Cg program handed to it in the first place or it isn't doing its job properly... What the backend does is allow the hardware vendor to optimise the output for their own architecture and take full advantage of the features found in that architecture.

If ATi is unhappy with how Cg currently runs they only have themselves to blame. nVidia is under no obligation to make their competitors look any better than they have to...
 
What exactly is it that the NV35 for example is missing to be a DX9 card? I realize their cards may run a 4fps but that has little to do with whether it can be classified as a dx9 compliant card (although it has to do with its utility as one).
 
Re: DX History and Nvidia

Spam said:
I would like anyone who can add to the public record to post their information here. I emailed Josh Walrath and he quickly responded. He gave the following information,


[These decisions not to attend were not documented by NVIDIA. I had a friend that attended many of the initial DX9 meetings that fleshed out the standard, and NVIDIA was absent. From all indications, this was a voluntary move by NVIDIA. I have just read a post from a former MS employee who worked with developers extensively, and he mentions that NVIDIA wasn't happy with many of the initial specifications of DX9, and wanted their own implemented. The person goes on to say that NVIDIA essentially tried to blackmail MS with Xbox chip allocations to try to get MS to change their minds on DX9. By the time the dust settled, DX9 was firmly entrenched, and NVIDIA already had the NV3x series of chips in advanced planning and engineering stages. In other words, NVIDIA followed their own ideal of DX9, but it turns out that much of that ideal was not implemented into DX9. So basically they had a part that did not match up well with the specification.

Microsoft legally would not be able to lock NVIDIA out of the DX9 discussions, because this could be labeled as anti-competitive behavior (and MS is very sensitive to that word now).quote]

I think it is important to have a public record and if you do to could you add to what is already known?

Thanks
The info I heard was that nVidia was trying to get M$ to make dx9 FP32 and tried to pressure M$ by holding out Xbox chips on 'em...which M$ did NOT like and didn't play ball with.

nVidia kept themselves out of those dx9 talks and tried to force CG onto the industry so they would have control over the standard, and lost bigtime. :)

phoenix666 said:
if the tables had been turned and the NV30 had been out first and if nvidia had not hyped it so much no one would be bashing the 5600’s performance as it WOULD have been the fasted card out till the 9700 came out Nvidia just made a few bad architectural decisions and had a $h|# load of bad luck.
No, it's actually been/being argued that nVidia would have faired even worse if they'd gotten the nV30 out on time. They wouldn't have had all that lead time to discredit 3dm2k3 and the benchmark would have absolutely killed their card.
 
DaveBaumann said:
ArtX had its origins inside SGI, where most of nVidia's talent also originated.

It been said to me that Dave Orton likes to occasionall rib David Kirk (they used to work with each other at SGI) when they are talking with each other about the fact that ATI currently has more ex-SGI employee's than NVIDIA :!:

Hehe, good ole Orton ;)
I'm wondering how much of the Rampage core team went to ATI though. I'd estimate that if NVIDIA got 50 of them ( fictious number ), ATI should have at least 20 of them, too...

I'm always amused to see how it's the 3DFX influence ( 4x2+4x0 supposedly being an idea from some ex-3DFXers ) that will give NVIDIA the Doom3 superiority for the NV3x VS R3xx battle.

Too bad these same 3DFX employees seem to have assumed they were the only one capable of delaying their products for more than 12 months :LOL:

If Doom 3 had been released in H2 2002 or H1 2003 as originally expected, the 5800 might have looked pretty good to the eye of "Mr Joe Consumer"...

---

BTW, regarding SGI. Is it just me or did those guys ( the SGI employees working at NVIDIA ) most likely help a lot for the GF3? NVIDIA's strategic alliance with SGI makes that likely.


Uttar
 
Uttar quoted this above. http://www.theinquirer.net/?article=7781
After a very short while, Intel and Nvidia both walked away from the whole thing and now we know why.

It seems that Microsoft wanted both companies to agree to reveal any patents they had relating to the technology the Vole was building. Once revealed, the Vole expected the companies to hand the patents over for good. Intel and Nvidia walked away. Only recently has it signed an altered agreement. That is why the GeForce FX does not follow the DirectX 9 specifications to the letter.

Another revelation that our friend managed to overhear was about the recent arbitration between Microsoft and Nvidia. We all know that the Vole was trying to get Nvidia to cut its prices but now we may know why. It was nothing to do with patents or Nvidia's ability to supply chips in volume. It was all to do with missed milestones and the chipset not performing as fast as Nvidia had promised. Microsoft is losing money at an incredible rate on Xbox and has been trying every trick in the book to get Nvidia to cut its prices.

The last thing our friend managed to eavesdrop on as he was packing up his kit was that Nvidia staff knew nothing about the GeForce FX being dropped.
That Inquirer quote makes no sense to me. The Vole?? I assume they mean Microsoft, but I've never heard that term. Also, why would they need Intel and Nvidia to reveal patents? Patents are public documents. And GFFX was never dropped so what is the last sentence talking about. Did they mean delayed instead of dropped?
 
3dcgi said:
That Inquirer quote makes no sense to me. The Vole?? I assume they mean Microsoft, but I've never heard that term. Also, why would they need Intel and Nvidia to reveal patents? Patents are public documents. And GFFX was never dropped so what is the last sentence talking about. Did they mean delayed instead of dropped?

The Vole is a nickname for Microsoft, possibly invented by Microsoft. It's similar to nicknames such as "Graphzilla" for NVIDIA or "Chipzilla" for Intel.

Why would they need these patents?
*cough* XBox 2 *cough*
Didn't you hear MS is simply designing ATI IP? They're doing quite a bit of designing themselves I believe.
They already had a GPU patent dated 1998 as noted by The Inq in another article.

If Microsoft had managed to get these patents, their job for XBox 2 would have been greatly simplified - of course, it isn't as much as doing an alliance with ATI.
ut for the first time you design a chip, having an IP portfolio can help a lot, since it allows you not to bypass a lot of "hey, we got a patent on this, and it doesn't matter there are no efficient way to do it that hasn't been patented!" BS.

I don't know if that stuff is true or not, I'd give it medium reliability - but it certainly makes more sense than most of the other ridiculous theories I've heard.


Also, in that context, The Inq was talking of the NV30 when saying "GeForce FX" - considering only 100000 cards were made, of which 50000 were for the workstation market.
I'd say it's relatively safe to say that 50000 GeForce FX 5800s, compared to the production of Ti4600s for example, is a pretty small number. Which is why "cancelled" is accurate IMO.


Uttar
 
Yes; The Vole == MS. Thats their pet name for them and always refer to them as such.

I think they were refering to the 5800s. Technically it shipped in miniscule quantities, but for all intents and purposes it was dropped.
 
One of the Inquirer's schticks is nicknaming just about everything out there that's commonplace, tech-wise. To help reduce your confusion, you can keep this for handy reference. ;)
 
As if we don't already have enough acronyms in the technology industry, we have a site that has to give nicknames to everything. :rolleyes: At least acronyms can often be figured out without a dictionary.
 
Hi - first time poster, longtime lurker (the usual song and dance ;) ).

radar1200gs said:
IMHO nVidia have only really made one bad decision that has led to the current situation - they purchased the wrong company.

Its no secret that ATi would be up a certain creek without a paddle if not for ArtX - ATi today IS ArtX in everything but name for all intents and purposes.

^ I find this fascinating. I'm curious - how much of the R3x0 design would you say ATI owes to ArtX? If I remember correctly ArtX was the company that won the Nintendo Gamecube contract, which ATI was in the bidding for. Was the Gamecube contract the only reason for ATI buying ArtX, or was it deeper than that?

Sorry if I'm steering this thread off topic. If it's any consolation, I have been along for all 10 pages of the ride and it's been very interesting and informative! I've learned a lot about Cg and other Nvidia "projects" (ie arrogant and monopolistic decisions made in the last couple of years).
 
jiffylube1024 said:
^ I find this fascinating. I'm curious - how much of the R3x0 design would you say ATI owes to ArtX? If I remember correctly ArtX was the company that won the Nintendo Gamecube contract, which ATI was in the bidding for. Was the Gamecube contract the only reason for ATI buying ArtX, or was it deeper than that?

It's not so much the design itself, but when Dave Orton of ArtX joined ATI in the role of running the company, he changed the aims, attitudes and expectiations of ATI. He turned the company around into something that was willing and able to jump from building average mass-market cards, to the best graphics cards in the world more than a year ahead of their competitors at Nvidia.
 
Back
Top