State of 3D Editorial

Discussion in 'Graphics and Semiconductor Industry' started by PaulS, Oct 30, 2003.

  1. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    To continue the theme of my last post I would like to add that in the future it will be even more important to have reviewers see, understand and appreciate the differences in image quality. We are getting to the point of diminishing returns as far as image quality goes. If a reviewer is not aware of these differences and the processing that is involved in producing hem, we will be left with simple FPS shot-outs. Which in tun will probably lead to image quality being shortchanged in order to win these types of battles. IMO
     
  2. g__day

    Regular

    Joined:
    Jun 22, 2002
    Messages:
    580
    Likes Received:
    1
    Location:
    Sydney Australia
    The trouble here is that if IHV lie to OEMs and OEMs make any cliam to the public relying on the IHV's information then they carry liability for selling a product that doesn't deliver what is promised. Not too many OEMs say "Nvidia claims X" they say the fastest box or DX9 compliant etc. So if they're lied too or feed partial truths they have an increased exposure not of their making. This could lead to be increased returns or becomming connected to an eventual class action against a IHV whose materials they used. Unless they themselves refute or omit the IHVs untrue claims they are being made party to a risk they just don't want.

    So as a whole Investors and OEM distributors and consumers all have the power to say play fair and honest. Maybe an industry watchdog will send them a friendly warning to play honestly.
     
  3. Spam

    Newcomer

    Joined:
    Nov 2, 2003
    Messages:
    2
    Likes Received:
    0
    DX History and Nvidia

    I have enjoyed reading and lurking on this site for a while now but now I have a question regarding Nvidia's opting out of the early DX 9 development. It is a critical piece of history and I have not heard or seen a lot of corroborative information or public record. I have heard it said by some revisionists, that Nvidia was frozen out of early DX 9 development. I would like anyone who can add to the public record to post their information here. I emailed Josh Walrath and he quickly responded. He gave the following information,


    [These decisions not to attend were not documented by NVIDIA. I had a friend that attended many of the initial DX9 meetings that fleshed out the standard, and NVIDIA was absent. From all indications, this was a voluntary move by NVIDIA. I have just read a post from a former MS employee who worked with developers extensively, and he mentions that NVIDIA wasn't happy with many of the initial specifications of DX9, and wanted their own implemented. The person goes on to say that NVIDIA essentially tried to blackmail MS with Xbox chip allocations to try to get MS to change their minds on DX9. By the time the dust settled, DX9 was firmly entrenched, and NVIDIA already had the NV3x series of chips in advanced planning and engineering stages. In other words, NVIDIA followed their own ideal of DX9, but it turns out that much of that ideal was not implemented into DX9. So basically they had a part that did not match up well with the specification.

    Microsoft legally would not be able to lock NVIDIA out of the DX9 discussions, because this could be labeled as anti-competitive behavior (and MS is very sensitive to that word now).quote]

    I think it is important to have a public record and if you do to could you add to what is already known?

    Thanks
     
  4. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    A good pay I guess. Plus, for engineers, the possibility to have access to absolutely ridiculous budget - AFAIK, NVIDIA's emulation infrastructure as well as some other things are still beyond what ATI can offer to its engineers.

    Brian Burke? Nah, I believe Dan Vivoli is most likely worse ;) Not that I ever had the "pleasure" of talking with someone like him :lol:


    ---

    Spam: I personally think The Inq's explanation for that makes a lot of sense, although it's hard to proof it:
    http://www.theinquirer.net/?article=7781
    That last sentence, I put just because I thought it was a splendid example of how even in early 2003, nobody knew what was going on at nV.


    Uttar
     
  5. phoenix666

    Newcomer

    Joined:
    Nov 2, 2003
    Messages:
    2
    Likes Received:
    0
    Hello this will be my first post here while I normally prefer to just lurk and keep to my self there are a few things I want to say as someone who is nothing more then a computer enthusiast. Oh ya just wanted to say sorry about the ranting ahead of time.

    First I think Josh deservers a lot of created. I know that if I wrote something with as many inaccuracys that his article apparently has I’d be too embarrassed to ever post under the same handle again. I think it takes some balls to not only admit his mistakes but then ask for help to fix it from the very same place that was trashing his work.

    Second this talk about if the NV34 is a DX whatever card I think the answer is pretty simple for a part to be a DX whatever card it must at minimum meet the DX whatever specs and be capable of running the code with no alterations any additional abilities are nice but has no impact where it is or is not a DX whatever part. Since the R3X0 meets these requirements so it is a DX9 card the NV34 however dose NOT meet these requirements therefore is not a DX9 card in fact as far as I can tell Nvidia dose not have a DX9 card at all since none of the NV3X line meets these requirements.

    Three if the tables had been turned and the NV30 had been out first and if nvidia had not hyped it so much no one would be bashing the 5600’s performance as it WOULD have been the fasted card out till the 9700 came out Nvidia just made a few bad architectural decisions and had a $h|# load of bad luck. This has happened before and it will happen again anyone remember the voodoo 5 no hardware T&L what the hell where they thinking. The only difference is that by that point 3dFX was already seen as lacking as nvidia had been on top since the TNT’s and as I remember the only ones who thought the V5 was going to put 3dFX back on top where the same fanboys (in some cases literally) who today blindly insist that the NV3X is better then the R3X0 and decide to ignore the facts because nvidia could never mess-up.

    As far as the future is concerned assuming R420 and NV40 are an equal progression from current gen chip and since PS3.0 and FP32 are not predicted till DX10 dose anyone really think NV40 will be a substantial enough brake from the NV30 to put it ahead of the R420 seeing as neither will be DX10 cards

    Lastly dose any else find it ironic that the FX in GeForce FX supposedly comes from the fact that this is the first card to use 3dFX tech, the Nvidia killed 3dFX and now the GeForce FX is killing Nvidia a few old 3dFXers have got to be laughing at this. :)
     
  6. stevem

    Regular

    Joined:
    Feb 11, 2002
    Messages:
    632
    Likes Received:
    3
    I agree that the market redresses these issues in the long run. The IHVs pimp their wares, but the bottom line is that OEMs become complicit in the generation cycle for continued revenue. Market forces then dictate the deals signed. OEMs are "happier" in this round of negotiations. Things are rarely B&W, though...
     
  7. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    IMHO nVidia have only really made one bad decision that has led to the current situation - they purchased the wrong company.

    Its no secret that ATi would be up a certain creek without a paddle if not for ArtX - ATi today IS ArtX in everything but name for all intents and purposes.

    ArtX had its origins inside SGI, where most of nVidia's talent also originated. The potential threat should have been obvious.

    Unfortunately it seems nVidia was blinded by its war with 3dfx (no real need there - 3dfx was doing a fine job of self destructing all by itself with no outside help required) and the gigapixel TBR IP held by 3dfx (nice to have I guess, but I prefer IP useful today rather than potentially useful in the future - especially considering the pricetag and the lack of impact TBR has had thus far).
     
  8. YeuEmMaiMai

    Regular

    Joined:
    Sep 11, 2002
    Messages:
    579
    Likes Received:
    4
    who wrote that thing?


    It takes the cake for ignorance. Nvidia put themselves in this situation and they and they alone will have to get out of it.
     
  9. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,079
    Likes Received:
    648
    Location:
    O Canada!
    It been said to me that Dave Orton likes to occasionall rib David Kirk (they used to work with each other at SGI) when they are talking with each other about the fact that ATI currently has more ex-SGI employee's than NVIDIA :!:
     
  10. YeuEmMaiMai

    Regular

    Joined:
    Sep 11, 2002
    Messages:
    579
    Likes Received:
    4
    I'll just state what andyspki did

    I believe that the point being highlighted was that according to the article nVidia was apparently taking 'great pains' to make sure CG ran well on competitors' cards. Clearly if, in order to run well, it required us to write the whole back-end of the compiler then that is hardly nVidia taking great pains - that seems to me to be them leaving the pains entirely up to us.

    Instead of writing back-ends for an unnecessary and divisive additional shading language we were busy concentrating our efforts on providing high-quality support for the two industry-standard high level shading languages.

    I don't see that you're making any relevant point here - perhaps instead of automatically repeating some tired, irrelevant rhetoric about ATI's lack of 'support' for CG you should instead read the thread more carefully.

     
  11. Sxotty

    Veteran

    Joined:
    Dec 11, 2002
    Messages:
    4,840
    Likes Received:
    303
    Location:
    PA USA
    What exactly is it that the NV35 for example is missing to be a DX9 card? I realize their cards may run a 4fps but that has little to do with whether it can be classified as a dx9 compliant card (although it has to do with its utility as one).
     
  12. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,124
    Likes Received:
    1,655
    Location:
    Winfield, IN USA
    Re: DX History and Nvidia

    The info I heard was that nVidia was trying to get M$ to make dx9 FP32 and tried to pressure M$ by holding out Xbox chips on 'em...which M$ did NOT like and didn't play ball with.

    nVidia kept themselves out of those dx9 talks and tried to force CG onto the industry so they would have control over the standard, and lost bigtime. :)

    No, it's actually been/being argued that nVidia would have faired even worse if they'd gotten the nV30 out on time. They wouldn't have had all that lead time to discredit 3dm2k3 and the benchmark would have absolutely killed their card.
     
  13. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    Hehe, good ole Orton ;)
    I'm wondering how much of the Rampage core team went to ATI though. I'd estimate that if NVIDIA got 50 of them ( fictious number ), ATI should have at least 20 of them, too...

    I'm always amused to see how it's the 3DFX influence ( 4x2+4x0 supposedly being an idea from some ex-3DFXers ) that will give NVIDIA the Doom3 superiority for the NV3x VS R3xx battle.

    Too bad these same 3DFX employees seem to have assumed they were the only one capable of delaying their products for more than 12 months :lol:

    If Doom 3 had been released in H2 2002 or H1 2003 as originally expected, the 5800 might have looked pretty good to the eye of "Mr Joe Consumer"...

    ---

    BTW, regarding SGI. Is it just me or did those guys ( the SGI employees working at NVIDIA ) most likely help a lot for the GF3? NVIDIA's strategic alliance with SGI makes that likely.


    Uttar
     
  14. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,432
    Likes Received:
    261
    Uttar quoted this above. http://www.theinquirer.net/?article=7781
    That Inquirer quote makes no sense to me. The Vole?? I assume they mean Microsoft, but I've never heard that term. Also, why would they need Intel and Nvidia to reveal patents? Patents are public documents. And GFFX was never dropped so what is the last sentence talking about. Did they mean delayed instead of dropped?
     
  15. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    The Vole is a nickname for Microsoft, possibly invented by Microsoft. It's similar to nicknames such as "Graphzilla" for NVIDIA or "Chipzilla" for Intel.

    Why would they need these patents?
    *cough* XBox 2 *cough*
    Didn't you hear MS is simply designing ATI IP? They're doing quite a bit of designing themselves I believe.
    They already had a GPU patent dated 1998 as noted by The Inq in another article.

    If Microsoft had managed to get these patents, their job for XBox 2 would have been greatly simplified - of course, it isn't as much as doing an alliance with ATI.
    ut for the first time you design a chip, having an IP portfolio can help a lot, since it allows you not to bypass a lot of "hey, we got a patent on this, and it doesn't matter there are no efficient way to do it that hasn't been patented!" BS.

    I don't know if that stuff is true or not, I'd give it medium reliability - but it certainly makes more sense than most of the other ridiculous theories I've heard.


    Also, in that context, The Inq was talking of the NV30 when saying "GeForce FX" - considering only 100000 cards were made, of which 50000 were for the workstation market.
    I'd say it's relatively safe to say that 50000 GeForce FX 5800s, compared to the production of Ti4600s for example, is a pretty small number. Which is why "cancelled" is accurate IMO.


    Uttar
     
  16. BRiT

    BRiT (╯°□°)╯
    Moderator Legend Alpha Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    12,254
    Likes Received:
    8,446
    Location:
    Cleveland
    Yes; The Vole == MS. Thats their pet name for them and always refer to them as such.

    I think they were refering to the 5800s. Technically it shipped in miniscule quantities, but for all intents and purposes it was dropped.
     
  17. cthellis42

    cthellis42 Hoopy Frood
    Legend

    Joined:
    Jun 15, 2003
    Messages:
    5,890
    Likes Received:
    33
    Location:
    Out of my gourd
    One of the Inquirer's schticks is nicknaming just about everything out there that's commonplace, tech-wise. To help reduce your confusion, you can keep this for handy reference. ;)
     
  18. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,432
    Likes Received:
    261
    As if we don't already have enough acronyms in the technology industry, we have a site that has to give nicknames to everything. :roll: At least acronyms can often be figured out without a dictionary.
     
  19. jiffylube1024

    Newcomer

    Joined:
    Nov 3, 2003
    Messages:
    4
    Likes Received:
    0
    Hi - first time poster, longtime lurker (the usual song and dance ;) ).

    ^ I find this fascinating. I'm curious - how much of the R3x0 design would you say ATI owes to ArtX? If I remember correctly ArtX was the company that won the Nintendo Gamecube contract, which ATI was in the bidding for. Was the Gamecube contract the only reason for ATI buying ArtX, or was it deeper than that?

    Sorry if I'm steering this thread off topic. If it's any consolation, I have been along for all 10 pages of the ride and it's been very interesting and informative! I've learned a lot about Cg and other Nvidia "projects" (ie arrogant and monopolistic decisions made in the last couple of years).
     
  20. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    It's not so much the design itself, but when Dave Orton of ArtX joined ATI in the role of running the company, he changed the aims, attitudes and expectiations of ATI. He turned the company around into something that was willing and able to jump from building average mass-market cards, to the best graphics cards in the world more than a year ahead of their competitors at Nvidia.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...