State of 3D Editorial

Discussion in 'Graphics and Semiconductor Industry' started by PaulS, Oct 30, 2003.

  1. ElMoIsEviL

    Newcomer

    Joined:
    Nov 3, 2003
    Messages:
    21
    Likes Received:
    0
    Location:
    Ottawa, Canada
    First of all wassup? I'm new here... been reading up on stuff from this website for quite some time now... i'm from a tiny compunity over at x-3Dfx.com. :p

    Anyways...

    I see all of you getting worked up over nothing. Unless you all haven't heard nVidia has just released an XT line of video cards... yes "XT". It is a lower end FX5600 using ATi's new naming scheme.

    It doesn't take an idiot to figure out that nVidia is trying to confuse customers into buying there crappy cards over ATi cards. If you all haven't read the marketshare reports they go a lil like this...

    In the Total graphics card market the situation looks like this:
    Intel: 35%
    nVidia: 25%
    ATi: 22%
    VIA: 9%
    SiS: 8%
    Matrox: 1%

    In the work environment the situation looks like this:
    nVidia: 62% (Down 2%)
    ATi: 32% (up 4%)

    In the Laptop environment:
    ATi: 71% (up 3%)
    nVidia: 21% (up 2%)

    In the Total DX9 market share:
    nVidia: 72% (Thanks to the 5200)
    ATi: 27% (9200 is not a DX9 card although it outperforms the 5200).

    High Performance DX9 cards 9500 and up; 5600 and up (my area):
    ATi: 68%
    nVidia: 32%

    nVidia not used to ONLY having 32% of the High Performance GPU market have released the FX5600 XT, which has a core clock of only 230Mhz and a memory clock of 400Mhz (200Mhz DDR).
    Speculation is (rather this is quite evident) that nVidia want to confuse customers who are looking to buy 9600 XT's into buying 5600 XT's.. Wow.. just change the 5 for a 9 and you got the same name.

    As for driver cheating I've exposed quite a few cheats by myself and have seen them first hand (I own a GeForceFX 5600 Ultra by Leadtek).

    As some of you pointed out the Quake/Quack cheat from ATi. They lowered Mip Map levels to achieve better performance in Quake III... they were caught and promised to never do it again.... they have yet to cheat.

    nVidia have never admited to cheating and continue to do so. They even went as far as telling consumers that they had implemented a system to regulate what a cheat was and what an optimisation was only to turn around and cheat again. Now I've lost all my Trilinear filtering in my D3D apps with my FX. Is it just me or isn't Trilinear filtering a VERY old method of filtering which is basically now not a bonus but a given... EVERY card must do Trilinear... and should be able to do so.

    I am angry, I used to always buy the latest products from both ATi and nVidia only to find myself no long willing to purchase the crap coming out of nVidia.

    This puzzles me as it had changed me into sort of a fanboy I guess... because my constant search for the truth has given me a certain sense of pride from owning an ATi 3D accelerator..... (since I keep revealing cheat after cheat after cheat in nVidia drivers).

    To be fair I have tested ATi drivers consistantly as well... and have yet to find anything fishy.

    To be fair to all consumers around the world I believe it's time for nVidia to close it's GPU doors down for good... it will only get worse.

    XGi is coming out with what might be a VERY competitive product which may force nVidia to the number 3 spot (much the same fate 3Dfx faced when the GeForce2 GTS and the Radeon came out back in 2000).

    If you don't agree with what I say then you're probably once of those people who hug there nVidia plush doll each night before going to be bed...since from where I stand this is AS brutally honest as one can get....

    Will I ever buy another nVidia product... HIGHLY unlikely...
    Do I think nVidia has a chance to build another market leading product.. Hell yes...it's nVidia.
    Then why won't I buy another nVidia card?

    Because they lied, cheated, blackmailed, hurt and betrayed our communities trust. They completely and totally deceived us enthusiasts and gamers and they don't seem to want to stop.

    Peace.
    ElMo
     
  2. PaulS

    Regular

    Joined:
    May 12, 2003
    Messages:
    481
    Likes Received:
    1
    Location:
    UK
    They won't.
     
  3. ElMoIsEviL

    Newcomer

    Joined:
    Nov 3, 2003
    Messages:
    21
    Likes Received:
    0
    Location:
    Ottawa, Canada
    How can you be so sure?
     
  4. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    I just want to say (going back a way), that while what nVidia have done may well be the "nature of business", our not standing for it is the "nature of consumerism". Sure, nVidia may do X, Y and Z to improve sales, look good and impress the share holders, but why on earth do we need to stand for it? Are reviewers working for nVidia? No, they're supposed to be guiding the consumer, so should look at it from their perspective and unleash hell on any practice that will not benefit us.

    Helping the consumer is the point of reviewers. Without that, they may as well not exist.

    Now let us never hear how "it's all right because nVidia are a business" again. If their business is screwing the consumer, then I as a consumer intend not to stand for it... in this case my money has gone to ATI.
     
  5. ElMoIsEviL

    Newcomer

    Joined:
    Nov 3, 2003
    Messages:
    21
    Likes Received:
    0
    Location:
    Ottawa, Canada
    I agree... and it is with that point in mind that I stand behind what I said.
     
  6. YeuEmMaiMai

    Regular

    Joined:
    Sep 11, 2002
    Messages:
    579
    Likes Received:
    4
    wasn't nVidia the one entity that was pushing FP32 over FP24 because of it's superior IQ and did they also claim that they could do FP32 FASTER than ATi could do FP24?

    I never liked any of nVidia's business tactics and I went from a S2 Savage 2K to a Radeon LE and have never looked back.
     
  7. jiffylube1024

    Newcomer

    Joined:
    Nov 3, 2003
    Messages:
    4
    Likes Received:
    0
    Ah, I never knew about Dave Orton coming in from ArtX and running the company. The results, however are night and day. ATI before that era (ie the Radeon 9700Pro) was a company without too much of a focused direction IMO. A good market follower, but never a leader. From the Rage II cards to the Rage Pro to the first two Radeon generations, ATI seemed good at releasing acceptable cards, but never seemed to be capable enough to be a market leader.

    However, since the Radeon9700 Pro, they have done a lot to prove they are capable of being a market leader. They have been a good study of Nvidia's climb to the top, if you will ;). And Nvidia has succumbed to their own hubris (for the time being). Hopefully both companies reload for a competitive '04!
     
  8. jiffylube1024

    Newcomer

    Joined:
    Nov 3, 2003
    Messages:
    4
    Likes Received:
    0
    I am very doubtful that XGI will be competitive with ATI and Nvidia when the release their Volari lineup. If XGI is serious about competing in the video card market, they I can see them possibly having a competitive part in a generation or two, but for now they're definately on a prove-it-to-me basis.

    First of all, their cards are all numbers and paper specs right now. At least ATI and Nvidia have not only working silicon out there, but competitive parts for sale.

    What makes XGI so different from ATI in the original Radeon days and before that - ie a half-step behind? XGI's dual-GPU'd Volari looks good on paper, but there are chinks in the armor. Their alleged 5500 3dmark 2k3 score is already behind ATI's 9800XT.

    Second, how can you be so sure XGI's drivers will be all that when they first come out. It takes a long time to get truly reliable, stable, compatible drivers for a series of video boards.

    Third, I think you are seriously overreacting if you believe Nvidia is dead in the water. Not only are they in a much *much* stronger financial position than 3dfx ever was, but they have a much more diverse market. They have an excellent position in the AMD motherboard sector as well.

    Moreover, we don't know how NV4x and R4x0 will perform, and it's all up in the air right now. Let's wait and see how NV4x does before we decide that XGI (of all companies!) will overtake them!
     
  9. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    I haven't seen much of interest in nVidia's whitepapers all year long--at least information that I judged to be sufficiently free of biased propaganda to the degree that I was tempted to seriously consider it. Too much of nVidia's "whitepaper" jargon seems little more than convenient marketing PR written to confuse issues which are, at their essence, pretty darn simple. Compared to R3x0, nV3x's shader hardware engine is lacking a lot.

    It matters little actually what software "optimizations" nVidia undertakes with nV3x, or where they undertake them, it's simply not enough to overcome the disparity between the R3x0 and nV3x architectures at the hardware level. To wit, this is the fact that nVidia wishes to obscure and confuse, if possible, and this is the intent of the "whitepapers" the company creates. There is no real intent to "educate" anybody--the intent here is to explain away their performance deficits to anyone who is willing to listen to any excuse they can manufacture. The only thing surprising to me about it this year have been the people more than willing to lend such empty excuses an authority and weight which grossly exceeds any possible "educational" value they might ever have. I mean, I find it odd in the extreme that people trumpet nVidia's latest tactic--that of blaming the deficit on poor compiler optimization (never mind that it was entirely nVidia's fault it was "poor" to begin with)--and speak of the fact that nV3x shader engine performance is now only 200%-300% behind R3x0, when it was 300%-500% behind, as though this was some kind of validation of "what nVidia's been saying." Despite a solid year's worth of "optimizations" of all descriptions, nV3x still plays second string to R3x0. That is about the only solid fact that nVidia has "validated" for me this year.

    The fact is is that unless nVidia starts publishing whitepapers which simply state that nV3x is not competitive with R3x0 from a purely architectural standpoint, nVidia simply has nothing to say that is worth reading, as what the company does manage to say is nothing but a "blame game" in which everybody is at fault--except nVidia. I think someone would have to be pretty thick to accept nVidia's whitepapers as being more than utterly self-serving and defensive.

    And that's why ATi has needed to say very little about its optimizations, compared to the reams and reams of virtually useless and apologetic info coming from nVidia with the intent to "clarify" that their performance deficit is due to software issues (which are moving targets and change constantly to fit the situation nVidia needs to "explain" away), instead of exposing that the root of nVidia's current performance deficits (and the deficits it has suffered all year) is the inferiority of its nV3x architecture when compared to R3x0. A factual "whitepaper" would explain why this is so...:)

    In short, is it really "helpful" to anyone to know that nVidia has an "answer" to explain nV3x shortcomings? Such answers, in that they provide no relief of any kind, cannot possibly be helpful, it seems to me. Rather, I think what is important for everyone to realize and accept is that so long as nV3x remains the "best" that nVidia can field in competition to R3x0, nothing is going to fundamentally change or shift, regardless of how many "whitepapers" nVidia publishes. Conversely, an increase in whitepapers from ATi will serve to broaden their lead not one little bit. What nVidia needs is a competitive architecture--not further whitepapers, IMO. I know that some willingly confuse the "world as it ought to be according to nVidia," with the world as it is, but fortunately, I am not one of them...:D
     
  10. hoom

    Veteran

    Joined:
    Sep 23, 2003
    Messages:
    2,908
    Likes Received:
    477
    Lets not forget that there is an NV3x path built into Doom3 with all the technical quality compromises which that entails:
    Texture lookups where ATI does mathematically correct calculations (which Carmack has stated to be clearly better than texture lookups), built in lower precision where ATI is doing fp24 (and again Carmack has said the higher precision looks better if only slightly), & probably others though those two are bad enough.
    And thats without even considering what the drivers will do to it...

    I just refuse to buy the idea that GeforceFXes will really be beating ATI at Doom3.
    Maybe they'll get a few more fps, but Carmack himself has stated that the FX is doing less so who cares if NV3x gets those few fps more?

    Its like if valve cut down the effects load on the mixed mode of HL2 enough that the FX beats ATI in fps.
    Who cares when we all know that the FX is doing less work & the ATI card renders it better?

    I guess the answer is: Joe Consumer who doesn't realise that the FX is doing less work at lower quality... :roll:


    Anyways, if only there had been threads like this (as in this much known for sure rather than pure conjecture + little/no proof of the relative performance) way back when the FX line first came out & I was trying to figure out whether to buy ATI or NV...
    It would have saved me all those months reading between the lines of reviews, public statements etc before I was sure that the r300 was an utterly fantastic chip & NV was in trouble.

    The best thing about the r300 is that the design of the pipeline is so elegant that ATI had enough room within their transistor budget to go brute force too.

    We've seen that the 4 pixel pipeline 9600p, even running at a lower clock speed than 5900u can beat that card in the real dx9 shader heavy situations where the elegance of the pipeline design should be the main factor.
    This proves outright that the ATI pipeline is the more elegant of the two.
     
  11. stevem

    Regular

    Joined:
    Feb 11, 2002
    Messages:
    632
    Likes Received:
    3
    As has been mentioned many times, a key determinant of D3 performance will be accelerating shadow volumes. Nvidia have incorporated Carmark's zpass/zfail stenciled shadow volume techniques by including 2-sided stencil testing & depth clamping in HW.
     
  12. demalion

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,024
    Likes Received:
    1
    Location:
    CT
    I have trouble seeing the association between what you talk about, and what Dave said, WaltC. It is demonstrable that there is a tangible result of a lack of specific information released by ATI and a relative abundance of specific information (however PR-centric and technically distorted) being released by nVidia, and that is what Dave is talking about...your discussion of your personal take on whether information is necessary for you doesn't say anything that changes that, or seem to succeed in making your opinion applicable to what Dave was addressing, AFAICS. He wasn't talking about your reaction to such information.

    On that note, stevem's comment on 2-sided stencil seems to underscore what Wavey is saying fairly well.
     
  13. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    Doubt post, sorry. See below.
     
  14. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    Helloooo, I never said NVIDIA hardware was better than ATI's.

    I said it looked better in Doom 3 to the eye's of Joe Consumer. That inclusion of "Joe Consumer" also meant I disregarded ALL IQ issues beside major ones, which simply do not exist in Doom 3 for NVIDIA hardware, but in fanatics' fantasies.
    Even Carmack said the difference wouldn't be visible unless you tried to find it - so if you want to quote your hero, at least quote him right ;)

    Without wanting to be harsh, although I'm sincerly tempted to be due to your nickname, please do not bash people's opinion just because it's "fun". Thank you.


    Uttar

    EDIT: At Doom3's time of release, I doubt the advantage will be anywhere as big as NV30/R300 or NV35/R350. Which is why NVIDIA's bet hasn't paid off at all.

    P.S.: ULE ETA: 48 hours.
     
  15. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,123
    Likes Received:
    1,654
    Location:
    Winfield, IN USA
    Really? KEWL!!! :D
     
  16. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    ATI has the same in the R3x0 and RV3x0 parts.
     
  17. [maven]

    Regular

    Joined:
    Apr 3, 2003
    Messages:
    645
    Likes Received:
    16
    Location:
    DE
    Depth Clamping too? I was only aware of two-sided stencil...
     
  18. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,298
    Likes Received:
    137
    Location:
    On the path to wisdom
    Maybe stevem was thinking of depth bounds (UltraShadow) instead of depth clamp. Which is, btw, only supported by NV35+
     
  19. ElMoIsEviL

    Newcomer

    Joined:
    Nov 3, 2003
    Messages:
    21
    Likes Received:
    0
    Location:
    Ottawa, Canada
    Oh I do not believe I am at all overreacting as I never stated that nVidia were as you say "dead in the water".

    I am simply mixing in a bit of philosophy as well as laws of nature into these current day events.
    What I mean is "they" say that history repeats itself. And by all means it is doing so ever since NV3x was pinned up against R3xx. The entire events which lead up to these two pieces of silicon as well as the events we see today can ALMOST all be matched up to events we saw between nVidia and 3Dfx.

    I could possibly display my analogy (careful with analogy's though).


    Bck in the day nVidia released a card which we all know as the GeForce 256. This was the first true GPU. There main competitor 3Dfx were still with there aging Voodoo3 lineup which was completely outmatched by the GeForce.
    Soon after 3Dfx announced the VSA-100 (Voodoo4 4500, Voodoo5 5500 and Voodoo5 6000). When they became publically available nVidia counter attacked with the release of the GeForce2 GTS... which again completely outmatched the VSA-100 and at the time 3Dfx's best offering the Voodoo5 5500 (the 6000 was never publically released).

    3Dfx soon after (a little under a year) went out of business because they did not have any financial backbones to sustain the company. (other business practices).

    History repeats itself today.

    ATi released the R300 (Radeon 9700 Pro) which completely outmatched nVidia's again GeForce4 Ti lineup. When nVidia made available it's counter attack in the form of the GeForceFX 5800 Ultra ATi counter attacked with the Radeon 9800 Pro..... the rest well right now is history.

    nVidia won't go bankrupt, never said that... but to assume that they will regain a position of undisputed power like they once held is illogical.

    Logic points that they may become the market leader once more wearing the performance crown within the next 2 product cycles.... and there next product might VERY well come close to ATi's upcoming offering the R420.

    Add to this the hatred that has spurred up against nVidia as they continue to lie and cheat there customers, well things do not bode well for the once mighty king.

    Only time will tell.....ironically I said these VERY same words when things looked grim for 3Dfx.
     
  20. Althornin

    Althornin Senior Lurker
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,326
    Likes Received:
    5
    elmo, pls check pm's
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...