nVidia slowing down...

Discussion in 'General 3D Technology' started by Joe DeFuria, Dec 17, 2002.

  1. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,420
    Likes Received:
    179
    Location:
    Chania
    Simon,

    Poison ivy? ;)

    Rev,

    If one would sit down and categorize individuals and their preferences, the list would have to be long. Keep in mind that the vast majority doesn't care all that much either about things like that, to even become vocal about it on message boards.

    As far as marketing tactics ot PR concerns I never liked them from any company or IHV out there; as a consumer I can conduct my own research and decide which product at a given point of time suits my needs best.

    I used to have a preference for 3dfx up to the V3 in the good ole days too, yet I never failed (or at least tried) to acknowledge when there was a better product out there, no matter the brand.

    I always had a weakness for TBDR's, but sadly enough there's only one company dealing with the approach left and there have only been budget products from them once in a blue moon. Would there be at least mainstream or high end Tilers from two or more IHV's on shelfs right now, I'd be crazy enough to try them all. My doctor says I'm a hopeless case :eek:

    PS:

    Himself,

    Nice post; I agree to most of it.
     
  2. demalion

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,024
    Likes Received:
    1
    Location:
    CT
    Similar to what I've said before (I think in response to the observation that monitor sizes don't have much further to go, therefore there is an effective cap on max resolutions), I just think we will skew away from feature creep purely to performance and cost enhancement. That the GPU market will at worst shift to behavior more like the CPU market with only performance of parts being responsible for pricing tiers (unlike the CPU world, it already de-emphasizes clock speed compared to performance).

    I think nVidia and ATI are more successful than credited in encouraging developers to utilize features that will allow this type of progress, and that the obstacles between "DX 9" and "DX 8" feature utilization is much much less than many proclaim (i.e., much much less than the obstacles between "DX 8" and "DX 7"), and only lowering in time (in a very short amount of time, I'd say...).

    I think the current and imminent "staple" engines' (for licensing in future games) capabilities support this, the announced "shader utilizing games" (such as the next batch of MMORPG games, and naturally console ports/co-development which I think will have an increasing impact on PC game advancement) illustrate a likely successful effort to fuel this, and the tools being deployed currently (HLSLs) leave lots of headroom to successfully facilitate this type of transition for atleast the next 5 years fairly easily...I'd even guess up to 10 years even before considering new system bus/architecture and CPU type adoption impacting the demand for new products..

    I think ATI's strategy, and success, with the 9500 and above series cards have already demonstrated the feasability of this shift, and while "DX 10" will likely not matter much at all to the consumer compared to DX 9, and even moreso DX 8, that with the development tools successfully deployed the major benefits of that functionality will almost transparently and instantaneously be offered (i.e., the barrier for acceptance many proclaim as threatening a slowdown in advancement and new products will effectively no longer be there).

    I think nVidia will have no choice but to match ATI's commitment to this (i.e., rapidly shifting DX 9 support to low end parts), and indeed have already indicated their intent, since this transition is taking place at a time when the underlying support for it is being offered hand in hand with performance improvements for even "DX 7" titles for the next year or two atleast.

    I think nVidia would prefer not to sprint down this path as they've stumbled at the starting block, but I still think they are capable of a sprint and with the competitor already with a lead they'll have no choice. I think comments indicating they are "slowing down" are merely a start on "spinning" perception of their stumble at the start to something intentional and purposeful while they are still perceived as the market leader and trend setter.

    Of course, reality has no obligation to agree with me, but I think all of the above are reasonable statements. :p
     
  3. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    1
    Location:
    Canada
    Reverend, I use a Geforce 256 DDR still. I have few problems currently with it but when I first received the card there were frequent problems with the drivers and they continued on for the better part of a year until nvidia finally fixed the driver issues. So in my case the situation is quite the opposite of what you have described.
     
  4. Sxotty

    Veteran

    Joined:
    Dec 11, 2002
    Messages:
    4,895
    Likes Received:
    344
    Location:
    PA USA
    High 5 crusher

    LOL I had to say he is right about the VW beetle and so many other things in our silly society
     
  5. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,420
    Likes Received:
    179
    Location:
    Chania
    The parallelism between computers and VW Beetles could only stand if the PC hardware consists of rare or unique components; something like a collector's item in a sense.

    Usual old hardware can be found either for free or dirt cheap, while the Beetle is quite expensive.
     
  6. Crusher

    Crusher Aptitudinal Constituent
    Regular

    Joined:
    Mar 16, 2002
    Messages:
    869
    Likes Received:
    19
    Usual old hardware doesn't consist of desknotes and flat screen monitors.
     
  7. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,420
    Likes Received:
    179
    Location:
    Chania
    You can also take out the back seats of a Beetle and put in a Porsche engine; after how many miles you'll get killed is irrelevant. :D

    I can see your point, yet the way I see it people that upgrade only once in a blue moon don't do it because their 100MHz CPU is an antique or collector's item the way I see it.
     
  8. megadrive0088

    Regular

    Joined:
    Jul 23, 2002
    Messages:
    700
    Likes Received:
    0
    This can mean so many different things, there are so many possibilities, it would be impossible to discuss them all here. IIt's obvious though, that Nvidia has already slowed down. They've been slowing down the introduction of new chip generations ever since the delay of NV20 in fall 2001 and the releasing of GF2 Ultra. Nvidia has been using speed bumps(GF2 Ultra, GF3Ti500) and tweaks (NV28) instead of releasing refeshes (NV25) six months after every new architecture. they've been spreading out their chips. maybe this will only increase. it makes sense though, with software not taking advantage of even the last 3 or 4 new cycles of video cards.

    If indeed NV40 is not out until late 2004 or early 2005, then XBox 2 would use some NV4X variant.
     
  9. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    I think it's silly to overestimate them, frankly. The fact is that despite nVidia's business "acumen" the company still missed an entire product cycle--just as 3dfx did. nVidia is also faced with a strong competitor with a lot of clout and an excellent architecture--just like 3dfx was faced with in the form of nVidia. So I think there are plenty of meaningful comparisons to draw here. True, nVidia is delving into other areas that 3dfx never came close to--like core logic chipsets--right now I'm really enjoying my nForce2 motherboard running my Radeon 9700P...;) But to be honest nVidia's graphics engine core sustains everything else from a fiscal perspective. nVidia loses a lot of ground in the graphics field and the rest won't matter.

    Here's the thing that's so often overlooked--nv30 is a *major* departure from nVidia's track for the last three-five years. Essentially, nVidia is where it is today because of the TNT architecture which was refined for TNT2, doubled, essentially, for GF1, and all subsequent versions have rested on the same architecture which has been clocked up as much as the technical prowess of FABs like TSMC would permit (by allowing differing versions to be made with performance-enhancing features due to advantages made possible by process improvements.) That's why, after all these years, nVidia's still running a 128-bit bus--they depend on the ram manufacturers to supply them with their bandwidth. Up to now this system has worked very well for them.

    But starting now post R300 and DX9, pre-nv30 and Ogl 2.0, all bets are off. Very little of that counts anymore in the sense that it counted when nVidia was facing down 3dfx. And of course ATI is much stronger competition than 3dfx turned out to be. nv30 is a new architecture, requiring new drivers, and a whole new approach. Ati is a competitor who, unlike 3dfx, is willing and able to use the same general technology advances in ram and manufacturing processes that nVidia has used. That advantage, a major advantage nVidia held over 3dfx, is gone for good. The truth about nv30 remains to be seen, but with R300 ATI has created a dynamite architecture, one that can be the foundation for years of advancements (just as nVidia did with TNT years ago.) And we already know about R350 and R400. Has nVidia done the same with nv30? As of yet nobody knows. The simple fact is that because nv30 is so late in the game even if it performs perfectly it still may be too late to make a significant impact--the kind of impact it would have made if nVidia had delivered it on time. Shades of 3dfx--that's exactly what would have happened had 3dfx shipped the V5 on time--nVidia would have been the one to be behind. Timing is *critical* in this market because competition waits for no man and no company. The thing that should never be underestimated is the critical importance of timing in this marketplace--the product that will assure your success and supremacy today is often the product that will, if released six months late, condemn you to mediocrity and second-fiddle.
     
  10. martrox

    martrox Old Fart
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,065
    Likes Received:
    16
    Location:
    Jacksonville, Florida USA
    Excellent post, WaltC.
     
  11. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,490
    Likes Received:
    262
    Location:
    Westeros
    Agreed, though perhaps a bit too one-sided for my tastes. Yes, Nvidia has missed a product cycle but, unlike 3dfx, their entire revenue stream isn't based on getting a new product into retail channels. Nvidia can afford a late NV30 and I think they'll maintain a larger market share over ATi throughout 2003. If, however, their follow-ups are also delayed, ATi will overtake them, IMO. I do agree with Walt that technologies (fab processes, memory) outside either company's control will level the competitive field more than we've seen in the past.
     
  12. CMKRNL

    Newcomer

    Joined:
    Jul 12, 2002
    Messages:
    91
    Likes Received:
    0

    I think it's silly to overestimate them, frankly. The fact is that despite nVidia's business "acumen" the company still missed an entire product cycle--just as 3dfx did. nVidia is also faced with a strong competitor with a lot of clout and an excellent architecture--just like 3dfx was faced with in the form of nVidia.


    Yes, but there are differences as to why the product cycles were missed. In the case of 3Dfx it was simply a lack of innovation and inability to adapt quickly to where the industry was headed. Banshee was where they wanted to be, but it came out far too late, far too underpowered and far too expensive for OEM's. After that they did whatever was necessary to try and stay afloat, but the situation was already spiralling downwards around the time Voodoo3 was out.

    nVidia's missed cycle stems from a) NV2A committments sidetracking them b) large redesign of the 3D core for NV30 and c) Some issues with process migration. It wasn't for a lack of committment or vision, but rather a lack of resources and some unfortunate events working against them. In the case of 3Dfx, they did not have a competitive product to follow up the missed product cycle. In the case of nVidia they DO have a very competitive product in the form of NV30/31/31M and even an integrated core in the form of NV34.

    Sure, they missed the boat with a consumer level shader capable part, but you know what? It didn't matter, because there was not a huge demand for this last year. It's only now and in the upcoming year that the average gamer is beginning to see the effects possible with shader capable hardware and are factoring that into their purchase decision.


    nVidia loses a lot of ground in the graphics field and the rest won't matter.


    When you say graphics field, I assume you mean discrete graphics chips. But you have to remember that mobile and integrated are both rapidly growing markets which nVidia has just barely tapped into. So far ATI has pretty much remained unchallenged on the mobile front, but if NV31M is competitive from a power consumption and price perspective, they stand a good chance of stealing some of ATI's business away.

    I absolutely agree that they do not have the breadth or diversity that ATI has, and this will hurt them in the long run, but in the short term this is not a major factor.


    Essentially, nVidia is where it is today because of the TNT architecture which was refined for TNT2, doubled, essentially, for GF1, and all subsequent versions have rested on the same architecture which has been clocked up as much as the technical prowess of FABs like TSMC would permit


    That's not true. There have been a lot of major improvements in the core, it's not entirely due to process refinements. If you think the changes were minimal, then you could say the same thing about ATI. The truth is that in both cases there were a lot of major internal design changes. Of course certain parts of the core are simply carried forward with minimal change, but the pipeline, memory controller, display controller, and other aspects of the design all underwent large changes.


    The truth about nv30 remains to be seen, but with R300 ATI has created a dynamite architecture, one that can be the foundation for years of advancements (just as nVidia did with TNT years ago.)


    Actually, I doubt it. My bet is that this is simply a transitory stage for both companies and that the real architecture that will carry them forward for 3-4 years will be based on R400/NV40.


    And we already know about R350 and R400. Has nVidia done the same with nv30? As of yet nobody knows.


    NV35 and NV40. ATi is definitely ahead in terms of schedule, but it's not like nVidia does not have a similar product roadmap lined up.


    The simple fact is that because nv30 is so late in the game even if it performs perfectly it still may be too late to make a significant impact--the kind of impact it would have made if nVidia had delivered it on time


    I think people are overestimating the impact of the delay. It has definitely hurt them in terms of OEM contracts, some marketshare and consumer mindshare -- no question about that. However, the real sales that we're concerned about now are RV350/R300 and NV31/NV30 sales for the upcoming year. nVidia has very aggressively taped out NV31 meaning that they will not be behind ATI in the all-important consumer and mobile lineups.

    As I said before, to a lesser company this delay could have meant the death spiral. But in the case of nVidia, they are large enough, with deep enough pockets and a good enough product roadmap that they will continue to survive and will in all likelihood continue to remain number 1 in 2003.
     
  13. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    Shouldn't that be :

    Competition is *critical* in this market and time waits for no man and no company.

    *hick*
     
  14. Brimstone

    Brimstone B3D Shockwave Rider
    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    1,835
    Likes Received:
    11
    I think Bit Boys would disagree with that logic. :D
     
  15. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    In the eyes of the average consumer I would agree with you but I would have to say that a number of individuals privy to inside 3dfx information/happenings would argue (perhaps vehemently) with both your reasons (the second one, perhaps less so).
     
  16. CMKRNL

    Newcomer

    Joined:
    Jul 12, 2002
    Messages:
    91
    Likes Received:
    0

    In the eyes of the average consumer I would agree with you but I would have to say that a number of individuals privy to inside 3dfx information/happenings would argue (perhaps vehemently) with both your reasons (the second one, perhaps less so).


    Well, I base my opinions on several long chats with an ex-3Dfx guy who was fairly high up on the SW side of things. We went into some depth on all the hacks in voodoo5 and what went into Rampage etc. The lack of innovation was more of a complaint pre-banshee days, but I think it's relevant because it seems to me that this was the pivotal point where 3Dfx started its eventual decline. As far as inability to adapt to where the industry was heading, I think this was abundantly clear based on their lack of certain key features as well as the absence of any significant OEM business.
     
  17. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    3dfx had a solid hardware crew. They also had a lousy management team as well as CEO. Voodoo5 (in its released form) was not supposed to be a product (in case you didn't already know).
     
  18. antlers

    Regular

    Joined:
    Aug 14, 2002
    Messages:
    457
    Likes Received:
    0
    Does this imply that the NV31 taped out at about the same time as the RV350 (which you previously stated taped out in November)?
     
  19. CMKRNL

    Newcomer

    Joined:
    Jul 12, 2002
    Messages:
    91
    Likes Received:
    0

    Voodoo5 (in its released form) was not supposed to be a product (in case you didn't already know).


    No, I didn't know that, but he couldn't stop bitching about what a hack the hardware was and what a pain in the ass it was to write the drivers for it because of that.


    Does this imply that the NV31 taped out at about the same time as the RV350 (which you previously stated taped out in November)?


    Well, I posted in Nov, but I had said that it had taped out in the "past few weeks". RV350 taped out in late Oct, NV31 in mid Nov, and both products are targetted to be released around the same timeframe.
     
  20. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Ah....it probably is a bit one sided, but I've never been much of a fence sitter....;) And you can bet the guys employed by these companies are as one sided in their own ways. While I don't subscribe--at all--to the theory that acquiring 3dfx somehow "tainted" nVidia *chuckle* I really do see a lot of similarities here with the 3dfx-nVidia face off a few years ago. Only now it's nVidia making the apologies and being late.

    The point with respect to ATI here that impressed me was that the company did not allow itself to be dictated to by the FAB and memory guys. Instead, they designed their chip to compete on a mature (that word's been over used) process and were able to make it fly without .13 microns or low-K dialectrics, and they didn't sit back and wait on DDRII--they designed in a 256-bit bus so that no matter what happened on the memory side of the fence they would come out smelling like a rose. That's the point I wanted to make--that of the two nVidia left far more of the process to third parties than ATI did. And the result is that ATI is shipping and nVidia is not. Right?...;)

    I hope that this will encourage nVidia to change some of its policies in this regard because clearly the strategies that worked with 3dfx won't work with ATI. But all of this is of course good for everyone since it means a good flow of products to pick from, from both companies. Obviously, ATI is not the pushover 3dfx turned out to be (much to my chagrin at the time), so I guess we'll see in the coming months whether ATI has what it takes to stay ahead, and nVidia has what it takes to compete with a more competent 3D opponent than was 3dfx.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...