3dfx's curse?

Discussion in 'Architecture and Products' started by Yannis, Jan 27, 2003.

  1. Yannis

    Newcomer

    Joined:
    Feb 6, 2002
    Messages:
    28
    Likes Received:
    0
    Hi,

    we all read the first GFFX reviews and I think most of us feel a bit disappointed. Don't you all "sense" it there is something wrong with nVidia? I am not sure but after buying 3dfx they acted as there were no other competition.

    They really remind me of 3dfx and Voodoo 5500.
    1)They are late with GFFX like 3dfx was with V5500
    2)Their card is a monster and personally not a very clever design. Huge heatsink, a lot of noise and 500MHz just to beat R300??? Voodoo5500 was a monster as well with its two VSA100 and its length.

    As another one said, I couldn't believe I would ever see nVidia come second again.
    Could it be the beginning of the end?
     
  2. Pete

    Pete Moderate Nuisance
    Moderator Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,945
    Likes Received:
    350
    Perhaps, as Carmack said, Xbox made nV late half a cycle. Or perhaps, as nV said, the .13um process delayed them. Either way, nV is too wealthy to die as quickly as 3dfx.

    I'm sure NV35 will be enough to place them back in the game at the right time. Heck, all they need is a 256-bit bus, which, judging by the number of card manufacturers that have one, is not that big a deal to implement.
     
  3. Nagorak

    Regular

    Joined:
    Jun 20, 2002
    Messages:
    854
    Likes Received:
    0
    If anything Nvidia made a mistake by being too arrogant. They should have realized that they've had real competition ever since the original Radeon, but I don't think they really took that to heart even after the release of the R8500. If you look at the different product cycles-- 12 months for ATi vs 18 months for Nvidia-- the R9700 was bound to happen.
    Radeon (6 months late), Radeon 8500 on time, Radeon 9700 (6 months early). I'm not saying that Nvidia is going to go out of business, but if they can't get their product cycles down to 12 months they are screwed (and refreshes don't count).
     
  4. Babel-17

    Veteran Regular

    Joined:
    Apr 24, 2002
    Messages:
    1,004
    Likes Received:
    245
    I think there were people even within ATI who were pleasantly surprised at how well the the R9700 turned out. Combined with the convergence of nVidia having a bit of a nightmare with the new fabrication process I'm not that amazed to see Nvidia caught a bit flatfooted at the moment. These things happen and the current scenario could be far, far worse imo.

    On a side note I think ATI still has yet to learn to go for the jugular. :) I see an opportunity here and now for a massive PR coup simply by announcing the acceptance of pre-orders of R350 based cards at 400/400 core and memory at a price of 400 dollars US. A tad sleazy, yes, but appropriate to the current rules of engagement. The time to strike is when your opponent is off balance and confused. Fwiw, I've owned two recent nVidia high end cards and enjoyed them immensely. I currently have an ATI card, the R9700 Pro.

    I could be way off base of course but I do sense both ATI and nVidia are locked in a battle where release dates and performance numbers are more critical than ever and both sides are being both secretive and using a bit of misdirection. Interesting times indeed.
     
  5. Pixel Pop

    Newcomer

    Joined:
    Dec 19, 2002
    Messages:
    7
    Likes Received:
    0
    What curse?

    You have a card for all intents and purposes that

    1. Performs on a par with the market leader 9700pro (with pre-release drivers),
    2. Is way ahead performance / feature-wise of their last product the Ti4600 and
    3. is providing DX9 VS / PS 2.0+ technology on the desktop for the very first time.

    This is what, the third DX9 class card released, and people want to throw stones at it until it dies.

    What for?

    Oh I know...

    We're just never satisfied...
     
  6. duncan36

    Newcomer

    Joined:
    Aug 14, 2002
    Messages:
    173
    Likes Received:
    0
    That may be a small part of it, but frankly saying that takes away from ATi's enormous achievements.

    Would you say that the Raiders lost the superbowl because they didn't 'try hard enough'? Obviously not, they got beaten by the better team, the team that showed more creativity,passion, and strength.

    ATi just wants it more and works harder IMO.
     
  7. Tahir2

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,978
    Likes Received:
    86
    Location:
    Earth
    They still have to recoup the money spent on the design of the Radeon9700 Pro and also need to look at their inventory. No point in ATI shooting itself in the foot and honestly these first few numbers only go to show that the R350 is not needed so soon.
    However if ATI relax then you can expect NVIDIA to pull the rug from under their feet before you can say "Dustbuster!"
     
  8. MuFu

    MuFu Chief Spastic Baboon
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,258
    Likes Received:
    51
    Location:
    Location, Location with Kirstie Allsopp
    Nobody has mentioned NV2A and the Xbox project. :?:

    That's the main reason they're in this mess. It's certainly not "the beginning of the end" by any means although it'll be uphill from here for a while yet.

    MuFu.
     
  9. MuFu

    MuFu Chief Spastic Baboon
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,258
    Likes Received:
    51
    Location:
    Location, Location with Kirstie Allsopp
    My apologies. :lol:
     
  10. Snap

    Newcomer

    Joined:
    Jul 27, 2002
    Messages:
    43
    Likes Received:
    0
    Is it that, or is it because their top flight engineers cashed in their stock options at the high point, leaving them with the 3dfx Engineers they could pick off and ex-SGI guys?
     
  11. MuFu

    MuFu Chief Spastic Baboon
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,258
    Likes Received:
    51
    Location:
    Location, Location with Kirstie Allsopp
    LOL, no it's the X-box thing.

    Worst comes to the worst, if everybody qualified buggers off then EE undergraduates save the day. :lol:

    MuFu.
     
  12. Glonk

    Regular

    Joined:
    May 26, 2002
    Messages:
    334
    Likes Received:
    0
    Location:
    Markham, Ontario
    Why on Earth would people honestly even think this is remotely related to the Xbox project?

    The Xbox project was hardly very big, and I'm pretty sure a different division handled making the XGPU (nForce division) rather than the engineers working on NV30...
     
  13. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    I agree. GF3, GF4, and XBox (at least the GPU part) are all so similar, I don't see how NVidia took so damn long to get NV30 out. If they had a parallel design team working on NV30 alongside these, NV30 must have been in development for nearly 3 years now. I was expecting GF4 to be NV30 at one point.
     
  14. mboeller

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    922
    Likes Received:
    1
    Location:
    Germany
    IMHO;

    Maybe they have dig themselves an nice deep hole with the NV30.

    What actions can they make to improve the current situation.

    => IMHO not so much

    They can improve the process and the layout to improve the situation, but with the NV30 they squeezed every MHz out of the normal 0.13µm process they could.
    So with the NV35 they can only improve the MHz a little bit; maybe 100MHz using the lowK process, but then they would have to live with the dustbuster until the NV40. If they abandon the dustbuster, they will have to start at 400MHz and will get to 500MHz(?) with the NV35. Not very convincing for an R400 competitor.
    They could also improve the layout like ATi have done it with the R300, but here they would have to start from the beginning, cause as it seems now, Nvidia has no experience with handoptimised chips, otherwise the NV30 wouldn't need the dustbuster. So an improved layout would help alot, like with the R300, but how long will this take when they have no experience with it, and how much will it cost (A02...A0x) too.

    So the only improvement I see is an 256bit interface. Hopefully the design of the NV35 is not finished yet so they can or have already incorporated the 256bit memory interface. Otherwise they will fall back compared to ATi.
     
  15. Himself

    Regular

    Joined:
    Sep 29, 2002
    Messages:
    381
    Likes Received:
    2
    I don't think the xbox has anything to do with it either, NVIDIA has the cash and resources to handle multiple projects at once without them impacting each other. I seem to recall some interviews with NVIDIA where this was asked and the answer was no, that's good enough for me.

    Frankly, the GFX is a good card, it just should be clocked lower and not have the noisy fan. NVIDIA's marketing department caused the dust buster thing to happen I'm guessing, NVIDIA has always been one to play within the bounds of what was practical to do, this is a departure. I actually like the clock scaling, but it should be constant while in a game or at least tweakable. I think the card's real performance will show itself in benchmarks that use shaders heavily. What we are seeing is early drivers and NVIDIA's philiosophy of designing for next gen features. They have in the past taken the initial performance hit to get popular support for features while still being faster than the other guy, only to come back with more practical performance in newer cards. This time, they are not faster than the other guy in the benchmarks available, so people are seeing the first gen effect of one of their new architectures.

    IMO, NVIDIA didn't expect anybody to go to 256bits, and that is where they guessed wrong and hence left a lot of performance on the table. I wonder if that decision is about the crossbar controller being 4 way at 128 bit and it having to either go 8 way or be twice as wide. Seems to me that either change would have to be migrated all along the design for optimal effect. In other words I don't think you tack on a double wide bus, you design around it.
     
  16. antlers

    Regular

    Joined:
    Aug 14, 2002
    Messages:
    457
    Likes Received:
    0
    If they could've got this same card out in June at 400/400 or so (which I suspect was the plan at some point) nobody would have given a damn about the 9700 (it would have been a little faster on some benchmarks, but using "old" technology and more limited shaders) when it came out in July.

    Shows the difference 7 months can make--a card that would've been a masterstroke that solidified NVidia's position at the top of the heap is now something they have to sell overclocked and even so it is perceived as a disappointment, leaving them vulnerable to ATI.
     
  17. Clashman

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    672
    Likes Received:
    11
    Location:
    Minneapolis
    Dreamcast and Neon 250 were pretty similar, too.
     
  18. elchuppa

    Newcomer

    Joined:
    Jan 20, 2003
    Messages:
    56
    Likes Received:
    0
    I just don't want my computer to sound like a lawnmower.. That's the primary reason I won't buy an FX. I would love to have full 128 bit FP precision all the way through the pipeline, but nVidia should call me when they have a product that achieves this in a consumer friendly way.
     
  19. Tahir2

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,978
    Likes Received:
    86
    Location:
    Earth
    How would a GFFX at 400/400 be faster than an R300? :roll:
     
  20. dksuiko

    Newcomer

    Joined:
    Feb 6, 2002
    Messages:
    196
    Likes Received:
    0
    I wouldn't blame nVidia's current situation on a mistake they made, but more on the successful execution on ATI's part. I'm sure nVidia wasn't just sitting down without the sense that ATI was serious competition. I think they did all they could, but in the end, ATI just did it better.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...