Jen-Hsun Talks NV4x

Discussion in 'Beyond3D News' started by Dave Baumann, Mar 3, 2004.

  1. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    "Only slightly"? Compare the relative positions of ATI and nVidia now vs. a couple of years ago. See a problem with your wording? Maybe nVidia's only slightly behind ATI now, but nVidia was far ahead before. I'd call that a significant shift. "Only slightly"? Yeah, right.
     
  2. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,217
    Likes Received:
    1,737
    Location:
    Winfield, IN USA
    I'll believe positive facts about nVidia's upcoming hardware, or even good speculation....it's the nebulous and undefined "golly-jeepers it'll be OOODLES better!" attitude that I have trouble accepting.

    Sorry, nothing personal. I just thought you might actually have something to contribute to the rumor pot a bit more specific than that if you were going to be making a statement like that. :(
     
  3. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    I might believe positive talk about Nvidia. But given the last year or two I make it a habit of considering the source for any graphics-related info and quite frankly a post from someone called "Guest" doesn't rank that high on the reliability index.

    What little info I have seen (from Valve) indicates that while Nvidia definitely has more cards in gamer's boxes than any other graphics chipset makers, ATI has much better representation in the high end than Nvidia. There are a third again as many people using 9800-series cards on Steam as there are using FX5600 series cards, and the FX5600 is the only GfFX other than the FX5200 used by more than one percent of the sample.

    12.85% of the sample use GfFX cards (over half being FX5200 users) while 16.24% use R3xx based cards. If you take the FX5200 out of the sample then Nvidia's share drops to 5.9%. Remember, the ATI sample is 9600 and higher and that's a much higher price bracket than the FX5200.
     
  4. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    It's from such various "16 lanes" comments like these, gleaned from various nVidia personnel, that I suspect the Inquirer picked up its "16 pixel pipeline" information relative to nV4x. Wouldn't surprise me at all to learn they'd confused "16 lanes" with "16 pixel pipes"....;) Inquirer frequently misunderstands what it hears.

    I would also think that in terms of integrated 3d graphics the bi-directional nature of PCIe might well be almost immediately utilized to good effect as that's where the difference [integrated 3d] will be most profound anyway, when contrasted with AGP x8. "AGP x16", otoh, isn't a standard for anybody, so it will be interesting to see how nVidia intends to have that advertised "functionality" benefit its customers.

    As for nVidia in general over the past 18 months, I think they've seriously damaged their credibility within the enthusiast market segment--defined as that market segment willing to spend $300 and up on a 3d accelerator. Whether it's been their misrepresentation of their architecture's pixel pipeline organization, their willingness to to sacrifice benchmarks and other software on the pyre of the nVidia PR altar, or their willingness to publicly trash any advanced hardware feature nV3x either doesn't support or else poorly supports (ps2.0, etc.), nVidia's done some serious damage to itself by its reliance on these negative PR tactics post-R300. The really bad thing for them is that because nV3x yields have been so poor, especially at the higher end, that the boomerang effect has been trebled and ATi has just clobbered them in the enthusiast segment of sales volume by as much as 9-1/8-2, depending on the numbers you look at.

    The question for me is one of whether or not nVidia can listen to the markets and design and produce products the market will find desirable, or whether nVidia is locked into a cycle of of producing products convenient mainly for nVidia, products which it must defend and promote via the same kind of negative campaigns the company has relied on since R300.
     
  5. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,217
    Likes Received:
    1,737
    Location:
    Winfield, IN USA
    Is that where the line is drawn? I'm thinking that the enthusiast segment is more the type willing to spend $200+ on a viddy card.

    $300+ is for the rich & crazy type that I want to be, but $200 for a video card is still a lot to some of us and I think there are a whole lot more people gaming with vid cards in the $200+ range than the $300+ range.

    I agree with everything else you said, it's just on where the dollar cut-off range is on the enthusiast's market I'm questioning. I think the enthusiast is much more informed and value conscious lately.
     
  6. Tim Murray

    Tim Murray the Windom Earle of mobile SOCs
    Veteran

    Joined:
    May 25, 2003
    Messages:
    3,278
    Likes Received:
    66
    Location:
    Mountain View, CA
    $200 and above is the enthusiast line. Always has been, always will be. ~$125 and below is the general user segment, and $125-200 is the mainstream market.

    As for the comment saying L'Inq confused lanes with pipelines, not true. It is 16x1, unless NVIDIA is far more manipulative than I previously imagined.
     
  7. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,217
    Likes Received:
    1,737
    Location:
    Winfield, IN USA
    Woo-hoo! The Baron backs me up! 8) ;)

    BTW-16x1, 32x0....I got that pretty much confirmed too.
     
  8. incurable

    Regular

    Joined:
    Apr 20, 2002
    Messages:
    547
    Likes Received:
    5
    Location:
    Germany
    Wait, there's a difference (besides chaging a '0' to a '5' on the lid)? :shock:

    cu

    incurable
     
  9. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Yes, really, the number I was talking about is certainly not chiseled in granite...;) I saw some numbers awhile back showing ATi with a 9-1 unit sales advantage over nVidia in the $300+ markets, but it could really be defined as "any number at which the facts make a difference in the purchase," and that could just as easily begin at the $200 level. I do think, though, that the more you get above $200 the less likely you are to be influenced primarily by marketing as opposed to other sources of information.
     
  10. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Really :?
     
  11. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,217
    Likes Received:
    1,737
    Location:
    Winfield, IN USA
    Yup.

    True, when you're shelling out the more serious cash you'll generally do the homework to find out more of what you should/will be getting while on the low-end people just tend to look at the blurbs on the boxes. :roll: :(
     
  12. karlotta

    karlotta pifft
    Veteran

    Joined:
    Jun 7, 2003
    Messages:
    1,292
    Likes Received:
    10
    Location:
    oregon
    $200 300 is the Wanabe enthusiast line but is to scared to go into the REAL enthusiast segment - $ 400 and above.
     
  13. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,217
    Likes Received:
    1,737
    Location:
    Winfield, IN USA
    Snob! :p
















    j/k! ;)
     
  14. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    Well, quite clearly, the benefit is that the PCIE bridge chip gets more bandwidth and less latency. 4gb/s vs 2gb/s. Whether bus bandwidth translates into any game improvements is another story, one PCIE must also overcome. Yeah yeah, HDTV editing, hypothetical UMA-like algorithms, yadda yadda.


    Not really. The enthusiast market pays for performance above all else, not brand or corporate cheer leading. I had a very low opinion of ATI until the R300, I changed my mind instantly afterwards. If the NV40 r0x0rs, all will be forgiven. The type of people who pound their fists on the table and rant and rave about NVidia are few and far between compared to the vast majority of consumers who don't sit around on 3D hardware message boards whining, many of whom won't even find out about stuff like the driver cheats unless some mass market games magazine like CGM, PC Gamer, etc prints it.

    If NVidia wanted to produce products "mainly for the convenience of nVidia" why didn't they make a chip that is cheap and easy to produce? This is just a silly comment.

    NVidia has two markets to listen to. End users, which demand performance on current games, and developers and IHVs which demand features which can extend and differentiate their future (1-2 years out) products. Nvidia didn't produce the NV3x for convenience. They tried a different architecture and fab process, which didn't work on their first try. Hopefully, the NV40 will be different, lessons learned, but none of this has anything to do with a silly notion like "produce chips mainly for their own desires" Both ATI and NVidia get input from ISVs for features and architecture.
     
  15. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Just as clearly, "AGP x16" is not a standard for anybody--not for Intel--so nVidia's all by its lonesome, there. So what's the point? Why not just do a PCIex16 implementation and bridge it down to AGPx8 for your AGPx8 customers, instead? The answer seems to be that this approach is merely one that benefits nVidia economically in the short run. Eventually, I expect that nVidia will be marketing only PCIe which is bridged down to support AGP x8. IE, who's going to be interested in "AGP x16" on a PCIex mboard? Most who will do early adoption of PICe will gravitate towards PCIex16 3d products, instead.

    What do you mean, "not really?" I think the fact that nV30 was cancelled and nV35/38 have been clobbered (9-1 or 8-2, take your pick, in sales volume) in the enthusuast's sector ($200-$300 & up) pretty much demonstrates how nVidia has derailed itself in that market. Additionally, image quality is about equal with performance as a major consideration in this market. The last 18 months have been a long story of nVidia attempting to gain frame-rate performance at the expense of IQ and to justify it. Clearly, there's more to the story than mere frame-rate performance, and always has been.

    I mean, you can delude yourself like nVidia and lump 5200 sales in with "DX9" if you want--but very few "enthusiasts" make that mistake. As well, I wouldn't call people whose sole source of info on 3d cards is "PC Gamer" to be "enthusiasts" in the first place (noobs, maybe, but eventually anyone who puts $200 + in 3d cards will learn that often the facts and the marketing do not agree.)

    I hope that nVidia can straighten itself out with nV40--I sincerely do. But that's something that remains to be seen. As a result of its dishonest approach to public relations over the last year and a half, nVidia will find that even if nV40 is "perfect" it will take time to undo all the damage they've done themselves with their nV3x marketing. As well, although you might have "instantly" understood what a good product R300 was, the great portion of the market took about a year to reach the same conclusion, precisely because of ATi's prior negative inertia. By the time nV40 ships, nVidia will have been looking at upwards of two years negative inertia of its own, and will not bounce back overnight in the most ideal of cases. Mainly, though, what will count for nV40 isn't just nV40 itself, but how nV40 compares to R4xx. As with R3x00 vs. nV30, it's the comparison that makes the difference.

    The fact that nVidia's had yield problems with nV30/5/8 consistently was not intentional, of course, and unintended. Everything else they've done with respect to marketing nV3x has been done solely for the convenience of nVidia--and it all started with misrepresenting nV30 as an 8x1 architecture. If that wasn't done strictly for the convenience of nVidia, I can't imagine why else it was done--ditto benchmark cheats, driver "optimization" and all the rest of it. Whom else besides nVidia do you imagine might have benefitted from such misinformation?

    nVidia has the same markets to listen to that ATi has. Of the two companies and their respective 3d products over the last 18 months, there's no doubt for me as to which company has done more listening, and which has done more talking, instead. I cannot think of a single soul who has benefitted from nVidia's marketing of nV3x in the last 18 months--aside from nVidia. Neither developers or customers have benefitted, it seems to me. Of course, I really don't think their PR campaign has benefitted nVidia as much as it has hurt them in the enthusiast sector, and nV40 will certainly illustrate how much they've really been listening to the markets as opposed to their own internal voices...;)

    Here's another area besides the enthusiast sector where nVidia's approach has plainly hurt it: OEMs, both board and system. Prior to R300/nv30, the board OEMs were practically all exclusively nVidia--now there are so few nVidia-only board OEMs left that it's difficult to get an accurate count. Board OEMs are actually nVidia's only customers, and the degree to which they've picked up ATi is but a reflection on the non-success of nVidia's nV3x efforts across the board. And it's the system OEMs, like Dell, which have been instrumental in ATi clobbering nVidia in the $200-$300+ markets. If and when nVidia's been able to produce enough nV3x chips for higher-end 3d products, they've been too hot, too large, or too noisy in comparison with R3x0-based products to sway system OEMs to deploy them at a rate competitive with R3x0. Let's also not forget how nVidia bungled the xBox2 contract with M$ (when presumably M$ was able to get a good look at the future directions of both companies and found nVidia wanting.) When I say "convenient" for nVidia, I don't mean "easy," I mean self-serving. nVidia's going to have to start serving its markets again to succeed, and a very real part of that is acknowledging your competition. At least publicly, nVidia even lately seems very reluctant to admit ATi is taking market share from it generally, and in some market sectors (like the enthusiast secotor) has steamrolled them flat...;) Being reluctant to admit what is so obvious to everyone else does not insprire me to confidence as to nVidia's future directions.
     
  16. Anonymous

    Veteran

    Joined:
    May 12, 1978
    Messages:
    3,263
    Likes Received:
    0
    No one ever sees AGPx16. It's a private protocol on the PCB between the GPU and the bridge chip. The GPU could speak chinese to the bridge chip, it simply does not matter. NVidia isn't marketing AGP16x as something your *motherboard* will understand. It's a private external protocol between GPU and bridge because the bridge sits outside the GPU.

    Early adopters aren't going to buy "value" video cards either, but the purpose of shipping non-high-end cards with PCIE bridge chips is because there are people who don't buy mainboards and early adopt. There are people who get their PCs from system builders, and "value" PCs are going to start having PCIE motherboards in them, so we need "value" PCIE cards.


    The point is, enthusiasts, with the exception of fanbois, don't have long memories or loyalty. If you would read what you were responding to, you'd be better off. I said *IF* the NV40 is all fixed and a monster performer and IQ, most enthusiasts won't care about the driver cheat fiascos, the NV3x, or any of the other crimes you whine about NVidia. Most people will simply buy the best card, irrespective of how "evil" the corporation is behind it.

    The rest of your post is irrelevent because the NV30 is irrelevent. We're talking about NVidia turning around their position with enthusiasts with the NV40. You can keep reliving past battles if you want.

    Maybe it was done for the benefit of consumers. AMD labels it's processors according to non-Mhz ratings that appear "Mhz-like". 3D card vendors used to regularly switch between promoting texel fill rate and pixel fillrate depending on whether they had 1 or 2 TMUs. If you leave out PS2.0 performance, the NV3x performs just as well as the R300 with 4 pipes vs 8 on most games that you can buy (e.g. DX8). The number of pipes is not what matters for the consumer, those are implementation details as are most GPU specs. Performance is what matters. Simply saying "4 pipes" on the box does not tell the whole story. Maybe they should have said "Max # of Registers before performance drop = 2", right? Total honesty?

    Anyway, you're beating a dead horse, which is NVidia marketing, which is supposed to be self-serving. Marketing and Sales are based on convincing the right people to buy you product by positioning it in the best light possible. I'm afraid to tell you that Nvidia isn't the only company to selectively leave out information and use creative analogies or labeling.



    The point is, the NV30 was *designed* according to the markets NVidia has listened do. I don't give a rats ass about the "marketing". The issue is, the HARDWARE DESIGN of the card was not "convenient for NVidia". We're having two conversations. I'm talking about the process by which the NVidia ended up with the NV30, and you're talking about the way they marketed it.


    Why should they admit it? So they should put out a PR release?

    Santa Clara, CA (PR NewsWire) - NVidia's CEO announced today that they have fallen behind ATI Technologies in total marketshare.

    Whenever you ask anyone in a position of leadership, even if they are failing, they are going to tell you the most optimistic and rosy picture. Did ATI put out public confessions of inferiority when NVidia was having their lunch?

    You're a hopeless case, too far gone to rescue.
     
  17. Florin

    Florin Merrily dodgy
    Veteran

    Joined:
    Aug 27, 2003
    Messages:
    1,644
    Likes Received:
    214
    Location:
    The colonies
    :lol: i don't mind a bit of common sense intermingled now and then but let's get back to our regular we'll never forgive nvidia because they're evil theme now
     
  18. PatrickL

    Veteran

    Joined:
    Mar 3, 2003
    Messages:
    1,315
    Likes Received:
    13
    It s not common sense, it is at best a wish. Before Nv30 fiasco i never bought an ATI card. I can tell you that my first PCIE card will be an ATI card. PR people that think users are mindless buyers make me sick.
     
  19. martrox

    martrox Old Fart
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,065
    Likes Received:
    16
    Location:
    Jacksonville, Florida USA
    Some people are! :wink:
     
  20. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    I'd agree - people who spend a lot of money have long memories. They remember who sold them a card for $400 that didn't perform as advertised. They remember when they buy a card and it gives them a real "Wow!" effect. That's where product/company loyalty comes from - and the same negative opinions for companies that you have bad experiences of.

    There are many, many people who wouldn't have touched an ATI card for all the tea in China, but were "forced" there by Nvidia's severe underperformance, late delivery, and corporate behaviour. It took a year for the mainstream to catch on, but the biggest hurdle has been overcome. Potential buyers who would have always gone for Nvidia first have had a taste of ATI.

    Now people are expecting to make choices between NV40 or R420 - how many people would have even had an ATI card at all on their list before the events of the last couple of years? Nvidia let ATI get a foot in the door for all those people that would have never considered an non-Nvidia product before. It wasn't enough for ATI to make a good product, they had to break through product loyalty, and Nvidia helped them do that by severely dropping the ball.

    People do remember things, and ATI have got a strong hold on the market where before they were almost nowhere. Look at all the broken Nvidia-exclusive deals at OEMs. This is all an achievement in itself.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...