Nvidia GT300 core: Speculation

Discussion in 'Architecture and Products' started by Shtal, Jul 20, 2008.

Thread Status:
Not open for further replies.
  1. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    I can - in a way. They force the competition to sell way below the anticipated margins for their high-end, which would be fine, since that is business. But at the same time, they're not making money themeselves, because their bread-and-butter product (HD4850) is not significantly faster than the competitions volume-chip (G92) and so they're into a price war at that level too, but without the significant cost advantage they probably have in the 200 - 300 Euro segment, i.e. HD 4890 vs. GTX260-285.

    [strike]Proof: Their recently announced losses in this Q2/2009.

    This is not healthy competition but self-destruction - IMO.[/strike]
    Apparently, I should have read more closely. AMD has stated their GPG business to be break even. Not healthy, but nonethelesse.
    "In the Graphics segment, revenue for the quarter was $222 million, down 18% sequentially and down 15% from the first quarter of 2008. Units and ASP were down quarter over quarter. ASPs were up year-over-year as a result of richer mix of the HD4000 family of products, and the Graphics segment broke even at the operating level."

    I am not too familiar with US price search engines - maybe someone can up with the respective data.

    I don't think so, that it depends on how I might look at it. Nobody in their right mind designs a performance-chip from scratch which requires insane amounts of power (for that time, compared to other performance-chips), which uses a double-digit-layer board, a 512 Bit memory bus (from which, as it turned out, the product didn't even profit) and a GPU far north of 400mm².


    It is, as the existence of HD2900 Pro and GT proves. If they had not had any parts to salvage, there'd be no reason to offer such a variety (1 only downclocked (pro) and two partly disabled (GT, 256-bit-pro)) parts.

    Even the full blown 2900 XT had its share of problems with 8800 GTS and their derivates were far to strong to compete against 8600 GTS - and on a remotely different performance level to produce I'd presume.
     
    #881 CarstenS, Apr 25, 2009
    Last edited by a moderator: Apr 25, 2009
  2. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Yea, but that wasn't the point I was trying to make.
    Obviously ATi went for a high-end part.
    That doesn't mean consumers see it as high-end.

    The exact same situation happened with the AMD Phenom. It was supposed to be the fastest quadcore on the market (some pretty funny videos on Youtube of AMD executives boasting Phenom's performance and superiority), and it had pretty insane power requirements... But its performance was appalling. Slower than the slowest quadcore that Intel offered.

    So yes, designed as high-end, but not performing as high-end. Consumers mainly look at price, and price is a result of performance. So they don't see 2900 or Phenom as high-end parts.
     
  3. compres

    Regular

    Joined:
    Jun 16, 2003
    Messages:
    553
    Likes Received:
    3
    Location:
    Germany
    Pardon my ignorance. Is AMD's graphic division taking a loss after the 4800 series release? Specially compared to when they had the 3800 would be nice to know.

    Somehow it seems to me when AMD had not so competitive products it was fine that they loss money, but now when they have a good product you claim they hurt the industry. I think the one they are hurting is their competitor, but would like to see the facts behind your reasoning.

    Edit: Ahh CarstenS, sorry I missed that bit, did not refresh page in time.
     
    #883 compres, Apr 25, 2009
    Last edited by a moderator: Apr 25, 2009
  4. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    Please read on in the posting, you've just quoted.
     
  5. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    Thats a little different then what I was getting at, AMD's whole stratagy was to gain marketshare, and if anyone says anything else it really doesn't make sense, they have a chip that is cheaper to make then the compitition and the pricing of their cards, was to undercut nV. But by doing so they forced nV's hand, if they didn't expect nV to follow suite that is a major mistake, if they throught nV would follow suite then there is no benefit to thier price cuts. Only thing they were going for is marketshare at any cost, did they get it, no, so they failed. Now on the other side, they just hurt nV. Thats why I'm saying the entire market.

    First they weren't able to accomplish what they wanted, instead they didn't get money which they should have, secondly, screwed up the other company. Thirdly add in the economy, its just a crap shot now.
     
  6. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    I think AMD has bigger worries than ATi/NV right now.
    Intel is forcing AMD out of the server market now, and in a few months, Core i5 will push AMD out of the mainstream desktop market aswell.
    Since AMD wasn't doing very well as it is, this is probably the end of the line for AMD, and therefore ATi.
     
  7. LordEC911

    Regular

    Joined:
    Nov 25, 2007
    Messages:
    877
    Likes Received:
    208
    Location:
    'Zona
    What, you are saying a 3.5% increase in marketshare for discrete GPUs is a fail?
    Edit- those are numbers from Q2 '08 to Q3 '08. I, for one, think they have had an even bigger impact in marketshare for Q1 '09.
     
    #887 LordEC911, Apr 25, 2009
    Last edited by a moderator: Apr 25, 2009
  8. Speccy

    Newcomer

    Joined:
    Oct 11, 2002
    Messages:
    86
    Likes Received:
    6
    Cheaper chip = badly undercutting. This is fantastic logic.

    Could it not just be that the pricing is "fair" pricing given the size of the chip? Remember, G92 is about the same size and NVIDIA initially launched that lower than RV770, it it quickly fell even below 4850's price. The logic of blaming AMD for damaging the market is astonishing, when indications are that the pricing of the chips is fair, given what happened with G92 in the prior months.
     
  9. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    JPR's numbers don't agree with you just check they are available. Last Q and the Q before that.

    http://forum.beyond3d.com/showpost.php?p=1286359&postcount=600


    Interesting isn't it

    No in the end both companies got hurt, AMD didn't reach any signficant goal, remember this http://arstechnica.com/hardware/new...nificant-market-share-gain-by-end-of-2008.ars, did that happen, no. Hmm look above
     
    #889 Razor1, Apr 25, 2009
    Last edited by a moderator: Apr 25, 2009
  10. Speccy

    Newcomer

    Joined:
    Oct 11, 2002
    Messages:
    86
    Likes Received:
    6
    This asinine line of reasoning appears goes against your prior assertion. Given that we have prior history to indicate that the prices of RV770 were fair market prices it seems like they didn’t further “trash” pricing to gain marketshare.

    There’s nothing to indicate that AMD’s pricing was set too low. All that we see is GT200 was woefully under-competitive given its size.
     
  11. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    really tell me what the last 4 gens pricing was like. Instead of just talking show me where your statistical data is coming from, or is that being asinine?

    I just showed you two links, qaurterly numbers and what AMD's position was, and that was my two original points, also in a conference call or interview they specifically stated the % they were going after in desktop discrete for Q4.
     
  12. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    It seems that some make the assumption that ATI/AMD and nVidia would have gotten higher profits if they simply had priced their cards higher. The underlying assumption is that their market is captive, and that graphics ASICs will sell in the same (total) numbers regardless of what is offered to the market.
    While there is some truth to that, their sales the last couple of quarters also show that the assumption of a captive audience is largely false. Higher prices could have been productive for the respective companies if it didn't cause their sales to drop even further, and if OEMs didn't decide to move even more strongly towards IGPs for cost reasons, and if that lower market penetration didn't further weaken the position of graphics as a sales driver.
    The ASPs of desktops and notebooks are falling. Gaming, with the misguided(?) help of Microsoft is moving away from PCs.
    I don't think AMD or nVidia has a choice.
     
  13. Speccy

    Newcomer

    Joined:
    Oct 11, 2002
    Messages:
    86
    Likes Received:
    6
    G92 was launched just 7 months before RV770 for an intented pricing of $199-$249 - this launched in a very similar segment to where RV770 was clearly targetted. The die size of G92, the best indicator to cost, is 30% larger than RV770.

    If you want to suggest that AMD's pricing of RV770 is dragging the market down, then you should look to NVIDIA and G92 prior to RV770.
     
  14. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,723
    Likes Received:
    242
    I somewhat disagree.

    I believe GT300 will be a major architectural change, even if perhaps not an extremely drastic one.

    I expect GT300 to be more of a change from GT200 than NV40-->NV47/G70, or G80-->GT200.

    Perhaps somewhat less than NV25 ==>NV30. I'd say going from GT200 to GT300 will be somewhat like going from NV47/G70 -->G80.

    GT300 is supposed to be Nvidia's first new architecture since G80 in 2006, and, it'll be Nvidia's answer to Larrabee. The GT300 is, finally, NV60, whereas everything we've had after G80/NV50 has been basicly NV5x.
     
  15. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    You know you aren't making much sense here, because with the g92, nV was able to maximize thier margins (net and gross profits too) for 2 quarters straight (2 q's prior to the economoic down turn)
     
  16. Speccy

    Newcomer

    Joined:
    Oct 11, 2002
    Messages:
    86
    Likes Received:
    6
    What? The launch pricing was lower than RV770 models launched at despite a significantly larger die size.
     
  17. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    what cards are you talking about, g92+? or g92 because first you are saying 7 months before the launch of the rv770, then you talking about launch price was lower, doesn't make much sense, the 9800 gtx was launched 7 months before, but then you have 8800 gt, both these cards are still more expensive then the range you are stating. The g92+ came out at the price you stated but not in the time frame you stated.
     
  18. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,455
    Likes Received:
    471
    PRO and GT were only short-term products - clearance of stock before RV670 arrived. Majority of them were re-flashed HD2900XT boards with 512bit memory bus despite the former plan was to use cheaper 256bit PCB.

    Cost-down products aren't good indicators of yields - manufacturers often use full-fledged GPUs, because demand of these cheap products is higher than the number of defective parts. It's better for the manufacturer to sell some GPUs for lower prices, than to leave the segment to competitor.

    You can also notice, that there are even HD4830 boards, whose SIMDs are not even disabled to 640...
     
  19. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
    Hm, I think the 2900GT's were all 256-bit. The VRM circuit was also simplified, no Rage Theater ASIC and so on - clicky!
     
  20. LordEC911

    Regular

    Joined:
    Nov 25, 2007
    Messages:
    877
    Likes Received:
    208
    Location:
    'Zona
    Ummm... talking about Nvidia vs AMD/ATi, I wasn't including Intel in those numbers, see "discrete."
    Obviously we get our numbers from different sources.
     
    #900 LordEC911, Apr 25, 2009
    Last edited by a moderator: Apr 25, 2009
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...