The AMD Execution Thread [2007 - 2017]

Discussion in 'Graphics and Semiconductor Industry' started by overclocked_enthusiasm, May 28, 2007.

Tags:
Thread Status:
Not open for further replies.
  1. Raqia

    Regular

    Joined:
    Oct 31, 2003
    Messages:
    508
    Likes Received:
    18
    I'm liking Read as the CEO so far. His acquisition of SeaMicro and move into ARM are the kinds of good risk reward bets I like to see instead of plowing money into diminishing returns by trying to eke out 10% more performance on their high end CPUs to compete with Intel but failing to execute in the end. He realized that a piledriver refresh and keeping the old socket is sufficient to address most of their addressable market and minimize execution risk. I remember he pointed out the Bobcat APU as a wonderful, revenue generating product at a time when AMD was in the doldrums after Bulldozer.

    The Never Settle bundles have worked brilliantly. Xbitlabs ran an article recently declaring the end of benchmarks as a driver of GPU sales because of the value such deals add to the consumer, and I totally agree. AMD clawed back a lot of market share from nVidia as a result. The thin margins from the console wins don't matter so much as the potentially decade long ecosystem AMD has laid down in the consumer space for its GPU products. Mantle is definitely perfect for leveraging this and it's something devs have been clamoring for. It might also indirectly solve their driver issues since probably easier to write good Mantle drivers and leave the rest to the devs once they have that low level path.
     
  2. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    Yeah I agree, Read knows what he's doing. The previous CEO's never gave me any impression of a long term plan.
     
  3. overclocked_enthusiasm

    Regular

    Joined:
    Apr 26, 2004
    Messages:
    424
    Likes Received:
    3
    Location:
    United States
    http://www.tomshardware.com/reviews/radeon-r9-290-driver-fix,3666.html

    "AMD is basically doing two things. First, it's fixing an issue that should have never made it past quality control and tightening up the variance between Radeon R9 290X cards in the wild. Simultaneously, it's pushing average clock rates of all cards higher using a faster fan speed setting that no board we've seen used previously."

    "The bad news is that I really couldn't imagine buying an R9 290 equipped with AMD's reference cooler, particularly in light of today's update that adds even more fan speed and noise. The good news is that I have now have higher hopes for third-party 290s. With Catalyst 13.11 Beta 9.2, our Sapphire Radeon R9 290 is just as fast as Asus' Radeon R9 290X, tested on the previous page. If we could just get our hands on more aftermarket cooling solutions, I'm pretty sure we could chip away at the most compelling reasons not to buy these boards today."

    Grrrr...this cooler issue is a joke. Self-inflicted wounds are not needed at this point. Those partner boards need to show up yesterday. What percentage of boards sold use reference coolers? 80? 90?
     
  4. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,515
    Likes Received:
    934
    Yeah, I haven't seen a computer hardware company shoot itself in the foot like that in a long time; perhaps ever. This is truly baffling.

    AMD built a $20,000 Ferrari, but forgot to put windows.
     
  5. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    AMD could have and should have improved the cooler. Bigger fan, better blades, quieter bearings, better materials, etc. Instead they used the same old cooler they've been using for years. It's more important than ever now that performance, throttling and noise is now closely linked to the temperature of the chip.

    It's a shame given how otherwise great the 290s are, that a poor cooler has overshadowed the launch. Third party coolers (indeed there are some after market coolers already available) have shown that they make a massive difference to heat, noise and performance. Now the the 290s are going to be stuck with a mostly undeserved reputation for problems that will simply disappear when good non-stock coolers arrive on the market.
     
  6. overclocked_enthusiasm

    Regular

    Joined:
    Apr 26, 2004
    Messages:
    424
    Likes Received:
    3
    Location:
    United States
    If the holy grail truly is beating Nvidia at all costs, why not mitigate your biggest flaws (heat, noise) by slotting the cards into slightly higher price points by utilizing a better cooling solution? The wide range of graphics card pricing at the higher end clearly gives you the cover to do so. It smacks of poor execution or lack of forethought...either are troubling.

    Any reference design where heat is over X...where fan noise is over Y...needs to be handled differently. These are warning signs of either immature silicon, poor design, or simply pushing the bleeding edge to beat the competition as in the 290. In any event, you have to mask these thermal and acoustic deficiencies with a better cooling solution so the story doesn't become "that hot loud card" instead of that "fast cool card". $50 per card seems like a bargain to me and retail WILL pay for it. How many people are smart enough to even look for after market coolers or partner boards? Think of the big percentage of neophytes that get their comps/cards from Dell or Best Buy...they get reference coolers and will form a poor opinion of AMD/Radeon.

    Mind share requires an investment...it requires consistent execution...or the pain is bad press and lower market share...pick your poison. Save on the front, or pay on the back.
     
  7. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    It wouldn't be a problem if cards with third party coolers were available now. My understanding is that AMD won't let card partners ship non-stock coolers until some embargo date is met.

    I'm guessing AMD want consistent initial reviews from the stock coolers so that partners can differentiate themselves down the line, but it's having a negative effect because the only cards generally available have underperforming stock coolers.

    So it ends up being another face-palm moment from AMD, all the more frustrating because the 290 is such a good product, but AMD manage to pull defeat from the jaws of victory yet again.
     
  8. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,365
    Likes Received:
    3,955
    Location:
    Well within 3d
    As far as forethought goes, AMD would be aware of the consequences of using a cooler that got some PR flack when it was used for the 7970 GHz edition.
    The response then was to wait for third party coolers and let the brand take a hit due to the bad first impression.
    The best case scenario is perhaps AMD didn't get as bad a hit from that, or this is a bone it's throwing to its partner companies.

    The PWM/RPM variance problem could be at least in part an execution problem that might have been caught if they did more full validation of the mechanical behavior of a large enough test pool of cards.
    It could also be an un-noted motor revision in the inventory of components that happened at some point, or some other production run change, which could be missed.

    I think there's a good argument that this could have been caught, if the time were invested in characterizing enough of the final physical products on the experience they provide.
    However, I haven't ruled out that the R290 team was given X amount of money by the bean counters, and that money for various reasons only extended to giving the cooler racing stripes.
     
  9. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,515
    Likes Received:
    934
    This thought occurred to me as well, but if so, it seems pretty dumb. AMD's brand took a serious hit from this, and what do they have to show for it? It would be a needless sacrifice.
     
  10. overclocked_enthusiasm

    Regular

    Joined:
    Apr 26, 2004
    Messages:
    424
    Likes Received:
    3
    Location:
    United States
    Day 1 AMD Developer Summit 2013 Day 1 Highlights

    http://img.new.livestream.com/event...4a49b-35f4-4300-ad4d-aaed822d6408_640x387.jpg

    http://img.new.livestream.com/event...89665-a2c4-445b-a877-10e49f7aa739_640x375.jpg

    http://img.new.livestream.com/event...e5304-51be-4e09-a901-2cf00728bed6_640x387.jpg

    "Kaveri will integrate AMD's TrueAudio and Project Mantle technologies, allowing it to access "console level quality" and brings console esque optimisaton techniques to the PC world"

    "John Taylor is back on stage for a 1920 x 1080 "competitive demo" of an i7-4770K + GT 630 vs a Kaveri A10 APU running Battlefield 4"

    "A10 allowing 33 fps @ 1920 x 1080 on medium graphics, Intel i7-4770K + Nvidia GT 630 only manages 13 fps. DICE has announced that a Mantle optimised version will be coming, promising only improvements from here on out"

    ""Smartphones to Super-Computers": AMD describes APUs as providing a single scalable solution where it uses the language developers are already familiar with, share data structures, access to all of virtual and physical memory and multi-core coherency."

    http://img.new.livestream.com/event...1adc9-7667-4dee-a299-0f4b8e83cd54_640x377.jpg

    http://img.new.livestream.com/event...94e2e-e2eb-4a3b-8b12-f286939b56d0_640x377.jpg
     
  11. overclocked_enthusiasm

    Regular

    Joined:
    Apr 26, 2004
    Messages:
    424
    Likes Received:
    3
    Location:
    United States
    BTW, a $340 CPU (i7 4770k) with a $60 GPU (GT 630)?? Hardly a typical setup...
     
  12. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,073
    Likes Received:
    1,158
    Location:
    Treading Water
    No, a more typical setup would be the onboard graphics. Would it fair much better?
     
  13. overclocked_enthusiasm

    Regular

    Joined:
    Apr 26, 2004
    Messages:
    424
    Likes Received:
    3
    Location:
    United States
    More of an indictment against the outdated GT 630 and the weak onboard graphics in Haswell rather than showing Kaveri's strengths. Calling out the competitions best CPU like that seems a bit disingenuous to me.

    How about i7 4770k + GTX 780 vs Kaveri + 290x? It's like comparing oranges to watermelons when trying to show an AMD APU vs a high end Intel CPU with weak onboard graphics. Not sure there is an apples to apples comparison to be had...is there?
     
  14. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,073
    Likes Received:
    1,158
    Location:
    Treading Water
    The point is to show off the power of their APU, sticking a 290x in the box is showing a different product.
     
  15. overclocked_enthusiasm

    Regular

    Joined:
    Apr 26, 2004
    Messages:
    424
    Likes Received:
    3
    Location:
    United States
    Is Kaveri going to play in the same sandbox as i7 4770k? If not, I don't get what they are trying to prove. The Kaveri GPU is stronger than an onboard Haswell + $60 discrete from 18 months ago? If anything, it muddles the picture. Picking a lower i5 and a stronger Nvidia GPU would have been more impressive to me...and less obvious.
     
  16. RecessionCone

    Regular Subscriber

    Joined:
    Feb 27, 2010
    Messages:
    501
    Likes Received:
    178
    Yes, but Kaveri will not compete against the GT 630...
     
  17. AlphaWolf

    AlphaWolf Specious Misanthrope
    Legend

    Joined:
    May 28, 2003
    Messages:
    9,073
    Likes Received:
    1,158
    Location:
    Treading Water
    It will compete against a lot of low end gpu's. As long as you can still buy a gt630, it's one of them.
     
  18. RecessionCone

    Regular Subscriber

    Joined:
    Feb 27, 2010
    Messages:
    501
    Likes Received:
    178
    That's a strange argument. There are a lot of antiquated things you can still buy that Kaveri doesn't compete with.

    It would have been much more impressive to show it beating Iris and Iris Pro. Even though Iris and Iris Pro will likely be superseded by Broadwell parts soon after Kaveri hits the market.
     
  19. Raqia

    Regular

    Joined:
    Oct 31, 2003
    Messages:
    508
    Likes Received:
    18
  20. overclocked_enthusiasm

    Regular

    Joined:
    Apr 26, 2004
    Messages:
    424
    Likes Received:
    3
    Location:
    United States
    So if the current high end AMD APU is $150, will Kaveri be in the same price point? $250? If so, all the more reason to show it against something it will actually compete against and not the i7 4770k. I guess I am confused as to what the inference is here by AMD.

    1. "Kaveri's onboard GPU blows away the onboard GPU in Haswell even with a $60 discrete card. We'll throw in the i7 to obfuscate and to smear their flagship"

    2. "Kaveri with (insert discrete here) will crush i7 with (insert discrete here)"

    3. "Kaveri is an APU and Intel really doesn't have anything with the level of dedicated graphics support, so we stand alone in the space. We'll throw in the i7 to obfuscate and to smear their flagship."

    I doubt if it is 2...so I'll vote for the 1+3 combo platter. Again, comparing $400 worth of Intel/Nvidia hardware to a $250 (guess??) Kaveri APU instead of showing against say i3 3240 ($119) + GTX 550 ti Boost ($139)which is also $258 combined.

    For starters the GTX 550 ti Boost is about twice as fast as the GT 630. God forbid AMD do an above board comparison on GPU power...UNLESS of course they are stating the processing power of the Kaveri CPU is on par or better than i7 4770k?
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...