AMD: Sea Islands R1100 (8*** series) Speculation/ Rumour Thread

Discussion in 'Architecture and Products' started by Shtal, Dec 31, 2011.

Tags:
  1. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    It doesn't have to be that specific, if only because everybody should be low power oriented already by now. :wink: Video, fabrics, CPU architecture, caches, signal theory, overall SOC expertise etc. I see tons of recruiter emails, old colleagues leaving, interviewing candidates, people joining incl. fresh from college even not first tier(!). Lots of people changing chairs and chairs being added.
     
  2. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    What is the benefit/ profit? :lol: Workstations with performance of a phone or tablet? :lol:
    This is ridiculous and first of all we will lose the beauty of high power beasts, second- you will need some generations to catch up the old performance, and finally- there will always be people who need maximum performance at any cost.
    Low power= less work performed per any given period of time or pure waste.
     
  3. McHuj

    Veteran Regular Subscriber

    Joined:
    Jul 1, 2005
    Messages:
    1,416
    Likes Received:
    534
    Location:
    Texas
    Qualcomm and Samsung are hiring a lot from AMD in Austin.
     
  4. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    I agree, I work for a very large corporation but we do not let go of our elite people, the few that do decide to leave are replaced with young talent that is very good with fresh ideas.

    There has been a recession going on in North Amercia for seven years (maybe more), your comments about not having a budget reflects what alot of companies are dealing with.
    When you sit back and think of what AMD attempted to do, take on Intel and win marketshare (which they did) and at the same time compete with Nvidia (beating or sometimes losing in performance) looking at what they have done it is pretty impressive but I think it was too aggressive for them, taking on both Giants at the same time.
    If AMD had good forcasting they could have had a mobility chip and beat Intel and Nvidia to market, their current business model focusing on servers is sound as all these hand helds do need to talk to something, but AMD does have expertise with CPU/GPU development and could have been a major player.
     
  5. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    He didn't mean that we should give up on 100W chips. He meant everybody should be focused on power efficiency.
     
  6. copperiodide

    Newcomer

    Joined:
    Oct 28, 2011
    Messages:
    1
    Likes Received:
    0
    Exactly, even at 100W it is: can I do more in 100W than the competition?
     
  7. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    If you don't use low power design techniques on a chip that has a 100W budget, your competitor will make the same chip with a 50W budget.

    Thanks for playing, though.
     
  8. Love_In_Rio

    Veteran

    Joined:
    Apr 21, 2004
    Messages:
    1,444
    Likes Received:
    108
    So, what is happening guys?. Will GCN be the last great architecture for desktops from AMD?. Do we blame Apple and the smartphone/tablet revolution?. What about Nvidia?. For us, graphics freaks, this all sounds a little depressing... above all because people with the latest Iphone has no clue about the GPU driving its little fashion device and however we, gpu lovers all have tried to understand the insides of the trannies moving the polys in our devices from de Voodoo era... All will be lost like tears in the rain!?.
     
    #688 Love_In_Rio, Jan 2, 2013
    Last edited by a moderator: Jan 3, 2013
  9. boxleitnerb

    Regular

    Joined:
    Aug 27, 2004
    Messages:
    407
    Likes Received:
    0
    GPU architectures will continue to evolve. The only change will be that the product cycles will slow down and new processes won't be used right away.
     
  10. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,435
    Likes Received:
    263
    No.
     
  11. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    Let me rephrase.

     
  12. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    You know with AMD doing so badly and everybody jumping ship, you really have to wonder what is so wrong with Nvidia that they can basically only get a tie on performance while being months late.
     
  13. Homeles

    Newcomer

    Joined:
    May 25, 2012
    Messages:
    234
    Likes Received:
    0
    This is absurd.

    The one time Nvidia doesn't make a massive monolithic die of doom, they get jumped on for "only tieing" AMD. Let me also point out that most of the "abandon ship" has been happening well after the release of the 7000 series.

    In reality, Nvidia's position has improved relative to AMD's. Power efficiency is much better, perf/mm^2 is better, their microstutter is drastically reduced...

    But no, because their top chip is smaller than AMD's, there's "something wrong" with them.

    In reality, the only thing that's wrong here is the argument you've just made.
     
  14. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    Where is power efficiency better? As usual "enthusiasts" are stuck looking at the top cards and ignoring the rest of them. Care to show me how Nvidia is winning the efficiency stakes at the low end or midrange?

    So Kepler wins in perf/watt at the enthusiast end when AMD saddles their gpu with a crapload of compute and a memory bus + memory that is 50% bigger than it needs to be? Jeez I'm so impressed!

    AMD could barely be in a worse state yet they are still ahead of Nvidia's "great" Kepler architecture where it matters. That should make you and rpg wonder just how godawful bad Nvidia is.
     
  15. snarfbot

    Regular Newcomer

    Joined:
    Apr 23, 2007
    Messages:
    540
    Likes Received:
    188
    They couldve made a gk110 consumer card if they absolutely had to.

    On the other hand i dont feel amd's situation is as grave as its made out to be, they have the next gen in the oven already im sure. And its not like critical engineers just walk one day, and everything they touched gets delayed while their replacement gets up to speed.
     
  16. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    I think GK107 is great and GK106 not terrible either.
    Of course AMD does well too. They both do efficient GPU and there's not much flame material unless you want to nitpick here and there.

    I think we can order the low-midrange products this way and that it's true for both power use and performance :
    GTX 650 < 7770 < GTX 650 ti < 7850

    If a vendor sucked and the other not we'd see a card both more power hungry and slower than the competition, as could happen in the past.
    ( I happen to think the GTX 650 and 650ti are appealingly power efficient :razz: )
     
  17. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    Yes I agree, however that's what I said. Nvidia basically ties while being months late, and while Nvidia fanboys are happily putting the boot in they should remind themselves that the best their team can muster while AMD is imploding is to get a draw. I see no reason for anyone to be smug right now.
     
  18. Homeles

    Newcomer

    Joined:
    May 25, 2012
    Messages:
    234
    Likes Received:
    0
    Whatever helps you sleep at night. Meanwhile, in the real world, Nvidia's raking in the dough because AMD flopped terribly with OEM wins.

    You aren't thinking rationally. Your post is saturated with confirmation bias, and you have no business partaking in rational discourse until you can demonstrate an ability to critically think.

    Why are you unimpressed? Because you want to be. Because you want AMD to win, and because you want Nvidia to lose. So you come here and post a ridiculous argument with no factual backing in order to paint Nvidia as the loser.

    This isn't the Reddit hardware hivemind. This is Beyond3D. Your drivel does not belong here, and the majority of us around here are just going to roll our eyes at you.
     
    #698 Homeles, Jan 4, 2013
    Last edited by a moderator: Jan 4, 2013
  19. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    Yeah I remember running the numbers on it. Nvidia has made a profit of about $10m a quarter on average for the past 5 years. Great stuff, really raking in the dough. :lol:
     
  20. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    :lol: Taxi for "Homeles"
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...