AMD-ATI Conference Call Transcript [LIVE]

Discussion in 'Beyond3D News' started by Arun, Jul 24, 2006.

  1. Skrying

    Skrying S K R Y I N G
    Veteran

    Joined:
    Jul 8, 2005
    Messages:
    4,815
    Likes Received:
    61
    Its certainly coming, it doesnt matter if people want to think its not. It'll be in the low end first though and probably for a long time. That's where its advantages will come into play.
     
  2. Junkstyle

    Newcomer

    Joined:
    Sep 18, 2005
    Messages:
    158
    Likes Received:
    1
    This is a sad day for 3d graphics. AMD bought ATI for many reasons...top performance 3d graphics cards was probably less than 20% of the reason they bought them for. So research into cutting edge 3d technology is going to get de-emphasized by AMD execs. This will slow down competition. If ATI eases up Nvidia will ease up. Progress will slow.
     
  3. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    It may also move into the mid range on multi-socket systems if we get enough on-die or on-package memory to make the performance worthwhile. That'll be some years yet, though.
     
  4. Titanio

    Legend

    Joined:
    Dec 1, 2004
    Messages:
    5,670
    Likes Received:
    51
    Thread-level-parallelism? You may want to have a look at some of the more exotic CPUs out there now, and where things may be going more generally glancing at Intel's roadmap..

    Not necessarily. Couldn't you have two 'CPUs' where you now have one CPU and one GPU? Each with their own pool of memory and a high speed interconnect between them?
     
  5. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    I'd say GPUs are more than just "TLP". They're SIMT. That is, Single Instruction Multiple Threads. Or at least, pixel shaders are, and ATI's current vertex shaders. G70's Vertex Shaders, on the other hand, are much more similar to a "traditional" CPU SIMD architecture. While still being incredibly different because of the kind of instructions run on them, of course.

    Fundamentally, you can't match a GPU's efficiency by an order of magnitude unless your CPU is a stream processor with excellent latency hiding (->huge number of registers), and even then, there's nothing to write home about. Good luck doing that on x86, anyway. An interesting case is that of CELL's SPEs, for which it can be highly preferable to "simulate" having multiple threads, or even sometimes SIMT, for certain workloads. And it's definitively possible (although painful) thanks to the huge number of registers and the large Local Store.

    But anyway, I'm getting carried away - this is about AMD/ATI, not CPU-GPU differentiation. So just ignore me here and try (kinda, sorta) sticking to the subject ;)


    Uttar
     
  6. DemoCoder

    Veteran

    Joined:
    Feb 9, 2002
    Messages:
    4,733
    Likes Received:
    81
    Location:
    California
    I already mentioned chips like Niagara (or Vega), but Intel ain't going there for a long time, because most workloads aren't TLP friendly today with the exception of some server stuff.


    And how you are going to connect these two separate memory pools to the die? 256-bit memory bus is already straining chip packaging to the limit, and you want to add another 64-256bit CPU memory pool as well?

    As for SIMT, I haven't heard this terminology before. Presumably, it's just a specialization of TLP wherein each thread has its own context of registers, et al, but share a single program counter?
     
  7. IgnorancePersonified

    Regular

    Joined:
    Apr 12, 2004
    Messages:
    778
    Likes Received:
    18
    Location:
    Sunny Canberra
    Sure ... if you listened to Intel prior to voodoo the cpu would do everything and cut your lunch. However the exact opposite is true now though more and more functions have gone from discrete units too Integrated on mobo to integrated in the South Bridge. There's certainly a trend towards integration or accumulation of functions though I find it hard to believe I will be buying a cpu that does all the gpu work as well and this merger was the cause of that and Ati's future options in this area suck the integrated arse. It's easier to believe I will be able to buy a relatively high end processor and select the GPU option that goes into it - wether it be a descrete gfx card or another socketed type of gfx product. Maybe Joe Sixpack and business will buy it but they basically are buying the forerunner of that system right now. I can afford the extra cooling, power case space etc for a gaming rig. I am not uber highend - far from it ... further from the best integrated pc now however. I doubt that is going to change.

    Unless there is a massive upheaval in the graphics world coming as indicated by other prescient threads here and this merger is the strategic maneuvering prior to the new age dawning in which case it was inevitable anyway... my point being I doubt this merger just pulled the trigger on discrete gfx.
     
    #127 IgnorancePersonified, Jul 25, 2006
    Last edited by a moderator: Jul 25, 2006
  8. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Well, no, but GPU's integrated into CPU's seem a near certainty on the AMD side now, for their low-end products.
     
  9. IgnorancePersonified

    Regular

    Joined:
    Apr 12, 2004
    Messages:
    778
    Likes Received:
    18
    Location:
    Sunny Canberra
    I like this. Just read it.
    CPU integrated GPU - south bridge = Low end + business machines.
    CPU - cHT socket style Gfx - South Bridge = sounds like the mid-range.

    High-end gaming = similar configuartaion to what we have today as it most likely will be in the workstation market.
    Nothing new there just a summation.
     
  10. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Yup, that's about what I'm expecting, and I'd be willing to bet that nVidia is going to be fighting very hard for marketshare in the mid-range socket market, if that market comes to fruition.
     
  11. IgnorancePersonified

    Regular

    Joined:
    Apr 12, 2004
    Messages:
    778
    Likes Received:
    18
    Location:
    Sunny Canberra
    Previously I thought the socket concept would not fly or be niche at best. Now, if the acquisition goes through, it's a certainty. I am not clear as too whether or not the cHT socket describes a form factor for the physical socket implementation?? If it is standard, or a standard number of "lanes" becomes the norm then there's is plenty of room for all current chipset players to supply chipset/gfx combo's, mobo makers to add variety + spice and for end users to have the choice of mixing and matching components.
     
  12. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    DailyTech, specifically Kristopher Kubicki, was kind enough to clean up their piece when the misattribution was brought to their attention. So good on them. :smile:

    http://www.dailytech.com/article.aspx?newsid=3471
     
  13. Titanio

    Legend

    Joined:
    Dec 1, 2004
    Messages:
    5,670
    Likes Received:
    51
    Here's an interesting article with comments from Patrick Moorehead of AMD:

    The AMD-ATI Acquisition: Integration and Freedom for Customers, IHVs

    Some things that stuck out at me:

    Huh?

    Further..

    Also, just so you know who's steering this ship:

     
  14. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Are there any other equals when Jen-Hsun is in the room? :lol:

    Interesting stuff, tho, particularly the "information firewall".

    It's nice they are making the effort, and probably it will perserve them some relationships. Doesn't change the fact that some of those other players will be reducing their overall footprint in the AMD world. I wonder if this might be enough in the not-so-long-run to knock Via out of chipsets completely.
     
  15. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    Wow, I'm infected with the Nvidia conspiracy bug. :lol: The first thing that popped into my mind after Titanio's post was that this was a massive Nvidia architected conspiracy to remove their high-end competition in the GPU space while AMD gains ATi's expertise in chipsets/integrated. Though I find it very hard to believe that AMD considers Nvidia to be their equals. But it is interesting that they are openly expressing their loyalty to ATi's biggest competitor - their dance partner so to speak.
     
  16. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Well, if I really wanted to be cynical I could note that whether they need NV all that much or not down the road, they certainly do still need them for the next several months to maybe a year! And this deal hasn't closed yet, either. How do you suppose that would help the relationship if AMD was doing the "In yo' face, NV!" dance all over the place and then the damn thing fell thru? :lol:

    Edit: Said another way, AMD has absolutely no self-interest right now in annoying anyone (other than Intel, of course!) more than the inherent logic and implications of the combination already does. They don't have to close interfaces or whisper in ears to get good things happening for them in the short to mid-term from this.
     
  17. Titanio

    Legend

    Joined:
    Dec 1, 2004
    Messages:
    5,670
    Likes Received:
    51
    The full article has an explanation. Using the same valuation as AMD did with their ATi (1.7x market cap), an offer of $12.5bn would have been required to acquire nVidia. AMD's market cap is about the same. So a simple acquisition would not have been on the cards - more like a merger - and AMD would not necessarily have been in the driving seat as a consequence.
     
  18. DudeMiester

    Regular

    Joined:
    Aug 10, 2004
    Messages:
    636
    Likes Received:
    10
    Location:
    San Francisco, CA
    Another point about this deal that I think is worth considering:

    AMD as a CPU company has the majority of their experiance invested in full custom designs. On the other hand, GPUs are a strong blend of premade library units and custom units. With CPUs become much larger and more modular (look at multi-core and K8L), ATI's skills may be very useful to AMD so that they can develop CPUs efficently and keep pace with Intel.
     
  19. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    Heh, good point.
     
  20. Bobbler

    Bobbler Shazbot!
    Veteran

    Joined:
    May 22, 2005
    Messages:
    1,827
    Likes Received:
    29
    Location:
    Minneapolis, MN
    That could work the other way around as well. Over time as the feature set of GPUs starts to solidify more, it's possible we'll see the GPU market start to act like CPU markets -- stay with an architecture for years vs changing it rather drastically every 9-12 months. It seems to me they can't continue with the drastic architecture changes forever... (on the software - DX/OGL - and hardware fronts)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...