[H]ardocp (Kyle Bennett) thoughts on ATI/AMD, Intel, and Nvidia.

Discussion in 'Graphics and Semiconductor Industry' started by ChrisRay, Apr 30, 2007.

  1. Techno+

    Regular

    Joined:
    Sep 22, 2006
    Messages:
    284
    Likes Received:
    4
    by swallow i dont just mean pull on die. I mean pull on die with the best performance\mm^2 ratio.
     
  2. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    Maybe the chipset would take care of that ;)

    I see the CPU becoming a mere periferal like the chipset nowadays, and the "upgrades" more on the side of add-in modules tailored for specific tasks in which they'll be much more capable then a CPU. Not much of a difference compared to nowadays, just on another scale.
     
  3. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,133
    Likes Received:
    454
    Location:
    en.gb.uk
    Well there's no real reason to demand or expect a homogeneous solution, is there?
     
  4. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    ELSA isn't a real nvidia only partner, anyway they never been exclusive before 2000 etc. they just made cards based on anything available, the original Gloria L/XL adapters were based on S3 Virge chips with random amounts of memory.
     
    #64 neliz, May 2, 2007
    Last edited by a moderator: May 2, 2007
  5. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    The S3 Virge (the "De-ccelerator", as it was known back then :D) was around long before the term "GPU" was coined by Nvidia for the original Geforce 256 in late 1999, so i think my statement is still valid. ;)
     
  6. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Well, nVidia's GPU term was made up to consists any device that assists the CPU in graphics processing, which definitely includes all those S3 this that had some sort of hardware capability.. just not in dx/glide.
     
  7. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    GPU came up with the GF256 and it's T&L unit, before that the term was not present.
     
  8. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    I think the coining of that marketing term has nothing to do with what we call GPU's today, even nvidia itself changed it's 1999 definition of GPU (T&L +10 million polys) to a more common "any processor to which the CPU can offload graphics tasks"

    Part of their nsist campaign since not everyone cares about hardware t&l but like "accelerated photo processing" better...
     
    #68 neliz, May 3, 2007
    Last edited by a moderator: May 3, 2007
  9. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    What's this ?

    http://www.digitimes.com/mobos/a20070507PD208.html


    Humm, "Nehalem" chipset license (that would be strange, considering the object of the exchange -GPU tech-), or (yes, i know this sounds thin, at best) a x86 license ?
     
  10. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Trade of technology or are we talking about advanced integration of hardware or IP's?
     
  11. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    When I read that at Digitimes this morning, fairly or not, my first reaction was that Intel is famous for strong-arming partners into cross-licensing IP as the price for playing on their platform. Dunno that's what's going on here, as the sourcing and details are mighty thin, but it certainly would fit the long-term historical pattern.
     
  12. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    I don't see intel sticking out a hand either, hello nVidia, want to make competing hardware?

    Aren't we just talking about SLi plus some low-level communication stuff here
     
  13. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    Maybe. But if NV foresees the sun setting on a good chunk of their AMD chipset business (and I think it likely they do foresee that), then possibly something a littler broader is afoot.
     
  14. Voltron

    Newcomer

    Joined:
    May 25, 2004
    Messages:
    192
    Likes Received:
    3
    Of course, licensing agreements are secretive, but considering NVIDIA's previous refusal to enter Intel chipset market on account of unfavorable licensing terms, there is reasonable speculation that the current chipset license does not involve royalty payments.

    JHH has frequently pointed out that NVIDIA has a lot of IP too. So as mighty as Intel might be in the IP department, NVIDIA is hardly a slouch. And key patents are key patents, no matter how many total patents you have. I don't think it would be too much of stretch to think that NVIDIA has long been building up IP leverage to play against Intel.
     
  15. Maintank

    Regular

    Joined:
    Apr 13, 2004
    Messages:
    463
    Likes Received:
    2
    No surprise really. Nvidia has a choice to remain loyal to AMD and hope they dont get backstabbed from a sinking ship. Or make a deal with the devil with the hope you can beat said devil at its own game.

    The Intel market is much bigger and more stable than AMDs.
     
  16. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,297
    Likes Received:
    464
    It might be bigger and more stable but Intel doesn't like sharing that market with anyone. BTW, despite JHH rethoric about how they are "the last one standing", "winner by default" and "exactly where they want to be", if Intel makes them a good offer - let's say $42 a share - Nvidia will vanish in a snap of the fingures.
     
  17. Techno+

    Regular

    Joined:
    Sep 22, 2006
    Messages:
    284
    Likes Received:
    4
    It makes me really sad :cry: that the AMD\ nvidia realtionship is no more as good as it was before the ATI acquisition. They made really good partners and were meant for each other.
     
  18. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,141
    Likes Received:
    1,671
    Location:
    Winfield, IN USA
    Considering they bought nVidia's biggest competitor though, didn't you foresee just a few hiccups in their relationship? :|
     
  19. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,297
    Likes Received:
    464
    I think one of the reasons why AMD-Nvidia relationship deteriorated that is often overlooked are the declining fortunes of AMD in general. Intel has always had a bigger slice of the pie, but until recently AMD had a multi-year grasp on the enthusiast community - and was growing across the board. Intel, on the other hand, was knee-deep in Netburst nightmare and was never keen in letting other companies on their turf anyway. The Nvidia-AMD relationship was natural - given a choice between a company with superior product and eagerness to partner and one with inferior CPUs and crappy attitude, who wouldn't go with the former?

    Now, let's see what happened since:
    1) AMD bought ATI. They said all the right things about keeping things open and blah-blah, but they now they are in direct competition with Nvidia across the board - everything Nvidia sells AMD would rather have you buy from them.
    2) Nvidia broke into Intel market. You'd rather have 20% of 80% than 50% of 20%.
    3) AMD lost the technological leadership. The same people who were buying Nforce 2s for their Athlons are now shopping for Core2 systems.

    3 years ago, both companies needed each other. Today, neither does.
     
    Tahir2 likes this.
  20. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,418
    Likes Received:
    411
    Location:
    New York
    And that can change in the blink of an eye - Nvidia is in a sticky position here. There is no guarantee that AMD won't rise again as the darling of the enthusiast community and if they do Nvidia will want to be there with competitive products supporting AMD CPU's.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...