[H]ardocp (Kyle Bennett) thoughts on ATI/AMD, Intel, and Nvidia.

Discussion in 'Graphics and Semiconductor Industry' started by ChrisRay, Apr 30, 2007.

  1. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    #1 ChrisRay, Apr 30, 2007
    Last edited by a moderator: Apr 30, 2007
    digitalwanderer likes this.
  2. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,297
    Likes Received:
    464
    I wonder if Kyle has considered the possibility that the reason why today's GPUs consumer so much power has less to do with them being piece of crap designs by know-nothing neophytes and more to do with all the tasks they have to perform? Those FLOPS have to come from somewhere, you know. As such, the running theme that all that is needed to bring power consumption down to nothing is some magical insight only people with CPU-building experience can posses seems rather strange.
     
  3. Voltron

    Newcomer

    Joined:
    May 25, 2004
    Messages:
    192
    Likes Received:
    3
    As if CPU makers can magically make GPUs and NVIDIA is increasingly investing in custom layout.

    AMD buys ATI so R700 is going to be amazing?

    If underlying 8000 series architecture is more efficient, as does seem to be the case, one wonders how long it will take for ATI bring out a new architecture. NVIDIA responded pretty fast to FX series with NV40, but underlying architectures seem to be sticking around longer these days and one cant exactly say amd's forte is bringing products to market quickly, at least relative to NVIDIA.
     
  4. MulciberXP

    Regular

    Joined:
    Oct 7, 2005
    Messages:
    331
    Likes Received:
    7
    ATI having more power efficient designs? Has he forgotten why nvidia has made up so much ground in the laptop space the past few years? And the rest of his assertions seem to assume AMD is going to somehow help them be more power efficient in the future, what by giving them fab space? I doubt it.
     
  5. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,297
    Likes Received:
    464
    Nevermind that P4 was a glowing abomination or that 90nm Athlon X2s have sucked juice pretty badly, those CPU people know what they are doing. In fact, I am sure they can put out a 400-500 GFLOPS chip drawing 15W.
     
  6. Voltron

    Newcomer

    Joined:
    May 25, 2004
    Messages:
    192
    Likes Received:
    3
    Also, isn't CUDA in beta? What bizarre criticisms, especially since NVIDIA is clearly excited about the double-precision part coming this fall.

    Inflexible management? What does that mean?
     
  7. Killer-Kris

    Regular

    Joined:
    May 20, 2003
    Messages:
    540
    Likes Received:
    4
    That's a frustration that I've started to have with a particular author from another one of those big sites. They don't seem to realize that most CPU workloads have a hard time surpassing an IPC of 1.0, which on a chip like C2D means that 75% of the ALUs are idle. Now if they look at the fact that the GPU idle time could be close to the reciprocal of the CPU time (or less) it should become painfully obvious why one can be made to sip power and not the other.
     
  8. Arun

    Arun Unknown.
    Moderator Legend Veteran

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    299
    Location:
    UK
    That editorial is hilarious. Thanks for a good laugh. ;)
    Wait... It was a joke, right?
     
    digitalwanderer likes this.
  9. AlBran

    AlBran Ferro-Fibrous
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    20,725
    Likes Received:
    5,818
    Location:
    ಠ_ಠ
    Uh huh... I suppose he's a master engineer and knows all about efficiency. Gosh, what were all those ATI engineers thinking...

    Oh yea... these so-called larger system builders outfit their systems with only the top of the line components.... How could I forget. :runaway:

    I see Kyle is a Master of AutoMagics.



    I do agree that things look grim with AMD and their apparently lagging release schedule of *new* designs though. IMHO, the AM2 was just pathetic, and the constant delays of the R6xx family is getting on my nerves. :(
     
  10. JoshMST

    Regular

    Joined:
    Sep 2, 2002
    Messages:
    465
    Likes Received:
    18
    Well, it is an editorial. I think quite a few of us have been guilty of producing one of those now and again. I don't know how much Kyle knows about the internal workings of these chips, or why they perform the way they do, but he certainly knows things from the end users perspective. Just because he is taking that point of view for most of his arguments does not mean it is invalid... because this is supposed to be a customer-centric set of businesses.

    While I don't agree with a lot of his contentions in the article, he certainly has a right to put forward his point of view.

    Some years back I wrote an article called "Slowing Down the Process Migration" and in it I talked about what I thought some trends were going to be. One of the biggest points of the article was that these companies are going to reach a point in that die shrinks are going to become harder and harder, and to keep die sizes to more manageable levels they are going to have to do more functionality with fewer transistors. NVIDIA's use of custom scalar units certainly underlines this point, as they have significantly increased performance without doubling that particular transistor count... and they do it by increasing the speed. While the promise of the Intrinsity technology has never appeared to go anywhere, NVIDIA taking the tough choice to do custom portions of the GPU is certainly a far reaching decision. We will end up seeing in within two generations time more portions of the GPU going full custom, which will make for better performance and power consumption vs. a traditional standard cell design. So instead of having 256 scalar units of standard cell we have 128 units running at more than double the speed without much more power consumption. I certainly hope ATI/AMD can do something like this in short order...
     
  11. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,270
    Likes Received:
    1,785
    Location:
    Winfield, IN USA
    Very insightful, very forward thinking. :yep2:












    I indeed did find Uttar's comments most illuminating. ;)
     
  12. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    Wanna have a really good laugh ?
    Try to read some of the letters that Fudo's just published on his site.
    100%, fat-filled comedy. :D
     
  13. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha Subscriber

    Joined:
    May 14, 2005
    Messages:
    1,373
    Likes Received:
    242
    Location:
    NY
    I will say this in favor of Kyle, I don't think Nvidia's days will get any easier. While I don't foresee them folding anytime soon, I do believe trouble is on the horizon for Nvidia.
     
  14. JoshMST

    Regular

    Joined:
    Sep 2, 2002
    Messages:
    465
    Likes Received:
    18
    I don't think things have ever been easy for NVIDIA, rather they just made it look easy. Except for the hiccups that were the NV1 and GeForce FX, they have had a pretty amazing record. From what I understand, the climate there is one of "everybody works hard, people don't play games unless they are in Dev Rel". They have a solid roadmap, they have expanded into areas that have made them money, and they work incredibly hard. From the CEO to the lowest janitor, everybody there comes early and stays late. Just the culture they tend to foster. Oh, and they are apparently one of the best companies to work for in Silicon Valley when talking about pay, bonuses, incentives, and benefits.

    So, while I agree that things won't get easier for them, I think it has constantly been a struggle for them to get to where they are. ATI was a larger, older company with more engineers, but NV was able to catch up. Even with the tremendous boost ATI had with the R300 series, NV caught up again after a year and a half.

    Still, it will be interesting to see where NV goes when obviously there is a trend towards convergence of GPUs and CPUs. Don't they figure it will be about 10 to 15 years before we get truly lifelike 3D scenes? Curious what they will do when they seemingly hit the visual performance wall.
     
  15. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha Subscriber

    Joined:
    May 14, 2005
    Messages:
    1,373
    Likes Received:
    242
    Location:
    NY
    Which is why I am foreseeing problems for Nvidia.

    I think that one day there will not be any more video cards, sound cards, physics cards, etc (but the Killer NIC card will live on forever! :lol: :roll: ) because everything will eventually be rendered/processed on several CPU's setup in a parallel configuration.

    I also believe this will happen sooner than most think; I have very high hopes for technologies like Fusion (obviously Fusion won't be targeted at the high end in the beginning, but I think the potential of Fusion will have a profound impact on the industry). Which I think (perhaps I am wrong) is the point Kyle was trying to get across. If the future of computers is technologies like Fusion (and whatever Intel's version is called), where does that leave Nvidia? Not an easy question to answer (especially if you're Nvidia).
     
  16. wishiknew

    Regular

    Joined:
    May 19, 2004
    Messages:
    332
    Likes Received:
    6
    Was Kyle at Tunisia or left out in the cold.
     
  17. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    You mean left out in the sweltering desert :?: :wink:
     
  18. jimmyjames123

    Regular

    Joined:
    Apr 14, 2004
    Messages:
    810
    Likes Received:
    3
    I don't see why NVIDIA would have any more problems than other companies. They may have disadvantages in some areas, but other companies will have disadvantages in other areas. Developing higher and higher performance programmable graphics solutions is extremely tricky, and the hardware design is only half the battle. I think that the very unique aspects of G80 in comparison to prior NV architectures is pretty good evidence that the company is willing and able to innovate and evolve over time to produce novel graphics solutions.

    It's just not wise to write off anyone at the moment. I remember that a few years ago, lots of people had written Nintendo off as it was competing against behemoths Sony and MSFT. Now look at how successful they have been simply by coming up with a very innovative and novel approach.
     
    #18 jimmyjames123, May 1, 2007
    Last edited by a moderator: May 1, 2007
  19. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    I think many people forget that, nV has been the best for the longest period of time because they push the envolope with products and make them sell much better then any other graphics card company in the past and present. As a company they are very good too, and that is probably being modest.
     
  20. jimmyjames123

    Regular

    Joined:
    Apr 14, 2004
    Messages:
    810
    Likes Received:
    3
    Yeah, I agree. Look at their track record of big successes over the past few years: Geforce 2, Geforce 3, Geforce 4 Ti, 6800, 7800, 8800 (only exception being NV30). It's no fluke that they keep having so many very successful generations of cards, they have a Razor-sharp focus on advancing visual processing solutions (no pun intended :) ).
     
    Razor1 likes this.
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...