The LAST R600 Rumours & Speculation Thread

Discussion in 'Architecture and Products' started by Geo, Jan 2, 2007.

Thread Status:
Not open for further replies.
  1. Robin B

    Newcomer

    Joined:
    Sep 19, 2005
    Messages:
    54
    Likes Received:
    0
    If i remember right, so did Matrox. :wink:
     
  2. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    I think we're reserving judgement on that. Great DX9 card.

    Jawed
     
  3. Acert93

    Acert93 Artist formerly known as Acert93
    Legend

    Joined:
    Dec 9, 2004
    Messages:
    7,782
    Likes Received:
    162
    Location:
    Seattle
    True. But does it even matter at this point? It has full DX10 compliance and I don't think the market really digs much deeper than that in many cases. As a market approach it is hard to argue with the strategy of, "Great current-focused API / Game performance, full feature support for the cutting edge API / features" because I would be willing to bet that a substantial number of high end GPU sales are in the first 6 months after release. Thus any shortcomings 12 months out are too late to make a real difference. Taking this approach, along with a very strong midrange ($149-$299) offering year in and year out (and seemingly ATI's declining interest in that market) as well as strong dev-rel to encourage strong support of your brand and it is hard to forsee how this strategy can fail. Especially since the market leader determines what features are relevant or not. If 75% of GPUs have the same strengths and weaknesses a competitor pretty much needs to break the paradigm in regards to market attraction or release products across the board that mop the floor with the competitor.

    I gotta head out the door, but I wonder if, and how, ATI will be adjusting their market approach in regards to product penetration because I think at this point tit-for-tat with NV isn't going to get it done -- even if they have a better DX10 part. I guess if Crysis or other cutting edge big selling PC games that push GPU sales were substantially faster on ATI top end and midrange products they could take some marketshare back, but outside of that?
     
  4. overclocked_enthusiasm

    Regular

    Joined:
    Apr 26, 2004
    Messages:
    424
    Likes Received:
    3
    Location:
    United States
    Great post. That sums things up quite nicely and highlights the challenges involved with both feeding the market and being the "best".
     
  5. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    [Tedious reply deleted]

    Yes it is, isn't it?

    I honestly cant be bothered re-hashing ATI's useless marketing and release schedule. Nothing to be learnt there. It's up to AMD to execute now.

    Jawed
     
    Acert93 likes this.
  6. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    Nope, since we dont have more details on it. Common sense says including it wouldnt switch the scales either.
     
  7. Anon Lamer

    Newcomer

    Joined:
    May 13, 2003
    Messages:
    88
    Likes Received:
    1
    Location:
    The void between routers

    Thank you sir (in an Oliver Twist voice). You grasped the concept of "manufacturing advantage" :) Dont go long on Nvidia shares!

    BTW, wrt the level505 stats on R600 - I wonder about this:
    "64 4-Way SIMD Unified Shaders, 128 Shader Operations/Cycle"
    Shouldnt it be 256 ops? Then the stated performance of "105,000,000,000 SOps (Shader Operations Per Second)" indicates a clockspeed of ~400 mhz for the shaders which is not unreasonable for beta hardware.
     
  8. turtle

    Regular

    Joined:
    Aug 20, 2005
    Messages:
    279
    Likes Received:
    8
    What I think Jawed is alluring to is the fact it could very well suck in DX10 compared to ATi's competing products (The Inq seems to have taken this leap because of the extremely late Vista driver) while doing well in DX9...Kind of like how GFX was decent enough in DX8, but struggled in DX9. While this doesn't matter now, it surely will matter if the mid-range cards coming out closer to DX10, and Nvidia's high-end refresh post-dx10 are built on the same architecture.
     
  9. TG01

    Newcomer

    Joined:
    Dec 18, 2006
    Messages:
    40
    Likes Received:
    1
    The last half a year Ati dominates precisely that market with the X1900XT256 and the X1950Pro..
     
  10. INKster

    Veteran

    Joined:
    Apr 30, 2006
    Messages:
    2,110
    Likes Received:
    30
    Location:
    Io, lava pit number 12
    It would be interesting to know for sure if X1950 Pro did in fact beat the 7600 GT, 7900 GT, 7950 GT, 7900 GTO and 7900 GS.
    Just saying so is not much of a proof...
     
  11. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    That steam survay pretty much negated that idea though, seems like the entire line of the x1xx's didn't do well at all.
     
  12. Skrying

    Skrying S K R Y I N G
    Veteran

    Joined:
    Jul 8, 2005
    Messages:
    4,815
    Likes Received:
    61
    The X1950 Pro has been on the market a much smaller amount of time compared to those cards.
     
  13. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    So is that mean that Microsoft and ATI team up and DX10 will be based on ATI hardware and Nvidia has to follow. "Just like NV30 design code did not match with MS-DX9 code path for NV30 architecture, because Nvidia was thinking differently about how DX9 software has to run paths of directions for NV30"
     
  14. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY

    1.17% for the x1950 (seems like they bunched them all together) vs. 2.55% 7950 the 7950's were released after the x1950, the x1950 pro was released in november, the 7950 was in september, a more expensive card selling more.
     
  15. jamis

    Newcomer

    Joined:
    Oct 10, 2006
    Messages:
    127
    Likes Received:
    4
    LOL, you are allready making excuses for it for all we know it might turn out to be the better DX10 part. Anyway, as I understood G80 was actually used as a reference design for DX10 so if it sucks in it you can hardly blame Ati.:lol:
     
    #255 jamis, Jan 7, 2007
    Last edited by a moderator: Jan 7, 2007
  16. SugarCoat

    Veteran

    Joined:
    Jul 17, 2005
    Messages:
    2,091
    Likes Received:
    52
    Location:
    State of Illusionism
    1. I really doubt its going to suck as some real work was put into it and we have no reason to think otherwise
    2. The driver isnt late. Its common sense, there isnt one benchmark or game out that can currently support DX10 so they're going to work on the driver as long as they have time too. Whats the point with rushing it to release when nothing supports it? The OS isnt even out in retail shops yet.
    3. Despite the FXs downfall and the recollection of that part that loves to come back to life everytime a new architecture is on the verge of being released by either ATi or nVidia we should remember that its becoming more myth then anything. The biggest problem with the FX series was how damn late it was which let the 9x00 series cards have basically free reign over the entire market. It wasnt the lackluster performance so much as it was the wicked long delay that caused its failure. Even if it was a good performing DX9 part it still would of flopped because it was over a half a year late. The NV40 can much be criticized in the same way the FX series was architecturally. Terrible drop in AF quality from previous generation architectures, very bad performance loss with HQ IQ settings, its key feature over the R420 was infact SM3.0s features and speed increase which it totally failed at, it cannot run HDR without some serious setting (normally resolution amongst other things) reduction, nor can it do HDR+AA which have basically become mandatory for a DX9.0C compliant card in the consumers mind. But it launched without a delay and sold well thanks to PR and very acceptable DX9 SM2.0 performance. Was that a failure? No.

    Launch timing and being the first to market is extremely important. Namely because you're the undisputed champion and the only choice so of course a large portion of people are going to buy your products. X1800XT, superior then 7800GTX? Absolutly. Highest marketshare? 7800GTX sales crushed it. This is where ATI has really started to drop the ball for some reason with the R520 and now R600 and personally its pissing me off because i'm in the market for a new card and dont enjoy waiting 6 additional months for 15% performance gain. Nor do i enjoy the thought of buying a badly delayed product only to watch its price plummit by hundreds after a refresh that trumps its performance is blitzed out by either ATi or nVdia a short time later. Last serious problem is volume. Even if its 10-15% faster the cards are going to be at their introductory prices while the competition whos been manufacturing their parts for the last 6 months is going to have parts at lower and inevitably far more acceptable prices. So i've waited 6 months for a graphics card that costed me $150 more then the competitions part, gave me 10% more performance, then i get to watch its lead at the top of the pack fall to a new card announced two weeks later. Whats that taste in my mouth? Oh i know, its bitterness toward ATi. If they keep this up they may as well just leave the high end market completely and concentrate on mainstream because inevitably they're just going to keep on tarnishing their good name on the sales that actually do matter being integrated graphic sales and mainsteam parts in the $100-$300 range.

    In short i guess what i'm saying is the only part that could be an FX nightmare all over again is always going to be the part thats last to the table. The G80 is not it.

    I know the the R600, like the R520 before it, is problably suffering physical problems, either with yields at desired speeds or there is a problem with the chip itself, so it doesnt sound correct for me to blame ATi for 'making me wait', but you know what; if you cant fight in the ring then you sit the hell out.
     
    #256 SugarCoat, Jan 7, 2007
    Last edited by a moderator: Jan 7, 2007
  17. icecold1983

    Banned

    Joined:
    Aug 4, 2006
    Messages:
    649
    Likes Received:
    4
    with all the talk in this thread and dave orton saying the r600 will basically crush the competition id be very disappointed if it was only 15% faster.
     
  18. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    And maybe in order to do that "crush word" it needs some time and effort to accomplish this miracle" and that it's why it is delay.
     
  19. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
    We all know nvidia spends alot more on marketing than ATI and they're quite good at it. Most people trust the nvidia name over ATI and that becomes obvious when you visit the majority of websites outside of this one. As a result those cards have sold quite a bit better. However, technologically the X1950 Pro and XT are better than all of those cards in features, speed and image quality for around the same cost.

    Nvidia's bottom line is profit which they have nailed down to a science.
     
    Acert93 likes this.
  20. Skrying

    Skrying S K R Y I N G
    Veteran

    Joined:
    Jul 8, 2005
    Messages:
    4,815
    Likes Received:
    61
    It really does make me wonder sometimes when a 7900GS out sells the X1950 Pro. The X1950 Pro being faster in nearly every case, has more features, and better image quality and coming in at around $10 of each other. Add to the fact that the 7900GS really does not run cooler or use less power than the X1950 Pro and I generally just go with "Nvidia really does have kick ass marketing".
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...