AMD: R7xx Speculation

Discussion in 'Architecture and Products' started by Unknown Soldier, May 18, 2007.

Thread Status:
Not open for further replies.
  1. Sound_Card

    Regular

    Joined:
    Nov 24, 2006
    Messages:
    936
    Likes Received:
    4
    Location:
    San Antonio, TX
    I know you say ATi likes to increase the ALU:TEX ratio, but I really don't see a problem with a 3:1 ratio when the ALU's are being increased by 50%!!! What is funny however is that your drawing ties almost exactly to what chiphell and others have been reporting. I strongly believed in 96 ALU's and 24 TMU's, but I may just be convinced into 96 and 32.
     
    #941 Sound_Card, Mar 28, 2008
    Last edited by a moderator: Mar 28, 2008
  2. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    The prices in that chart don't lead one to believe it'll be true competition to Nvidia in the enthusiast class again.

    Although if it does end up being serious comp. (single chip vs. single chip) then I certainly like the price strategy.

    However, a $299 MSRP would suggest AMD doesn't feel a single gpu can compete with a single gpu from Nvidia. Unless they also fee Nvidia will have to use a dual GPU card to compete for the enthusiast class.

    Assuming, of course, the info in that chart is somewhat accurate.

    Regards,
    SB
     
  3. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    As far as the table can be trusted, it seems that nothing has happened on the energy efficiency front. Not surprising since the process is the same as the one currently used, but nVidia currently has a significant performance/power advantage over AMD in their 9600 vs 3870 products (under load) and it seems that there has been no architectural enhancements to help the situation.

    I wonder when GPUs based on TSMCs 40nm process will be available, and in what segments. The energy savings made possible by this generation of lithography seems to have been spent on implementing DX10/GP features. Maybe the next generation will bring lower power draws along with some traditional performance increases.
     
  4. flopper

    Newcomer

    Joined:
    Nov 10, 2006
    Messages:
    150
    Likes Received:
    6
    so 2 months and I be owning a card that runs my 8800GT into the past.
    ;)
     
  5. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,426
    Likes Received:
    10,320
    I'd much prefer Idle power efficiency over peak power efficiency. But then again I spend the majority of the time on my computer not running games. I'm probably only gaming about 2-4 hours a day generally. So lower idle power use means significantly higher energy savings.

    And on my HTPC which is almost always on. Idle power efficiency is quite significantly more important than peak load efficiency.

    Hopefully Nvidia will work on this more.

    In other words, while it'd be nice if ATI could increasing their power efficiency under load, it's an absolute non-issue for me as idle power efficiency is much MUCH more important to me.

    It's the one thing that convinced me to get a 3870 about a month ago to replace the HD 2900 XT I had even though the performance increase was negligible in most cases.

    Likewise, the primary reason I didn't get a 8800 GT or consider the 9600 series even though I was tempted.

    I'd originally been planning on trying to tolerate the power use of the HD 2900 XT until the R(v)7xx series was out or the next gen Nvidia cards were out.

    Regards,
    SB
     
  6. The_Wolf_Who_Cried_Boy

    Newcomer

    Joined:
    Feb 18, 2005
    Messages:
    172
    Likes Received:
    9
    Location:
    Floating face down in the stagnant pond of life.
    So it's now in production is it?

    Skimming through this guys posting history suggests he does know whats going on.
    Anyone here have something they'd like to share?
     
  7. Berek

    Regular

    Joined:
    Oct 17, 2004
    Messages:
    274
    Likes Received:
    4
    Location:
    Austin, TX
    What's the average/typical time frame from when a chip is taped out to seeing products (high-end) available to the market?
     
  8. LordEC911

    Regular

    Joined:
    Nov 25, 2007
    Messages:
    877
    Likes Received:
    208
    Location:
    'Zona
    Around 6-8 months if things go smoothly.
     
  9. Sound_Card

    Regular

    Joined:
    Nov 24, 2006
    Messages:
    936
    Likes Received:
    4
    Location:
    San Antonio, TX
    I'm not too sure, but I thought I read RV770 was taped out in January. So whatever from there.
     
  10. Berek

    Regular

    Joined:
    Oct 17, 2004
    Messages:
    274
    Likes Received:
    4
    Location:
    Austin, TX
    Then a June release timeframe is quite possible then...
     
  11. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    Isn't it more like 3-4 months in case no respins are needed?
     
  12. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    So what would the A12 revision of the r670 mean for the r7xx? Does AMD intend on downgrading the r670 into a 46xx series product?, whilst the new chip would be the 48xx series? Would it make sense to make the r670 a little faster, to give it longer legs as a more value orientated part?
     
  13. CarstenS

    Legend Subscriber

    Joined:
    May 31, 2002
    Messages:
    5,800
    Likes Received:
    3,920
    Location:
    Germany
    Maybe AMD were able to cut out a few more transistors? Even small amounts could help reduce pricing, if you have a disadvantageous die-size leaving an almost-complete die on the edges of the wafer.

    I sure hope, there will be some longer life for HD3850-like products, so they can find their way into many back-to-school OEM-PCs so that in the long run, game devs do not target 40-ALU-cripples as their primary market for games.
     
  14. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    I agree that AMD/ATI deserves credit for having done a good job on keeping idle power draw down. While both players could probably improve, nVidia has more to do by all accounts. HTPCs are a small part of the market, but laptops are the the lions share of the computer market, so bringing down idle power draw should be a top priority all environmental and cost concerns aside.

    That said, in-use power draw is obviously important as well, since the total system needs to be designed to accommodate maximum power draw. Laptops have obvious problems here, but desktops are difficult as well, since the gfx-card manufacturers have no knowledge of what kind of system the card will be in, nor the environment the computer needs to operate in. This adds both cost, complexity and realistically noise for all fan-based solutions. (The cooling system also adds a failure mode, increasing RMA costs.)
    All these drawbacks are passed on to the consumer.

    When even the midrange cards can draw over 100W in use, I know for a fact that quite a few of us that have both funds and an interest in 3D performance bow out of the race on power draw concerns alone. The DX10 generation paid a hefty price in power draw for their feature set. Adding insult to injury, that feature set is mostly useless unless you run Vista which only a small minority of consumers do. I hope that power issues receive a higher priority for the next generation of cards, and that the benefits of moving to 40nm lithography is spent there rather than on, for instance, IEEE compliant math which is pretty much completely useless to the masses actually paying for and using the products.

    Tangible benefits, please.
     
  15. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    AMD did already show some unannounced videocards to certain people last week in Asia... products which will be released within a few months.
     
  16. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    When RV770 launches AMD will still need cheaper GPUs. The remaining RV7xx GPUs will prolly launch later (and if history is an indication, much later) so RV670 has a fairly long life ahead of it I expect...

    Jawed
     
  17. w0mbat

    Newcomer

    Joined:
    Nov 18, 2006
    Messages:
    234
    Likes Received:
    5
    My guess:

    Q2 2008:
    enthusiast: RV770 (XT & Pro)
    performance: RV670 Rev12
    mainstream: RV635
    value: RV620

    Q4 2008:
    enthusiast: RV770x2
    performance: RV770 (XT & Pro)
    mainstream: RV740
    mainstream: RV730 (?)
    value: RV710
     
    #957 w0mbat, Mar 30, 2008
    Last edited by a moderator: Mar 30, 2008
  18. flopper

    Newcomer

    Joined:
    Nov 10, 2006
    Messages:
    150
    Likes Received:
    6
    seems plausible.
     
  19. v_rr

    Newcomer

    Joined:
    Apr 30, 2007
    Messages:
    147
    Likes Received:
    0
    Seems god, but I think that HD 3870X2 still beat the RV770 because it´s 2xGPU versus one GPU.
     
  20. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    Based on rumored specs of RV770, it probably won't.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...