AMD: Volcanic Islands R1100/1200 (8***/9*** series) Speculation/ Rumour Thread

Discussion in 'Architecture and Products' started by Nemo, May 7, 2013.

Tags:
  1. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,604
    Likes Received:
    648
    Location:
    New York
    I tend to trust techpowerup for power consumption numbers. They're the only site I know of that measures card power directly. All other sites measure total system consumption.
     
  2. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    It is what Nvidia recommended to reviewers and Anandtech blindly just eats it up. Atleast there are reviewers like Kyle [H] who have the guts to go against such tactics.
     
  3. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,515
    Likes Received:
    934
    Hardware.fr only measures the card's power.

    That said, system power consumption is not irrelevant. Sure, it sort of obfuscates the exact differences between the cards themselves, but (i) this is what you actually pay for, and (ii) if a certain card has an impact on CPU power consumption because of higher/lower driver load, that's good to know.
     
  4. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    Did they throw out Titan's compute numbers or are they still using them?

    [​IMG]

    You can't get those Titan numbers without flicking a switch in the drivers, which certainly isn't shipping spec. If it's the act of physically changing the switch that is the issue they can always just up the fan speed in CCC instead.
     
    #1984 jimbo75, Nov 8, 2013
    Last edited by a moderator: Nov 8, 2013
  5. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    Please, this has no bearing on power/noise targets. No one would buy a $1000 card to run DP GPGPU tasks and then waste it by not flipping a fucking checkbox, that's insane.

    Do you want benches to run at vsync on if that's the default, too? Even if a checkbox fixes that.
     
  6. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,529
    Likes Received:
    108
    How is that relevant to the topic at hand in any way? Are you fundamentally incapable of understanding that the tolerance for {A, N} Defence Forces is exhausted and that this bullshit will be stopped one way or the other? We do not care about your holy crusade for the honour of ATI, stop littering tech threads with this junk. If you want to help Wavey, go send a CV to AMD. Note that this holds for all other thinly veiled crusaders (of which there are a few, for both sides).
     
  7. jimbo75

    Veteran

    Joined:
    Jan 17, 2010
    Messages:
    1,211
    Likes Received:
    0
    http://anandtech.com/show/7481/the-amd-radeon-r9-290-review/17

    Based on this AT has decided that the 290X will be evaluated at 40% fan speed because that's what "out of the box" is.

    "out of the box" for Titan is single precision mode with boost and higher clocks. In order to get the DP numbers you need to flick a switch in the drivers, disabling boost and lowering clocks.

    What's the fucking difference between flicking a switch and flicking a switch? What you call "junk" I say shows the clear difference in AT's thought train when it comes to each company - for the past 8 months Anandtech has been showing DP results for Titan - which is not "out of the box" performance, yet now it's a convenient excuse to run the 290X's numbers at 40% because they can't move a fan speed slider in CCC?

    Also, what happened to "having more information is good"? Is it really so hard to show both the 290X's numbers? If anything a lot of people would be very interested in that - just to see how their favourite games run in "quiet mode". Edit - of course that's all we'll get now, no uber mode numbers.

    There is no excuse for this, none whatsoever. Not one other tech site is doing this, just Anandtech.
     
    #1987 jimbo75, Nov 8, 2013
    Last edited by a moderator: Nov 8, 2013
  8. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,529
    Likes Received:
    108
    Perhaps now we will be able to discuss things in a more interesting way, with less noise. I will take this opportunity to note that there are two other names on a list of candidates for early-outs, one wearing green underwear and one wearing red - I am sure they can figure themselves out, and that they will improve the SNR.
     
  9. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,864
    Likes Received:
    364
    Location:
    35.1415,-90.056
    Ok, so have I simply overlooked FCAT results with the 290 series in Crossfire, now that they're not using the CF bridge?

    Edit: Nevermind, I found it. Not a comprehensive review, but it looks like most DX11 titles are doing better. Skyrim is b0rked still though...
     
    #1989 Albuquerque, Nov 8, 2013
    Last edited by a moderator: Nov 8, 2013
  10. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,604
    Likes Received:
    648
    Location:
    New York
    Yeah hardware.fr does it too. Forgot about them.

    System power consumption is indicative of the reviewer's system. It's not going to be very representative of whatever (psu, cpu, ram, hdd) you're running.

    It shows the impact of the GPU on the rest of the system but the implications are blurry. Is higher system consumption good because you're less GPU limited? Higher consumption is supposed to be bad :)
     
  11. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,864
    Likes Received:
    364
    Location:
    35.1415,-90.056
    Certainly, a higher performing GPU may very well increase system power consumption because it allows the CPU (and other architectural components) to pull more of their own weight. An i7-4770k has a lot of idle time if it's feeding an GeForce 630 at 2560x1440 at high details, not so much if it's feeding a Titan.
     
  12. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,515
    Likes Received:
    934
    I would just look at performance and power consumption, without worrying about being GPU-limited or not.

    There may be an argument, however, that testing CPU load as a function of GPU model is useful to determine how big a CPU you need to purchase to properly feed your graphics card. But I'm not sure there would be much impact there (at a given performance level, that is).

    In other words, a 780 Ti will obviously require a bigger CPU than a GTX 640 to avoid CPU bottlenecks, but while picking a 780 Ti or a 290X may have some impact on CPU power draw, I don't think it's going to make you hit CPU bottlenecks on a decent quad core. This is just a guess, though.
     
  13. gkar1

    Regular

    Joined:
    Jul 20, 2002
    Messages:
    614
    Likes Received:
    7
    Anandtech's numbers show the 290X uber mode to be faster than 780ti

    [​IMG]

    Grabbed copies and will mirror them just in case
     
  14. Esrever

    Regular Newcomer

    Joined:
    Feb 6, 2013
    Messages:
    768
    Likes Received:
    532
    is this why they didn't publish them?
     
  15. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    By a stunning geomean of 1%... (aside: across a very strange data-set of non-equally weighted games due to varying numbers of configurations, fairly useless fraps windowed minimum frame rates in the same data set, etc).

    There's honestly not much useful that comes out of aggregating the results for cards that are basically the same performance. Just choose based on what games you play. Or choose based on secondary factors (cost, noise, etc). Or if you don't care about any of that, just flip a coin!

    You're welcome, internet. Now can we move on? :p

    Regarding boost/powertune/turbo, while it is definitely a can of worms, it's ultimately unavoidable. As these cards are increasingly power-limited, you enter the space where you can't turn the whole chip on at once. If you design your chip to run at the same clocks in firmark as a game you're going to be leaving a lot of useful performance on the floor.

    This is no different than the situation on CPUs for the last couple years, particularly on ultra-mobile (15W, etc. and down). Game developers are going to have to start dealing with this on GPUs too and targeting power-efficient algorithms over filling all idle processing resources to get optimal results. And yeah, moving to Alaska and gaming outside might be required for "maximum" performance :p
     
  16. iMacmatician

    Regular

    Joined:
    Jul 24, 2010
    Messages:
    787
    Likes Received:
    215
    I copied down the data in the above spreadsheet into my own spreadsheet and weighted the benchmarks so that these three conditions were satisfied:
    1. All benchmarks in a single game have a total weight of 1
    2. All benchmarks of the same resolution in a single game total to the same weight (in this case 0.5 since the only resolutions are 3840x2160 and 2560x1440)
    3. All benchmarks of the same resolution in a single game have the same weight value
    Then the weighted arithmetic means of the ratios are actually the opposite of the unweighted means: AMD/NVIDIA = 0.992, NVIDIA/AMD = 1.020; as well as the weighted geometric mean: AMD/NVIDIA = 0.986, NVIDIA/AMD = 1.014.

    And for what it's worth, comparing just the 3840x2160 benchmarks gives weighted arithmetic means of AMD/NVIDIA = 1.034 and NVIDIA/AMD = 0.977, and comparing just the 2560x1440 benchmarks gives weighted arithmetic means of AMD/NVIDIA = 0.951 and NVIDIA/AMD = 1.064 (the weighted geometric means are similar).

    (I can give a picture of my spreadsheet upon request.)
     
    #1996 iMacmatician, Nov 9, 2013
    Last edited by a moderator: Nov 9, 2013
  17. Andrew Lauritzen

    Moderator Veteran

    Joined:
    May 21, 2004
    Messages:
    2,526
    Likes Received:
    454
    Location:
    British Columbia, Canada
    Cool, so basically a wash still. Good demonstration of how these aggregate values are more just a function of choice of benchmarks when cards are neck in neck than anything else. I imagine with such a small sample of games you could include/exclude a single game to swing AMD or NVIDIA to a win.
     
  18. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    IOW, the win isn't statistically significant.
     
  19. sheepdogexpress

    Newcomer

    Joined:
    Mar 10, 2012
    Messages:
    86
    Likes Received:
    11
    I would personally rather have the 6.4% win for 2560*1440. Rather than the 3.4% win at 3840*2160.

    2560*1440 makes up 1 percent of the monitors at steam hardware. I expect 4k to be maybe a thousandth of that as it's 10 times the cost or doesn't include imacs.

    I don't think too many people are using seiki 4K tv's as gaming monitors due to their 30hz limitation plus general usage problems for daily computing.

    To give 4k equal weight to 2560*1440 is ridiculous due to it's rarity and impracticality due to cost. Heck, even 1920*1080 is more relevant once we turn up the settings and raise AA high AA high enough.

    Anandtech is doing AMD a huge favor by testing 4k and leaving out 1080p. The vast vast majority of people gaming on these cards are going to be using 1080 considering it is the most common resolution.
     
    #1999 sheepdogexpress, Nov 9, 2013
    Last edited by a moderator: Nov 9, 2013
  20. kalelovil

    Regular

    Joined:
    Sep 8, 2011
    Messages:
    558
    Likes Received:
    95
    1. 4K is probably a decent analogue for multi-monitor gaming resolutions in terms of performance.

    2. The price of UHD monitors is decreasing.

    2. There is likely a strong correlation between those who buy >=US$550 graphics cards and those who have multi-monitor or UHD gaming display setups.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...