AMD 300 Series reviews ...

Discussion in 'Architecture and Products' started by pharma, Jun 18, 2015.

Tags:
  1. Kaarlisk

    Regular Newcomer Subscriber

    Joined:
    Mar 22, 2010
    Messages:
    293
    Likes Received:
    49
    Yes there is value. According to AMD, even with the same cooler, performance will be higher because 3xx cards will be better ar turboing up. Also, the new drivers are better, which means that compared to Nvidia's lineup, AMD has become more competitive and can demand higher prices.
    The owners of 2xx cards can consider themselves lucky, as they're getting a free performance boost from drivers, instead of having to buy a new card earlier.
     
  2. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,117
    Likes Received:
    2,860
    Location:
    Well within 3d
    If I recall correctly, there was a possible bottleneck past modest factors when the amount of domain shader wavefronts becomes large relative to the wavefronts needed at the vertex or hull stages. Jawed posited that DS wavefronts were being locked to the same CUs as the HS wavefronts that were feeding them input, and the Xbox One SDK mentions this scenario as well. After that point, even though the workload would strongly benefit from more DS wavefronts, the occupancy of the CUs puts a ceiling on the domain shader portion.
    The indicated alternative was an off-chip mode that streamed to memory, which could break the restriction that domain shaders had to run on the increasingly cramped CUs the originating hull and vertex shaders ran on, at the cost of bandwidth and latency.

    The lower factors in the graph may be reflective of the scenario where GCN's preferred on-chip method could still be used, or is most likely effective.
    I guess I'm not entirely sure that I would say that the driver changes just flipped the switch between on-chip and off, since there are gaps in performance at the lower factors that could be where bandwidth is affecting performance.
    If it is on, then the 16x threshold might represent the threshold where the SDK said that devs should benchmark that mode, since it might not help.

    It doesn't seem to be a purely memory or CU-based problem, since Tonga without the beta drivers degrades somewhat like Hawaii.
    That hump at the later part of the Hawaii curve might be where the GPU gets out of the range where the memory subsystem's latency or thrashing is a factor due to the amount of data or the GPU/driver giving up on its optimization efforts. Then its raw bandwidth and CU count over Tonga takes over, at a markedly lower level.

    Perhaps Tonga at 5.15 has changed its algorithm. I'm drawn to the idea that an architectural change in Carrizo for preemption might mean that a similar feature could be in newer IP like Tonga. Rather than fiddle with buffering at high factors, maybe Tonga has synched the stages better, or can pause the stages feeding DS launch before they can spill or thrash.
     
    Alexko likes this.
  3. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,780
    Likes Received:
    4,431
    According to AMD, yes. It's just not thanks to AMD.
    My MSI R9 290X Gaming already has a 1050MHz core clock. It's a card from early 2014.
    As for the better at turboing up, we'll see what that difference is when the 5.15 drivers arrive for the 200 series.


    So now we should all feel so thankful towards AMD for launching updated non-beta drivers once every 6 months?

    Let's ask a different question:
    - How many iterative optimizations have the customers been denied over the last year so that AMD could show a substantial performance boost for their rebranded cards through the use of drivers alone?

    Funny that we've had a severe drought of driver updates for the last year or so. It seems to me that the chips are falling into place now.
     
  4. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,451
    Likes Received:
    570
    Location:
    WI, USA
    I think saving some driver improvements for launches has been a strategy forever with AMD and NV. NV likes to save them to rain on AMD's launch parade.
     
  5. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,780
    Likes Received:
    4,431
    True, but nVidia doesn't withhold non-beta drivers for half a year.
    AMD's latest WHQL driver is now 7.5 months and counting.

    Regardless, Kaarlisk's suggestion that AMD graphics' customers should consider themselves lucky because they're getting a driver update was way too degrading from a customer point of view.
    Like it or not, customers are now entitled to driver updates. Fail on that and the customers will just start buying another IHV, simple as that.
     
    BRiT likes this.
  6. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,451
    Likes Received:
    570
    Location:
    WI, USA
    Indeed I'm thinking many customers have been buying from another IHV for awhile now, judging by the AMD marketshare info I've seen. It's been interesting seeing AMD switch from the occasionally disastrous troubles of the monthly Catalyst releases to now barely releasing non-beta drivers at all.

    Frankly I don't understand why they don't push a so-called beta through WHQL occasionally for the hell of it to shut people up. It's not like these betas are so bad they cause bluescreens every 30 mins and format your hard drive. They are quite solid.

    On the other hand maybe you've noticed that being Beta is cool and edgy now. They are probably playing off that a bit.
     
    #66 swaaye, Jun 21, 2015
    Last edited: Jun 21, 2015
  7. Putas

    Regular Newcomer

    Joined:
    Nov 7, 2004
    Messages:
    387
    Likes Received:
    55
    That would be silly, for most of common folks beta is something unfinished, not very reliable. If they want to spread final drivers without certification that is fine, but don't call it a beta, that is suicidal. Their own bloatware is by default ignoring beta releases.
     
  8. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    GPU-Z 0.8.3 cannot recognise neither Grenada with its proper data, nor Fiji:

    [​IMG]&[​IMG]

    TPU wants to be the next punished over negative content?
     
  9. Ryan Smith

    Regular

    Joined:
    Mar 26, 2010
    Messages:
    609
    Likes Received:
    1,036
    Location:
    PCIe x16_1
    GPU-Z is reading Hawaii just fine.

    As for Fiji, GPU-Z has always been so-so at reading unreleased GPUs. W1zzard knows what to look for in order to ID Fiji, but since it's an unreleased part, clearly his interpretation of the ROP registers is a bit off. Which is the risk inherent in something like GPU-Z; no doubt he'll put out a new version to fix it later this week.
     
    digitalwanderer likes this.
  10. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    Not really.

    At first glance, I see only three (3) errors.

    It is Grenada and not Hawaii.
    The release date is June 18, 2015, and not that date in 2013.
    DirectX version is DX12, not DX11.2 (see how in the Fiji shot you have DX12).

    Yes, Fiji is not released yet but when his review (if) appears, how would he post (again) shots with wrong data.

    I am saying negative content because he is obviously negative to the product launch and tries to emphasize that it is "just a rebrand".
     
  11. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,414
    Likes Received:
    411
    Location:
    New York
    Besides the memory increase there's no real difference. Perf/clk and perf/w appear to be identical to Hawaii. Unless I missed something.
     
  12. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,451
    Likes Received:
    570
    Location:
    WI, USA

    GPUZ works by reading some registers of the card at hand, as Ryan mentioned. In other words, the card is being recognized as Hawaii because 390X is identifying itself similarly to a 290X.

    But yeah I imagine the program will be updated. There are other ASICs with various brandings that it recognizes with the unique codenames.



    I agree. But it is clearly some kind of strategy to specifically call them "beta"... AMD previously called these "hotfix" drivers.
     
    #72 swaaye, Jun 21, 2015
    Last edited: Jun 21, 2015
    digitalwanderer likes this.
  13. ToTTenTranz

    Legend Veteran Subscriber

    Joined:
    Jul 7, 2008
    Messages:
    9,780
    Likes Received:
    4,431
    Should we get the Fury reviews in here too?

    Here are some first results:

    Single: http://www.digitalstorm.com/unlocked/amd-fury-x-performance-benchmarks-idnum360/
    Crossfire: http://www.digitalstorm.com/unlocked/amd-fury-x-crossfire-gaming-benchmarks-vs-sli-titan-x-idnum361/

    Some first thoughts:
    - Single card performance in 4K generally sits between 980 Ti and Titan X.
    - Power consumption is about 40W over a Titan X and 980 Ti
    - Temperatures are really low on load, below 60º. Overclockers shouldn't find a wall with the temperatures.
     
  14. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    703
    Location:
    Guess...
    Looks okay. A very limited range of benchmarks but nice to see it beating the 980Ti at 4K at least - although not so great to see it losing to the Titan X.

    Personally I'm more concerned with 1080p (my monitor) and 1440p (similar to Rift) performance since while I'd never buy one of these ultra high end cards, I will be buying a card from the next gen with equivalent or greater performance but I still won't be moving ot 4K at that point. I guess for now though, the kind of people that would buy these ultra high end cards probably are 4K gamers and so those are the more relevant numbers at this price range.
     
  15. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,875
    Likes Received:
    1,581
    I think they created a FuryX review thread ...
     
  16. pharma

    Veteran Regular

    Joined:
    Mar 29, 2004
    Messages:
    2,875
    Likes Received:
    1,581
  17. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,171
    Location:
    La-la land
    Been running my new ASUS R9 390X DirectCU II for a couple hours of Folding@Home, and it appears stable so far... :) It's giving an excellent score of 340k PPD on my old Nehalem-based PC (haswell rig died friday, may be 3 weeks until I get it back and the gaming itch just got me so I just HAD to run out and get this thing, lol.)

    Like any aircooled high-end card, this thing is pretty loud when running full tilt, and the chassis it's mounted in is less than ideal for fat GPUs, so with the front noise-damping door closed the GPU climbs to as much as 93C. Keep door open to let front fan ingest air more freely and it's only 86, 87C.... :p It doesn't help to keep the side panel off; the card runs cooler this way - hitting roughly 90C in an open chassis.
     
    Lightman likes this.
  18. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,171
    Location:
    La-la land
    Been playing WoW now for about two hours. No crashes, glitches or anything. Game runs beautifully with everything set to ultra 2560*1440. Quite low fan noise too despite the high rez; guess it's because wow isn't shading heavy, just fillrate heavy.

    Gonna be so awesome when I get my main rig back again. I'll transplant this card to it, move my 290X to this box, then run Folding on both of them for awesomesauce PPDs! :lol:
     
    Razor1 and pharma like this.
  19. DieH@rd

    Legend Veteran

    Joined:
    Sep 20, 2006
    Messages:
    6,076
    Likes Received:
    2,007
    Try to lower the GPU fan noise more in WoW by tweaking Frame Rate Target Control. Now with Crimson driver you can do that globally for all games, or individually in per-game profiles.
     
  20. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,171
    Location:
    La-la land
    Yes, I just installed Crimson (like, 15 mins ago), haven't had time yet to take a look at it, but I don't think target control will do much for me in WoW at least, because I'm already running the game with vsynch on + triple buffer.

    Anyway, Crimson looks like a great initiative by AMD, I hope there will be many good things coming out of this.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...