Is 4GB enough for a high-end GPU in 2015?

Discussion in 'Architecture and Products' started by Albuquerque, Jun 9, 2015.

  1. snc

    snc Veteran

  2. DavidGraham

    DavidGraham Veteran

    Also Watch_Dogs:


    I bet Dying Light will suffer the same too.
     
  3. gamervivek

    gamervivek Regular

    Engineers optimizing for HBM hard at work. :yep2:

    [​IMG]
     
  4. pharma

    pharma Veteran

    Using the same PC setup as the "official" AMD Benchmark guide? ;-)
     
    gamervivek and Razor1 like this.
  5. flopper

    flopper Newcomer

    your saying they mislead the customer then...
     
  6. Infinisearch

    Infinisearch Veteran

    One thing I've wondered about is if with HBM2, DDR4 and DX12 multiengine was if rendering with alot of resource juggling would speed up and make lower (2 or 4GB) memory more feasible. If HBM/HBM2 gives enough bandwidth along side DDR4 it would speed up data transfers to the GPU. And perhaps with DX12 multiengine and HBM/2 rendering won't slow down if there is a simultaneous DMA transfer because the abundance of bandwidth.
     
  7. silent_guy

    silent_guy Veteran Subscriber

    I think the PCIe BW has been the main limiter for quite a while now.
     
  8. gamervivek

    gamervivek Regular

  9. Alessio1989

    Alessio1989 Regular

    How much total memory (VRAM + system shared) reports DXIAG for Fury X?
     
  10. Infinisearch

    Infinisearch Veteran

    You're right, my musings were more abstract though, just wondering if we'll have enough bandwidth at some point where resource juggling doesn't impact performance that much. Besides pcie4.0 and nvlink are coming in 2016 right?

    But to be fair 40% of 25.600 is only 10.240 GB/s (50% 12.800) and pcie3.0 can do 15.75GB/s. (I figure the gpu will get 50% of system memory bandwidth max) and alot of people do have vanilla 1600 RAM.
     
  11. silent_guy

    silent_guy Veteran Subscriber

    I don't think we'll ever see nvlink in a product that matters for gaming, which means: an Intel or AMD CPU. But even if it happened, as you point out, it'd still be a small fraction of the local DRAM.
     
  12. lanek

    lanek Veteran

    Dyiing light and watchdog have never work on AMD gpu's anyway .. If you are a fan of thoses 2 games, dont buy an AMD gpu''s for it.

    Dying light as example, was a stutter mess when released on AMD gpu's, not amd fault, but the game have never been tested by the developper on an AMD gpus, let alone watchdog.

    The next one you could try is PC cars.. its funny to see Techreport using this game in intro of their review, when the AMD gpu's work at an absolute criminal fps on them ( a 780TI have 200% more fps than a 290x in this game )
     
  13. DavidGraham

    DavidGraham Veteran

    By your logic users should never buy an AMD GPU, since most of triple A games are supported by NVIDIA. PCars, Dying Light, GTA V, Witcher 3, Batman, Call Of Duty, Assassin's Creed, Watch_Dogs, Far Cry and so on. With even more down the road.
     
    Last edited: Jun 27, 2015
  14. lanek

    lanek Veteran

    You can remove Gta5 from the list, because it is not an TWITMP games, not a gamework titles, and not an Gaming evolved AMD.. it is maybe the more balanced PC games since a long time in respecting every hardware brand .. FarCry 1 was an ATI titles, Farcry2 was an Nvidia one, Farcry 3 was an AMD titles, Farcry 4 is an Nvidia one .. But this was the only serie from Ubisoft who was a bit free in this sense, every ubisoft games are Gamework now and every EA games are AMD ones .. the 2 bigger studios worldwide.

    When you get a game Gamework or TWITMP where the performance of AMD gpu's are something like half of his counterpart of Nvidia ( or the 290x is now in general faster in games than a 780TI ) like PCcars, watchdog, Assassin creed.. you dont use that in a review, ...

    Hé you should have see the irony behind my first post... i have not say im right with that.
    its absolutely not logic.

    At the same time, what is funny is every games you have list outside GTA5 and i will say Witcher3 outside some little problem who have been quickly fixed ( at the same time it use a really similar render engine of AMD Forward+ )... have been the worst launch of thoses last 10years on PC..

    Everyone of thoses games you are listing will have give the envy to any PC gamers to commit suicide when they have been launched. Watchdog was a mess, AssassinCreed was a mess, Batman AK is a mess, Farcry4 was a mess, dying light was a mess, Pccars is a pure mess on AMD gpu's...

    The good thing ? they are not only a mess on AMD gpu's, they was offtly too a complete mess on Nvidia ones.
     
    Last edited: Jun 28, 2015
  15. Esrever

    Esrever Regular

    290x seems to be fine with 4gb on GTA from what people are posting.
     
  16. gamervivek

    gamervivek Regular

    http://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/11

    much credible, very non-biased, wow
     
    kalelovil and Lightman like this.
  17. silent_guy

    silent_guy Veteran Subscriber

    homerdog, Razor1 and pharma like this.
  18. Alexko

    Alexko Veteran Subscriber

    Yes, but it's slightly different, because Hawx 2 used very high amounts of tessellation even in cases where objects were far too distant for it to make any difference, and Ubisoft ignored AMD's suggestions to use adaptive tessellation. This is a rather obvious case of sabotage of the game to make GeForces look relatively better. The operating word being relatively, because Fermi took a hit too, just a lesser one than Evergreen.

    There is nothing to suggest that there was anything like that in Dirt Showdown, aside from a rendering technique that happens to run better on AMD hardware.
     
  19. gamervivek

    gamervivek Regular

    Interesting, but I didn't post the other side because that was already posted. Very different from this 'other side' however.

    Memory usage in different games here, curiously not included in the benchmarks.

    http://www.hardwareluxx.com/index.p...5798-reviewed-amd-r9-fury-x-4gb.html?start=19
     
  20. silent_guy

    silent_guy Veteran Subscriber

    My point is: if TechReport were such shilling biased paid-off baddies, they wouldn't have excluded Hawx, irrespective of whether excluding it was warranted or not.
     
Loading...

Share This Page

Loading...