Is 4GB enough for a high-end GPU in 2015?

Engineers optimizing for HBM hard at work. :yep2:

500x1000px-LL-5e136261_GTAV_3840x2160_FRAPSFPS_1.png
 
One thing I've wondered about is if with HBM2, DDR4 and DX12 multiengine was if rendering with alot of resource juggling would speed up and make lower (2 or 4GB) memory more feasible. If HBM/HBM2 gives enough bandwidth along side DDR4 it would speed up data transfers to the GPU. And perhaps with DX12 multiengine and HBM/2 rendering won't slow down if there is a simultaneous DMA transfer because the abundance of bandwidth.
 
If HBM/HBM2 gives enough bandwidth along side DDR4 it would speed up data transfers to the GPU. And perhaps with DX12 multiengine and HBM/2 rendering won't slow down if there is a simultaneous DMA transfer because the abundance of bandwidth.
I think the PCIe BW has been the main limiter for quite a while now.
 
I think the PCIe BW has been the main limiter for quite a while now.
You're right, my musings were more abstract though, just wondering if we'll have enough bandwidth at some point where resource juggling doesn't impact performance that much. Besides pcie4.0 and nvlink are coming in 2016 right?

But to be fair 40% of 25.600 is only 10.240 GB/s (50% 12.800) and pcie3.0 can do 15.75GB/s. (I figure the gpu will get 50% of system memory bandwidth max) and alot of people do have vanilla 1600 RAM.
 
I don't think we'll ever see nvlink in a product that matters for gaming, which means: an Intel or AMD CPU. But even if it happened, as you point out, it'd still be a small fraction of the local DRAM.
 
Also Watch_Dogs:

I bet Dying Light will suffer the same too.

Dyiing light and watchdog have never work on AMD gpu's anyway .. If you are a fan of thoses 2 games, dont buy an AMD gpu''s for it.

Dying light as example, was a stutter mess when released on AMD gpu's, not amd fault, but the game have never been tested by the developper on an AMD gpus, let alone watchdog.

The next one you could try is PC cars.. its funny to see Techreport using this game in intro of their review, when the AMD gpu's work at an absolute criminal fps on them ( a 780TI have 200% more fps than a 290x in this game )
 
By your logic users should never buy an AMD GPU, since most of triple A games are supported by NVIDIA. PCars, Dying Light, GTA V, Witcher 3, Batman, Call Of Duty, Assassin's Creed, Watch_Dogs, Far Cry and so on. With even more down the road.
 
Last edited:
By your logic users should never buy an AMD GPU, since most of triple A games are supported by NVIDIA. PCars, Dying Light, GTA V, Witcher 3, Batman, Call Of Duty, Assassin's Creed, Watch_Dogs, Far Cry and so on. With even more down the road.

You can remove Gta5 from the list, because it is not an TWITMP games, not a gamework titles, and not an Gaming evolved AMD.. it is maybe the more balanced PC games since a long time in respecting every hardware brand .. FarCry 1 was an ATI titles, Farcry2 was an Nvidia one, Farcry 3 was an AMD titles, Farcry 4 is an Nvidia one .. But this was the only serie from Ubisoft who was a bit free in this sense, every ubisoft games are Gamework now and every EA games are AMD ones .. the 2 bigger studios worldwide.

When you get a game Gamework or TWITMP where the performance of AMD gpu's are something like half of his counterpart of Nvidia ( or the 290x is now in general faster in games than a 780TI ) like PCcars, watchdog, Assassin creed.. you dont use that in a review, ...

Hé you should have see the irony behind my first post... i have not say im right with that.
its absolutely not logic.

At the same time, what is funny is every games you have list outside GTA5 and i will say Witcher3 outside some little problem who have been quickly fixed ( at the same time it use a really similar render engine of AMD Forward+ )... have been the worst launch of thoses last 10years on PC..

Everyone of thoses games you are listing will have give the envy to any PC gamers to commit suicide when they have been launched. Watchdog was a mess, AssassinCreed was a mess, Batman AK is a mess, Farcry4 was a mess, dying light was a mess, Pccars is a pure mess on AMD gpu's...

The good thing ? they are not only a mess on AMD gpu's, they was offtly too a complete mess on Nvidia ones.
 
Last edited:
Dyiing light and watchdog have never work on AMD gpu's anyway .. If you are a fan of thoses 2 games, dont buy an AMD gpu''s for it.

Dying light as example, was a stutter mess when released on AMD gpu's, not amd fault, but the game have never been tested by the developper on an AMD gpus, let alone watchdog.

The next one you could try is PC cars.. its funny to see Techreport using this game in intro of their review, when the AMD gpu's work at an absolute criminal fps on them ( a 780TI have 200% more fps than a 290x in this game )

Our overall performance numbers come from the geometric mean of the scores across five of the six games we tested. (We chose to exclude DiRT Showdown, since the results skewed the average pretty badly and since AMD worked very closely with the developers on the lighting path tested.)

http://techreport.com/review/23527/review-nvidia-geforce-gtx-660-graphics-card/11

much credible, very non-biased, wow
 
If you're going to post one side of the story, you should post the other side as well: http://techreport.com/review/20126/amd-radeon-hd-6950-and-6970-graphics-processors/16

Yes, but it's slightly different, because Hawx 2 used very high amounts of tessellation even in cases where objects were far too distant for it to make any difference, and Ubisoft ignored AMD's suggestions to use adaptive tessellation. This is a rather obvious case of sabotage of the game to make GeForces look relatively better. The operating word being relatively, because Fermi took a hit too, just a lesser one than Evergreen.

There is nothing to suggest that there was anything like that in Dirt Showdown, aside from a rendering technique that happens to run better on AMD hardware.
 
If you're going to post one side of the story, you should post the other side as well: http://techreport.com/review/20126/amd-radeon-hd-6950-and-6970-graphics-processors/16

Interesting, but I didn't post the other side because that was already posted. Very different from this 'other side' however.

Memory usage in different games here, curiously not included in the benchmarks.

http://www.hardwareluxx.com/index.p...5798-reviewed-amd-r9-fury-x-4gb.html?start=19
 
My point is: if TechReport were such shilling biased paid-off baddies, they wouldn't have excluded Hawx, irrespective of whether excluding it was warranted or not.
 
Back
Top