AMD Mantle API [updating]

60fps in BF4 MP are kind of low and most frequent player say it feels sluggish (especially with DX, as not only the frame time distribution is worse but it appears also the absolute lag if you want to believe people who tried Mantle, the same fps feel smoother with Mantle).
Don't start this "feels smoother" BS again... 120Hz monitor, fine. Everyone else v-synced flat 60 is perfect. BF4 already limits the maximum number of queued frames (in full-screen mode at least).

And I'll happily kick any of your asses @ 60 ;)

The only "extreme unrealistic case" I see in way too many reviews is those tests using overclocked 500-1000€ Intel CPUs, since less than 1 gamer in every 200 uses such a CPU.
That's true, but by that logic all of the Kaveri testing (or AMD CPU testing in general) is even less useful. If you're interested in folks with older CPUs upgrading to a new graphics card, you really have to test the older CPUs. You can't generalize CPU results to older architectures... PD is not going to be a good predictor for Nehalem performance, etc.

I do understand that graphics cards reviewers want to avoid having their CPU as a bottleneck, but what they don't understand is that most of them end up alienating 99.5% of the PC gamers by not giving them a realistic example of what to expect with that particular graphics card.
[
This is not a new problem. Reviewing 290X's at all is a similar issue... almost no one has them or will buy them, but for whatever reason people want to see the high end stuff reviewed, then generalize the result and go buy something midrange even if it's not better. People want to have simple things in their mind like "780Ti>290X therefore NVIDIA better therefore I'll buy a 750". It's stupid, but it's how people think. (<rant>AND ALL OF YOU GUYS DO IT WITH IGPUs!!! :p</rant>)

I'm all for testing older CPUs to see what sort of boost you can get there, but I don't think pairing them with $500+ GPUs makes any more sense for these clearly budget-conscious folks.

Careful there, or you'll get banned by Andrew… :LOL:
:D
 
More than 50% of people in the Steam survey have two or fewer CPU cores.
And a 290X?
The frame times with Kaveri already look kind of hinky to me, so widening the CPU/GPU gap promises a patchy experience.
Getting a less outsized GPU puts things right back to where the GPU is freed from its DX shackles and it gets you a teeny bit further.
Mantle could be great, but the GPUs don't live up to it.

Besides, just because my 2500K has the headroom to wade through the current API garbage doesn't mean that I wouldn't rather it was doing things that would make the games better.
As long as Mantle remains niche, devs will make the games to conform to the majority of platform.
 
Afaik Intel still controls Havok. http://www.havok.com/customer-projects/games

Does Havok still mainly use the cpu? If so, Intel has an incentive to promote its advancement (imo). A little googling gives me a link that implies the gpu can handle Havok. http://www.gamephys.com/tag/havok-physics/

Maybe the option to use either is there. My point is that, if the gpu is loaded and the cpu is hardly taxed, it would be a win-win for all of us it we could just hit a switch and enable some more gaming goodness by taking better advantage of our cpu's.
 
v-synced flat 60 is perfect.
What? V-sync in a multiplayer game? The latency for me is atrocious. For me 100 fps is close to perfect - thats why I don't really care about MSAA in BF4.

People want to have simple things in their mind like "780Ti>290X therefore NVIDIA better therefore I'll buy a 750". It's stupid, but it's how people think.
Totally agree. Products like Titan or 780Ti result in greater faith over lower segment products.
 
And a 290X?
The frame times with Kaveri already look kind of hinky to me, so widening the CPU/GPU gap promises a patchy experience.
Getting a less outsized GPU puts things right back to where the GPU is freed from its DX shackles and it gets you a teeny bit further.
Mantle could be great, but the GPUs don't live up to it.

That's only if you come at it from the angle of maximum res, maximum quality. I think that's another example of the "enthusiast reviewer" distorted mindset.

If you're happy to reduce GPU load (reduce IQ, turn off features or reduce res) you can boost frame rates massively even with a shitty card, and in doing so play a good, fun multiplayer match very competitively. That's assuming you have the CPU power to take you there of course ... which Mantle makes more likely.

There's this bogus idea doing the rounds that everyone plays games at max res and max quality, and so a shit CPU doesn't matter if you have a mid/low end card because you'll have shit frame rate anyway ... because you'll be playing games at max res and max quality.

It's circular reasoning and it's wrong. Mantle isn't "wasted" even on a low end GPU because you can still get great frame rates from a mainstream CPU.

As long as Mantle remains niche, devs will make the games to conform to the majority of platform.

Which is why Mantle - or something like it (hopefully cross vendor) - needs to become common.
 
You're thinking too much as an enthusiast. I have alot of friends with Phenom II X4 and X6 CPUs that primarily play Battlefield 4, and with Mantle they have been given a reasonable upgrade path for the graphics card. Same goes for all folks with the first generations of core i5 and i7 CPUs, and core i3s.
First gen i5 or i7 sound about right too. they are decent CPUs after all, as long as they are a quad core with decent clocks.

If we are going to cater for the average joes then no game would ever be tested with the highest visual quality at the highest resolution using any of the big gun GPUs .. We are an enthusiast community discussing technical matters that are only revealed through pushing tech to the their limits, not by getting a hack job CPU and whistling at how it behaves above it's class level.

Next Gen games are around the corners, and all old CPUs are in for a bloody phasing out as they are no longer fit for the job. heck I replaced my E8400 a year ago because it was utterly useless. Old quad core CPUs are already feeling the heat.

I am not convinced the main community of BF4 is just a whole bunch of people with low-end hardware, it just doesn'tmake sense to me.(and maybe Repi could chime in with some hardware charts of his own) BF has alwyas been a game for the enthusiastic gamers, and so many people upgrade their PCs just to play BF comfortably. In fact Battlefield and Crysis have always driven high-end hardware sales up with their visual fidelity.

More than 50% of people in the Steam survey have two or fewer CPU cores.
And you can be sure that is because all of them are using steam for f2p games and or old games.
Give it another year and Battlefield won't even start on a dual core CPU.
heck even the official requirements of BF4 lists a core i5 or an i7.
 
Last edited by a moderator:
That's only if you come at it from the angle of maximum res, maximum quality. I think that's another example of the "enthusiast reviewer" distorted mindset.
That is not the realm for which Mantle provides significant benefit, so that was not under discussion.

If you're happy to reduce GPU load (reduce IQ, turn off features or reduce res) you can boost frame rates massively even with a shitty card, and in doing so play a good, fun multiplayer match very competitively.
That's assuming you have the CPU power to take you there of course ... which Mantle makes more likely.
If it's not at least as good as Kaveri, it doesn't look like there's as much breathing room.

There's this bogus idea doing the rounds that everyone plays games at max res and max quality, and so a shit CPU doesn't matter if you have a mid/low end card because you'll have shit frame rate anyway ... because you'll be playing games at max res and max quality.
Irrelevant to the debate at hand, since even per AMD that is not where Mantle makes a major difference.

It's circular reasoning and it's wrong. Mantle isn't "wasted" even on a low end GPU because you can still get great frame rates from a mainstream CPU.
The frame times are pretty janky on Kaveri when running with a 290X that is massively overprovisioned for the test load. If reliable and high performance is your thing, persisting with a CPU even weaker than that is asking for high headline FPS numbers on a glass-jaw gaming rig.
 
That's true, but by that logic all of the Kaveri testing (or AMD CPU testing in general) is even less useful. If you're interested in folks with older CPUs upgrading to a new graphics card, you really have to test the older CPUs. You can't generalize CPU results to older architectures... PD is not going to be a good predictor for Nehalem performance, etc.

Agreed, which is why I've been asking for Mantle (p)reviews with a broader range of CPUs. At least, more than 1 150€ APU + 1 500€ CPU.

Kaveri results are interesting to check the performance of its iGPU and try to deduct the performance of a dual-module Piledriver @ ~4GHz.
.
Unfortunately, by not pairing that APU with quad-channel DDR3 or dual-channel GDDR5, AMD managed to turn Kaveri into quite a boring product, merely a shadow of what could have been.




This is not a new problem. Reviewing 290X's at all is a similar issue... almost no one has them or will buy them, but for whatever reason people want to see the high end stuff reviewed, then generalize the result and go buy something midrange even if it's not better. People want to have simple things in their mind like "780Ti>290X therefore NVIDIA better therefore I'll buy a 750". It's stupid, but it's how people think. (<rant>AND ALL OF YOU GUYS DO IT WITH IGPUs!!! :p</rant>).

But it's a snowball effect, IMO lead by reviewers themselves.
I would give more attention to a review of a R7 260X paired with a Core i3 and a FX-6300 than the typical let's pair this 120€ graphics card with a 300€ i7 4770k with a 180€ motherboard and a €100 watercooler overclocked to 4.5GHz.

Reviewers should pair mid-range GPUs with mid-range CPUs, and they should explain why.


I'm all for testing older CPUs to see what sort of boost you can get there, but I don't think pairing them with $500+ GPUs makes any more sense for these clearly budget-conscious folks.

Well pairing a R9 290X with a Celeron or A4-4000 is a bit stupid, but AFAICS there's very little gain in pairing it with the most expensive Core i7 than the cheapest Haswell Core i5, even in DirectX. Crossfire/SLI apart, of course.
 
What? V-sync in a multiplayer game? The latency for me is atrocious.
Sounds like there's a problem with your setup.

If you're happy to reduce GPU load (reduce IQ, turn off features or reduce res) you can boost frame rates massively even with a shitty card, and in doing so play a good, fun multiplayer match very competitively. That's assuming you have the CPU power to take you there of course ... which Mantle makes more likely.
Turning down settings also greatly reduces the load on the CPU in most games as it usually involves shorter view distances, lower LODs/proxies, fewer shadow cascades, etc.

Again, if you want to make the "consumer" argument, you just have to show that some config with AMD GPU+Mantle is now cheaper than any other config for a given level of performance.

I'd definitely be curious if repi had any numbers on how many people run on roughly what presets (Low/Med/High/Ultra). Not sure if they collect that, but failing that it would be interesting to know if BF player hardware tends to be a lot different from the Steam numbers. I'm guessing it will be somewhat more skewed towards enthusiasts, but how much is the question. If I were to venture a guess, I'd still say the majority probably use <$200 GPUs even in BF.
 
I'd definitely be curious if repi had any numbers on how many people run on roughly what presets (Low/Med/High/Ultra). Not sure if they collect that, but failing that it would be interesting to know if BF player hardware tends to be a lot different from the Steam numbers. I'm guessing it will be somewhat more skewed towards enthusiasts, but how much is the question. If I were to venture a guess, I'd still say the majority probably use <$200 GPUs even in BF.

Don't have figures about Low/Med/High/Ultra usage (but that would be interesting!), but do have HW stats from Origin specifically for BF4 and it is definitely more high-end than the Steam stats, which makes sense.
 
Careful there, or you'll get banned by Andrew… :LOL:

Uh-oh... I meant around 6-ish percent higher performance (in whatever metric you prefer) :D

Which is why Mantle - or something like it (hopefully cross vendor) - needs to become common.
In that case, as soon as the game developers have adopted, we'll be standing where we were right before Mantle, albeit with better looking games.
 
Don't have figures about Low/Med/High/Ultra usage (but that would be interesting!), but do have HW stats from Origin specifically for BF4 and it is definitely more high-end than the Steam stats, which makes sense.

Aren't Geforce Experience and AMD's Raptr collecting the settings data to cloud?
Might be a place to start.
 
Don't start this "feels smoother" BS again...
That are widespread comments, not only from AMD followers but even nV fans who tried it.
120Hz monitor, fine. Everyone else v-synced flat 60 is perfect. BF4 already limits the maximum number of queued frames (in full-screen mode at least).
As said, the frame time distribution with Mantle is narrower than with DX, there is simply less variation between adjacent frames (just check a few reviews looking at that). And without the heavy driver threads processing all draw calls, I would also suspect a slightly lower absolute latency with Mantle, even while BF4 puts a low limit on the queued frames with DX.
So, don't be too hasty by calling it BS. ;)
 
That are widespread comments, not only from AMD followers but even nV fans who tried it.
It's qualitative nonsense. If you're locked at 60Hz there is zero frame time variance - every frame is the same speed. If you limit the queue depth you have entirely predictable (and small) "latency". There's simple math for all of this and there's nothing magical that you can do on the CPU to change it. If you're locked at 60 you're fine, period.

I'm obviously all for frame time distribution investigations a la. TechReport (see my initial thread that argued why they are important and - humility aside - played a role in the wider community accepting them). I will note though that TR's own results showed that while on AMD variance was reduced vs. DX, NVIDIA's DX driver did as well or better.

Anyways, I'm not disputing anything about Mantle being good or not. I'm just saying the "100fps just feels so much smoother than locked 60 on my 60Hz monitor" is just a useless anecdote, and I have zero faith in the average user's ability to objectively determine such things to begin with.
 
I'm just saying the "100fps just feels so much smoother than locked 60 on my 60Hz monitor" is just a useless anecdote, and I have zero faith in the average user's ability to objectively determine such things to begin with.
If that's the case then the only solution would be that you test it by yourself. :LOL:
Btw., my first statement to this topic specifically mentioned monitors with 120 and 144Hz refresh rate.
 
Last edited by a moderator:
I'm just saying the "100fps just feels so much smoother than locked 60 on my 60Hz monitor" is just a useless anecdote, and I have zero faith in the average user's ability to objectively determine such things to begin with.

You can choose to disregard these anecdotes but I clearly see a difference between 60 and 100 fps even on a 60 Hz monitor. In CS for example playing with 60 fps is like being a handicap compared to 100. And after using a 120 Hz monitor I will never go back to 60, it feels like 30 Hz to me now.
 
You can choose to disregard these anecdotes but I clearly see a difference between 60 and 100 fps even on a 60 Hz monitor. In CS for example playing with 60 fps is like being a handicap compared to 100. And after using a 120 Hz monitor I will never go back to 60, it feels like 30 Hz to me now.

Likely what you are seeing is higher average FPS (lower average frame times) basically resulting in higher minimum FPS (lower max frame times).

And even when that isn't the case, Mantle seems to affect minimum FPS (max frame times) greatly even if the average doesn't change significantly.

Regards,
SB
 
Those not with an AMD processor are GPU limited because DX and the developers using it work so well. It doesn't take much effort to become GPU limited, and with AMD's lack of progress on multiple fronts and contrived examples like Oxide, the effort to get to AMD's sweet spot becomes progressively unreasonable as time goes on.

The 290X's middling scaling of Mantle when using an Intel CPU--in comparison with itself or the 780 still on D3D, shows that even if the API gets out of the way, AMD's GPUs barely take you further.
AMD can present a stronger case for the API holding its wonderful GPUs back when they don't stall 12 FPS higher than with said API.

If the API was less of an overhead, you would have an option to do more things on the CPU. That can certainly help the GPU limited scenarios.

My point is not about AMD's awesomeness, it's about enabling a more balanced workload distribution by removing artificial bottlenecks.
 
Those not with an AMD processor are GPU limited because DX and the developers using it work so well. It doesn't take much effort to become GPU limited, and with AMD's lack of progress on multiple fronts and contrived examples like Oxide, the effort to get to AMD's sweet spot becomes progressively unreasonable as time goes on.

The 290X's middling scaling of Mantle when using an Intel CPU--in comparison with itself or the 780 still on D3D, shows that even if the API gets out of the way, AMD's GPUs barely take you further.
AMD can present a stronger case for the API holding its wonderful GPUs back when they don't stall 12 FPS higher than with said API.

If the API was less of an overhead, you would have an option to do more things on the CPU. That can certainly help the GPU limited scenarios.

My point is not about AMD's awesomeness, it's about enabling a more balanced workload distribution by removing artificial bottlenecks.
 
Back
Top