AMD Mantle API [updating]

I guess Frostbite is a big investment for EA with all of those subsidiaries using it. Must be seeing it as a competitive advantage and a potential way to save money during development.
 
potential way to save money during development.
Could be, if their recent behavior is anything to go by. Their E3 showings were very bad, and uninteresting compared to last year, despite them having excellent games under their belt. With the exception of Dragon Age, they didn't show anything attractive for the rest of their lineup, even BF: Hardline was hardly presented in an interesting way, and the graphics was modest. Compared to Advanced Warfare, it was easily outmatched.

They say they didn't offer decent trailers because they wanted to embrace something like the "early access" module in their games, involving players and whatnot. But knowing EA, it sounds to me like a cheap way to slide out of launch problems, (like the disastrous BF4 launch). In the beta of Hardline, the same problems that plagued BF4 launch still exist, so unless they work on fixing them soon, players will experience BF4 launch all over again.

You can't just substitute building the hype for your upcoming games with decent showings (which could propel your games forward in the market mindshare), with half-assed jobs while slabbing the name early access on them, and then expect me to believe you are doing that on behalf of the players. All you are really doing is saving costs and reducing the visibility of your game.

IMO, There is a bit of chaos at EA. Dice is overwhelmed, fixing BF problems, building another two games, also making plans for the next BF. Bioware is building 3 games too. EA put off NFS for the first time in a decade, they also canned MOH and Dead Space. So either they are running out of cash, or running out of good studios to build games for them, and so they are cramming everything together.
 
god talk about the drama :runaway:

people who cry about BF4 issues......

do you not remember launch of (i do as i bought them all at launch)

BF2
BF2142
BF:BC2
BF3

You think it would be 4th time bitten 5th time shy :LOL: . Or maybe its that complexity of mechanic and scale of gameplay that attracts people, the side effect of those mechanics is bugs.
 
I was just reflecting on how many EA studios were previously using their own engine or say UE3. Now it looks like EAHQ decreed that their own Frostbite tech is to be pushed hard throughout the company. Or maybe it's just that all of their developers need a state of the art engine and Frostbite is relatively free? Whatever.
 
I was just reflecting on how many EA studios were previously using their own engine or say UE3. Now it looks like EAHQ decreed that their own Frostbite tech is to be pushed hard throughout the company. Or maybe it's just that all of their developers need a state of the art engine and Frostbite is relatively free? Whatever.

I think it's clear they're focusing on Frostbite since they spinned it's development off to it's own company/studio rather than just DICE doing that too
 
I was just reflecting on how many EA studios were previously using their own engine or say UE3. Now it looks like EAHQ decreed that their own Frostbite tech is to be pushed hard throughout the company. Or maybe it's just that all of their developers need a state of the art engine and Frostbite is relatively free? Whatever.

When Mantle was first being announced and DICE/Frostbite was talking about it, they were also stating that their Frostbite was now the main engine for use throughout EA and how implementing Mantle will help all their studios because of that. Frostbite is now a separate division from DICE and solely working on their engine for the entire EA.
 
Those 1.6Ghz numbers are fantastic, incredible. Any other source verification of the same results? Absolutely amazing for Mantle on average hardware.
 
Those 1.6Ghz numbers are fantastic, incredible. Any other source verification of the same results? Absolutely amazing for Mantle on average hardware.
Theoretically nice (like I said, CPU overhead on Mantle/DX12 is legitimately far lower) but yet again this is not a particularly compelling competitive example. Even if someone bought a quite low-end CPU to pair with their $500+ video card, it still doesn't give AMD an edge as NVIDIA runs >60 in all the tested cases as well. And the 720p/low cases are just a waste of time... once you're over monitor refresh consistently you're done. If you're buying an expensive GPU to run at low settings so you can see "omg 200fps", you're an idiot.

The most useful numbers there are the 290X going from below 60 to comfortably above 60 (49 -> 93) on the theoretical 1.8Ghz CPU in the 1080p case. So that's great if you already have a high-end AMD card and a fairly low-end CPU. But the NVIDIA card does just fine in DX11 there as well, so it's hardly a compelling purchase argument.

They really need to actually test this stuff on some more modest hardware and see if there's a sweet spot where this really allows you to save some cash on the CPU or similar. These underclocked/running on low with overpowered GPU situations are not realistic.
 
PCgameshardware.de has tested the performance in the Battlefield Hardline beta
http://www.pcgameshardware.de/Battl...Battlefield-Hardline-Beta-Benchmarks-1125079/

Take a look at the CPU results in 720p:D

I like the idea of Mantle. I really do. But most games are going to be engineered upfront so that they don't depend on Mantle, though they might use it if available. This means FPS might be somewhat better, but overall performance is still going to depend on a GPU's CUs, etc.

Mantle will shine when people start comparing non-Mantle games with Mantle games and note the increase in overall level of action made possible by Mantle rather than simply FPS. Action doesn't scale in the same way that resolution does or the way texture detail does, or any of the other settings that exist to allow multiple cards to be used.

The trouble is that game companies end up writing essentially two different games if they want to really exploit Mantle beyond the somewhat small increases in FPS and they aren't going to do that.

The same sort of thing happened in the old Amiga/Atari ST days. The Amiga had a primitive GPU with a coprocessor that could drive it (ala command buffers), more colors on screen, hardware scrolling, and hardware sprites.

And for years nearly all these features were ignored by game designers. They simply wrote one game that could run on a basic 68000 system with a dumb frame buffer.
 
But the NVIDIA card does just fine in DX11 there as well, so it's hardly a compelling purchase argument.
@1.8GHz the difference between NV and AMD in DX11 is staggering!! NV has 72% more performance!

Also, the pattern is repeating once again, a GTX 770 is almost equal to the 290X in DX11 mode and Ultra/1080p!
 
@1.8GHz the difference between NV and AMD in DX11 is staggering!! NV has 72% more performance!

Also, the pattern is repeating once again, a GTX 770 is almost equal to the 290X in DX11 mode and Ultra/1080p!

@1.8GHz the difference between NV and AMD in with mantle is staggering!! AMD has 230% more performance!

see what i did there,


Also, the pattern is repeating once again, a GTX 770 is almost equal to the 290X in DX11 mode and Ultra/1080p



now the interesting question, how many 770 users are running windows 8.1?

They really need to actually test this stuff on some more modest hardware and see if there's a sweet spot where this really allows you to save some cash on the CPU or similar. These underclocked/running on low with overpowered GPU situations are not realistic.

if we assume under mantle we aren't CPU bound then simple linear extrapolation is going to be pretty damn close(amd have pretty consistent ALU/MTU ratios down the stack). 1.8ghz is nice when we consider typical laptop CPU performance, On my 1x carbon (i5 4300u) on diablo 3 with the help of throttlestop i get best system performance when my CPU is sitting @ 800mhz and i keep the GPU @ 1100 so generally speaking the more power to the GPU on the low end the better.

The thing that im interested about is what does mantle bring that DX12 wont be able to? will DX12 provide the same level of async compute as mantle ( when will we see async compute in a mantle product?) I can find a single line in a BF4 presentation saying 80% faster with there tile based lighting system but not yet in the BF4 code.

Given the GPU's from the consoles already have access the hardware to do low level compute, is Mantle and driving Mantle adoption all about giving AMD an advantage in DX12 time frames where complex compute thats already been developed for the consoles can be more easily migrated to Mantle API then DX12?
 
@1.8GHz the difference between NV and AMD in with mantle is staggering!! AMD has 230% more performance!
see what i did there,
Obviously you were lookig at a different metric, stick to 1080p,4XAA and you will be in sync.

At this resolution and with crippling the CPU to 1.8GHz:
@DX11 NV has 73% advantage over AMD
@Mntle AMD has 12% over NV

Upgrading the CPU to 4.6GHz:
@DX11 NV has 36% advantage over AMD
@Mntle NV has 6 % advantage over AMD "again"

I am guessing running the CPU at stock (@3.9GHz) and comparing results to the overclocked state, we would see an increase in NV advantage over AMD @DX11, while also decreasing NV advantage over AMD @Mantle, to the point of equilibrium. At any rate I don't consider this method of testing multiplayer games reliable at all, these were just interesting observations that were in line with previous ones. They are merely food for thoughts, useful maybe for studying GPU behaviors with CPU overhead.

now the interesting question, how many 770 users are running windows 8.1?
Right now, Windows has nothing to do with it, though that wasn't the case before the Mantle patch/DX11 driver, now NV perf is almost equal on both Win 7 and 8.
 
@1.8GHz the difference between NV and AMD in DX11 is staggering!! NV has 72% more performance!
But it's an irrelevant gain in a contrived CPU limited situation that does nothing to improve the end user experience. *Nothing*.

Like I've said - several times - there are definitely large CPU overhead gains to be had by Mantle/DX12; I know first-hand. If there is any real disagreement on that then people are just ignorant. But that is a different question from whether or not there's a compelling consumer argument at the moment. There may be, but these numbers don't show it, nor do the previous ones posted.

This is starting to remind me of one of the many threads on gamedev forums where people post things like "when I only do a clear I get 10,000 fps but when I draw a single triangle it drops to 1000 fps... that's a 10x drop! Imagine how slow it will be when I render a million triangles!". Yeah...

I just want to see a real case where a plausible system running plausible settings would be better off going AMD because of Mantle. Is that too much to ask?
 
The only two cases that mantles relieving of the API CPU bottleneck for current games matters in my eyes is for laptops/portables and for older systems where a GPU upgrade could be the best upgrade for the money for mantle enabled games. We've seen a few tests on old Phenom II's quite a few months ago now, I'd personally like to see more. I know of quite a few people with systems like this that have upgraded to modern video cards and find many games are an inconsistent stutter-fest. Of course they all want to upgrade the rest of their system, but as an example, if the one who went with a 7950 was able to play a handful of mantle games just fine, where the one who went with a GTX 670 had stuttering in all games at high settings, then that is clearly going to win mantle and AMD some favour. This sort of benefit is hard to market for AMD though.

The laptop situation is easier to emulate in the way that the above benchmarks have. This is the difference that AMD need to market. Comparing to AMD under DX11 is deceiving though, as they don't have a lot of motivation to optimise the DX11 path in mantle games. Instead, compare to the competition with a mid-range laptop CPU. The difference at a resolution both can manage should be similar to the results seen above.
 
But it's an irrelevant gain in a contrived CPU limited situation that does nothing to improve the end user experience. *Nothing*.
Yeah, I totally agree with that, my angle here is that the difference between DX11 performance between AMD and NV in these Mantle games is so large now, that it is now in the "utterly unacceptable" and "down right embarrassing" territory. AMD should ramp up it's DX11 performance, I will be more than satisfied when that happens.
 
Yeah, I totally agree with that, my angle here is that the difference between DX11 performance between AMD and NV in these Mantle games is so large now, that it is now in the "utterly unacceptable" and "down right embarrassing" territory. AMD should ramp up it's DX11 performance, I will be more than satisfied when that happens.

Why would they devote man hours to a DX11 path for a mantle game? How exactly will it benefit them OR their customers?

One reason would certainly be older hardware that does not support mantle, but we would have to look at tests using that older hardware to make any conclusions as to whether it is being negatively impacted. That would show whether AMD are ignoring DX11 optimisation for GCN in mantle games or ignoring DX11 optimisation in mantle games all together.
 
Why would they devote man hours to a DX11 path for a mantle game? How exactly will it benefit them OR their customers?
Comparing to AMD under DX11 is deceiving though, as they don't have a lot of motivation to optimise the DX11 path in mantle games.
They should, you think average joes who buy AMD mid range GPUs or have it in their laptops actually know what Mantle is? let alone think about activating it in the games that support it? these games work @DX11 by default, unless the user actually switches to Mantle he will get the short end of the stick. Developers should at least detect the presence of Mantle compatible GPUs and automatically make it the default setting, but they wouldn't do that, they prefer defaulting to the most stable renderer to avoid problems, not to mention Mantle is usually added via a patch after game launch.



Instead, compare to the competition with a mid-range laptop CPU. The difference at a resolution both can manage should be similar to the results seen above.
That wouldn't be enough, mid-range CPUs in laptops are often coupled with mid-range GPUs, you need a high end GPU with that processor to introduce a massive CPU overhead that Mantle then would mitigate. Choosing a mid range CPu/GPU combo will not yield the best output for Mantle.

See this for example:
star2_0.png


Mantle mostly works through two ways: either couple a crippled CPU with a powerful GPU, or cripple a powerful CPU by choosing a low resolution or quality settings.
 
No, when showing mantle CPU optimisations thats the best way to demonstrate it. We are yet to see what Mantle GPU optimisations can do, so making such a blanket statement has a pretty good chance at being erroneous ( if you trust what dice say).
 
Back
Top