Could be, if their recent behavior is anything to go by. Their E3 showings were very bad, and uninteresting compared to last year, despite them having excellent games under their belt. With the exception of Dragon Age, they didn't show anything attractive for the rest of their lineup, even BF: Hardline was hardly presented in an interesting way, and the graphics was modest. Compared to Advanced Warfare, it was easily outmatched.potential way to save money during development.
I was just reflecting on how many EA studios were previously using their own engine or say UE3. Now it looks like EAHQ decreed that their own Frostbite tech is to be pushed hard throughout the company. Or maybe it's just that all of their developers need a state of the art engine and Frostbite is relatively free? Whatever.
sorry but what does that mean ????That, they will "need" use it ...
I was just reflecting on how many EA studios were previously using their own engine or say UE3. Now it looks like EAHQ decreed that their own Frostbite tech is to be pushed hard throughout the company. Or maybe it's just that all of their developers need a state of the art engine and Frostbite is relatively free? Whatever.
Theoretically nice (like I said, CPU overhead on Mantle/DX12 is legitimately far lower) but yet again this is not a particularly compelling competitive example. Even if someone bought a quite low-end CPU to pair with their $500+ video card, it still doesn't give AMD an edge as NVIDIA runs >60 in all the tested cases as well. And the 720p/low cases are just a waste of time... once you're over monitor refresh consistently you're done. If you're buying an expensive GPU to run at low settings so you can see "omg 200fps", you're an idiot.Those 1.6Ghz numbers are fantastic, incredible. Any other source verification of the same results? Absolutely amazing for Mantle on average hardware.
PCgameshardware.de has tested the performance in the Battlefield Hardline beta
http://www.pcgameshardware.de/Battl...Battlefield-Hardline-Beta-Benchmarks-1125079/
Take a look at the CPU results in 720p
@1.8GHz the difference between NV and AMD in DX11 is staggering!! NV has 72% more performance!But the NVIDIA card does just fine in DX11 there as well, so it's hardly a compelling purchase argument.
@1.8GHz the difference between NV and AMD in DX11 is staggering!! NV has 72% more performance!
Also, the pattern is repeating once again, a GTX 770 is almost equal to the 290X in DX11 mode and Ultra/1080p!
Also, the pattern is repeating once again, a GTX 770 is almost equal to the 290X in DX11 mode and Ultra/1080p
They really need to actually test this stuff on some more modest hardware and see if there's a sweet spot where this really allows you to save some cash on the CPU or similar. These underclocked/running on low with overpowered GPU situations are not realistic.
Obviously you were lookig at a different metric, stick to 1080p,4XAA and you will be in sync.@1.8GHz the difference between NV and AMD in with mantle is staggering!! AMD has 230% more performance!
see what i did there,
Right now, Windows has nothing to do with it, though that wasn't the case before the Mantle patch/DX11 driver, now NV perf is almost equal on both Win 7 and 8.now the interesting question, how many 770 users are running windows 8.1?
But it's an irrelevant gain in a contrived CPU limited situation that does nothing to improve the end user experience. *Nothing*.@1.8GHz the difference between NV and AMD in DX11 is staggering!! NV has 72% more performance!
Yeah, I totally agree with that, my angle here is that the difference between DX11 performance between AMD and NV in these Mantle games is so large now, that it is now in the "utterly unacceptable" and "down right embarrassing" territory. AMD should ramp up it's DX11 performance, I will be more than satisfied when that happens.But it's an irrelevant gain in a contrived CPU limited situation that does nothing to improve the end user experience. *Nothing*.
Yeah, I totally agree with that, my angle here is that the difference between DX11 performance between AMD and NV in these Mantle games is so large now, that it is now in the "utterly unacceptable" and "down right embarrassing" territory. AMD should ramp up it's DX11 performance, I will be more than satisfied when that happens.
Why would they devote man hours to a DX11 path for a mantle game? How exactly will it benefit them OR their customers?
They should, you think average joes who buy AMD mid range GPUs or have it in their laptops actually know what Mantle is? let alone think about activating it in the games that support it? these games work @DX11 by default, unless the user actually switches to Mantle he will get the short end of the stick. Developers should at least detect the presence of Mantle compatible GPUs and automatically make it the default setting, but they wouldn't do that, they prefer defaulting to the most stable renderer to avoid problems, not to mention Mantle is usually added via a patch after game launch.Comparing to AMD under DX11 is deceiving though, as they don't have a lot of motivation to optimise the DX11 path in mantle games.
That wouldn't be enough, mid-range CPUs in laptops are often coupled with mid-range GPUs, you need a high end GPU with that processor to introduce a massive CPU overhead that Mantle then would mitigate. Choosing a mid range CPu/GPU combo will not yield the best output for Mantle.Instead, compare to the competition with a mid-range laptop CPU. The difference at a resolution both can manage should be similar to the results seen above.