A fantasy game riding on the coat tails of LOTR because they don't believe they can appeal otherwise, yet which won't benefit from the Middle Earth mythology particularly, and will completely ignore it for gameplay purposes.
So another typical Middle Earth game then!
Few weeks ago devs from Monolyth talked about how oldgen versions of the game will not get as advanced AI and procedural generation of "named" enemies that will be present in nextgen. I suppose PC will get everything that nextgen version offers, which is maybe reason for high CPU requirement.
I think that we have come to time that devs cant create games that max out nextgen potential using CPUs that are quadcore. Their workstations are most likely moving to 8-threaded machines.
It takes an i7 to match a 6-7 core Jaguar at 1.6-1.8GHz? Even an i3 should suffice.
Yes it can, First consoles have 6 cores only (two dedicated to the OS). Second an i3 could overpower jaguar by clock speeds alone. Not to mention the advantages of better IPC and caches. Of Course An ideal situation would be a Core i5, 4 Cores running at 3.2 GHz are a bruteforce compared to Jaguar. In that sense a Core i7 is untouchable. In fact in the world of gaming having a faster single threaded performance is always better than having a multi-threaded one, due to the inherent latency that is tied to multi-threading.Do you really think that i3 can match nextgen CPUs, when it is well known that DirectX is severely blocking multithreading even for oldgen PC ports, draw-calls are severely limited and large driver thread causes entire CPU to stall?
With Mantle and eventual DX12 addoption, lowend CPUs and modern quadcores will get chance to match and surpass Jaguar cores. But until then, high sysreqs will be common.
I think that we have come to time that devs cant create games that max out nextgen potential using CPUs that are quadcore. Their workstations are most likely moving to 8-threaded machines.
Either way if draw calls are the problem, DX12 or Mantle should certainly resolve the issues to the point were a fast i3 should be plenty to match the console CPU's.
Do you really think that i3 can match nextgen CPUs, when it is well known that DirectX is severely blocking multithreading even for oldgen PC ports, draw-calls are severely limited and large driver thread causes entire CPU to stall?
Yes an i3 performs better than the next gen consoles in games even with additional D3D overhead. Modern i3 is vastly superior to the console CPUs especially in single thread performance which is still very much important even on the consoles with thinner APIs.
Middle-earth: Shadow of Modor PC System Requirements:
Minimum:
OS: 64-bit: Vista, Win 7, Win 8
Processor: Intel Core i5-750, 2.67 GHz | AMD Phenom II X4 965, 3.4 GHz
Memory: 4 GB RAM
Graphics: NVIDIA GeForce GTX 560 | AMD Radeon HD 6950
DirectX: Version 11
Network: Broadband Internet connection
Hard Drive: 25 GB available space
Recommended:
OS: 64-bit: Win 7, Win 8
Processor: Intel Core i7-3770, 3.4 GHz | AMD FX-8350, 4.0 GHz
Memory: 8 GB RAM
Graphics: NVIDIA GeForce GTX 670 | AMD Radeon HD 7970
DirectX: Version 11
Network: Broadband Internet connection
Hard Drive: 40 GB available space
I cant wait to see PC sysreq for nextgen-only games [Arkham Knight, AC:U, Division, Witcher...]
Its always like this
Victorizations or not, all the supposed advantages are hammered away by the clock speed, IPC, caches and even memory bandwidth advantage of the PC CPUs. It is not like developers do not optimize for PCs either. it may not be 100% efficient optimizations, but a 90% or 80% is more than enough with the more powerful PC CPUs. API advantage is present but is very small.Well, consoles take full advantage of Jaguar vector units, is not like games code is full of "if elses". And in this (vector unit), Jaguar is quite a beast. This along the mentioned gpu API advantage, plus ( and this is the most important ) programmers hand tunning every bit running even in CPU registers memory ( see Naughty Dog notes about programming in PS4 ) is what gives these tiny cpus their advantage.
Yes it is a common practice nowadays, Thief and COD Ghosts are among the best examples for that. And the final game always ends up requiring and utilizing far less powerful hardware.Maybe it's just some kind of marketing deal with IHVs? You know, to motivate people to spend $$$ upgrading their systems in anticipation for the game?
Best or worst?Yes it is a common practice nowadays, Thief and COD Ghosts are among the best examples for that