And we can see that in action on in Killzone 2/3 vs Shadowfall comparisons
You can see in terms of the ragdoll physics and hit detection its more impressive in Killzone for PS3 than for PS4.
Still... Killzone 4 is of course more burdening on the cpu in other regards for other things than Killzone 2/3 were for PS3...
Killzone 4??!! There's a Killzone 4?
The interviews with GG mentioned it wasn't a number, Killzone 4 hasn't happened or hasn't been released yet.
Killzone Shadow Fall (unlike Killzone 1, 2, and 3) is a mere launch title, despite all those marketing claims of "easy to dev for" it's still a launch title and launch titles usually have a purpose which is to be part of the launch lineup and hope that gamers buy and like and will get sequels.
Assuming that either PS4 or Xbox one's CPU Jaguar cores are bad at A.I. is foolish.
On average a proper game engine and game takes two to three or so years to make. I hope you noticed how neither Halo 3 nor Killzone 2 were launch titles and how long it took for them to materialize.
Also note that Killzone SF dropped (initially) features from previous games while adding new ones in both single and online mp...when a proper Killzone sequel is released we'll see a huge difference in A.I., features and perhaps even graphics.
To add to the CellBE discussion there's the old Killzone 2 "42 minute" presentation interview where they list how they used the SPUs and note that there actually was a graphics upgrade in Killzone 3 but on top of that they added Sony's inhouse MLAA which runs off the SPUs and has been used since by other PS3 specific games and games that it was used. MLAA frees up RSX to render more or higher polygon counts or reach the reasonable limitations.
Also keep in mind that since nVidia G80, G92 the overall GPU power doubled over G70 based RSX and even Xenos.
AMD had a golden opportunity contract...they got it and their solution as weak as we perceive is still more than last gen...however the gamble is that the PS4's 1.8TF GPU plus having more than 4GB (8GB) will obviously allow to take over tasks that PS3 overall had to do so we just have to wait and see until next year just like it was previous gen ramp ups.
Thinking of these consoles or last gen consoles as compared to PCs is just foolish PC gamer marketing hype mentality.
How long did it take for quad core CPUs to become the minimum requirements?
PCs may have great power but why didn't Dx10 become a standard minimum requirement and why did it take Crytek so many years to make their CE3? Or competition?
I believe a marketing rep claimed on a late night show that PS4 or xboxone was "ten times more powerful" than last gen and that footage showed up in the Video Game movie...
Also thinking Intel would get the contract is kinda foolish, Intel is so advanced and ahead because they not only supply CPUs for mainstream PCs and Laptops but also for Workstations and Enterprise. Intel's main focus is to invest and produce. AMD just went in a different route since the Athlon 64/Opterons when they bought ATI and frankly based on budget and TDP from the agreement perspective they thought they were right even if we don't believe it.
Consoles price, timing and unreliable customers prevented another PS3 type of upgrade.
at the end of the day it was probably the only choice to do, AMD probably offered a great deal for CPU+GPU, and Jaguar makes more sense than the Bulldozer based CPUs for size/power
I just think it's a shame that they are stuck with it for a while, I mean Jaguar was pretty immature, now with the latest refreshed Jaguar (Beema) is using "turbo" for higher ST performance and is more power efficient... people overclocking AM1 can achieve 2.5GHz relatively easy on Kabini Jaguar cores (when the motherboard is good), and Beema can run at 2.4GHz boost
now the PS4 is stuck with 1.6GHz fixed, if they could boost let's say a couple of cores to 2.4GHz it would probably be a significant help?
and I would guess a 20nm shrink could possibly achieve more... but... it's the same situation as old gen, with MS having to simulate a slow FSB, it's a shame.
Jaguar 1.6GHz is horrible for a gaming PC, but for the consoles they are going to get all they can out of it and the GPU and have amazing results I'm sure.
just dreaming a little, would have been cool of sony to add a die shrunken Cell as a co processor to the PS4, for physics and backwards compatibility, imagine that... or maybe it would be nightmare for the devs, useless and expensive I 'm so clueless
As great and cool as clock speeds do affect PCs, AMD was releasing statements that "Clock speeds didn't matter" about two years ago iirc...not sure if it was part of eh something else (marketing) but the difference between the custom consoles 1.X clocks versus a PC 2.5GHz APU cpu core is not gonna make a difference because the consoles are closed boxes with custom coding even with using older 3d engines.
PCs will just be graced and grateful if a dev/pub decided to make a PC version where most likely you get higher settings yet you still had to have paid over $300 for a single graphics card to deliver decent performance. ..it's just not the same...sure PCs will be more power but cost is higher and game port has to be decided.
We barely started this current gen...cool discussion buy still just being realistic.
I would have preferred that all three would have waited until 2013 and 2014 hardware solutions but they made those decisions way back. The biggest problem is comparing TDP and wattage when choosing parts otherwise it'd be interesting seeing an evolved CellBE 2 or CellBE 3 combined with a GTX 980 RSX-2 customized as a 2014 PS4 and a comparable AMD solution for Xbox but power draw might need big bulky consoles with big $600 dollar price tag...if only customers were reliable to buy they may have made it...otherwise it's harsh...