PS4 & XBone emulated?

Clock for clock you could say a haswell core is just over 2x faster than a jaguar core, theoretically speaking.

No. miarch matters, clock is a useless reference across very different architectures.
An IB can easily avg 4 mops in small loops, whereas jaguar cannot, as it would be limited to 2 Mops (and add that a Macro-op differs from a micro-op).
Also, the number of stages in Jaguar is 'X', whereas the number of stages of an Ivy should be around 17 I think (dont remember atm). That makes difference on jumps, and more when you have mop-cache kicking in in tight loops in intel archs.

...so, whatever unity measure you choose, dont take clock to compare architectures...

maybe clock*average IPC over a meaningful(?) test-set is better, if you prefer one.
 
in terms of overall ST performance an i5 Haswell ($200) is probably something like 4x the PS4 CPU performance for current PC software right?

Well let's see, that's 3.9GHz vs 1.6GHz, a 2.4375x difference, so that'd mean about 1.64x better perf/MHz as well, which sounds about right. I don't agree with sebbi that it's about 2x perf/MHz vs Ivy Bridge, unless that's including a strong benefit from HT.
 
I wonder what would have happened if the CPU was as beefed up as the GPU on next-gen. Would the consoles then cost $1000?

Nah, a quad core Haswell with no iGPU would be tiny and very low power at ~2.5-3GHz. The consoles would be maybe $100 more expensive I guess depending on how badly Intel rips them off :p.

Of course then you would have a totally separate CPU and GPU, although still sharing the same memory pool assuming Intel modified the IMC to run GDDR5 for PS4. I assume the pros would still vastly outweigh the cons.
 
Of course then you would have a totally separate CPU and GPU, although still sharing the same memory pool assuming Intel modified the IMC to run GDDR5 for PS4. I assume the pros would still vastly outweigh the cons.

You can't just have two separate memory controllers accessing the same pool of memory. You'd need the CPU to go through the GPU's IMC or vice-versa. I really doubt Intel would even entertain doing a highly custom Haswell for consoles, at best they'd get a unique bin/fused feature set like they got for the first XBox..

An Intel solution not using their IGP would almost certainly involve two memory pools and the non-CPU chip would almost certainly be some custom design and not an off the shelf discrete GPU.

The most likely show stopper is the economic need to keep shrinking the chip. Doing the CPU + GPU separately is more expensive, that was probably a big motivation for having an APU. Who knows if Intel would even be willing to sell Haswells indefinitely, much less continually shrink the chip so they can sell the same level of functionality for less and less as years go by.
 
Oh yeah it's just a fantasy. There are many reasons why it would never happen. But we can dream :)
 
According to this: http://gamingbolt.com/project-cars-...splitting-renderer-across-threads-challenging

On being asked if there was a challenge in development due to different CPU threads and GPU compute units in the PS4, Tudor stated that, “It’s been challenging splitting the renderer further across threads in an even more fine-grained manner – even splitting already-small tasks into 2-3ms chunks. The single-core speed is quite slow compared to a high-end PC though so splitting across cores is essential.

“The bottlenecks are mainly in command list building – we now have this split-up of up to four cores in parallel. There are still some bottlenecks to work out with memory flushing to garlic, even after changing to LCUE, the memory copying is still significant.”
 
Anybody could tell you that the single core speed is a lot slower than what you'd get in a high end PC, that doesn't exactly need a developer's confirmation.
 
Don't know if fair but sounds a bit like a port that copies over PC inefficiencies.
 
Some of the best looking games of this past year seem to be almost imune to high CPU clocks and will behave excellently as long as you have a quad-core.
Check out the CPU performance benchmarks from Tomb Raider, Metro Last Light and Battlefield 4, for example.

CPU power is a huge factor in Metro Last Light. Watch as the FPS plummet when using older 3+ghz quad-core CPU's. http://www.techspot.com/review/670-metro-last-light-performance/page6.html

I think this might be proof that most games that require very high clocks or an Intel CPU to play decently are just running too much code that was originally written for the 3.2GHz CPUs in the last gen.

One or two cherry picked examples out of dozens of games does not constitute proof.

My point is: those CPUs aren't weak. It's just that we've been running such poorly optimized code in your PC CPUs for so long that now we think a +3.5GHz CPU is needed to play recent games.

Yes... they are weak. The PS4's CPU is roughly as powerful as an Intel i3-2100. http://www.redgamingtech.com/amd-jaguar-ps4s-cpu-vs-pc-desktop/


It's probably not, since both Sony and Microsoft chose 1.6GHz Jaguars for the new consoles.

They chose 1.6ghz Jaguars because both companies wanted an APU, required low power draw, and didn't want to lose a ton of money on each console considering Sony is in the tank financially and Microsoft's Xbox division is barely scratching by. Those factors brought both companies to AMD, who provided each of them with slightly different versions of the most powerful APU they could put together.

But while it's powerful for an APU, last i checked dual-core jaguars run against intel Atom chips, and an 8-core Jaguar chip like the PS4 has is around as fast as an i3-3100.
 
CPU power is a huge factor in Metro Last Light. Watch as the FPS plummet when using older 3+ghz quad-core CPU's. http://www.techspot.com/review/670-metro-last-light-performance/page6.html

Interesting results, but without knowing how much of the CPU load is due to rendering, and therefore how much of it might be avoided on consoles thanks to their much lower-overhead APIs, it's difficult to draw any solid conclusions.

Further note that Phenoms perform much, much better than Athlons. In particular, the dual-core Phenom X2 3.50GHz performs about the same as the quad-core Athlon X4 3.00GHz. The latter is of course running at slightly lower clocks, but still. A quick look at the Athlon X3 3.10GHz and Athlon X2 3.30GHz confirms that the game is reasonably well multithreated, since the former is a good bit faster in spite of its lower clocks.

All in all, this indicates that Metro Last Light is very heavily cache-dependent. This is further confirmed by the FX-4100's performance at 3.60GHz: it's much higher than the A10-5700, which has the double benefit of Piledriver cores (vs. Bulldozer) and 100MHz higher clock speed. Developers should have a relatively easier time optimizing memory access patterns on PS4, since every unit shares the same memory architecture and is therefore much more predictable. In fact, it's quite possible that the developers of MLL didn't even bother testing their game on L3-less Athlons. If you look at the results, you'll notice that every CPU with a large L3 cache actually performs quite well.

Meanwhile, the Xbone features fast, embedded SRAM which, if properly used, would be even better.
 
Metro is a GPU limited game, it runs very well on even dual core CPUs with Very High graphics (it did on my E8400).In fact all games use just 3 cores at maximum, with the exception of Far Cry 3 and Crysis 3, and both got patched and became less CPU intensive by far! most of the time, about 50-60% of my 3770 just setting there idling! and that is in every game, even BF4 MP, on the biggest, busiest of maps. games are hardly thread-heavy till this day.

Look and run pathetically? I don't know what you're reading, most games that DF analyzed look about the same (some with better textures or more post-processing) and run somewhere between the PS3 and XBox 360 versions. A few have particularly notable issues, but a couple others run consistently better.
Where? there are less than 10 multi-platform games featured on the Wii U, how many of them got rated better than X360 or PS3 on any metric? one? the rest just got blasted for bad performance and/or image quality.
 
In fact all games use just 3 cores at maximum, with the exception of Far Cry 3 and Crysis 3, and both got patched and became less CPU intensive by far! most of the time, about 50-60% of my 3770 just setting there idling!

Sounds like you're mixing threads and cores right there. Windows task manager shows a fully loaded HT CPU (4 threads in linpack for example) as 50% loaded only.
 
Its kind of a fallacy to equate the Cell and Xenon like they are the same beast in regards to being "more powerful than next gen CPU's". Cell was an absolute monster of its time in floating point precision, and could do a heck of a lot more than what the Xenon could do. Only issue was it was of course inflexible with many tasks and needed constant optimizing to the SPU's because of Sony skimping out on the GPU. The Cell had to cover for an entirely different part of the console a lot of the time, hence being limited to begin with in its usage.

It should go without saying that the jaguar cores in these new machines beat both xenon and cell in flexibility, and the jaguar cores definitely beat Xenon in just about everything else.
 
Sounds like you're mixing threads and cores right there. Windows task manager shows a fully loaded HT CPU (4 threads in linpack for example) as 50% loaded only.
Don't think that is true, I have seen many applications that can max my CPU. (like AIDA64), also seen games that reach more than 80% like Far Cry 3 and Crysis 3, however both have been batched to death, and with every patch I see CPU utilization dropping, now it barely hovers above 50%.
 
Don't think that is true, I have seen many applications that can max my CPU. (like AIDA64), also seen games that reach more than 80% like Far Cry 3 and Crysis 3, however both have been batched to death, and with every patch I see CPU utilization dropping, now it barely hovers above 50%.

It's true. Run any single threaded application and you'll see 12-13% utilization (1/8 of an 8 thread CPU). The actual utilization is a whole core, or 25%. Expand that to 4 threads and you have 50% and 100% respectively, for a 4 core 8 thread CPU.
 
Where? there are less than 10 multi-platform games featured on the Wii U, how many of them got rated better than X360 or PS3 on any metric? one? the rest just got blasted for bad performance and/or image quality.

They're still within parity or close to parity relative to the PS3 and 360 versions. A few are in between the 360 and PS3 versions, the rest are on par or close to par. A couple were better. The main reason these ports were blasted is because "within parity" is unacceptable for a new $300+ console. All of these ports were done by smaller teams with much smaller budgets. They most likely ported the 360 code path (given that both 360 and Wii U both have a tri-core PPC CPU) and got it within parity. If, for example, disabling AF did the trick, so be it. They still gotta add gamepad functionality and debug it.

Wii U's CPU is a low-clock (1.2 GHz) tri-core PPC 750 with a larger and slightly more modern cache, OoOE, no SMT, and poor (gamecube-derived) floating point performance. And overall the multiplats are still near parity with the 360 versions.

Still, I'm not so sure that core-for-core a Jaguar core is overall THAT much better than an espresso core but there are 8 (6 for games apparently.. at least for now) compared to Wii U's 3. For the new consoles, the GPU side is like 8x-10x better than last gen, there is 8x the RAM, but I don't think there is anything close to an 8x increase in CPU power. 2-2.5x maybe?
 
Where? there are less than 10 multi-platform games featured on the Wii U, how many of them got rated better than X360 or PS3 on any metric? one? the rest just got blasted for bad performance and/or image quality.

You said they look and run pathetically compared to XBox 360 and PS3, but somehow I need to counter that by showing that they run better? For most of them the performance was very similar.
 
You said it comes between PS3/X360 , so they must be at least better than PS3.

I took the liberty of searching through DF articles to put this matter to rest, here u go :

Assassin's Creed 4 : matching image quality but bad performance
http://www.eurogamer.net/articles/digitalfoundry-assassins-creed-4-next-gen-face-off

Splinter Cell Black List : very bad performance and low image quality
http://www.eurogamer.net/articles/digitalfoundry-splinter-cell-blacklist-face-off

Call Of Duty : Black Ops 2 : matching visuals but very bad performance
http://www.eurogamer.net/articles/digitalfoundry-black-ops-2-wii-u-face-off

DarkSiders 2 : bad performance and image quality
http://www.eurogamer.net/articles/digitalfoundry-darksiders-2-on-wii-u-face-off

Tekken Tournament 2 : worse fps and slightly worse visuals:
http://www.eurogamer.net/articles/digitalfoundry-tekken-tag-tournament-2-on-wii-u-face-off

Assassin's Creed 3 : slightly worse fps and visuals
http://www.eurogamer.net/articles/digitalfoundry-assassins-creed-3-wii-u-face-off

Batman Origins : very bad performance and image quality
http://www.eurogamer.net/articles/digitalfoundry-batman-arkham-origins-face-off

Call Of Duty Ghosts : very bad performance and visuals
http://www.eurogamer.net/articles/digitalfoundry-call-of-duty-ghosts-face-off

Resident Evil Revelations : bad fps and matching visuals
http://www.eurogamer.net/articles/digitalfoundry-resident-evil-revelations-face-off

Need For Speed MW: better fps and image quality
http://www.eurogamer.net/articles/digitalfoundry-need-for-speed-most-wanted-wii-u-face-off

Trine 2 : better fps and visuals
http://www.eurogamer.net/articles/digitalfoundry-trine-2-face-off

Mass Effect 3 : matching visuals and fps
http://www.eurogamer.net/articles/digitalfoundry-mass-effect-3-wii-u-face-off

That's it for the Wii U, 12 cross platform titles, only 3 are good or better than old-gen, the rest are blasted for bad fps/visuals, couple that with the fact that many developers are not even considering porting their games to the Wii U, while the existing ones are dropping their old support (no Wii U NFS this year) and you can see just how desperate the situation for the Wii U is.
 
Your comments are largely either grossly exaggerated or totally wrong. Like where you say Splinter Cell has "bad image quality" on Wii U when the conclusion says it looks the best of the three. Most of the games that you say are "blasted" only have fairly minor performance differences, often constrained to some particular areas or effects.

You're really missing the point here. I said that Wii U was at a similar capability level to XBox 360 and PS3 to illustrate how the ostensibly much more powerful CPU setup in XBox One and PS4 shows that they're not "just parity" with last gen like you were saying. Your retort that Wii U actually looks and runs pathetically compared to the two is pretty far from the truth. Your segue into Wii U's current third party situation is totally irrelevant.
 
Splinter Cell lacked any form of HD texture resolution, that's why it has low image quality, conclusion states it looked clearer because of it's less than 10%% higher output resolution, irrelevant when the rest of the game is muddy and washed out, which is stated in the conclusion by the way. Not to mention the horrible pop ins.

Performance usually drops to the low 20s with stuttering and pauses in most of these games on a regular basis, that's not minor, that's a reduction of no less than 25 to 33% in performance, perhaps more.

The problem here with the Wii U is it's very weak CPU, the console is able to approximately match the render resolution of the X360/PS3, due to it's moderately better GPU, but it completely falls flat when the action flares up and the screen becomes busy. it also bares back on image quality to save CPU cycles whenever it can.

And I find the third part situation completely relevant, if the console is up to the task, no body would have complained. clearly it is not.
 
Back
Top