PS4 & XBone emulated?

Discussion in 'Architecture and Products' started by alexsok, Dec 10, 2013.

  1. pMax

    Regular

    Joined:
    May 14, 2013
    Messages:
    327
    Likes Received:
    22
    Location:
    out of the games
    No. miarch matters, clock is a useless reference across very different architectures.
    An IB can easily avg 4 mops in small loops, whereas jaguar cannot, as it would be limited to 2 Mops (and add that a Macro-op differs from a micro-op).
    Also, the number of stages in Jaguar is 'X', whereas the number of stages of an Ivy should be around 17 I think (dont remember atm). That makes difference on jumps, and more when you have mop-cache kicking in in tight loops in intel archs.

    ...so, whatever unity measure you choose, dont take clock to compare architectures...

    maybe clock*average IPC over a meaningful(?) test-set is better, if you prefer one.
     
  2. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    Well let's see, that's 3.9GHz vs 1.6GHz, a 2.4375x difference, so that'd mean about 1.64x better perf/MHz as well, which sounds about right. I don't agree with sebbi that it's about 2x perf/MHz vs Ivy Bridge, unless that's including a strong benefit from HT.
     
  3. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    Nah, a quad core Haswell with no iGPU would be tiny and very low power at ~2.5-3GHz. The consoles would be maybe $100 more expensive I guess depending on how badly Intel rips them off :razz:.

    Of course then you would have a totally separate CPU and GPU, although still sharing the same memory pool assuming Intel modified the IMC to run GDDR5 for PS4. I assume the pros would still vastly outweigh the cons.
     
  4. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    You can't just have two separate memory controllers accessing the same pool of memory. You'd need the CPU to go through the GPU's IMC or vice-versa. I really doubt Intel would even entertain doing a highly custom Haswell for consoles, at best they'd get a unique bin/fused feature set like they got for the first XBox..

    An Intel solution not using their IGP would almost certainly involve two memory pools and the non-CPU chip would almost certainly be some custom design and not an off the shelf discrete GPU.

    The most likely show stopper is the economic need to keep shrinking the chip. Doing the CPU + GPU separately is more expensive, that was probably a big motivation for having an APU. Who knows if Intel would even be willing to sell Haswells indefinitely, much less continually shrink the chip so they can sell the same level of functionality for less and less as years go by.
     
  5. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    Oh yeah it's just a fantasy. There are many reasons why it would never happen. But we can dream :)
     
  6. alexsok

    Regular

    Joined:
    Jul 12, 2002
    Messages:
    807
    Likes Received:
    2
    Location:
    Toronto, Canada
    According to this: http://gamingbolt.com/project-cars-...splitting-renderer-across-threads-challenging

     
  7. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    Anybody could tell you that the single core speed is a lot slower than what you'd get in a high end PC, that doesn't exactly need a developer's confirmation.
     
  8. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,762
    Likes Received:
    2,639
    Location:
    Maastricht, The Netherlands
    Don't know if fair but sounds a bit like a port that copies over PC inefficiencies.
     
  9. Andrew

    Newcomer

    Joined:
    Jul 26, 2002
    Messages:
    58
    Likes Received:
    5
    CPU power is a huge factor in Metro Last Light. Watch as the FPS plummet when using older 3+ghz quad-core CPU's. http://www.techspot.com/review/670-metro-last-light-performance/page6.html

    One or two cherry picked examples out of dozens of games does not constitute proof.

    Yes... they are weak. The PS4's CPU is roughly as powerful as an Intel i3-2100. http://www.redgamingtech.com/amd-jaguar-ps4s-cpu-vs-pc-desktop/


    They chose 1.6ghz Jaguars because both companies wanted an APU, required low power draw, and didn't want to lose a ton of money on each console considering Sony is in the tank financially and Microsoft's Xbox division is barely scratching by. Those factors brought both companies to AMD, who provided each of them with slightly different versions of the most powerful APU they could put together.

    But while it's powerful for an APU, last i checked dual-core jaguars run against intel Atom chips, and an 8-core Jaguar chip like the PS4 has is around as fast as an i3-3100.
     
  10. Alexko

    Veteran Subscriber

    Joined:
    Aug 31, 2009
    Messages:
    4,541
    Likes Received:
    964
    Interesting results, but without knowing how much of the CPU load is due to rendering, and therefore how much of it might be avoided on consoles thanks to their much lower-overhead APIs, it's difficult to draw any solid conclusions.

    Further note that Phenoms perform much, much better than Athlons. In particular, the dual-core Phenom X2 3.50GHz performs about the same as the quad-core Athlon X4 3.00GHz. The latter is of course running at slightly lower clocks, but still. A quick look at the Athlon X3 3.10GHz and Athlon X2 3.30GHz confirms that the game is reasonably well multithreated, since the former is a good bit faster in spite of its lower clocks.

    All in all, this indicates that Metro Last Light is very heavily cache-dependent. This is further confirmed by the FX-4100's performance at 3.60GHz: it's much higher than the A10-5700, which has the double benefit of Piledriver cores (vs. Bulldozer) and 100MHz higher clock speed. Developers should have a relatively easier time optimizing memory access patterns on PS4, since every unit shares the same memory architecture and is therefore much more predictable. In fact, it's quite possible that the developers of MLL didn't even bother testing their game on L3-less Athlons. If you look at the results, you'll notice that every CPU with a large L3 cache actually performs quite well.

    Meanwhile, the Xbone features fast, embedded SRAM which, if properly used, would be even better.
     
  11. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Metro is a GPU limited game, it runs very well on even dual core CPUs with Very High graphics (it did on my E8400).In fact all games use just 3 cores at maximum, with the exception of Far Cry 3 and Crysis 3, and both got patched and became less CPU intensive by far! most of the time, about 50-60% of my 3770 just setting there idling! and that is in every game, even BF4 MP, on the biggest, busiest of maps. games are hardly thread-heavy till this day.

    Where? there are less than 10 multi-platform games featured on the Wii U, how many of them got rated better than X360 or PS3 on any metric? one? the rest just got blasted for bad performance and/or image quality.
     
  12. STaR GaZeR

    Newcomer

    Joined:
    Dec 10, 2011
    Messages:
    16
    Likes Received:
    0
    Sounds like you're mixing threads and cores right there. Windows task manager shows a fully loaded HT CPU (4 threads in linpack for example) as 50% loaded only.
     
  13. Inuhanyou

    Veteran

    Joined:
    Dec 23, 2012
    Messages:
    1,305
    Likes Received:
    480
    Location:
    New Jersey, USA
    Its kind of a fallacy to equate the Cell and Xenon like they are the same beast in regards to being "more powerful than next gen CPU's". Cell was an absolute monster of its time in floating point precision, and could do a heck of a lot more than what the Xenon could do. Only issue was it was of course inflexible with many tasks and needed constant optimizing to the SPU's because of Sony skimping out on the GPU. The Cell had to cover for an entirely different part of the console a lot of the time, hence being limited to begin with in its usage.

    It should go without saying that the jaguar cores in these new machines beat both xenon and cell in flexibility, and the jaguar cores definitely beat Xenon in just about everything else.
     
  14. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Don't think that is true, I have seen many applications that can max my CPU. (like AIDA64), also seen games that reach more than 80% like Far Cry 3 and Crysis 3, however both have been batched to death, and with every patch I see CPU utilization dropping, now it barely hovers above 50%.
     
  15. STaR GaZeR

    Newcomer

    Joined:
    Dec 10, 2011
    Messages:
    16
    Likes Received:
    0
    It's true. Run any single threaded application and you'll see 12-13% utilization (1/8 of an 8 thread CPU). The actual utilization is a whole core, or 25%. Expand that to 4 threads and you have 50% and 100% respectively, for a 4 core 8 thread CPU.
     
  16. Melqart

    Newcomer

    Joined:
    Nov 12, 2013
    Messages:
    36
    Likes Received:
    0
    They're still within parity or close to parity relative to the PS3 and 360 versions. A few are in between the 360 and PS3 versions, the rest are on par or close to par. A couple were better. The main reason these ports were blasted is because "within parity" is unacceptable for a new $300+ console. All of these ports were done by smaller teams with much smaller budgets. They most likely ported the 360 code path (given that both 360 and Wii U both have a tri-core PPC CPU) and got it within parity. If, for example, disabling AF did the trick, so be it. They still gotta add gamepad functionality and debug it.

    Wii U's CPU is a low-clock (1.2 GHz) tri-core PPC 750 with a larger and slightly more modern cache, OoOE, no SMT, and poor (gamecube-derived) floating point performance. And overall the multiplats are still near parity with the 360 versions.

    Still, I'm not so sure that core-for-core a Jaguar core is overall THAT much better than an espresso core but there are 8 (6 for games apparently.. at least for now) compared to Wii U's 3. For the new consoles, the GPU side is like 8x-10x better than last gen, there is 8x the RAM, but I don't think there is anything close to an 8x increase in CPU power. 2-2.5x maybe?
     
  17. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    You said they look and run pathetically compared to XBox 360 and PS3, but somehow I need to counter that by showing that they run better? For most of them the performance was very similar.
     
  18. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    You said it comes between PS3/X360 , so they must be at least better than PS3.

    I took the liberty of searching through DF articles to put this matter to rest, here u go :

    Assassin's Creed 4 : matching image quality but bad performance
    http://www.eurogamer.net/articles/digitalfoundry-assassins-creed-4-next-gen-face-off

    Splinter Cell Black List : very bad performance and low image quality
    http://www.eurogamer.net/articles/digitalfoundry-splinter-cell-blacklist-face-off

    Call Of Duty : Black Ops 2 : matching visuals but very bad performance
    http://www.eurogamer.net/articles/digitalfoundry-black-ops-2-wii-u-face-off

    DarkSiders 2 : bad performance and image quality
    http://www.eurogamer.net/articles/digitalfoundry-darksiders-2-on-wii-u-face-off

    Tekken Tournament 2 : worse fps and slightly worse visuals:
    http://www.eurogamer.net/articles/digitalfoundry-tekken-tag-tournament-2-on-wii-u-face-off

    Assassin's Creed 3 : slightly worse fps and visuals
    http://www.eurogamer.net/articles/digitalfoundry-assassins-creed-3-wii-u-face-off

    Batman Origins : very bad performance and image quality
    http://www.eurogamer.net/articles/digitalfoundry-batman-arkham-origins-face-off

    Call Of Duty Ghosts : very bad performance and visuals
    http://www.eurogamer.net/articles/digitalfoundry-call-of-duty-ghosts-face-off

    Resident Evil Revelations : bad fps and matching visuals
    http://www.eurogamer.net/articles/digitalfoundry-resident-evil-revelations-face-off

    Need For Speed MW: better fps and image quality
    http://www.eurogamer.net/articles/digitalfoundry-need-for-speed-most-wanted-wii-u-face-off

    Trine 2 : better fps and visuals
    http://www.eurogamer.net/articles/digitalfoundry-trine-2-face-off

    Mass Effect 3 : matching visuals and fps
    http://www.eurogamer.net/articles/digitalfoundry-mass-effect-3-wii-u-face-off

    That's it for the Wii U, 12 cross platform titles, only 3 are good or better than old-gen, the rest are blasted for bad fps/visuals, couple that with the fact that many developers are not even considering porting their games to the Wii U, while the existing ones are dropping their old support (no Wii U NFS this year) and you can see just how desperate the situation for the Wii U is.
     
  19. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    Your comments are largely either grossly exaggerated or totally wrong. Like where you say Splinter Cell has "bad image quality" on Wii U when the conclusion says it looks the best of the three. Most of the games that you say are "blasted" only have fairly minor performance differences, often constrained to some particular areas or effects.

    You're really missing the point here. I said that Wii U was at a similar capability level to XBox 360 and PS3 to illustrate how the ostensibly much more powerful CPU setup in XBox One and PS4 shows that they're not "just parity" with last gen like you were saying. Your retort that Wii U actually looks and runs pathetically compared to the two is pretty far from the truth. Your segue into Wii U's current third party situation is totally irrelevant.
     
  20. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,976
    Likes Received:
    5,213
    Splinter Cell lacked any form of HD texture resolution, that's why it has low image quality, conclusion states it looked clearer because of it's less than 10%% higher output resolution, irrelevant when the rest of the game is muddy and washed out, which is stated in the conclusion by the way. Not to mention the horrible pop ins.

    Performance usually drops to the low 20s with stuttering and pauses in most of these games on a regular basis, that's not minor, that's a reduction of no less than 25 to 33% in performance, perhaps more.

    The problem here with the Wii U is it's very weak CPU, the console is able to approximately match the render resolution of the X360/PS3, due to it's moderately better GPU, but it completely falls flat when the action flares up and the screen becomes busy. it also bares back on image quality to save CPU cycles whenever it can.

    And I find the third part situation completely relevant, if the console is up to the task, no body would have complained. clearly it is not.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...