Digital Foundry Article Technical Discussion Archive [2015]

Discussion in 'Console Technology' started by DSoup, Jan 2, 2015.

Thread Status:
Not open for further replies.
  1. Cyan

    Cyan orange
    Legend Veteran

    Joined:
    Apr 24, 2007
    Messages:
    9,061
    Likes Received:
    2,670
  2. Lucid_Dreamer

    Veteran

    Joined:
    Mar 28, 2008
    Messages:
    1,210
    Likes Received:
    3
    If asking focused questions is your definition of "attitude", I guess you would be correct. You seem to want to stick with Lair, after I mentioned "all the Madden games" of that time (first 3 years?). I said "etc", but the truth is that is was a LOT of the games within the first 3 years of the PS3's lifespan. Gloss over that, if you wish. It does not keep these things from being true.

    PS4 remasters, from the PS3, should push the CPU. When taking full advantage of the Cell processor (especially in a 30fps game), it should be quite difficult to complete the same work in half the time. I believe, this is quite common knowledge. To pull that off is an amazing feat. That's why even Naughty Dog had some trouble with it.

    It seems you are trying to lean heavy on the hardware not being up to snuff. At less than two years into this gen, that seems like, very much, a flawed argument. Feel free to address any number of last gen games (around this time index) that had performance issues (there were MANY on PS3). Tell me, what makes them so different?

    You say you were just pondering, but it seemed/seems like a condemnation of the hardware.
     
  3. hesido

    Regular

    Joined:
    Mar 28, 2004
    Messages:
    553
    Likes Received:
    85
    The jump between PS2 CPU and PS3 CPU is massive, but it's not the same case for PS3 -> PS4. Surely the GPU jump is big, and Sony might have bet on the GPU compute. But it doesn't change the fact that they went with a comparably weaker CPU.

    It's obvious PS3 pushed the envelope with the best available hardware at the time (bluray, memory, cell, minus the rsx), but PS4 didn't, hence they are not losing money with the hardware (or very little) compared to the PS3.
     
  4. Allandor

    Regular Newcomer

    Joined:
    Oct 6, 2013
    Messages:
    386
    Likes Received:
    192
    it is just how you see the CPU of the PS3. The general compute part of the Cell was ... well descent. Cell was mainly used to support the GPU. When you look at work that should be done by the cpu, the CPU part has been greatly improved with the jaguar cpu. but it just depends on how you look at the cpu. E.g. branching is something Cells SPEs are not good at, so you had to write your code with that in mind. The SPEs are highly specialized cores. If you have optimal code for them, they are great. If not... well a more general purpose cpu like the jaguar is just better at that.

    The really good thing for developers is, x86 is really well known. Cell is just a to special thing.
     
  5. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,063
    Likes Received:
    1,660
    Location:
    Maastricht, The Netherlands
    I also think you need to see the system as a whole. The new CPU part of the PS4 basically intended to just replace the PPU part, and the GPU part intended to replace the SPEs+GPU. That said, it is not yet clear that CUs are as easy or as efficient to program and integrate into your workflow as the SPE parts were. I assume they will be by virtue of CUs being present in a lot of desktop GPUs and therefore better known and more widely used than SPEs, but SPEs may be better at some things than CUs are as they are perhaps closer to CPUs in terms of their independence, ability to run 'normal' C code, etc., and CUs could still be more complicated to integrate with CPU work?

    Certainly though I'm getting the impression that PS4 dedicated titles already do more with CUs than most PC games do, but I'd love to see some papers on them being used really well. We also have this topic, by the way, where I dropped in an article about Async Compute in Tomorrow's Children that I recently came across:

    https://forum.beyond3d.com/threads/asynchronous-compute-what-are-the-benefits.54891/page-15
     
    Shortbread likes this.
  6. Nesh

    Nesh Double Agent
    Legend

    Joined:
    Oct 2, 2005
    Messages:
    12,457
    Likes Received:
    2,760
    Its not you asking questions my definition of "attitude". You know very well what I mean.
    You tend to make too many assumptions about what you think I may be trying to convice and ignore continuously things I said.
    Let me quote my self AGAIN (I am specifically pointing to bolded areas)
    These two quotes should have made your above reply pointless because it matters only under the assumption that they have never been said.

    Regarding the PS3 vs what we have this gen there is HUGE FREAKIN DIFFERENCE. That console was a pain in the ass to program for and squeeze out performance. The GPU was weak, the Cell although strong was like solving a rubik's cube (as quoted by Cerny), it had two memory pools and a much larger memory footprint reserved for the OS the first years compared to what was available later. Developers were reporting struggle. We dont get much of these complaints this gen. The PS3 did have a very disastrous start and it showed on price, performance and sales. Once more your PS3 example, much like Lair, is not a good example because its not a like for like example with either the XB1 or the PS4. We knew what was wrong back then with the PS3.

    If you take the 360 on the other hand those "etc" examples you didnt name are even greatly reduced. We thought of the ugly examples as efforts that didnt get the appropriate care because at the same time we had too many games that were visually more stunning that were outperforming them. I remember back then people talking about some games (I think it was madden?) and many agreed that the developer from a business perspective didnt need to struggle too much with optimization. The game would sell anyways.

    This gen the PS4 is the complete opposite of the PS3 in almost all areas. The whole focus was making it as developer friendly as possible.
     
    #786 Nesh, May 19, 2015
    Last edited: May 19, 2015
  7. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,829
    Likes Received:
    1,142
    Location:
    Guess...
    I imagine a lot of stuff that was done on the SPE's doesn't even need CU's in a modern GPU. i.e. they are just functions that are standard in a newer GPU and don't require any special compute intervention. One obvious example would be vertex shading which I understand the SPE's helped out with a lot. Obviously with todays massive unified shader arrays that's something that wouldn't require any special conversion from SPE code to GPGPU code (even though the work is still done in the CU's of course).

    Considering no PC API aside from Mantle even supports async compute at present I'd say that's a no brainer. Also aside from the latest Maxwell GPU's from Nvidia, neither Intel or Nvidia actually support async compute in their GPU's anyway so developers have little motivation to add that capability at present even if the API's did support it. Of course it can also be argued that Nvidia and Intel would benefit less from async compute in the first place (or put in a more negative way towards AMD, AMD requires async compute to maximise the use of it's shader array unlike Nvidia and Intel). Of course that does mean there should be some spare performance left on the table with GCN based GPU's on the PC so it will be interesting to see how much of a boost GCN parts get in the PC space in relation to Kepler and Maxwell 1 in games that make heavy use of async compute.
     
  8. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,738
    Likes Received:
    8,126
    Location:
    London, UK
    Digital Foundry: AMD reveals HBM: the future of graphics RAM technology

    Nvidia's Prototype
    [​IMG]
    However, it's worth stressing that HBM is not a wholly proprietary technology. The concept of stacked memory modules is hardly unique thinking: AMD says it has been working on the tech for seven years, but the principles behind it are hardly a secret and Nvidia has already shown off a test vehicle showcasing its own version (pictured above). It's a part of its next-gen Pascal architecture that's due to arrive in 2016. But it's AMD that will be first to market with stacked memory and we can't wait to put it through its paces.
     
    Shortbread likes this.
  9. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,738
    Likes Received:
    8,126
    Location:
    London, UK
    Shortbread likes this.
  10. Shortbread

    Shortbread Island Hopper
    Veteran

    Joined:
    Jul 1, 2013
    Messages:
    4,100
    Likes Received:
    2,326
    I'm guessing the uncapped XB1 edition, has more to do with allowing the game more legroom to breath (perform). If capped, XB1 edition may see terrible drops approaching the low 20s. But still, we're going to hear some PS4 users complain about not having the option of doing so...
     
  11. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    21,710
    Likes Received:
    7,349
    Location:
    ಠ_ಠ
    Why would capping the maximum affect the lower bound if they're already triple buffered/v-sync?
     
  12. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    3,493
    Likes Received:
    2,190
    Location:
    France
    Interesting question. I have read once on a DF article (or post) that the uncapped Infamous games had strangely better lower bounds than the capped game like sometimes the capped games had sub-30fps drops when the uncapped game stayed (slightly) above 30fps.

    It's easily displayed in this video showing SS and first light in the same area. The beginning is the uncapped game then the capped game. You can see that the uncapped First light game never goes under 33fps (it dips once at 33fps, the rest is mostly ~40fps) but strangely, when capped in the same area, the game regularly dips at 28 or 29fps.

     
  13. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,738
    Likes Received:
    8,126
    Location:
    London, UK
    I thought this was generally chalked up to improvements in the engine between SS and FL?

    [​IMG]
     
  14. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    21,710
    Likes Received:
    7,349
    Location:
    ಠ_ಠ
    Frame pacing issue?

    It's comparing the same spots in both builds, just that the first half of the video is without the fps limiter, the latter half with it on. :p
     
  15. DSoup

    DSoup meh
    Legend Veteran Subscriber

    Joined:
    Nov 23, 2007
    Messages:
    12,738
    Likes Received:
    8,126
    Location:
    London, UK
    I don't have time to read things, I'm busy searching the internet for Spock pictures! :oops:
     
    TheAlSpark likes this.
  16. HTupolev

    Regular

    Joined:
    Dec 8, 2012
    Messages:
    936
    Likes Received:
    564
    There's nothing strange about it. Look at the frame time graph in the video; even when it's "consistently above 30fps" (as a fairly lengthy time-average of performance), you regularly have individual frames displayed for more than 33ms. Those seem to occur with similar frequency in the capped and uncapped videos.

    Sometimes the game just takes a while to process a frame.
     
    Globalisateur likes this.
  17. Globalisateur

    Globalisateur Globby
    Veteran Regular Subscriber

    Joined:
    Nov 6, 2013
    Messages:
    3,493
    Likes Received:
    2,190
    Location:
    France
    You are definitely right. Now If we look at the patched Witcher 3 XB1 DF framerate video and focus on the frame time graph only, during gameplay, the game has regularly its frame time above 33ms meaning the game would similarly dips under 30fps if it was capped at 30fps...:roll:

    Fortunately for the XB1 game, the engine is uncapped. The PS4 game on the other hand...:runaway:
     
    #797 Globalisateur, May 19, 2015
    Last edited: May 19, 2015
  18. TheAlSpark

    TheAlSpark Moderator
    Moderator Legend

    Joined:
    Feb 29, 2004
    Messages:
    21,710
    Likes Received:
    7,349
    Location:
    ಠ_ಠ
    http://www.neogaf.com/forum/showpost.php?p=164444919&postcount=2741

    hmm...

     
  19. function

    function None functional
    Legend Veteran

    Joined:
    Mar 27, 2003
    Messages:
    5,340
    Likes Received:
    2,797
    Location:
    Wrong thread
    They could have capped the game at 30 fps and kept shadow quality where it was, and also added a real time version of Space Invaders to play during CGI loading screens.

    Everyone wins. Except alien invaders.
     
  20. HTupolev

    Regular

    Joined:
    Dec 8, 2012
    Messages:
    936
    Likes Received:
    564
    Ehhh

    Judging by the comparison video, a decently-implemented cap would make things feel much stabler than they currently are. Some scenes would be rock solid and others would have a blip every couple seconds. Versus the total mess we're seeing right now.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...