Mark Cerny "The Road to PS4"

Discussion in 'Console Technology' started by patsu, Jun 27, 2013.

  1. smb

    smb
    Newcomer

    Joined:
    Jul 2, 2013
    Messages:
    7
    Likes Received:
    5
    It should be pointed out, Mark Cerny was invited to talk about his life and work, and to receive an award, like Peter Molyneux before him, hence his presentation was unavoidably self-centered ("...i began my career..." etc).
     
  2. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    Actually, 3dcgi said the following regarding the ACEs:
    I interprete this as it was in the works anyway. Sony may have had some influence so some detail may look different than it would have been otherwise, but in principle the extended compute features would have been there also without the console deals.
    I have to correct myself here. When I wrote this, I thought I had seen this before, but Temash/Kabini appear to be the first APU/GPU to support that volatile flag and the corresponding instructions. So it is not supported by GCN1.0, only the updated "GCN1.1" version used for Temash/Kabini and apparently the PS4 (not sure about XB1).
     
  3. archie4oz

    archie4oz ea_spouse is H4WT!
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    1,608
    Likes Received:
    30
    Location:
    53:4F:4E:59
    No, you're off by a generation. It was during the PS2. PS3/Cell was just a repeat of the PS2. Nor would I necessarily call it an overreach. It was solid effort that lead to Sony being the first to fab and ship an MPU on 90nm, and it was certainly plausible to continue. Executive management didn't see that as the direction they wanted to take the company. Partially because they didn't know what markets they wanted to leverage their capability and capacity in, and partially because they didn't know how to be a foundry or desire to become one.

    There was no lack of skill, just a different philosophy of design.

    For IBM, Cell was simply an experiment. A collaboration on homogenous computing with a couple of partners footing a large chunk of the bill. Not terribly different from Intel with Terra Scale and Larrabee, only without the partners to share the costs. Then again Intel has the benefit of large volume and relatively high margins. For Toshiba, a relatively low cost way of acquiring new IP for it's semiconductor business (although they ended up being very good at marketing the product). For Sony, a continuation of what started with Emotion Engine, however with theoretically less presumed risk (ultimately that didn't work out).

    The SPE isn't what you get from limited experience. It's a conscious design philosophy. Kutaragi was a believer of maximizing computational throughput with the fewest gates possible. That the hardware engineer should be providing as much computation as possible within a transistor budget. That it's the responsibility of the software engineer to exploit it and utilize it. It's his hardware engineer mentality (I kind of have it too). The problem is that ultimately you're not producing something that's meant to solve big computational problems, but to provide a platform for content factories.

    I don't think it was so much of maintaining an illusion as it was just simply a lack of interest in that industry.
     
  4. Love_In_Rio

    Veteran

    Joined:
    Apr 21, 2004
    Messages:
    1,467
    Likes Received:
    116
    Onion + is also present in Kabini?.
     
  5. archie4oz

    archie4oz ea_spouse is H4WT!
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    1,608
    Likes Received:
    30
    Location:
    53:4F:4E:59
    Honestly there's VERY FEW companies these days that develop and ship their own operating system (Western or not), particularly on the desktop. There's nothing special about Western countries, they're just at a different phase (post manufacturing), where a lot of technical industry is focussed on IP instead of manufacturing prowess.

    I can't speak for other east Asian countries, but in Japan there's never been a large domestic demand for desktop CPUs, GPUs, or operating systems to facilitate enough growth to develop into a pillar industry that it could export. Most processor and software development has focused around embedded electronics and software (e.g. consumer electronics, appliances, automotive, industrial electronics), particularly in the mobile space. Also, for a long there's there's a been a nascent supercomputer industry as well (w/Hitachi, NEC, and Fujitsu)...

    FWIW, Sony used to have it's own BSD variant (Sony NEWS) in the 80's and 90's to power it's line of m68k and MIPS UNIX workstations—the later being the original development workstations for the Playstation. Sony's Computer Science Labs also developed the RTOS that powered the AIBO and Qrio robots (was used on a few set top boxes).

    Also, not only is the PS4 BSD based, the PS3 is/was as well...
     
  6. Gipsel

    Veteran

    Joined:
    Jan 4, 2010
    Messages:
    1,620
    Likes Received:
    264
    Location:
    Hamburg, Germany
    In principle, the possibility to bypass the GPU caches (which is said to be the difference to Onion without a plus) exists in all GCN GPUs. It is basically an additional bit in the instruction field for memory instructions which controls this. There are two different bits to set the caching behaviour: GLC (globally coherent) and SLC (system level coherent). A set GLC bit means to bypass/miss the L1 and causes to read from the L2 cache or write through to the L2 cache (a writethrough is actually always performed, but the respective lines don't persist in the L1 for the next wavefront if the bit is set). That is important if you want to ensure coherency between different CUs in the GPU. But the L2 cache still works as a write back cache, so the CPU wouldn't be able to see the changes (as the CPU can't probe GPU caches, it only works the other way around) without a flush. The SLC bit resolves that and really bypasses all GPU caches and reads/writes directly to/from system memory with an APU (with a discrete GPU it would be the VRAM).
     
    #106 Gipsel, Jul 2, 2013
    Last edited by a moderator: Jul 2, 2013
  7. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,166
    Likes Received:
    3,059
    Location:
    Well within 3d
    They invested heavily in a fab for Cell and all the Cell derivatives that never happened. They recouped some of the cost by selling it to Toshiba, and then later bought it back for camera sensors.
    The PS2 made money. Cell and its massive physical investment put Sony in a multibillion dollar hole.

    As opposed to the other Sony in-house high-performance compute designs?

    It was a way to get someone to pay for the circuit techniques that went into POWER6, a design that was proof that IBM wasn't immune to architectural missteps.
    If Cell were homogenous, Toshiba would have walked.

    Those were not given the same kind of major project status as Cell. Intel didn't built a billion dollar fab for either.

    There were multiple instances in the years leading up to that point that showed how outdated that view had become. Perhaps he felt this could be avoided, or perhaps he didn't weight them the same because Sony managed to skate by on what had become unacceptable since the PS2 dominated the market.

    The hundreds of millions (possibly a billion?) dollars in cash poured into the 65nm Nagasaki fab is not what I characterize as a lack of interest in trying to use Cell to drive itself to a leadership position in consumer electronics and beyond. The design was ludicrously over-engineered if the company was content to plug away at low-functionality devices.
     
    #107 3dilettante, Jul 2, 2013
    Last edited by a moderator: Jul 2, 2013
  8. tuna

    Veteran

    Joined:
    Mar 10, 2002
    Messages:
    3,203
    Likes Received:
    392
    That is really interesting but unfortunately we will probably never know.

    Also, the PS4 OS is probably a direct sequel to the one in Vita.
     
  9. tuna

    Veteran

    Joined:
    Mar 10, 2002
    Messages:
    3,203
    Likes Received:
    392
    It would be interesting to find out how much Sony/SCE spent on fabs. That has to be a lot of money down the drain.
     
  10. kyetech

    Regular

    Joined:
    Sep 10, 2004
    Messages:
    532
    Likes Received:
    0
    Welcome to the Forum :)
     
  11. archie4oz

    archie4oz ea_spouse is H4WT!
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    1,608
    Likes Received:
    30
    Location:
    53:4F:4E:59
    IIRC Nagasaki Fab 1 and 2 were built from the ground up for the PS2 (1999 and 2001 respectively), and cumulatively I recall it about $2-2.8 billion being spent. Of course that was just the GS (and EE+GS) fabrication (not including the initial batch of GS from Kokubu). The standalone EE was at Oita TS (as was the RSX).

    Cumulatively, I have no clue how much was spent of fabs. Of you want to go back to the start, Kokubu was the first in 1973, followed by Oita in 1984, then Nagasaki in 1987 (acquired from Fairchild/Fujitsu before Sony turned it into the beast it is today), and finally Kumamoto in 2004 (I think). Start your calculator... :)
     
  12. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,614
    Likes Received:
    60
    Do you by any chance know if he's going to talk about the same thing in the Develop keynote ? (i.e., Time to triangle)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...