Yoshida confirms SCE working on new hardware

Discussion in 'Console Industry' started by Versatil, Jul 6, 2010.

Thread Status:
Not open for further replies.
  1. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    Lots of scientific and FP code. And yes, it can decode H264 streams higher than 10mbits, PCs have been able to do that for a while. And if they are well designed they do it with less power.

    You mean like AI? Yeah, game console tend to have pretty crappy AIs.

    It requires continuous programming effort even with the frameworks, well beyond what is required by a more well designed architecture.
     
  2. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    Um, Portal 2 will likely look better and play better on both 360 and PC. The only reason he said that is Sony is supporting steam on PS3, which is good and cool and needed because sony is still so far behind in online, but it really doesn't have much to do with the game itself at release.

    It doesn't matter how familiar they are with it, its not a familiarity issue. It is more complex and time consuming to program and get efficiency out of.
     
  3. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    Considering all engineering work around cell has been dead for a while now...
     
  4. aaronspink

    Veteran

    Joined:
    Jun 20, 2003
    Messages:
    2,641
    Likes Received:
    64
    If the game is important, they can recompile the binary.


    360 DID NOT have BC. It had recompiled binaries.
    PS3 has sold more consoles without BC than with.
     
  5. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    Surely for these sorts of workloads, the solution is coupling SPUs with a decent conventional core or three, and you share the workloads between cores as is appropriate, no different to sharing work between CPU and GPU. The problem with Cell in this regard was the feeble PPE, but the architecture could accomodate a couple of worthwhile cores going forwards.

    No you can't (if by infinite FP machine you mean infinity flops). By definition you'd need either an infinite number of transistors or infinite clock, and an infinite power draw. Maybe you could design a useless processor that can churn out a million petaflops of pointless calculations to prove your point, but that's so far from infinite (infinitely far) that it's by comparison no processing performance at all.
     
  6. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    I'm sorry for your embarrassment. Please don't blush.

    ...if you hire the talent and spend the time and money making an exclusive game it'll look good or perhaps even great, and then you can easily market it to your fanbase as being technically impossible on the other guy's machine (because it lacks Cell or BluRay or eDram or Blast Processing or whatever else it is that the core gameplay and art style could easily survive without).

    I'm comparing the way "hardware advantages" can be used to claim game X is impossible on your competitors hardware. Take a wild guess as to why I threw Blast Processing in there.

    Maybe my comprehension isn't so great either, because I can't work out what you're getting at. Maybe all those NEEDLESSLY capitalised WORDS are CONFUSING ME.

    I'm trying to make the point that a faster CPU does you little good if it's underutilised because it's only used to run stuff targeted at less powerful hardware. I'm not sure how what you're saying counters that.

    You seem to find a lot of things embarrassing. No need to be so bashful son, it's only a forum.

    No, if I'm talking about how "hardware advantages" have been used to big-up exclusive titles to the general public (and vice versa) it makes no sense to mention blast processing with Direct X and XNA.

    Xbox 1 was expensive and late and multiplatform titles consistently failed to make the most of the system. Only a handful of exclusives did anything really special, but most of these were sales disappointments.

    Hooray! "Lazy devs!"

    Well, until 2004 there was only one Halo game so it'd be tough to call any standalone game a franchise.

    Halo grew because it was an outstanding game. Games with far more hype have failed to do what it did.
     
  7. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
    So adding a new target platform to your game by "re-compiling" it does not have any implications at all?

    What kind of legal issues did Sony go through to have the complete library of PS1 and PS2 running on the PS3?

    Maybe there are enough suckers to warrant the presence of BC?

    Still the major complaint about the Cell at introduction was that tasks had to be divided into managable jobs for the SPUs. Now you have to use the same technique to get good performance out of many-core homogenous CPUs. Seems like Cell just was ahead of its time with regard to this and now the development community has caught up.

    If your FP machine involves transistors I seriously doubt it has infinite FLOPS, but you are welcome to prove me wrong. Edit: Shifty beat me on this, I also think Mr Spink is making promises he can´t keep. :)
     
    #187 Crossbar, Jul 12, 2010
    Last edited by a moderator: Jul 12, 2010
  8. V3

    V3
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    3,304
    Likes Received:
    5
    Well it was going to be eDRAM and remember Sony was investing lots of money into eDRAM, as well as 65nm and 45nm. eDRAM amount to nothing, they did get to 45nm though (eventually).

    It wasn't just MS, I am sure the Bluray group was pressuring Sony to release the PS3. PS3 was released into format war. Console launched always have that air of uncertainty about them, that's why backward compatible help ease those uncertainty even most people know they're pretty useless, but a stable platform where your investment is "safe" is what people want at launched. At the end of it, even if PS3 won the format war, the brand name was left bruised from it.

    Still my point was, Cell inside the PS3 is a poor choice given their circumstances.

    Sony was funny, many devs had been crying over the difficulty of programming the Emotion Engine and yet Sony asked the same two engineers responsible for EEngine from Toshiba who is adamant in their use of local store to design the SPU. I think if only IBM that gotten the contract, it would have been all PowerPC cores in there. As efficient as Cell is, going forward is a dead end for the architecture.

    If Sony had managed to get 65nm ready and produced the Broadband Engine the visioned then the investment in Cell would have been worthwhile. As it is, its just a waste of money. Worst while they were busy with Cell and the pursue of FLOPS, Nintendo stole their idea from under them and run with it. Now Playstation Move will be forever known as Wiimote clone, despite SCE demoing the technology so long ago that people had forgotten about it.
     
  9. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,406
    Location:
    Wrong thread
    I dunno, this guy describes it as backwards compatibility:
    http://www.qbrundage.com/michaelb/resume/michael_brundage_resume.pdf
    http://www.qbrundage.com/michaelb/pubs/essays/xbox360.html

    I don't think he's just using the term loosely, and as another term for recompiling the original code. If you look at the things he's talking about doing it just doesn't fit with simply running native 360 code.


     
    #189 function, Jul 12, 2010
    Last edited by a moderator: Jul 12, 2010
  10. obonicus

    Veteran

    Joined:
    May 1, 2008
    Messages:
    4,939
    Likes Received:
    0
    The 360 DID HAVE BC. The way it implemented it is irrelevant. And the PS3 only abandoned BC a few years into its lifecycle. In both cases they only forgot about BCs importance when the platform's library was robust enough. But that doesn't address the fallacy of 'it hasn't hurt them any'. You can't make that statement, not when in this generation and the last the most popular, market-leading consoles have had full BC.

    Also, are you kidding me? If Sony goes for a more traditional architecture next time around you really think that they'd get all these games tuned for Cell to work by 'recompiling the binary'?
     
  11. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,762
    Likes Received:
    2,639
    Location:
    Maastricht, The Netherlands
    The backward compatibility for the 360 is a little complicated, but I do seem to recall that it does mostly involve emulation. Didn't they basically release emulation code for each game individually, sometimes just changing a few parameters, but at other times recompiling whole parts of the game? (as I think they did with Halo 2?)

    If true, you both may be right. Would be really nice if you don't both argue as if everything you say is the absolute truth and you can't possibly be wrong. ;) I know it is hard work, from personal experience, but at least look like you're trying. ;)
     
  12. Crossbar

    Veteran

    Joined:
    Feb 8, 2006
    Messages:
    1,821
    Likes Received:
    12
    The 360 BC has already been covered in this thread, but here it is again. 1up.

    Also all PS3s have PS1 BC.

    edit: I also believe there are gamespecific emulation codes as Arwin suggests, I don´t think any games need completely new binaries, as stated above the CPU-part is not a major bottleneck.
     
    #192 Crossbar, Jul 12, 2010
    Last edited by a moderator: Jul 12, 2010
  13. NRP

    NRP
    Veteran

    Joined:
    Aug 26, 2004
    Messages:
    2,712
    Likes Received:
    293
    This is what I wonder about. If most PS3 games use a task-based programming model that "sees only a small slice of memory at a time", and that "small slice of memory" can fit in the caches of a more traditional CPU, then it seems like PS3 code could run just fine on a conventional multicore CPU. Then again, I am probably grossly oversimplifying things.
     
  14. obonicus

    Veteran

    Joined:
    May 1, 2008
    Messages:
    4,939
    Likes Received:
    0
    My concern is about how much of the stuff is coded directly to the metal, as well as stuff designed around PS3's weirdness. If coding to a traditional CPU do you really want to do all those 'CELL as GPU-helper' shenanigans? The point is it's hardly a just a recompile, at least when you're dealing with shifting oddball architectures. (Edit: and this is supposing that this traditional CPU would be enough of a leap forward so as to do the job of the SPUs without a sweat).
     
  15. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,762
    Likes Received:
    2,639
    Location:
    Maastricht, The Netherlands
    Current traditional CPUs can't keep up with some tasks the Cell's SPEs perform today, so don't expect that to be easily emulated in the future. If Sony is going to be serious about PS3 game emulation on the PS4, then they're going to either at the very least add a Cell processor as well as whatever other chip they're considering, or do it the way Microsoft did Xbox1 emulation on the 360 (of which, if you think about it, the HD re-release bundles of God of War and the recently announced Sly Cooper are good examples).

    On the other end of the scale are the PhyreEngine games and the Minis platform, which are abstracted platforms/libraries that can be coded against on a higher level and can probably be rewritten to use a current higher end GPU today (if I remember correctly, there was an intention to be able to code for PhyreEngine on the PC and develop games for both PC and PS3 using it).

    Microsoft has a similar setup for this with XNA - everything that runs on that will obviously run on the next Xbox as well.
     
  16. corduroygt

    Banned

    Joined:
    Nov 26, 2008
    Messages:
    1,390
    Likes Received:
    0
    BS...all the games would be far more inferior since there would be no way to make up for RSX's weaknesses. The PS3 is bottlenecked by its GPU, not the Cell.

    Gaming code besides the graphics components is really not that complicated. GOW3 main executable was around 5MB for example. A multiplayer game might be a little more, but no one is complaining about programming effort for cell any more. Even PS Move shovelware developers are able to program using the cell, so it seems that it's always you who's complaining about the programming effort where studios seem to churn out PS3 games left and right without much problem. The problems, when they happen, are because of RSX, not Cell.
     
  17. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    More games probably. But better games require much more than a good CPU and GPU (e.g., Nintendo can make great games with inferior CPU and GPU). It depends on a whole slew of other factors. I think someone surveyed the earlier PS3 games, collectively they reviewed well.

    If the tech platform is easier to development for, developers will have more time working on the gameplay though. So they do have to address programmability moving forward.
     
  18. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    Just because no-one's voicing their complaints, doesn't mean they're happy with Cell, nor that Cell is an ideal architecture. Developers just see it as a necessary evil and soldier on (potentially, or maybe they all love it ;)). Also more studios creating simple content doesn't mean they've all masterd Cell either. Sony make available PhyrEngine for all developers, and there are various middleware options. Managing your memory use is a pain that developers would rather not have to worry about; no-one willing takes on more work just for the sake of it! And for those developers who enjoy poking around with ultra-efficient memory management, a cache system enables that without forcing all developers to have to work to that low level.
     
  19. tuna

    Veteran

    Joined:
    Mar 10, 2002
    Messages:
    3,550
    Likes Received:
    590
    How do you recompile a binary?
     
  20. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,762
    Likes Received:
    2,639
    Location:
    Maastricht, The Netherlands
    It will always remain a matter of context as well though. While it may certainly be true that some coders will prefer to work with something like the 360 CPU instead or even better, a PC CPU, the reality is that in the current landscape of the 100 man team making a multi-platform game, there will be 5 programmers writing the code leading on PC or 360, and typically just one or maybe two doing the PS3 version of that code. This is why developers who code with both architectures in mind from day one will achieve better results - not just because it forces them to think better about strategies that work on both/all three systems, but because on average there are more programmers working on the PS3 version.

    And yes, I am also strongly convinced that the state of play has changed, not just because programmers have more hours on the Cell under their belt, but because programming PCs and particularly modern GPUs has become more and more like programming the Cell. You can see some of that in the comments that D.I.C.E. make about programming the Cell in their slides. The concepts in Cell that were new and ahead of the curb are now becoming much more mainstream.

    The main lesson, above anything else for Sony, should be that it is important to cater for two types of developers: those that like to code on the low level, and those that like to code on the high level. Particularly at the start of a new generation of hardware, you should be ready to give support for high level coding as there will be only a very small group ready to do the low level stuff. Microsoft got that right, Sony didn't (not for main development tools, ease of porting to PC, and not for SDK type support for network functions and the like), and it was a very important distinction, much more so than the choice for Cell per se. I don't consider it very likely that they will make the same mistake again (though it's just one mistake of several that Sony made, so it will be hard work to get them all right for next-gen). However, they will have some big catching up to do, and work hard if they want to skip some of Microsoft's mistakes and get to par or better - it is particularly in this area that I'm hoping the 'bring the developers in on the PS4 development early' is going to help out.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...