Questions about PS2

Discussion in 'Console Technology' started by Liandry, Apr 7, 2016.

  1. bunge

    Regular

    Joined:
    Nov 23, 2014
    Messages:
    725
    Likes Received:
    513
    Why didn't even Sony use these techniques? I don't really understand but it sounds like a fantastic opportunity. They could have extended the life of the machine which was essentially printing money by then.
     
    Liandry likes this.
  2. Squeak

    Veteran

    Joined:
    Jul 13, 2002
    Messages:
    1,262
    Likes Received:
    32
    Location:
    Denmark
    For the same reason no one else was using them. Sonys few inhouse devs was and is also part of the industry and the things that are good and bad about it.
    Take a look at this presentation from early 2003: http://lukasz.dk/mirror/research-scea/research/pdfs/GDC2003_Intro_Performance_Analyzer_18Mar03.pdf Sonys hardware and tool guys where really very slow and mealymouthed about guiding people towards the right paths.
    Also look at how bad the utilization of the EE to GS bus is, even on the "good" example.

    Using the IPU to decompress textures would of course not have been as straight forward as just storing textures in DXTC format. But if streaming textures and geometry off the incredibly slow disc, compared to RAM worked, as it did in many games, then using the IPU to stream-decompress textures from memory could of course have been done.
    It would have made a huge difference with natural material (complex) or picture textures, stretching the 32 Mb far further out than ordinary compression would have allowed.
     
    #382 Squeak, Dec 30, 2016
    Last edited: Dec 30, 2016
    Liandry and bunge like this.
  3. Liandry

    Regular

    Joined:
    Feb 26, 2011
    Messages:
    323
    Likes Received:
    44
    When multipass is used, did all geometry should be sent to GS for each pass, or only part of geometry? As an exampe, if there is room rendered with single pass, and in that oom is object (cube), and cube have also enviroument mapping. Al polygons will be send twice, or only polygons needed for cube?
     
  4. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    Only polygons drawn. Each pass is a separate render - triangles, lights, pixels.
     
    Liandry likes this.
  5. Liandry

    Regular

    Joined:
    Feb 26, 2011
    Messages:
    323
    Likes Received:
    44
    So it means what only polygons for cube wil be sent twice? :-D

    Ok, next question. After first pass is made and backbuffer written to EDRAM, should GS read back buffer from EDRAM for second pass, or not, or only part of backbuffer needed for cube which require second pass? (yes, maybe my questions is hard to understan). :-D
     
  6. Squeak

    Veteran

    Joined:
    Jul 13, 2002
    Messages:
    1,262
    Likes Received:
    32
    Location:
    Denmark
    Yes, only the polygons drawn will consume filtrate and untextured polygons (useful for lighting and shadows and other special effects) will in theory render twice as fast. And only the part of the back buffer touched is the one rendered to. Or rather the 32x32 pages rendered to, so it does load a bit more pixels into the render-cache than are necessarily drawn to.

    PS2 really is very very fast at multipass. All the talk you'll find about additional passes halving filtrate is kind of true but really just FUD.
    The only thing that could kill it would be too small polygons because the rendering quad (stamp of pixels that walks across the screen when rendering) would be wasted on smaller polygons.
     
    Liandry and chris1515 like this.
  7. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,104
    Likes Received:
    16,896
    Location:
    Under my bridge
    Very true. The drop in shadow quality from PS2 to PS3 was so jarring! Makes one wonder how would a PS3 game perform if it strove for some PS2 quality targets like shadow quality (pixel-perfect) and particle density?
     
    Liandry likes this.
  8. Squeak

    Veteran

    Joined:
    Jul 13, 2002
    Messages:
    1,262
    Likes Received:
    32
    Location:
    Denmark
    Kutaragi should have had his way and something like the GSCube should have been the rendering engine on the PS3.
    They would have needed to take doing the necessary APIs and tools really effing seriously, but Sony was in a very good place after the PS2, so it should have been doable.
    It would have meant a much more scalable architecture than what we have now.
    Just add some more APUs and eDRAM when new processes makes it possible. Also rendering would have been very flexible going far further down that road than where we are now.
     
    Liandry likes this.
  9. function

    function None functional
    Legend

    Joined:
    Mar 27, 2003
    Messages:
    5,854
    Likes Received:
    4,405
    Location:
    Wrong thread
    The fantasy train has to stop somewhere.

    Resubmitting geometry to a fast but simple rasteriser with limited blend modes, limited culling and CLUT only texture compression was not the optimal way to utilise 3D hardware. GSCube was doubling down on a dead end of graphics hardware, which is why it was dropped. Stone cold. Dead. By people who knew what they were doing.

    They killed it.

    Graphics left the interesting but evolutionary dead end GS behind for a reason. Meanwhile, Nvidia and ATI went on to dominate the PC space and the TBDR went on to dominate the world of portable devices, reaching further than Nvidia or AMD ever managed.

    Despite having far more development resources poured into it than any other platform in it's generation, PS2 ended up where it ended up. No other system had the luxury of almost unlimited resources poured into it over several generations of software.

    Every system in its generation ended up less well developed than the PS2, mostly because graphics pipelines and asset creation pipelines weren't ready yet.

    PS2 had a better shot at achieving it's potential - for the most part - than any other system.
     
    vipa899 and milk like this.
  10. Squeak

    Veteran

    Joined:
    Jul 13, 2002
    Messages:
    1,262
    Likes Received:
    32
    Location:
    Denmark
    BS! PS2 was different. People on a deadline and under budget don't like different. That doesn't mean it was bad though. 5 years is nothing to learn something different. Especially not when Microsoft lured developers with a repackaged PC.
    The PS2 might have had the biggest budget but that doesn't say much in an industry where the common philosophy is to leave hardware to a few magic companies and let Moores law work for you.

    The PS3 GS equivalent would have contained APU's being essentially a special extension of Cell. Also far more eDRAM for a larger buffer.
    Those two things would have meant all the difference.
    Those APU's could have been used for anything, and when they where finished with that, they could be state changed within the same frame and do something else.
    Not that the GS wasn't a good design, but anything done within a timeframe and a budget has to make some compromises.
    The GSCube was used and was very successful at what it set out to do. IE rendering high-res interactive previews of CG movies. It was used on a few movies. It was never meant as a big seller. It was a bit like the original Pixar Image computer. A showcase.
     
    chris1515 likes this.
  11. idsn6

    Regular

    Joined:
    Apr 14, 2006
    Messages:
    509
    Likes Received:
    163
    Naughty Dog was a Sony first party with a completely in-house technology stack built from the ground up specifically for the PS2 and refined by very good engineers for the length of the generation: engine, pipeline, compiler, programming language. You cannot get much more different from the industry at large than that.

    If even their best efforts fell short of the hardware's ideal, then maybe the fault did not lie with the developers.
     
    vipa899 likes this.
  12. chris1515

    Legend

    Joined:
    Jul 24, 2005
    Messages:
    7,157
    Likes Received:
    7,965
    Location:
    Barcelona Spain
    It was the first plan before going with NVIDIA. My friend who was working at Quantic Dreams tells me the two CELL plan is an urban legend but they wanted to do a GS 2 on PS3 and change late in development for the NVIDIA RSX.
     
  13. chris1515

    Legend

    Joined:
    Jul 24, 2005
    Messages:
    7,157
    Likes Received:
    7,965
    Location:
    Barcelona Spain
    He didn't like the idea technically but he thinks maybe commercially it helps the PS3, some developer could have stop supporting the PS3 with a GS 2...
     
  14. corysama

    Newcomer

    Joined:
    Jul 10, 2004
    Messages:
    190
    Likes Received:
    185
    It's really not as complicated as you are making it out to be. You can draw a triangle using 3 verts, a texture and a blend mode. If you try to, you can draw another triangle that happens to have the same vertex positions, but maybe other stuff that's different. That's all there is to multipass. There is no scene-level/full-frame-level/anything-complicated-level logic in the GS. Just triangles.
     
    milk, Liandry and chris1515 like this.
  15. bunge

    Regular

    Joined:
    Nov 23, 2014
    Messages:
    725
    Likes Received:
    513
    I don't know if it was an urban legend. The chip was specifically designed for multi-chip...designs. I have no inside knowledge though.
     
  16. chris1515

    Legend

    Joined:
    Jul 24, 2005
    Messages:
    7,157
    Likes Received:
    7,965
    Location:
    Barcelona Spain
    No it is an urban legend... They needed a true GPU on the other side. The plan was to build the PS4 with Cell processor too(multi chip design maybe) or use it in other field like they did building a supercomputer...
     
  17. bunge

    Regular

    Joined:
    Nov 23, 2014
    Messages:
    725
    Likes Received:
    513
    I have no idea if they could use two cell and a GPU. I guessing you're right that two cell and no GPU was not considered though.
     
  18. chris1515

    Legend

    Joined:
    Jul 24, 2005
    Messages:
    7,157
    Likes Received:
    7,965
    Location:
    Barcelona Spain
    In his opinion it was a more interesting idea than the PS3 with an Nividia GPU but it was too risky...
     
  19. MrSpiggott

    Newcomer

    Joined:
    Feb 26, 2005
    Messages:
    116
    Likes Received:
    37
    Location:
    UK
    Have the proposed specifications of GS2 ever leaked?
     
  20. Liandry

    Regular

    Joined:
    Feb 26, 2011
    Messages:
    323
    Likes Received:
    44
    Ok, thanks.
    Why then when multipass used fillrate is devided by two for each pass?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...