PlayStation Move technology thread

Discussion in 'Console Technology' started by messyman, Jun 19, 2010.

  1. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56
    Sony has a 1 lens 3D technology that they use in their stereo cameras, they even have a full body motion controller like Kinect called ICU maybe they will use that for next gen.
     
  2. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Kinect isn't dependent on the amount of light reflected, but the distortions of a known pattern, which, although affected by different amounts of absorption, are a far more robust solution to a straight intensity evaluation. If sophisticated software could be developed to map and track areas of different intensity, eg. calibrating to a standard pose, with fast enough refresh it could work, but would likely be prone to contrast issues like EyeToy. This is where project patterns and pulsed light come in, which are probably covered completely by patents.
     
  3. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    Yeah, Dr. Marks mentioned (in one of the interviews) that Sony has filed for this patent while working on PSEye. He discovered it accidentally when he saw his own video on a TV screen even though all the office lights were turned off. Turns out that the PSEye is sensitive enough to see him under low light condition (i.e., illuminated by light from the TV screen).

    I presume you need to be quite close to the box, may be like a laptop ?
     
  4. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56




    they can track the depth by size like they do with the Move and face tracking and the Playstation Eye already work in the dark only using the light from the TV.






    Edit: I said kinect will have the same problem because some materials don't reflect IR light. and if it's black the system will just see it as part of the background.
     
    #44 onQ, Jul 12, 2010
    Last edited by a moderator: Jul 12, 2010
  5. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    They're tracking the face there, which provides several points of contrast always present against a predicatable background - eyes and mouth contrasting with skin tone. Full body tracking a la Kinect would require 'seeing' the torso and limbs. A pale top against a background throws up issues. EyeToy failed to detect my hand contrasted against a pale-blue wall in bright conditions as there was not enough contrast to register the pixels as different within the threshold needed to accomodate noise. If PSEye were to read my hand as forwards, if I then move it to one side and similarly coloured pixels are behind, how does it know that the pixels occupying that position are background and not hand moved back? Given human physiology, it would be possible to determine what's happened as clearly the hand can't move back 3 metres to where the wall is in one frame, but you'd need some funky code-voodoo to pull of a robust full-body tracking system achieved optically from a single camera. The best I know of so far is an 8 camera motion capture system. There's a YouTube vid of a 3 camera system on PS3 Linux, but no skeleton to see how effective it actually is.
     
  6. catisfit

    Regular

    Joined:
    Oct 27, 2007
    Messages:
    638
    Likes Received:
    0
    Okay that makes sense. I posted some back of the envelope calculations in the old thread stating I didn't think it was possible to get sub-millimetre depth accuracy without u/s, based on the camera resolution and distance between camera and controller.

    I was wondering how they were doing it as I didn't think it was possible, if it turns out they aren't then it all makes sense!
     
  7. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56
    this is why I said if they used a IR emitter & a lense cap that only let the IR in so it would only be seeing the IR light that's reflecting back from your body.

    and even without that they can use dynamic background extraction so it can know your hands from something that's in the background.

    I never said it would be perfect I just asked could it be done.

    edit:



    see the part at about the 2 minute mark when he was reaching his hands into the scene I'm talking about if they used a IR emitter to do that on a bigger scale.
     
    #47 onQ, Jul 12, 2010
    Last edited by a moderator: Jul 12, 2010
  8. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    44,106
    Likes Received:
    16,898
    Location:
    Under my bridge
    Background removal is tricky, as Sony themselves said. There's a good example in Kung Fu live (starts 2:30) that shows the blobbyness of background removal. Very similar to the point-cloud silhouettes of Kinect.

    Without a projected pattern, recreating Kinect in PSEye with a filter cap as you suggest won't work IMO. Light intensity alone won't cut it, which is why every company doing this either has multiple cameras or some projected pattern.
     
  9. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56
  10. Arwin

    Arwin Now Officially a Top 10 Poster
    Moderator Legend

    Joined:
    May 17, 2006
    Messages:
    18,762
    Likes Received:
    2,639
    Location:
    Maastricht, The Netherlands
    Thanks, great interview! It was a lot of work but the information is so dense in that interview and one of the best I've seen so far, so I wrote it out (hopefully noone else did that yet in English or I'll feel silly, it took me an hour ;) ).

     
  11. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56
    #51 onQ, Jul 15, 2010
    Last edited by a moderator: Jul 15, 2010
  12. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    Wow, you should leave the entire content on techradar so they can make some dough from the traffic.
     
  13. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56
    LOL ok
     
  14. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56
  15. onQ

    onQ
    Veteran

    Joined:
    Mar 4, 2010
    Messages:
    1,540
    Likes Received:
    56
  16. patsu

    Legend

    Joined:
    Jun 25, 2005
    Messages:
    27,709
    Likes Received:
    145
    I can't see any of the above images and video, onQ. Am on an iPad. YouTube videos should work.

    EDIT:
    Forgot the China government blocked YouTube.
     
  17. Brad Grenz

    Brad Grenz Philosopher & Poet
    Veteran

    Joined:
    Mar 3, 2005
    Messages:
    2,531
    Likes Received:
    2
    Location:
    Oregon
    The RTS controls look really good. The idea of painting units instead of drag selection is really good. And the speed with which you could zoom in and out seems ideal for the genre.
     
  18. dumbo11

    Regular

    Joined:
    Apr 21, 2010
    Messages:
    440
    Likes Received:
    7
    There are other interviews kicking around with Dr Marks and Anton that are interesting, although not necessarily very technical. (the hiphopgamer interview is funny even for me, nowgamer (I think) have a more 'financial/strategic/less graphically interesting' interview).

    Looking at the GT racing wheel, does anyone else think that maybe those recent racers could be patched for move support? (I think they were very high budget, and a 'partial-re-release' along with sony marketing at a better time of the year might reduce their losses?)
     
  19. djskribbles

    Legend

    Joined:
    Jan 27, 2007
    Messages:
    5,257
    Likes Received:
    667
    For the wheel attachment, wouldn't the eye lose track of the ball when the wheel is turned? If they put the trackball at the top instead of the side, at least you could turn a decent amount without it getting lost behind the base, but the way they have it now, wouldn't it lose track for left turns? Oh well... I love my DFGT.
     
    #59 djskribbles, Jul 23, 2010
    Last edited by a moderator: Jul 23, 2010
  20. dumbo11

    Regular

    Joined:
    Apr 21, 2010
    Messages:
    440
    Likes Received:
    7
    The eye tracks the glowing ball, and finds the linear position of the Move controller. "you moved up 8 pixels, left 7.2 and back 1cm".

    For a steering wheel, you want the rotational position of the controller - provided by the gyroscope/magnetometer inside the Move.

    i.e. you could unplug the eye and technically the move would still function as a steering wheel to the same of degree accuracy.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...