Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
That's good enough for computer vision and robots, which is where depth sensing is really been developed, but there's a load of noise and black spots for a game. Trying to isolate all the content at a distance between 45% and 55% is going to find a load of samples that shouldn't be present, and miss a load of points that should be included.
An odd thing would be if the auxiliary port's use of USB3 meant it also had internal support for USB3's host-to-host communication features. Since it probably feeds into hardware dedicated to the video stream, that level of flexibility doesn't seem likely to me.
We could debate how large the population of people willing to buy another PS4 or a PC for a local "cloud" compute resource would be relative to the population that is going to embrace a camera Sony isn't doing much to back.
If there were a gameplay element or a content market for stereoscopic video, like if 3D TVs and monitors were actually all the rage, I could imagine something neat like using the camera to make little 3D videos, like a stop-motion LEGO video. It'd probably be a massive can of worms to open in allowing gamers to make content and post it on those social media and video site accounts, though.
That, at least, would be something where a more advanced, accurate, and reliable setup with single color and depth-only camera would be at a disadvantage.
Maybe if a VR headset actually became an accessory, it at least would make a difference for some kind of AR game.
The case where the camera's input somehow can feed back as-is to a pair of human eyes is the one case I can stretch my imagination to even entertain as a point where it is not seriously outclassed.
That's good enough for computer vision and robots, which is where depth sensing is really been developed, but there's a load of noise and black spots for a game. Trying to isolate all the content at a distance between 45% and 55% is going to find a load of samples that shouldn't be present, and miss a load of points that should be included.
Filtering the results to get solid depth data would likely lead to noisy margins instead of clean cutouts.
Perhaps the reason 3D hasn't been shown much is it's really not ready, but the dual-camera setup barely costs more and provides better 2D options, so that's why Sony went that route, and bunged in 3D because they could without too much trouble.
I think I'm missing your point. Are you saying that the tech can't be even better with the stereoscopic cameras and with a higher resolution? Are you trying to say work has completely stopped in these fields by Sony (and others)?Just pointing out the obvious that these are from 2009.
Thank you, Patsu! I could not find it, under the incorrect search phrases I used.However, I did find some other videos in the process (done with the PS3's PS Eye).
Face-tracking Library (w/ Augmented Reality)
http://www.youtube.com/watch?v=gOtPVof2K94&feature=youtu.be&noredirect=1
Voice Recognition Library
http://www.youtube.com/watch?v=5TaUsSy-f0Y&feature=youtu.be&noredirect=1
Edit: will add to OP
I think I'm missing your point. Are you saying that the tech can't be even better with the stereoscopic cameras and with a higher resolution? Are you trying to say work has completely stopped in these fields by Sony (and others)?
I am under the impression that most motion games tracks moving objects. They don't really care about the background.