During Satya Nadella’s Build keynote, he introduced the world to one such tool that may sound a little familiar: Project Kinect for Azure. I wanted to take a little more time to expand upon this project, what it means and the role it will play in enabling developers to apply AI over the real world in profound new ways.
What Satya described is a key advance in the evolution of the intelligent edge; the ability for devices to perceive the people, places and things around them. One of the things that makes Project Kinect for Azure unique and compelling is the combination of our category-defining depth-sensor with our Azure AI services that, together, will enable developers to make the intelligent edge more perceptive than ever before.
...
With HoloLens, we saw incredible results when we took some of the magic of Kinect and applied it in a mixed reality context. The current version of HoloLens uses the third generation of Kinect depth-sensing technology to enable it to place holograms in the real world. With HoloLens we have a device that understands people and environments, takes input in the form of gaze, gestures and voice, and provides output in the form of 3D holograms and immersive spatial sound. With Project Kinect for Azure, the fourth generation of Kinect now integrates with our intelligent cloud and intelligent edge platform, extending that same innovation opportunity to our developer community.
Read more about the technical breakthroughs at the blog article @ https://www.linkedin.com/pulse/introducing-project-kinect-azure-alex-kipman
What Satya described is a key advance in the evolution of the intelligent edge; the ability for devices to perceive the people, places and things around them. One of the things that makes Project Kinect for Azure unique and compelling is the combination of our category-defining depth-sensor with our Azure AI services that, together, will enable developers to make the intelligent edge more perceptive than ever before.
...
With HoloLens, we saw incredible results when we took some of the magic of Kinect and applied it in a mixed reality context. The current version of HoloLens uses the third generation of Kinect depth-sensing technology to enable it to place holograms in the real world. With HoloLens we have a device that understands people and environments, takes input in the form of gaze, gestures and voice, and provides output in the form of 3D holograms and immersive spatial sound. With Project Kinect for Azure, the fourth generation of Kinect now integrates with our intelligent cloud and intelligent edge platform, extending that same innovation opportunity to our developer community.
Read more about the technical breakthroughs at the blog article @ https://www.linkedin.com/pulse/introducing-project-kinect-azure-alex-kipman