google introduces project tango, a smartphone that creates 3D mapped environments
all images courtesy ATAP google
‘as we walk through our daily lives, we use visual cues to navigate and understand the world around us. we observe the size and shape of objects and rooms, and we learn their position and layout almost effortlessly over time. this awareness of space and motion is fundamental to the way we interact with our environment and each other. we are physical beings that live in a 3D world. yet, our mobile devices assume that physical world ends at the boundaries of the screen.’ – johnny lee and the ATAP-project tango team.
over the past year, ATAP google has worked with universities, research labs, and industrial partners from around the world to harvest research relating to the advancement in robotics and computer vision, concentrating the gathered knowledge and technology into a unique mobile phone. known as ‘project tango’, the development proposes to give mobile devices a human-scale understanding of space and motion.
the intelligent project tango platform is focused on the exploration of what might be possible in a mobile device; the current prototype features a 5 inch display containing customized hardware and software designed to track the full 3D motion of the smartphone, while simultaneously creating a map of a users’ environment. the built-in sensors allow the phone to make over a quarter million three-dimensional measurements every second, updating its position and orientation in real-time, combining that data into a single 3D map of the space around you.
‘project tango’ runs android and includes development APIs to provide position, orientation, and depth data to standard android applications written in java, C/C++, as well as a unity game engine.