Last but not least. Valve started a royalty-free licensing program to use lighthouse technology for third-party products. Licensees will need to pay $2,975 to attend a training course, but other than that, there’s no licensing fees or royalties for using the tech.
Valve provides a Lighthouse ‘Licensee Dev Kit’ to companies who apply to use the technology. It includes:
Dev Kit Contents
A modular reference tracked object suitable for attaching to prototype HMDs or other devices
Full complement of EVM circuit boards to enable rapid prototyping of your own tracked object
40 individual sensors for building your own tracked object
Accessories to enable custom prototypes
Software toolkit to assist with optimal sensor placement
Calibration tools for prototyping and manufacturing
Schematics and layouts for all electronic components
Mechanical designs for the reference tracked object and accessories
I remember when the Kinect was first released, and later when the first 3D AR demos leveraging Kinect Fusion style algorithms became public, thinking that this would be a very interesting thing to have on a mobile platform.
Now Google has made its Project Tango public that is promising to deliver exactly this.
Their protoype a 5″ phone is capable of tracking full 3D motion while simultaneously creating a 3D map of its environment. Running on Android APIs provide position, orientation, and depth data to standard Android applications written in Java, C/C++, as well as the Unity Game Engine.
Algorithms, and APIs are still in active development. You can apply for a developer program to receive one of the 200 prototype device currently available. They expect to distribute all of our their available units by March 14th, 2014.
On re:publica 2013 the Berlin based german data designers from OpenDataCity created a wifi tracking network with 100 Access Points that allowed them to visualize the movements of about 6,700 different electronic devices during the conference.
The application called re:log is a dynamic map of the conference location that shows the approximate locations of the devices when they were connected to the local WiFi hotspots. An interactive timeline underneath allows to explore the dynamic changes over time, while a rectangular area can be drawn to more specifically highlight and follow a smaller amount of dots.
The visualization was based on tracking the MAC addresses of the devices according to the WiFi hotspot they were connected to. This data, which can be downloaded, was fully anonymized, yet the authors mention their desire to allow people to look up their own MAC address in the future.
I suspect the solution used is based on MagicMap a free Wifi/Bluetooth tracking architecture developed at the Humboldt-University Berlin. Their Wiki has some more information.
They improved on the Microsoft implementation with their algo called KinFu Large as they are able to scan multiple volumes in on pass allowing to scan larger scenes in one go.
The point cloud library (PCL) is available as prebuild binaries for Linux, Windows and OSX as well as in source code from their svn repository. The code relies heavily on the NVidia CUDA development libraries for GPU optimizations and will require a compatible GPU for best results. Information on how to setup your own build environment and the required dependencies is available from their site.