They improved on the Microsoft implementation with their algo called KinFu Large as they are able to scan multiple volumes in on pass allowing to scan larger scenes in one go.
The point cloud library (PCL) is available as prebuild binaries for Linux, Windows and OSX as well as in source code from their svn repository. The code relies heavily on the NVidia CUDA development libraries for GPU optimizations and will require a compatible GPU for best results. Information on how to setup your own build environment and the required dependencies is available from their site.
The preliminary source code is currently available in their SVN repository’s. The code relies heavily on the NVidia CUDA development libraries for GPU optimizations and will require a compatible GPU for best results.
Besides the Kinect the library supports several other sensors. Moving forward, the developers want to continue to refine and improve the system, and are hoping to improve upon the original algorithm in order to model larger scale environments in the near future. The code is still beta, a stable release is planed to coincide with the upcoming PCL 2.0 release.
I’m definitely looking forward for to what the Kinect community is going to do with that.
PrimeSense the company providing the technology behind Microsofts Kinect is working with Asus on the long awaited 🙂 replacement for the TV Remote. One result of this cooperation is a new product by Asus the “WAVI Xtion”. The technology behind it is the same as the Kinects. Other than Microsoft Asus is not aiming for the gaming market but the living room in general. Seeing it as the dooropener for bringing the PC into the living room.
Amongst other things one important part of this development is to increase the resolution of the sensor to allow tracking of smaller structures like fingers.
A first shot at implementing more complex interaction with the Kinect by the Institute for Creative Technologies at the University of Southern California.
They are developing the Flexible Action and Articulated Skeleton Toolkit (FAAST), which is middleware to facilitate integration of full-body control with games and VR applications. FAAST currently supports the PrimeSensor and the Microsoft Kinect using the OpenNI framework.
In this video, they show how FAAST can be used to control off-the-shelf video games such as World of Warcraft. Since these games would not normally support motion sensing devices, FAAST emulates keyboard input triggered by body posture and specific gestures. The controls can be dynamically configured for different applications and games.
There seems to be some progress on getting OpenNI going on the Mac. PrimeSense has also joined the effort. Currently they are working on getting the NITE Middleware going. NITE provides basic hand and skeleton tracking on top of OpenNI.
The company PrimeSense, providing the sensor for Microsofts Kinect-Kamera together with Willow Garage and Side-kick released an official Open-Source-Treiber for the 3D-Kamera for Windows and Linux (Ubuntu since version 10.10).