OpenNI on the Mac

There seems to be some progress on getting OpenNI going on the Mac. PrimeSense has also joined the effort. Currently they are working on getting the NITE Middleware going. NITE provides basic hand and skeleton tracking on top of OpenNI.

Datamining power tool

After purchasing Freebase the public graph database Google has also taken Freebases data mining application Gridbase under its wings. Now named Google Refine it still is a power tool for working with messy data, cleaning it up, transforming it from one format into another, extending and correcting it with web services, and linking it to databases like Freebase.
Version 2.0 has just been announced. While you’re at it. Also have a look at Freebases powerful API that provides access to an amazingly big community build ontology. You can play with it in the Query Editor.

Back on the original topic. There are some introduction videos on Google Refine online:

Precise indoor smartphone tracking

Shopkick uses the build in microphone of a smartphone, currently iOS or Android based, to triangulate its position via a network of ultrasonic beacons. Other then satellite based systems like GPS this allows a more precise location tracking especially in indoor situations where GPS is not or poorly available. Shopkick did a first installation in a Best Buy in San Francisco as part of a POS application.

See also this article here: technologie review article …

Kinect – The Community Rules

Once again the open source community shows the industry how it’s done. The demos and concepts that have been development since the community cranked out a driver/library for the kinect in just a few days after its release are quite astounding. Good for Microsoft that they seem to have understood that they shouldn’t blockade that movement. Soon they will start copying its ideas I guess.

I try to follow the work of Oliver Kreylos a german researcher at the university of california. He was one of the first generating a mesh from the voxel data provided by the kinect and texturing it with the rgb data from the second camera.

In his latest work he uses two calibrated kinects in combination and thus allows an object to be scanned from two sides at the same time without rotating it. A lot of people thought that the kinects structured light sensor wouldn’t be able to deal with two overlapping dot fields. But it seems to work, have a look: