Interactive Projection Mapping

Projecting onto buildings has become a somehow common thing. I have never seen it combined with interactivity and data visualization though.

Come to your Census[spinifexgroup.com] was developed by Spinifex for the occasion of Vivid Sydney, a spectacular festival around the theme of light which happened about one month ago.

The installation consisted of 3 different parts that visualized several datasets from the Australian Bureau of Statistics. The main interactive projection mapping showed data like Age & Gender, Country of Birth, Mode of Transport and Occupation of every postcode in Australia, which then could be interactively steered from a multitouch table in front of it. Another projection mapping displayed a normal infographics-based animation, while several interactive kiosks offered access to the ABS Spotlight website, an infographic story-tellingw website that describes several census statistics from a personal perspective.

[via infosthetics.com|

Tweet about this on TwitterShare on Google+Share on FacebookPin on PinterestShare on RedditShare on LinkedInShare on StumbleUponEmail this to someone

The Leap: Gesture control like Kinect

A new USB device, called The Leap, by Leap Motion, creates an 8-cubic-feet bubble of “interaction space”. It claims to detect your hand gestures down to an accuracy of 0.01 millimeters — about 200 times more accurate than “existing touch-free products and technologies,” such as your smartphone’s touchscreen… or Microsoft Kinect.

Wireds Gadget Lab had a detailed first look at the device.

The Leap is available for pre-order right now and will ship sometime during the December-through-February time frame. Leap Motion will make an SDK and APIs available to developers, and plans on shipping the first batches of the hardware to developers as well. An application to sign up to be one of the first coders to work with the the Leap is on the company’s site

 

Tweet about this on TwitterShare on Google+Share on FacebookPin on PinterestShare on RedditShare on LinkedInShare on StumbleUponEmail this to someone

KinectFusion – Open Source

UDPATE: There is a more recent post on this topic here: Open Source Kinect Fusion – Update

Developers of the open source Point Cloud Library have implemented the Kinect Fusion Algorithm as published in the Paper by Microsoft.

The preliminary source code is currently available in their SVN repository’s. The code relies heavily on the NVidia CUDA development libraries for GPU optimizations and will require a compatible GPU for best results.

Besides the Kinect the library supports several other sensors. Moving forward, the developers want to continue to refine and improve the system, and are hoping to improve upon the original algorithm in order to model larger scale environments in the near future. The code is still beta, a stable release is planed to coincide with the upcoming PCL 2.0 release.

I’m definitely looking forward for to what the Kinect community is going to do with that.

 

 

Tweet about this on TwitterShare on Google+Share on FacebookPin on PinterestShare on RedditShare on LinkedInShare on StumbleUponEmail this to someone

Kinect as a 3D Scanner – reloaded

Maybe you remember some months ago Microsoft Research published information about KinectFusion, an application that allowed to incrementally digitize a 3D scene with the kinect. Until now they haven’t made this available to the public. Now some engineers have been able to reproduce the functionality and made it available for non commercial use.

ReconstructMe works on top of the OpenNI Framework and thus can also use the Asus Xtion sensor. The Application makes use of the GPU for generating the 3D data. If you don’t have a GPU powerful enough you can still process recorded data in just not in realtime then. The generated 3D scene can be exported in the STL and OBJ format.

Tweet about this on TwitterShare on Google+Share on FacebookPin on PinterestShare on RedditShare on LinkedInShare on StumbleUponEmail this to someone