News from the famous voxel rendering engine Unlimited Detail by the australian company Eyclideon. Deemed vapor ware by some they have now released information about Geosphere a software for viewing large geospatial voxel data sets. The technology can now also be licensed in form of an SDK.
Still some of the criticism, eg. that animation and lighting can’t be done in voxels, voiced by people like John Carmack might be right after all and the reason for the shift from the field of gaming to geospatial visualization where this is not a requirement.
Still very impressive:
Interesting embedded software development blog: CNXSoft
The PTP daemon (PTPd) implements the Precision Time protocol (PTP) as defined by the relevant IEEE 1588 standard. PTP Version 1 implements IEEE-1588-2002, and PTP Version 2 implements IEEE-1588-2008. PTP was developed to provide very precise time coordination of LAN connected computers. The source code is available on github.
They improved on the Microsoft implementation with their algo called KinFu Large as they are able to scan multiple volumes in on pass allowing to scan larger scenes in one go.
The point cloud library (PCL) is available as prebuild binaries for Linux, Windows and OSX as well as in source code from their svn repository. The code relies heavily on the NVidia CUDA development libraries for GPU optimizations and will require a compatible GPU for best results. Information on how to setup your own build environment and the required dependencies is available from their site.
The sources for a GPU accelerated video player for the Raspberry Pi developed by the XBMC Raspberry PI project are available on GitHub: huceke/omxplayer.
The V Motion Project is a visually powerful Kinect based musical “instrument” that was developed by multiple artists for a marketing campaign.
On the technical side they found a very creative steam punk like solution for the problem of multiple kinects interfering with each other:
Matt Tizard found a white paper and video that explained an ingenious solution: wiggle the cameras. That’s it! Normally, the Kinect projects a pattern of infrared dots into space. An infrared sensor looks to see how this pattern has been distorted, and thus the shape of any objects in front of it. When you’ve got two cameras, they get confused when they see each other’s dots. If you wiggle one of the cameras, it sees its own dots as normal but the other camera’s dots are blurred streaks it can ignore. Paul built a little battery operated wiggling device from a model car kit, and then our Kinects were the best of friends.
A Music Hack Day event in Boston has yielded a funny little web app The Infinite Jukebox creates an infinitely long and ever-changing version of uploaded tracks and visualizes the process.
It also shows how todays chart music material is copy an pasted together in the studio as it performs really well on such material meaning you can’t really hear the cuts. On older tracks, were larger portions of the song are recorded in one take, it doesn’t work that well.
If you want to test. At the moment it only seems to work with Chrome and Safari not with Firefox.
Once you’ve got your base video (or videos) in Popcorn Maker, adding elements to it is as simple as grabbing one of the “events” from the right hand side of the editor and dragging it onto either the video stage itself, or the timeline below. Once your event is in the timeline you can change the settings, resize it, move it around and otherwise tweak it to behave the way you’d like. And finally just click the share link and Popcorn Maker will give you either a link (or an embed code) you can paste anywhere on the web.
To get started remixing videos, head on over to the Popcorn Maker site. If you want to see some examples, check out new Popcorn Maker projects on Webmaker.org. For more background on how Popcorn Maker works, check out the Mozilla Hacks blog post.