On re:publica 2013 the Berlin based german data designers from OpenDataCity created a wifi tracking network with 100 Access Points that allowed them to visualize the movements of about 6,700 different electronic devices during the conference.
The application called re:log is a dynamic map of the conference location that shows the approximate locations of the devices when they were connected to the local WiFi hotspots. An interactive timeline underneath allows to explore the dynamic changes over time, while a rectangular area can be drawn to more specifically highlight and follow a smaller amount of dots.
The visualization was based on tracking the MAC addresses of the devices according to the WiFi hotspot they were connected to. This data, which can be downloaded, was fully anonymized, yet the authors mention their desire to allow people to look up their own MAC address in the future.
I suspect the solution used is based on MagicMap a free Wifi/Bluetooth tracking architecture developed at the Humboldt-University Berlin. Their Wiki has some more information.
Xamarin the company behind Mono the .NET runtime for Linux, iOS, MacOS and Android has just announced that they got the Java part of Android ported to C# via machine translation. They claim some serious performance gains over Dalvik. For them, this is an experiment that they are not planning to focus on, but they will be using some of the technologies in Mono for Android. As Part of the project they improved the automated Java to C# translator “Sharpen”. Their version of Sharpen besides the code of the Android port itself is available on Github.
Reflection for Mac allows to mirror an iPad 2 or iPhone 4s screen on a Mac for presentations etc. By default the image gets scaled to 1280x720px but it’s also possible to use the native device resolution. A detailed test of the application you can find here (German).
It’s an augmented-reality, OCR-capable translation app, but that’s a poor description. A better one would be “magic.” World Lens looks at any printed text through the iPhones camera, reads it, translates between Spanish and English. That’s pretty impressive already — it does it in real time — but it also matches the color, font and perspective of the text, and remaps it onto the image. It’s as if the world itself has been translated.