A VR HMD currently in development.
If you want to get started with coding for the Kinect and tend to use Microsofts API instead of OpenNi, I highly recommend to have a look at Microsofts Channel 9 – Coding4Fun articles on the Kinect. Lots of practical examples with source code and technical background info there. For all the Webdudes out there they even have examples for a Kinect WebSocket Server: WebSocketing the Kinect with Kinection; Kinect, HTML5, WebSockets and Canvas.
Not all examples have been updated to use the current Kinect for Windows SDK that established a new interface for the Kinect access. So check the dependencies if you want to build one.
Also worth a look is the Kinect for Windows blog.
Gaikai another cloud gaming provider launched their service into the beta phase. You can register on their page. After that you can play up to 20 PC and console games for free.
In Berlin the Computerspielemuseum opened its doors. Providing one of the biggest gaming collection worldwide it allows a retrospection on 50 years of gaming.
A first shot at implementing more complex interaction with the Kinect by the Institute for Creative Technologies at the University of Southern California.
They are developing the Flexible Action and Articulated Skeleton Toolkit (FAAST), which is middleware to facilitate integration of full-body control with games and VR applications. FAAST currently supports the PrimeSensor and the Microsoft Kinect using the OpenNI framework.
In this video, they show how FAAST can be used to control off-the-shelf video games such as World of Warcraft. Since these games would not normally support motion sensing devices, FAAST emulates keyboard input triggered by body posture and specific gestures. The controls can be dynamically configured for different applications and games.