Unreal Engine 4 Demo

In an amazing realtime demo at the Game Developers Conference in Cologne Epic technical artist Alan Willard showed off features of  the coming Unreal Engine 4.

 

Tweet about this on TwitterShare on Google+Share on FacebookPin on PinterestShare on RedditShare on LinkedInShare on StumbleUponEmail this to someone

Coding for Kinect

If you want to get started with coding for the Kinect and tend to use Microsofts API instead of OpenNi, I highly recommend to have a look at Microsofts Channel 9 – Coding4Fun articles on the Kinect. Lots of practical examples with source code and technical background info there. For all the Webdudes out there they even have examples for a Kinect WebSocket Server: WebSocketing the Kinect with Kinection; Kinect, HTML5, WebSockets and Canvas.

Not all examples have been updated to use the current Kinect for Windows SDK that established a new interface for the Kinect access. So check the dependencies if you want to build one.

Also worth a look is the Kinect for Windows blog.

Tweet about this on TwitterShare on Google+Share on FacebookPin on PinterestShare on RedditShare on LinkedInShare on StumbleUponEmail this to someone

A Glimpse Of The New Unreal Engine

What will videogames look like in the near future? Epic Games offered a glimpse at the future of graphics in the form of a real-time animation demo that showed off the eye-popping capabilities of the upgraded Unreal Engine 3.

[via wired.com}.

Tweet about this on TwitterShare on Google+Share on FacebookPin on PinterestShare on RedditShare on LinkedInShare on StumbleUponEmail this to someone

Controling WoW with a Kinect

A first shot at implementing more complex interaction with the Kinect by the Institute for Creative Technologies at the University of Southern California.

They are developing the Flexible Action and Articulated Skeleton Toolkit (FAAST), which is middleware to facilitate integration of full-body control with games and VR applications. FAAST currently supports the PrimeSensor and the Microsoft Kinect using the OpenNI framework.

In this video, they show how FAAST can be used to control off-the-shelf video games such as World of Warcraft. Since these games would not normally support motion sensing devices, FAAST emulates keyboard input triggered by body posture and specific gestures. The controls can be dynamically configured for different applications and games.

FAAST is free software that uses the OpenNI. It is currently prepared the for an open-source release. You can download FAAST at: http://people.ict.usc.edu/~suma/faast/

Tweet about this on TwitterShare on Google+Share on FacebookPin on PinterestShare on RedditShare on LinkedInShare on StumbleUponEmail this to someone