I made my first OpenNI skeleton tracking application this weekend. This was an interesting challenge due to a number of reasons. Here's a quick video of the app in use (sorry for the low quality - learning the new camtasia app on the mac):
I will detail the challenges in a follow up post as I fully intend to continue development and release the code for everyone to use. In a nutshell though the problems revolved around:
- The OpenNI libraries are C/C++ with a thin wrapper of .NET, and it seems to throw a lot of exceptions
- What .NET examples exist are either console or Winforms apps, not WPF
- And honestly, the examples are not too useful
The video above doesn't show a lot of what I've put into the app. I'm creating a reusable WPF control that can be used in any application to display skeleton information. I plan to have a lot of other controls for other items too such as hand tracking. I'm trying to wrap the OpenNI .NET library with an injectable service that will have higher level constructs for managing higher level functionality, such as gesture recognition with various algorithms.
One of the next extensions that I plan to add, and after I get that working I'll release the code, is to add a Dynamic Time Warp algorithm for gesture recognition. I dug up some code for this over the weekend (which I'll give reference to when I implement it). it looks real promising to do complicated gesture recognition.
As this evolves, I see building on top of the gesture recognition, into the aforementioned injectible service, a means of routing gesture recognitions into the application so that the application can respond to the gestures. This can allow for quite a number of interesting interactions with the application, such as swipes between screens and graphs, zooms and pans, grabbing and dragging screen elements, and many I'm sure I've not even come up with yet.
I honestly feel that with this type of archtiecture that a lot of what is seen in Minority Report may be possible in today's application. To see what I'm talking about follow this link to my write up on the NUX of Minority Report. I actually think much of this can be done without the "gloves" that he uses. Some gestures, such as hand rotation for fast forward may not be possible, but for that I have also developed infrared "gloves" that can be monitored simultaneously with a WiiMote. I'll write that up soon too.