KinectCommander

Here we highlight the projects members of the lab are working on.

KinectCommander – NUI hybrid with Visual Studio 2012/2013

Project Lead: Den Delimarsky

Project Members: George Swartzendruber

kinect-sensor[1]

Looking at the evolution of integrated development environments (IDE), it is easy to notice that as more functionality is being added, the less discoverable it  becomes both for new and professional developers. Going beyond that, some capabilities are being used more frequently than others, yet being deeply entrenched in the tree-based menu system. KinectCommander is built to simplify and speed-up access to a wide variety of IDE capabilities by providing a direct communication layer via Natural User Interface interactions.

At the foundation of the system is the a single hardware component – the Microsoft Kinect sensor. With the help of the public Windows-based SDK, we are able to read and identify a variety of indicators and signals, such as facial movement, hand gestures and voice commands. Those are later processed by the intermediary layer developed at SERL and are passed to Visual Studio as individual actions. For example, a “debug” voice command will launch the debugger for the project in its current state. Slight head movement can be used to perform view frame scrolling. Ensuring that the system is suitable for a variety of conditions and preferences, it is built on top of an extensible platform, where custom commands and gestures can be set by the developer to fit their individual needs.

The core benefits of KinectCommander are:

  • Increased efficiency: focus on the code and streamline any additional procedures, such as deubugging or profiling.
  • Get things done in a more natural way: use your voice and gestures to control the flow of your program and easily find out what the IDE has to offer to accomplish your tasks.
  • Empower developers: a physical disability should not be a barrier for someone who wants to code. KinectCommander ensures that anyone is able to write code with minimal physical effort.

The project was started under an Undergraduate Research Grant at Wichita State University in August 2011.