GAIM Developer Documentation

From EQUIS Lab Wiki

Jump to: navigation, search

The General Active Input Model (GAIM) framework provides utilities for developers of active/exercise games. This page is provided as a resource for GAIM developers.

Contents

Classifying Input For Active Games

Many active games are developed around specific hardware platforms, such as accelerometers, camera input, or pedal and steering input. Stach and Graham reviewed 114 active games for the purpose of classifying input techniques found throughout these games, abstracted away from the input hardware. Previous analogous classifications have been carried out for traditional input techniques (mouse, keyboard) and tabletop computing. Hardware-independent input includes power, pointing, stance, gestures, tap, and continuous control. These can be used in conjunction with traditional input (mouse, keyboard, gamepad, or a joystick). Many existing hardware-dependent games could make use of alternative hardware should they make use of a toolkit which abstracts input type from the hardware device being used. This would allow game developers to focus on game content and story rather than low-level input peripherals. This would also make games more portable and allow for people with different hardware to play together.

Levels of Abstraction

Input can be abstracted from specific devices in two stages. Device-specific data can be abstracted to the level of which the data's origin could be from one of many devices in the same class. The data is then again abstracted to the level of input techniques, which could employ a variety of device classes.

Device Layer

At the device level, input data from individual devices is collected, implementing the interfaces described at the abstract device level. An example of which is the Tunturi E6R, which implements the IBike and IHRMonitor interfaces, providing properties such as power wattage generated and current heart rate. The device level is a thin layer above any available proprietary 3rd party or existing APIs for enabled devices.

Abstract Device Layer

The abstract device level refers to broad classes of input devices, or devices with a substantial amount shared functionality. These include exercise equipment (bikes, treadmills, etc.), accelerometers, cameras, exertion or resistance devices, pads, and mats. For example, Bike input can refer to many instances of exercise bicycle hardware devices, such as the Tunturi E6R or the FitXF PCGamerBike Mini. Camera input can represent input from standard webcams, Sony Eyetoy cameras, or Project Natal input.

Input Layer

The input level refers to the high-level hardware-independent or abstract input in active games. This level of input includes the aforementioned input types: gesture, stance, pointing, power, continuous control, and tap.

General Active Input Model (GAIM) Framework

The General Active Input Model (GAIM) framework incorporates the six aforementioned input techniques for use in active games, with an architecture that abstracts low-level device code and device class information, enabling game developers to design around forms of input rather than specific input devices. This framework will make games more portable, and allow developers to focus on game content and story rather than low-level input. It will also allow people with different devices to play together.

In the following sections, each of the input techniques supported by the framework are described in detail. Abstract and concrete devices can be used by means of calling methods in GAIM's Device Manager utility, which relies upon a user-generated configuration text file indicating which devices are available and connected.

The GAIM framework is written in Visual C# / XNA Studio 3.0 (except where otherwise noted), allowing for easy integration with existing XNA-based games.

If you are a game developer intent on using GAIM, please refer to the Developing Games with GAIM wiki page.

Power / IPower

Power input refers to physical exertion exerted by the player, typically measured in watts generated, or as a function of cadence (when using an exercise bicycle, treadmill, rowing machine, or other exercise device), or heart rate. Power input can be directly mapped to game behaviour, such as in the context of racing games in which a greater wattage generated or higher heart rate results in faster speed.

The IPower interface supplies one property, Power. An abstract device which implements this interface may represent a variety of device classes, such as exercise bicycles, heart rate monitors, or other fitness / exertion devices. Currently, device classes supported by the framework include exercise bicycles and heart rate monitors. Ideally, the sole property (Power : double) would be measured in Watts generated. Ongoing research is being carried out to convert data to from devices which do not directly report power generated. At present, some devices report power as a function of pedal cadence (for bicycles) or heart rate (for heart rate monitors).

// properties of IPower
double Power; // current power generated in watts

BikePower

A BikePower device is an input layer device representative of all exercise bicycles with shared functionality (such as pedal cadence). BikePower implements the IPower interface, providing one property Power, which depending upon the device, is either directly measured in watts generated, or as a function of pedal cadence and pedal tension. BikePower creates and calls into an IBike device at the abstract input layer.

IBike

The IBike interface describes properties common to exercise bicycles, such as pedal power, pedal cadence, pedal tension, and pedal distance. The interface also defines properties for device status and type. While not all bicycles provide data as to the amount of power generated by a user, a goal of the current research is to determine the correct equivalent of power generated based on cadence and tension. In addition, some devices do not report tension; the PCGamerBike, for instance, allows for manual manipulation of tension, but not programmatic manipulation or observation.

//properties of IBike
int DeviceStatus; // status of bike (1 if connected, 0 otherwise)
string DeviceType; // name of device
doublge Speed; // pedal speed in mph
double pedalCadence; // pedal cadence in rpm
double Power; // power generated in watts
double PedalDistance; // distance traveled
int pedalTension; // current tension
PCGamerBike

The FitXF PCGamerBike Mini is a compact exercise bicycle, or rather, a pair of pedals, equipped with a USB connection, text display, and can be pedaled easily by placing it in front of a desk chair. When connected, the FitXF server software can read data from the device.

The PCGamerBike class implements the IBike and IDirection interfaces. This class retrieves data from the FitXF server using an unmanaged FitXF .dll by way of a managed C++ layer. The FitXF server, which comes included with FitXF hardware, provides several utilities to the developer and the user. For the user, a configurable GUI can display bike information in real-time as the user pedals, including pedal direction, cadence, and distance traveled. The GUI also allows for customized key-mappings (for use in games), which enables the user to map forward pedaling to the forward keystroke, and backward pedaling for the backwards keystroke.

FitXF provides a C++ .dll which provides the software developer with methods that call into the FitXF server and retrieve device data. There is substantially more data available through the .dll than from the GUI, with device status functions, interface functions, as well as bike data retrieval functions available. A managed C++ layer retrieves device data from the aforementioned unmanaged C++ .dll. This layer provides data to the PCGamerBike class. Not all functions available in the .dll are used; functions used include those which satisfy the requirements of the IBike and IDirection interfaces.

TunturiBike

The Tunturi E6R is a fully-featured exercise bicycle, with a graphical display and COM port connection, which transfers bike data to a connected computer with a small delay. It also features a heart rate monitor wireless signal receiver.

The TunturiBike class implements the IBike and IHRMonitor interfaces. The class retrieves data from the bike via a managed C++ layer over an existing unmanaged C++ API written for accessing bike data. The CAXInput library is part of the larger CAXCore API, written in C++ by former EQUIS researcher Will Roberts for the purpose of developing exercise video games with the Tunturi E6R.

The TunturiManaged project component of the GAIM framework uses a subset of source and header files from this API, namely those pertaining to bicycle I/O. This includes low-level and high-level I/O functions, as described in CAXLogger.{h/cpp}, InputTunturi.{h/cpp}, InputSerial.{h/cpp}, and InputBicycle.{h/cpp}. The latter of which contains high-level bicycle I/O functions such as getPowerOutput() and setTension(int t), and has dependencies on the low-level functions defined in the latter 3 classes. A managed C++ layer retrieves device data from the unmanaged C++ CAXInput API. This layer provides data to the TunturiBike class. Not all functions available in the API are used; functions used include those which satisfy the requirements of the IBike and IHRMonitor interfaces.

IDirection

The IDirection interface defines properties of devices with directional control, such as the PCGamerBike and its ability to recognize which direction the user is pedalling. The definition of the IDirection interface and how it is implemented is still largely unknown, and will likely be established gradually as the GAIM framework is used to develop games with non-active game peripheral devices (such as gamepad controllers with directional joysticks), or active game controllers containing directional 3D accelerometers such as the Wiimote.

TargetHRPower

TargetHRPower Implements the IPower interface, providing one property, Power. A TargetHRPower device retrieves a user's heart rate by making use of a IHRMonitor device (i.e. an HRMI or TunturiBike - see below). With a user's heart rate, as well static properties for the user's age and resting heart rate, power can be calculated using the difference between the user's heart rate and their target heart rate, based on formulas for calculating heart rate defined in the next section.

// properties of TargetHRPower
double HR; // user's current heart rate
double Power; // power/speed as a function of current HR, age, resting HR
static double MaxSpeed; // top speed allowed in the game
static int Age; // user's age in years
static double RestingHR; // user's resting HR in bpm

Target HR calculations

When a player is at rest, a scaled, normalized heart rate is reported as 0. When a player reaches his/her target heart rate, it is reported as 1. Target heart rate is calculated based on a user's resting heart rate and their maximum heart rate. Maximum heart rate is roughly equivalent to 207 = (0.7 * age). Reserve heart rate is equivalent to maximum heart rate - resting heart rate. Target heart rate is subsequently calculated as the intensity (set to 70% as a default for the aerobic zone) multiplied by the reserve heart rate, then added to the resting heart rate. The reported heart rate is then scaled and normalized between the resting and target heart rate. Power is calculated using this value, multiplied by the maximum possible in speed allowed by the game, a static property. The target heart rate calculations as described by Stach et al make use of a small nimbleness factor, which reflects the current cadence of the player. As this does not always apply in situations where different devices are in use, nimbleness is ignored in the GAIM framework.

  • Stach, T., Graham, T.C.N., Yim, J., and Rhodes, R. Heart rate control of exercise video games. Proceedings of Graphics Interface 2009.
double hr = 0;
double reserveHR = (207 - (0.7 * age)) - restingHR; // maxHR = 207 - (0.7 * age)
double targetHR = (intensity * reserveHR) + restingHR;
if (hrMon.HR > restingHR)
{
   hr = (hrMon.HR - restingHR) / (targetHR - restingHR); // normalized HR
   hr = Math.Log10((10 - 1) * hr + 1); // scaled, normalized HR
}
return hr; // return scaled and normalized HR (1 or 0)
IHRMonitor

The IHRMonitor interface defines one property, Heart Rate, meaning that classes which implement IHRMonitor represent devices equipped with the ability to report a user's heart rate in beats per minute (BPM).

// properties of IHRMonitor 
double HR; // current HR (bpm)
HRMI

The HRMI class implements the IHRMonitor interface, polling an HRMI device and reporting heart rate data held in the device's buffer. The class uses low-level I/O in a separate thread to poll the device and check the buffer for current heart rate data. The last 15 heart rate values are collected (one sample every 150 ms, the approximate minimum polling time allowed by the device), then averaged for a smooth current heart rate value.

The HRMI itself is a small heart rate monitor interface manufactured by Sparkfun. The heart rate board is embeddable and contains a USB connection. The device can be connected to a PC via a virtual COM port connection. The device can receive ECG signals from a Polar Electro Heart Rate Monitor (HRM) transmitter and convert it into easy-to-use heart rate data. Indication of a signal being received is provided by a blinking LED, which blinks in rhythm with the user's heart rate. When the device is being polled for data, separate LED indicators flash at the polling rate, which is set to once every 150ms.

Asynchronously polling the HRMI:

  • creating the port object
port = new SerialPort(portName, BAUD_RATE); // i.e. "COM3", 9600
  • initializing the object
port.DataReceived += new SerialDataReceivedEventHandler(SerialEvent);
OpenSerial();
queryThread = new Thread(new ThreadStart(SendQuery));
threadBegin = true;
queryThread.Start();
Thread.Sleep(50);
  • querying the device with SendQuery:
do
{
   QueryHRData(HR_REQ);
   Thread.Sleep(WAIT_PERIOD);
   if (validData)
      UpdateHR(HaveResponse());
} while (threadBegin);
  • sending the Query with QueryHRData:
if (numVals > MAX_REQ_HR)
   reqVals = MAX_REQ_HR;
else
   reqVals = numVals;
try
{
   // convert required values to necessary byte format
   int req1 = reqVals / 10 + 48;
   int req2 = reqVals % 10 + 48;
   // assemble message into array of bytes, terminating in CR
   byte[] newMsg = { (byte)'G', (byte)req1, (byte)req2, (byte)CR };
   // send message to the HR device
   port.Write(newMsg, 0, newMsg.Length);
}
  • the port's SerialEvent:
int i = 0;
int argIndex = 0;
int charIndex = 0;
// initialize array for values held in buffer
for (i = 0; i < reqVals + 2; i++)
   rspArgArray[i] = 0;
// ensure data is available to read before processing
// read until CR encountered in buffer
do
{
   // read a byte from the buffer
   rspCharArray[charIndex] = port.ReadByte();
   // otherwise, record separate values stored in buffer
   if (rspCharArray[charIndex] != (byte)' ')
      rspArgArray[argIndex] = (rspArgArray[argIndex] * 10) + (rspCharArray[charIndex] - ((byte)'0'));
   else
      argIndex++;
   charIndex++;
} while (rspCharArray[charIndex - 1] != (byte)CR);
// set valid data flag
validData = true;
  • update the properties of the HRMI with valid data (UpdateHR):
if (haveResponse)
{
   double temp = 0;
   // get most current values from argument array
   curFlags = rspArgArray[0];          // current flags (mode: 0 | 1)
   curSampleCount = rspArgArray[1];    // current sample count (s)
   int numVals = 10;
   int i;
   
   // average last 10 incoming HR values
   // weight more recent HR values as more significant
   for (i = 11; i > 1; i--)
      temp += rspArgArray[i]; // *((numVals - i) / numVals); // multiply by weighting factor
   curHrVal = temp / numVals; // divide by number of HR values
}
  • properties of the HRMI:
double HR; // current HR (bpm)
static string PortName; // port to which the HRMI is connected
TunturiBike (HR Monitor)

The TunturiBike class also implements the IHRMonitor interface, as well as the IBike interface (described above). In addition to its bicycle properties, the bike also reports the user's heart rate, thus satisfying the requirements of the IHRMonitor interface.

The Tunturi E6R can receive heart rate data from the Polar wireless heart rate monitor, worn around the user's chest. As with the bicycle's other data, there is a lag of a few seconds before the data is sent to the PC.

Point / IPoint

Pointing refers to the direction of attention to an on-screen entity by aiming a finger, hand, or hand-held peripheral. Cameras, such as the Sony EyeToy, and infrared receivers, such as the Wiimote, are traditional used to handle this form of input.

The IPoint interface contains two properties relating to the on-screen position to which the user is pointing:

float XCoordinate;
float YCoordinate;

WiimotePoint

The WiimotePoint class implements the IPoint interface, reporting IR pointing using the Wiimote and an IR-emitting device. WiimotePoint makes use of the WiimoteLib library - developed as part of Microsoft's Coding4Fun initiative (License: Microsoft Public License (Ms-PL)).

Initialization:

WiimoteCollection wiimoteCollection = new WiimoteCollection();
wiimoteCollection.FindAllWiimotes();

Connecting each wiimote (i):

Wiimote wiimote = wiimoteCollection[i];
wiimote.WiimoteChanged += wm_WiimoteChanged;
wiimote.WiimoteExtensionChanged += wm_WiimoteExtensionChanged;
wiimote.SetReportType(InputReport.IRAccel, true);
// connect the wiimote
wiimote.Connect();

wm_WiimoteChanged event handler (update properties of IPoint):

WiimoteState ws = args.WiimoteState;
ws.IRState.Mode = IRMode.Extended;
// update IR midpoint info on wiimote changed event
xCoordinate = ws.IRState.IRSensors[0].Position.X;
yCoordinate = ws.IRState.IRSensors[0].Position.Y;

From an application using an IPoint device, and a config file containing a WiimotePoint device, one can asynchronously poll the device to determine the current pointing coordinate. Note that an IR transmitter placed immediately above or below the screen is required for pointing with the Wiimote.

CameraPoint

Pointing with a camera remains to be implemented. Finger or pointer recognition is required to establish the position on-screen being pointed to.

Stance / IStance

Stance input tracks a user's physical position (i.e. placement of limbs) and centre of gravity. Hardware devices such as the WiiFit balance board or the Dance Dance Revolution mat accomplish this task. Stance input can be tracked using a range of physical devices, from mats and pads to camera input to accelerometers.

Properties of stance:

// the stance area may be a pressure sensitive mat, a wii balance board, or other device capable of determining stance in the user's area
float CenterGravityX; // user's center of gravity x position over stance area
float CenterGravityY; // user's center of gravity y position over stance area
float Weight; // total weight in lb in stance area
float BottomLeft; // weight in lb (bottom left quadrant or stance area)
float BottomRight; // weight in lb (bottom right quadrant or stance area)
float TopLeft;  // weight in lb (top left quadrant or stance area)
float TopRight; // weight in lb (top right quadrant or stance area)

WiiStance

The WiiStance class implements the IStance interface, reporting stance information using the Wii Balance Board. WiiStance makes use of the WiimoteLib library - developed as part of Microsoft's Coding4Fun initiative (License: Microsoft Public License (Ms-PL)).

initialization:

WiimoteCollection wiimoteCollection = new WiimoteCollection();
wiimoteCollection.FindAllWiimotes();

for each wiimote device (i):

Wiimote wiimote = wiimoteCollection[i];
// connect the wiimote
wiimote.Connect();
// if devide is a balance board, use the device
if (!wiimote.WiimoteState.ExtensionType == ExtensionType.BalanceBoard)
   wiimote.Disconnect(); // disconnect device if not balance board

wm_WiiBalanceBoardChanged event (update properties of IStance):

// current state information
WiimoteState ws = args.WiimoteState;
// get information from balance board
centerGravityX = ws.BalanceBoardState.CenterOfGravity.X;
centerGravityY = ws.BalanceBoardState.CenterOfGravity.Y;
weightKg = ws.BalanceBoardState.WeightKg;
bottomLeftKg = ws.BalanceBoardState.SensorValuesKg.BottomLeft;
bottomRightKg = ws.BalanceBoardState.SensorValuesKg.BottomRight;
topLeftKg = ws.BalanceBoardState.SensorValuesKg.TopLeft;
topRightKg = ws.BalanceBoardState.SensorValuesKg.TopRight;

Gesture / IGesture

Gesture input tracks the movement of a user's limbs, head, or body. The location of the body and its orientation are irrelevant with respect to gesture recognition. Gestures represent commands, and not real-time control (as not to be confused with continuous control input). Gesture recognition could be carried out with the use of accelerometer-equipped peripheral devices (such as the Nintendo Wiimote), infrared-emitting peripherals (such as the forthcoming Sony motion controller), or via camera input (such as the upcoming Microsoft Project Natal system). The common backbone of these gesture recognition systems is the use of advanced machine learning algorithms and probability models such as hidden Markov models, nearest-neighbour networks, neural networks, or support vector machines to train and recognize gestures.

The IGesture interface contains the following properties:

string GestureName: the name of the most recently detected gesture
double Condifence: the confidence of  the most recently detected gesture (probability of gesture occurring)
bool GestureIsNew: flag used to indicate if a new gesture has been recorded since the last polling
DateTime GestureTime: the time of detection for the last gesture detected
string GestureJobName: the set of gestures currently being used in online classification

The IGesture interface also contains a function RemoveAllGestureJobs for removing all currently loaded gesture sets from online classification

Useful gesture recognition references

The following references are especially useful for those requiring an introduction to the machine learning algorithms used in modern gesture recognition toolkits.

  • Russell, S. & Norvig, P. (eds.). Artificial Intelligence: A Modern Approach, Second Edition. Chp. 15 - Probabilistic Reasoning Over Time. Pearson Education, USA 2003. p. 537-551.
    • introduction to temporal probability models (time and uncertainty) - stationary processes and the Markov assumption, states and observations;
    • inference in temporal models - filtering, prediction, smoothing (backward-forward algorithm), most likely explanation;
    • hidden Markov models / HMMs
  • Rabiner, L. R. & Juang, B. H. An Introduction to Hidden Markov Models. IEEE ASSP Magazine, January 1986, p. 4-16.
    • elements of HMMs;
    • problems solved using an HMM;
      • how to compute the probability of an observation sequence based on the properties of a model?
      • given an observation sequence, what is the optimal (and meaningful!) hidden state sequence?
      • how to adjust and refine the model parameters?
  • Whitehead, A. & Fox, K. Device Agnostic 3D Gesture Recognition using Hidden Markov Models. In Proceedings of FuturePlay 2009, p. 29-30.
    • sensor-system with 3-axis accelerometer, positional, and gyroscopic data;
    • probability distributions generated using a training process
    • discrete segmentation of sensor's range granularity dependent on precision of sensors being used;
    • decomposition of 3d space intto 27 subspaces (3 subspaces in each dimension), thus a 27-state HMM;
  • Oka, K., Sato, Y., & Koike, H. Real-Time Fingertip Tracking and Gesture Recognition. IEEE Computer Graphics and Applications, November / December 2002, p. 64-71.
    • fingertip tracking using Kalman filter;
    • use of HMM to recognize segmented symbol gestures, recognizing multiple fingertip trajectories;
  • Segen, J. and Kumar, S. Fast and Accurate 3D Gesture Recognition Interface. In proceedings of the 14th ICPR. IEEE Computer Society, Washington, DC, August 1998.
    • use of 2 cameras and 3d pose estimation algorithms for gesture recognition of 3 gestures;
    • involves image segmentation, finite state classification based on # of peaks and valleys in gesture image;
  • C. Keskin, A. Erkan, and L. Akarun. Real time hand tracking and 3d gesture recognition for interactive interfaces using hmm. In Proceedings of the Joint International Conference ICANN/ICONIP 2003. Springer.
    • use of colored glove and webcam for gesture recognition system;
    • Kalman filter used for noise reduction, a neural network for gesture recognition;
    • main concerns include spatiotemporal variability and segmentation ambiguity of natural gestures;
    • use of an ergodic adaptive threshold model for recognizing gestures apart from unrelated non-gesture hand movements;
  • Lee, H. & Kim, J. H. An HMM-Based Threshold Model Approach for Gesture Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 21, No. 10, October 1999, p. 961 - 973.
    • another adaptive threshold model for gesture recognition which deals with infinite non-gesture patterns;
    • dynamic programming approach and use of Viterbi algorithm to determine complete gesture, ignoring embedded gestures and non-gestures;

Gesture recognition toolkit resources:

  • GT2K - Device Agnostic Java-based 3D gesture recognition toolkit / Georgia Tech Toolkit (Contextual Computing Group at Georgia Tech)
    • Tracy Westeyn, Helene Brashear, Amin Atrash and Thad Starner. Georgia Tech Gesture Toolkit: Supporting Experiments in Gesture Recognition. In the International Conference on Perceptive and Multimodal User Interfaces 2003.
      • GT2K provides tools for novices to automatically generate gesture recognition models, while still providing users with domain specific knowledge about the types of gestures generated;
      • supports cameras, accelerometers, microphones, laser range finders, motion capture devices (in theory) - anything that develops a set of feature vectors;
      • made obsolete by GART;
  • GART - Gesture and Recognition Toolkit (Java) (Contextual Computing Group at Georgia Tech)
    • Kent Lyons, Helene Brashear, Tracy Westeyn, Jung Soo Kim, & Thad Starner. GART: The Gesture and Activity Recognition Toolkit 12th International Conference on Human-Computer Interaction. Beijing, China, July 2007.
      • supports gestures with mouse bluetooth accelerometers, camera sensors;
      • requested SVN access to codebase (denied);
      • contains sensor, library, and machine learning components;
      • abstracts machine learning components from end user; uses HTK (HMM toolkit);
      • does not support continuous gesture recognition;
  • Bespoke 3DUI Framework (XNA toolkit with Gesture Recognition)
    • 3DUI includes stereoscopic rendering, 6DOF optical head tracking, Wiimote 3D motion controller cupport [Wiimotion Plus?], and extendible 3D gesture recognizer;
    • a set of software components essential to supporting 3D interaction;
  • Wiigee - A Java-based gesture recognition library for the Wii remote (Benjamin Poppinga and Thomas Schlömer, University of Oldenburg)
    • uses HMMs, an arbitrary number of gestures possible;
    • recognition process: idle-state and directional equivalence filtering > k-means vector quantizer > L-R HMM > Bayes classifier;
    • reusable and extensible gesture recognition library base don event-driven design pattern;
  • WiiGLE - Wii-Based Gesture Learning Environment (WiiGLE GUI + Visual C# WiiGLE Framework)
    • M. Rehm, N. Bee, and E. André: Wave Like an Egyption - Accelerometer Based Gesture Recognition for Culture Specific Interactions. In: Proceedings of HCI 2008, Culture, Creativity, Interaction, 2008.
      • see below for detailed description;

WiiGesture

WiiGesture implements the IGesture interface for recognizing gestures with a Nintendo Wiimote controller. This implementation makes use of the WiiGLE library and gesture sets generated and trained with the WiiGLE GUI application. The asynchronous function GatherGestureData initializes and loads a WiiGLE gesture job, and starts the OnlineClassification_GestureVoteEvent listener, which in turn updates the IGesture properties whenever a gesture is detected.

WiiGLE

WiiGLE is an environment in which one may record gestures, as well as adjust the parameters of machine learning features and classifiers. The Wiimote may then be used for online gesture recognition based on accelerometer data. It contains an API that allows for the integration of custom features and classifiers. (License: GPL)

WiiGLE behaves as a client. When the user performs a gesture, WiiGLE sends the classification result over the network as a XML message. An external application acts as a server, and listens to incoming UDP-messages from WiiGLE. An application will parse gesture messages and execute corresponding commands.

The WiiGLE wiki contains a tutorial for creating gesture sets, training classifiers on the gesture sets and making use of specific gesture features, and setting up gesture jobs for online classification in an external application - all using the WiiGLE GUI application.

The contents of the WiiGLE/runtime folder are placed in the [Debug/Release] folder of the application using GAIM. When a gesture job is created with the WiiGLE GUI, it can be found in the Gesture Jobs folder with a default name. For our convenience, we rename this file (i.e. tennisGestures) and place it in [parent debug folder]/GestureJobStorage. The WiiGesture class locates and loads the specified gesture job by name and makes use of the message parsing functions found in the WiiGLE.dll for online classification at runtime.

Gesture sets created thus far include tennisGestures, golfGestures, greetingGestures, swordGestures, and householdGestures.

In WiiGesture.cs:

// loading a gesture set in WiiGesture.cs
GestureJob myJob = Repository.GestureJobs.ElementAt(0);  
// train the job if necessary
 if (!myJob.Trained)
     myJob.Train();
gestureJobName = myJob.Name;
OnlineClassification.Add(myJob);
// register callback method to receive gesture classification events
OnlineClassification.GestureVoteEvent += new EventHandler<GestureVoteEventArgs>(OnlineClassification_GestureVoteEvent);

WiiGLE classification allows for button-trigger gesture recognition or time-based recognition. Here we set the interval to 1000ms with and idle time of 1ms. Trigger recognition is the default behaviour, with the user pressing 'B' on the Wiimote to mark the beginning and end of the gesture. buttonTrigger is a static bool property of WiiGesture:

// specify the behaviour of gesture trigger
if (!buttonTrigger)                    
   WiiController.GestureTrigger = new TimeIntervalTrigger(1000,1);
// connect to the wiimote
WiiController.Connect();

In the OnlineClassification_GestureVoteEvent(object sender, gestureVoteEventArgs e) method:

// GestureVote object encapsulates the gesture decisions of the classifier.
// If we put multiple gesture jobs to OnlineClassification then we would
// iterate over each gesture vote.
if (e.GestureVotes.ElementAt(0).Winner != "rest")
{
   gestureName = e.GestureVotes.ElementAt(0).Winner;
   confidence = e.GestureVotes.ElementAt(0).Decisions.Values.Max();
   gestureIsNew = true;
   gestureTime = DateTime.Now;
}
else
   gestureIsNew = false;   

From an application using an IGesture:

// setting up gesture job from our application
DeviceManager.generateList();
IGesture wGesture = DeviceManager.getIGesture();
wGesture.GestureJobName = "tennisGestures";
Thread gestureThread = new Thread(new ThreadStart(GatherGestureData));
gestureThread.Start();

In GatherGestureData:

// get latest gesture    
string currentGesture = wGesture.GestureName;
double currentConfidence = wGesture.Confidence;
DateTime currentGestureTime = wGesture.GestureTime;
string currentGestureJob = wGesture.GestureJobName;

CameraGesture

The CameraGesture implementation of the IGesture interface is slated for future development. A gesture recognition toolkit analogous to that of WiiGLE for cameras is necessary for the rapid development of such an implementation.

Refer to the gesture recognition resources above (i.e. Bespoke 3DUI, GART), as they may prove useful in the camera implementation of the IGesture interface.

Tap / ITap

Tap input refers to making contact with and object (such as a pad or mat), or a location in the physical world, such as a wall, with location and intensity captured at the moment of impact. For example, a wall mounted pad may be used to capture players' punches and kicks. The popular game Dance Dance Revolution uses feet and hands to tap locations on the floor in time to music.

While the ITap interface may primarily support touch-pad devices, vision-based implementations of tap may also be plausible. This component of GAIM is slated for future development.

Continuous control / IContControl

Continuous control refers to a one-to-one mapping of body movement to an on-screen entity. In other words, a slaving body movement to an on-screen entity. Examples of continuous control input can be found in Microsoft's forthcoming Project Natal system, as well as in games like Body-driven Bomberman:

  • Laakso, S., and Laakso, M. Design of a body-driven multiplayer game system. In Computers in Entertainment, 4(4), 2006, 7.

Movement can be captured in 2 or 3 dimensions by making use of cameras, accelerometers, laser depth finders, or combinations thereof. The continuous control interface component of the GAIM framework is slated for future development.

Device Manager

A device is an active gaming peripheral supported by the GAIM framework. GAIM's DeviceManager includes a device struct which defines a device as having a name, a device type, and optionally, a serial or virtual com port to which it is connected.

The device manager class is called in an application to load GAIM abstract input devices from a text config file (in the application's [debug/release]/bin folder). Multiple devices of the same type (i.e. more than one IPower device), are currently ignored by the device manager; only the first device is considered.

The public static method generateList() is called from an application to read the config file and generate the list of currently connected devices.

Abstract device

The device manager contains static methods for locating and initializing instances of GAIM abstract input devices based on what is found in the config file:

public static ActiveGameEquipment.Power.IPower getIPower()
public static ActiveGameEquipment.Point.IPoint getIPoint()
public static ActiveGameEquipment.Stance.IStance getIStance()
public static ActiveGameEquipment.Gesture.IGesture getIGesture()

Under development:

public static ActiveGameEquipment.Tap.ITap getITap()
public static ActiveGameEquipment.ContControl.IContControl getIContControl()

Locating a I[Device](for any [deviceType]):

while (i < deviceList.Count)
{
   if (deviceList[i].deviceType == "[deviceType1]")
   {
      // [deviceType1]device found
      deviceIndex = i;
      return [deviceType1].Instance;
   }
   else if (deviceList[i].deviceType == "[deviceType2]")
   {
      // [deviceType2]device found
      deviceIndex = i;
      return [deviceType2].Instance;
   }
   else if ...
   ...
   i++;
}

Device configuration text file

The device manager requires that the config file listing connected devices be formatted as follows (* indicates optional item):

[type] [name] [port]*

Examples of this include:

BikePower PCGamerBike
BikePower TunturiBike COM1
TargetHRPower HRMI COM3
WiiStance WiiStance

The type of device refers to the abstract device level of the GAIM framework. In some cases, the name of the device and the device type are synonymous (for example WiiStance is the only WiiStance device).

Using GAIM in a game

Using GAIM in a game results in more portable games, less code, and abstracted input rather than detailed low-level code. Steps involved in integrating GAIM into a game are:

  1. Editing a configuration text file based on what devices are available
  2. Initializing the game input devices using GAIM's device manager
  3. Calling properties of the abstract input (Power, Point, Stance, Gesture, Tap, Continuous Control)

If you are a game developer intent on using GAIM, please refer to the Developing Games with GAIM wiki page.

Console Tester

The GAIM Visual Studio Solution contains the ConsoleTester project, with a Windows Form application as the main .exe. This application allows for testing of GAIM components (IPower, IPoint, IStance, and IGesture). To use the tester:

  1. Connect devices (USB / Serial Port / Bluetooth);
  2. In GAIM/BikeConsole/bin/, make changes to config.txt based on what currently connected devices you are planning to use;
  3. If you are using an IGesture devices, ensure one of the following:
    1. GAIM/BikeConsole/bin/Debug/GestureJobs has at least one job (IGesture will load the first gesture in GestureJobs unless instructed otherwise;
    2. Your desired GestureJob can be found in GAIM/BikeConsole/bin/GestureJobStorage;
  4. (Un)comment lines of device instantiation code in the TestForm's TestForm_Load function to test various devices.
    1. Specify the particular GestureJob for an IGesture device, if desired;

Example: XNA Racing Game

We have used GAIM in the XNA Racing Game starter kit. The game involves racing a car around a track. In our modified version of the racing game, car speed is controlled by physical power, as measured by the speed at which she is pedaling a bicycle, or how closely he/she is matching his/her target heart rate.

35 lines of code taking input from the mouse/keyboard/game controller were removed, and 11 lines of code to process IPower input were inserted. No changes in code are required to change from one device to another (merely editing the config file is all that is required).

Initialization in RacingGameManager.cs:

// find and instantiate a compatible IPower
DeviceManager.generateList();
// initialize static properties in TargetHRPower
TargetHRPower.Age = 23;
TargetHRPower.MaxSpeed = CarPhysics.MaxPossibleSpeed;
TargetHRPower.RestingHR = 60;
iPow = DeviceManager.getIPower();

Calling properties of the abstract Power input in CarPhysics.cs:

speed = (float)RacingGameManager.IPow.Power;

A text configuration file (placed in the game's debug/bin folder) is used to specify which devices are available to the application, allowing the framework to determine which class to use to implement IPower (by means of the Device Manager). The text file uses a ranked order, so for example, DeviceManager.getIPower() would return the first IPower device in the list.

For example:

BikePower PCGamerBike
TargetHRPower TunturiBike COM1
BikePower TunturiBike COM1
TargetHRPower HRMI COM3