LIAV Architecture

From EQUIS Lab Wiki

Jump to: navigation, search

Contents

Workspace Architecture

The following is a conceptual-level Workspace Model schematic of a simple game architecture (click for image map): conceptual.png

Graphics Engine (OGRE)

  • Provides platform-independent rendering of 3d geometry

Game Object

  • Actor component: contains the program's main loop
  • Encapsulates game states (game rules and behaviour)
  • Event-driven (called from the render thread):
    1. Handles user input (polls input manager),
    2. Manages AI as submodule
    3. Drives graphics and sound engines
  • Manages an Avatar object
  • Computes player energy
  • Collision detection for all objects
  • Handles buildings that produce things

Working Design

See Data below for notes about our data design.

Implemented Game Design UML

This very high level diagram shows which components own which others. It also shows which namespace a module may be found in. On each frame, the game object gets passed control. It calls its input manager to poll the various game devices being used, tells its various managers to update their sub-objects, and moves the avatar and camera inside the OGRE scene.

game-design-3.png

Implemented Data Abstraction UML

Note in the above diagram that all the actual backing data of the core model objects (Building, Avatar and Villager) is actually contained inside data structures in the data namespace. The structure of the data storage classes mirrors the structure of the model classes, which is clearly shown in the diagram below:

data-model-2.png

"World Objects", including Avatar

  • Position, direction, and scale
  • References an ogre mesh object
  • When the object or person is moved, the corresponding mesh object moves in the game world, too
  • Actual data owned by the object is hidden in an OGRE-independent data source (CAXWorldObjectData)

Buildings

Building Properties: xxx

Look at buildingdata and make this what it has

Implemented Building Manager System UML

buildings.png

Automatic Building Placement

Buildings are currently placed in the game world automatically in two ways:

  • Random distribution, possibly controlled by an alpha map
  • Clustered distribution controlled by an alpha map

The first of these is used to plant trees onto the landscape in the concept application. There is a tree alpha map in the game's terrain subfolder of the media directory which specifies areas where trees are located; the actual placement of individual trees is handled by a pseudorandom number generator which takes this distribution information into account.

Clustered object placement is used to put forest nodes into the forest in the game world in such a way that the forest nodes are in the centre of a clump of trees. The locations of the forest nodes is not known at run-time, and must be computed. This computation uses the same tree distribution information (alpha map) as the tree planting task described above. Forest nodes are placed in representative locations using the K-means clustering algorithm, a simple technique for finding single points which approximate a data set.

Collision Detection

The game currently implements several different kinds of simple collision detection using bounding volumes. Simplest of these is collision detection against static objects (Buildings), which employ oriented bounding box volumes. Several different kinds of entities in the game use this function (e.g., user's avatar, villagers), which is prototyped as

CAXBuilding* CAXBuildingManager::moveToNotIntersectBuildings ( Vector3& pos, float radius = 0.0f )

This function will (destructively) move pos to a position in the game world which is not inside any of the building objects currently loaded. The optional radius parameter specifies the size of a cylindrical bounding volume to use for the colliding object. If the given position vector does intersect one of the currently loaded buildings, a pointer to the relevant building is returned by the function; if there is no collision, the function returns NULL. moveToNotIntersectBuildings() is used to detect collisions between the user's avatar and buildings, as well as between villagers and buildings.

Villagers also use collision detection of two different kinds. The first of these is the equivalent to the building collision detection above:

CAXVillager* CAXVillagerManager::moveToNotIntersectVillagers ( Vector3& pos, float radius = 0.0f )

This function is used to detect collsions between two villagers. Villagers are mobile objects and also need to detect collisions with the user's avatar (implementing it this way and not vice versa allows the user to push villagers around in the game world). The method for performing this kind of collision detection is:

void CAXVillagerManager::moveVillagersToNotIntersectAvatar ( const Vector3 &pos, float radius = 0.0f )

This method takes the position (and bounding volume radius) of the avatar and moves all the villagers in the game world so that they do not intersect it.

Some links on collision detection in games:

Game States

As the game gets more complex, we start to need a Markov-style model to keep track of game function. See Managing Game States with OGRE on the OGRE wiki site.

Simple example diagram showing a start state and the ability to command villagers to cut down trees.

gamestates.png

In order to allow insertion of buildings into the game world, and generalised triggered actions, the following game state model will be implemented:

game-states.png

The states are briefly described below:

  • Start: initial start state of the game, possibly with a loading screen and main menu.
  • Standard: standard game state. the user can walk around in the game world; other entities are animated. the camera follows the user and generally shows the view in front of the user's avatar.
  • Action: same as standard, but with the Action Button (1 on the radio joystick) enabled to allow the user to interact with some object in the game world. The game enters the Action state when a game trigger is fired; the only implementation of this currently is the distance-based trigger, which fires when the user's avatar is within a certain radius of a particular building in the game world (e.g., forest nodes).
  • Insert Building: similar to standard, but the user cannot walk around; instead, the joystick controls the placement of the building. other buttons (maybe the gear up/gear down buttons?) can be used to rotate the house; there is also a cancel and a house-insert button. the camera hangs above the user's avatar, providing a bird's-eye-view of the user's immediate surroundings. The Insert Building state is entered when the user presses the Insert Building button on the joystick (e.g., button 10).

Implemented Game State UML

This means that the core Game object in the game program is not actually a monolithic structure; most of the actual processing, and a fair amount of the state data is contained in the game state classes:

game-states-design.png

Artificial Intelligence

  • AI module could be built as an Actor component (i.e., running in its own thread) if needed for multi-agent game play.

Villagers (A.I. Agents)

  • villager is a person much like the user's Avatar that is controlled by the computer
  • job
  • current activity and corresponding model animation
  • where they live

villager-ai.png

Villager Manager (A.I. Unit)

  • manages Villager objects
  • villager pathfinding
  • goal list for villagers, can automatically reassign a villager
  • villager productivity, computed from:
    • villager cheerfulness, villager satiety and other global villager properties
  • villager needs: commercial, housing

Read a description of the tasks the villager manager must perform on the Doxygen Villager Manager page.

There are also some notes on the Doxygen AI Module page.

Control of the autonomous characters in this game is divided into three separate layers of control: action selection, pathfinding and steering. The low-level motor details of locomotion (i.e., turning wheels or moving legs) are abstracted out of this model. Action selection refers to choosing the appropriate kind of behaviour at any given moment, and is controlled in this program by a simple kind of rule-based logic system, described below. Pathfinding and steering allow a character to move through the game world; pathfinding computes the large-scale route through the landscape and steering controls the character's velocity between successive waypoints on the computed path.

Rule-Based Logic System

Exercises in Game AI Programming is a set of tutorials written by one of the FEAR SDK developers. Based on the Rule-Based System described on this page, I wrote a boolean-function based control system which is controlled by a set of rules loaded from XML at runtime. In an early version of this system that I was using for design, the rules looked like this:

IF at_tree AND NOT chop AND NOT drop_off_wood 
  THEN chop AND NOT move
IF at_tree AND chop AND chop_timeout 
  THEN NOT chop AND switch_targets AND reverse_path AND drop_off_wood AND move
IF drop_off_wood AND at_wood_drop_off 
  THEN NOT drop_off_wood AND drop_wood AND switch_targets AND reverse_path AND go_to_tree AND move
IF NOT chop AND NOT drop_off_wood 
  THEN go_to_tree AND move

The rules are applied in order; only the first applicable rule will be executed. In general, rules are interpreted in this way: if the left hand side is found to be true, make the right hand side true. Tokens (in lowercase above) may refer to either built-in functions, which are predicates (possibly with imperative side-effects), or symbols, which are simply stored as a true or false value. In the toy example above (which has not changed significantly in the currently implemented logic rules), at_tree, chop_timeout, switch_targets, reverse_path, at_wood_drop_off, drop_wood and go_to_tree are built-in functions; chop, move and drop_off_wood are symbols.

This input can be converted automatically to an XML-formatted document using a Python script. This XML file can then be loaded by the game program at runtime and interpreted every time a decision is required for one of the villagers. Using the sample file above as a simple testing example (most of the "built-in" functions simply returned true always for this demo), the interpreter gives the following debugging output. The set of true symbols listed in the symbol table is printed after each rule set application.

Ruleset iteration 1:
     +SIDE EFFECT: GO TO TREE
     Symbol Table:  move
Ruleset iteration 2:
     Symbol Table:  chop
Ruleset iteration 3:
     Symbol Table:  chop
Ruleset iteration 4:
     Symbol Table:  chop
Ruleset iteration 5:
     Symbol Table:  drop_off_wood  move
Ruleset iteration 6:
     +SIDE EFFECT: GO TO TREE
     Symbol Table:  move
Ruleset iteration 7:
     Symbol Table:  chop
Ruleset iteration 8:
     Symbol Table:  chop
Ruleset iteration 9:
     Symbol Table:  chop
Ruleset iteration 10:
     Symbol Table:  drop_off_wood  move

This system can very easily be used to design behaviour of arbitrary complexity. The existing logic can be reconfigured, and functionality can be enabled or disabled without having to recompile the program. To add a new predicate or side-effect is a simple matter of writing the built-in function into the Villager Manager class and registering its name in the function table. Ongoing behaviour (e.g., animation or steering) can be controlled by checking the value of a special symbol name in the symbol table (here, "move" is used to engage or disable steering behaviour, and "chop" selects a chopping animation for the villager model).

One extension that has already been designed into the system allows specifying villager behaviour on a very large scale using a Job type enumeration. Villagers may be assigned to one of the following set of currently defined job types:

  • Idle (symbol "job_idle")
  • Chopping wood (symbol "job_chop")

When a villager is assigned a job type, the corresponding job type symbol is entered into the villager's symbol table. This can then be used to select sets of rules to apply selectively, based on job type. The other symbols activated and destroyed by the rules then are used to select component tasks of this job. For instance, the example above makes plain the division of the wood chopping job type into moving to tree, chopping, moving to wood drop, and dropping subtasks. The idle job type has only a single subtask, which is to stand stationary with the idle animation.

Pathfinding

Moving autonomous villagers around the game world requires a pathfinding system. The A* heuristic-based search algorithm is widely used in gaming projects for its speed and its ability to integrate several different kinds of data into its pathfinding decisions, and it also lends itself readily to finding partial solutions (e.g., pathfinding could have an imposed time-limit of 200ms; any path which required more computation that this would only be partially computed, and the best partial solution found would be used).

Our implementation uses the A* algorithm to find full paths through the game world. The environment is approximated using a 2D bitmap image; building bounding volumes are drawn to this bitmap to indicate unreachable areas in the game world. The A* algorithm can then search this bitmap fairly rapidly and compute an optimal path.

Sample auto-generated 513 x 513 obstacle map:

obstacle.png

The black boxes are the drawn bounding volumes. This obstacle map is generated from out current game world: the forested region is the kidney shape to the left of the image (see the tree alpha map in the Heightmap Generation page). The handful of boxes on the right side of the image are 10 cabin models that were randomly sprinkled onto the terrain to provide a semblance of a village. Pathfinding from one side of the image to the other with no obstacles (worst-case scenario) takes about 150ms on my P4 prescott. This latency cannot be tied into the render loop without incurring jerky gameplay issues, so pathfinding in the game is asynchronous. Villagers register their path objects with a path manager, which updates the paths in a separate thread when computation load permits.

Steering

Steering is the process of frame-to-frame control of the villager's position, heading and velocity. At its most simple, steering involves moving towards the goal location and avoiding bumping into obstacles in the environment. Steering can easily be extended to give more complex flocking behaviours, such as fleeing, following, moving in formation, etc.

Data Store

Game data storage falls into two types:

User Data

  • Store component, OGRE-independent data store
  • Instantaneous user data: information about moving things in the player's world
  • Can be local, or shared across a network (for multiplayer games)
  • Keeps track of:
    • dynamic data about positions of movable objects in the world
    • avatar data
      • player position
      • player heading
      • player speed
    • villagers
      • energy
      • food
      • productivity
    • game world zone

Game World Data

  • Store component, OGRE-independent data store
  • Game world and level information: information about static custom objects in the player's world, including world data (information about the terrain and world, which may change)
  • static information:
    • scene file and terrain information
    • models and animations
    • textures and billboards
    • (ambient) sound sources
  • dynamic information:
    • terrain deformation
    • dynamic data about the layout and composition of the village
      • buildings
      • building modifications
    • game world resources like wood, ore, etc.
    • user-constructed roads, bridges, stairs
  • access by OpenAL and OGRE for world rendering

Input Manager

Responsibilities:

  • Finding input devices
  • Attaching and configuring input devices
  • Read control values from input devices
  • Maps physical inputs to game controls:
    • Forward
    • Brake
    • Turn left/right
    • Change gears up/down
    • Camera pitch control
    • Screenshot
    • Program exit

The Simple DirectMedia Layer (SDL) library is a platform-independent open-source multimedia library which can be used to interface with a joystick. This API will also be used to query the keyboard and mouse.

See the Tunturi Protocol page for notes about developing the bicycle interface. See the CAX Dependencies page for software packages used so far in building our game.

Early versions of the input manager can load all their settings from a configuration file, like most of the modules in OGRE. See doxygen page for Ogre::ConfigFile.

Implemented Input Manager UML

inputmanager.png

Joystick Force-Feedback

One of the features we're interested in with the joystick is force-feedback-like effects. The playstation controller we're using can do rumble effects, but the SDL library doesn't support any kind of force-feedback functionality yet. This seems to be due mostly to a lack of good Linux drivers for force-feedback. (For those interested, this is a site describing one attempt to write a Linux force-feedback driver; it sounds like it has alpha support for our device, and might be worth checking out).

Bicycle Physics

There are over one billion bicycles in the world. Bicycling is the most energy-efficient mode of human transportation known, with a lower energy cost per unit distance than walking (up to five times more efficient).

Due to the high latency on the bicycle and the lack of support for directly controlling the tension, our game will not be able to simulate high-realism cycling physics. Instead, we are starting with a very simple implementation. Bike tension is controlled by setting the desired bike power output, in watts. This tends to make the bike harder to pedal at lower speeds, and easier to pedal at higher speeds, which can occasionally mimic the effects of inertia. The target power output varies in relation to the sine of the slope under the user's avatar. Increasing the gear simply adds a constant value to the target power output, thus making the bike harder to pedal in higher gears.

Sound Engine (OpenAL)

There is a new OpenAL-based OGRE sound manager described on the OGRE wiki site.

OpenAL is an open-source platform independent library for playing sound. The 1.0 version of the OpenAL specification documents the interface.

  • Provides platform-independent rendering of 3d sound
  • Should probably be designed as a singleton manager which handles its own resources
  • Functions that we're interested in having:
    • Easy configuration (load settings from another config file?)
    • Move sound sources around each frame
    • Play a sound (also pause, stop, rewind, loop)
    • High-level data structure for a sound source (not just bare OpenAL handles to buffer objects)
  • Things in our game world that can be sound sources:
    • People (avatars and villagers)
    • Buildings and other user-created structures
    • Localised environmental sounds (e.g., waterfall)
    • Ambient sound (i.e., travels with the avatar, so another kind of avatar sound)
  • Possible future feature: implement directional sound sources with DIRECTION and CONE parameters

"Sound Ideas":

  • ambient wind sound gain is proportional to altitude, so that mountaintops are windy and valleys are not
  • ambient forest sound in forest areas: gain is proportional to forest density

Implementation notes:

  • singleton design
  • my implementation uses the non-standard STL hash_map container to map sound names onto sound source ID values. If your compiler does not include this container (MSVC7 does), you can download it from the SGI STL site.
  • sound manager needs to handle the following additional properties:
    • Doppler effect
    • distance attenuation model
    • distance attenuation clamping?
    • distance culling
    • list of all sources so that sources can be deallocated by the sound manager on program exit (i.e., CAXSoundSource doesn't actually own the OpenAL sound source object; however, the CAXSoundSource constructor is the only way to create a new OpenAL sound source, and the CAXSoundSource destructor asks the SoundManager to kill the corresponding sound source).
    • sound context
    • sound device

Sound Manager config file:

  • list of sound clips:
    • WAV filename (required)
    • sound clip name (required)
  • attenuation parameters
  • preferred device settings (optional) -- autodetect if not present
  • ambient and environmental sound sources defined

Planned Sound Engine UML

sound-manager.png