Heart Rate Power-Ups

From EQUIS Lab Wiki

Jump to: navigation, search

Heart Rate Power-Ups is a project undertaken by Mallory Ketcheson and Luke Walker which examines the possibility of modifying commercial video games to function as effective exergames. This builds upon work by Mallory in a prior project. That project took three exergames developed in the EQUIS Lab and attempted to improve the quality of the exercise that they elicit from players by introducing a novel technique: the heart rate power-up. A heart rate power-up, once activated, provides the player with some in-game benefit and includes a graphical effect that makes it appealing to players. The player is rewarded with the power-up when they reach their target heart rate, which was determined earlier in the experiment. So the idea is that by enticing the player to reach their target heart rate with in-game rewards, the overall quality of the exercise that they get by playing the game will be improved over the case where the game is played without heart rate power-ups.

This project aims to expand upon the previous one by applying the same technique to commercial games. In short, for each game involved the major tasks that must be completed are as follows:

  1. Modify the control scheme for the game so that it plays as an exergame. In the simplest case, this can be accomplished by using the default control scheme for the game but disabling player movement and possibly other actions as well when the player isn't pedaling on the exercise bike.
  2. Provide the player with a power-up in the game when they reach their target heart rate. If they fall below the heart rate, the power-up should be removed.

Obviously to accomplish these tasks the game needs to have some sort of modding capabilities built in.

To aid in the development of such modifications a toolkit has been developed. This page describes the capabilities of the toolkit as well as the build process for it, among other things.

Contents

The Toolkit

The main services that the toolkit provides are the following:

  1. Ability to send "virtual" key presses to an application. The key presses that get sent are configurable, and they are sent depending on the player's current heart rate and how fast they're pedaling on the exercise bike. These input events can be used in a modded game to modify the default control scheme and optionally enable a heart rate power-up.
  2. Draw a heart rate overlay in a DirectX application. The overlay looks similar to the picture below. The size of the inner heart gives the player an idea of their current heart rate, and when it fills the outer heart completely it changes colour to indicate that they have reached their target heart rate.

Overlay.png

The toolkit is composed of three main projects: InputSimulator, OverlayManager, and HRPUManager. All of these projects are written in C#, and the implementation of each one is described individually in the subsections that follow. The build process for the toolkit and how to use it are described in later sections. If you don't care about the implementation of the toolkit then this section can be skimmed.

InputSimulator

The InputSimulator library (aka WindowsInput) is used to send "fake" key presses to an application. Essentially it is a high-level wrapper around the Win32 SendInput() function. The homepage for this project is https://inputsimulator.codeplex.com/.

There are a few things worth noting here in case someone needs to update this library at some point. First of all, as of this writing the version linked on the main page (i.e. from the "download" button") is out of date so I'd recommend updating the library from Git instead, assuming that the main branch is stable. A related issue is that the examples listed on the website were written for the old version of the library. The new version has a different interface and so the examples as written will not work. I'd recommend looking at some of the code in HRPUManager to see how the new version is used. Lastly, but most importantly, the version of the library used in the toolkit (and committed to SVN) is actually from a fork written by another contributor. (See https://inputsimulator.codeplex.com/SourceControl/network/forks/joewhaley/inputsimulator for this fork.) The official version of the library only supports sending virtual key codes, which will work with most GUI applications but not video games. Any game written with DirectX likely uses the DirectInput library to detect hardware input, which uses a different set of key codes for each key press. This fork adds support for sending DirectInput key codes to an application.

One final thing worth mentioning is that the key presses sent by this library are in no way targeted - they simply get sent to whichever window currently has focus. So when you're using the toolkit to send fake key presses to another application, the window of the application that you're targeting needs to have focus for the key presses to be detected.

OverlayManager

This library is what draws the heart rate overlay in the target application. The original name of the library was Direct3DHook, but we renamed it to make its role in the toolkit more obvious. The homepage for the project (if you want to call it that) can be found here. The first blog post on the project may also be of interest. Like InputSimulator the version linked for download is rather out of date so the library should be updated from GitHub instead. Also, the article mentions that you must download SlimDX to use the library, however this is no longer the case. The library now uses SharpDX instead (a C# wrapper around the DirectX libraries) and the necessary files are included in the packages folder.

The library accomplishes its task through a combination of DLL injection and function hooking. This involves two main tasks. The first task is to inject code into the address space of the target application. Such code can do things like draw an overlay. The second task is to search the address space of the program for certain functions, such as EndScene(), and change the first few lines of the function so that it jumps to our injected code. The location of the original function is saved, and as long as our injected code returns to the function before it exits the application will display as it normally does (minus the overlay that we've drawn). It's worth noting that much of the hooking logic is actually provided by EasyHook, a related library that the same author has also worked on. The main contribution that the Direct3DHook library brings is that it does the hooking for specific Direct3D functions, thus providing an entry point where you can do things like draw an overlay. I'm still not entirely sure how all of this works so I'll refer you to the author's blog instead if you really want to figure out. And of course, reading the code can also be helpful.

Another one of the library's contributions is that it sets up inter-process communication (IPC) between itself and the target application. This is a bit misleading at first, so I should explain it in more detail. It would perhaps be more correct to say that the library sets up an IPC channel between itself and the code that it injects into the target application. To be more precise, before you hook into another application the library only exists within your host application. Once the hooking completes, part of the library code is injected into the target application. The exact division of the library code between the two processes is all still a bit mysterious to me so I won't go into any more detail than this. The most important part is that the IPC capabilities of the library allow us to transfer data, such as current heart rate, from the host application (HRPUManager) to the library code drawing the overlay in the target application (a video game). This enables the heart rate overlay to be updated as the player's heart rate fluctuates.

It's worth noting that the original purpose of the library was for screen capture. I believe that the author intended this to be used for taking screenshots in DirectX applications. I suppose this functionality could also be used to record gameplay like Fraps does, although doing this efficiently can be a challenge. Regardless, for our purposes this functionality wasn't needed. However, if you did need to use the library for screen capturing then much of the code to do this exists already.

Limitations

Now that the behaviour of the library has been described I'll briefly go over some of the limitations of it here. The main limitation is that currently the heart rate overlay only works in DirectX 9 applications. Having said that, you shouldn't be deterred from trying to add support for newer versions of DirectX if your project requires it. The original author completed the hooking logic for DirectX 10 and 11 applications in the related DXHook classes, so much of the hard work has been done already. The main task that would need to be completed is figuring out how to draw the overlay in these versions of DirectX.

Another limitation is that the library only works with DirectX applications, meaning that OpenGL apps can't be hooked into. For the most part this probably won't be an issue, because in my experience most games that run on Windows use DirectX. If you really do need the library to work with OpenGL, however, then a good place to start would be to study the existing hooking code for DirectX. Perhaps the hardest part of doing the hooking correctly is determining the location of the functions that you want to hook into. The approach that the author has taken for this task is to analyze the Virtual Method Table (VTable) of a DirectX device. He describes this approach in a decent amount of detail in his first blog post. I imagine that this approach would work with OpenGL too, although I can't be sure. Unfortunately there isn't really a community built up around this project so getting help with the library can be difficult. Having said that, in my experience the author responds to questions asked on his blog fairly frequently so you may be able to get some assistance that way.

DirectX

Before ending this section I'll briefly go over something that confused me at first when working with this library. DirectX is not actually a library itself but rather a collection of libraries for multimedia programming on Windows. The 3D graphics capabilities of DirectX are provided by the Direct3D library, and there are a variety of other libraries available such as DirectInput for input tasks and DirectSound for playing sound files. Hence the reason why the original name of this library was "Direct3DHook" and not "DirectXHook". It's worth noting that SharpDX has wrappers around most (if not all) of the DirectX libraries, so if you needed access to one of these libraries in the toolkit then it could be attained by downloading the corresponding SharpDX DLL.

HRPUManager

HRPUManager is what ties the toolkit together. It uses the InputSimulator and OverlayManager libraries to provide the services described at the beginning of this section. Usage of the program is described in the Using the Toolkit section.

The program functions as follows. It first reads in a configuration file provided by the user and validates it. If any of the configuration options are specified incorrectly then the program ends. It then searches the operating system for a running process that matches the name specified in the config file. If no such process is found then it begins waiting and polls for the list of processes every second. Once the target process has started, the program attempts to hook into its Direct3D functions using the OverlayManager library. After this, it then attempts to start FireAnt as a child process. Assuming that the hooking completes successfully and FireAnt can be started, the program then enters its main loop. The main loop consists of reading the output from FireAnt and parsing this output for the heart rate and cadence. Virtual key presses are then sent using the InputSimulator library depending on the heart rate and cadence values reported by FireAnt. It also updates the current heart rate in OverlayManager using the IPC capabilities that the library provides. The program continues in this fashion until the target process exits.

It's worth noting that the rate at which the main loop iterates is implicitly throttled by FireAnt. As of this writing FireAnt reports its values 10 times a second, so the main loop of HRPUManager will also iterate roughly 10 times a second. If FireAnt is changed so that it reports its values at a different rate, then this will affect HRPUManager as well.

Build Process

The build process for the toolkit is fairly straightforward. You'll want to build InputSimulator or OverlayManager first, and then HRPUManager. HRPUManager needs to be built last since it relies on the DLLs produced by building the other two projects. This section isn't a step-by-step guide for building projects in Visual Studio as there are other resources that do this already. Rather, my goal with this section is to highlight a few of the intricacies involved with building the toolkit from source. If you've built projects in Visual Studio before then I honestly doubt that you'll run into any problems, so you may just want to skim over this section unless you encounter build issues or are looking for more detail.

What you need

Before you can build the toolkit there are some supporting programs that are needed. The first is Visual Studio. Personally I used Visual Studio Community 2013. At first this version may seem like shareware because you can only use it for a trial period before you need to make an account. Making an account is free, however, and you can use your Queen's email address for this. If you decide to use a newer version of Visual Studio then the solution/project files committed to SVN may not work out of the box. In my experience though Visual Studio is usually pretty good about upgrading old versions of project files to work in the current version.

The other program that you'll need is some sort of SVN client. Instructions for using the EQUIS SVN repo can be found here. Check out:

This library and the associated games have been tested using Windows 8.

Debug vs. Release Builds

When building a project in Visual Studio you have the option of building it in a Debug or Release configuration. Generally speaking, a Debug build has more debugging information included whereas the Release build has most of this information left out. In addition, the build configuration that you choose determines where the output files from the build process are placed - either bin\Debug or bin\Release. For the purposes of building the toolkit it doesn't matter much which configuration you choose. Having said that, I'd recommend building everything in the Debug configuration because HRPUManager references the two other projects through their bin\Debug folder. If you do build the projects in Release configuration then you'll need to update the Capture and WindowsInput references in HRPUManager. (The build instructions for HRPUManager describe this in a bit more detail.)

InputSimulator

You may want to build InputSimulator first since it's the easiest to build of the three projects. Open the solution file in Visual Studio (located at InputSimulator\WindowsInput.sln) and try to build the solution (shortcut Ctrl+Shift+B). If you're using Visual Studio 2012 or newer then I don't expect you'll run into any problems.

The solution is composed of two projects: WindowsInput and WindowsInput.Tests. WindowsInput is the main project that we care about, and if it builds successfully then the file WindowsInput.dll should be produced in InputSimulator\WindowsInput\bin. Where you may run into problems is when it tries to build WindowsInput.Tests. This project relies on NuGet the first time it's built, which is a package manager for Visual Studio. For those familiar with UNIX development it's like the .NET equivalent of apt-get, although it's a bit more high level than that. If you don't have NuGet and you really don't want to install it then you could avoid building the Tests project by building only the WindowsInput project from the Solution Explorer pane. This is a useful extension to have though so I'd recommend installing it anyway or upgrading Visual Studio.

OverlayManager

Next you'll want to build OverlayManager. Open the solution file in Visual Studio (OverlayManager\OverlayManager.sln) and try to build it. If it builds successfully then Capture.dll will be produced in OverlayManager\Capture\bin. You may notice that building the project also produces various DLL and XML files related to SharpDX in the bin folder, however these have just been copied over from the packages folder. These same files, including Capture.dll, also exist within OverlayManager\bin. I'm not entirely sure why the author set up the solution to have two output folders. Regardless, it shouldn't matter which Capture.dll file you use since they're identical.

Like InputSimulator, this solution also includes a test project (TestScreenshot) which is helpful for testing changes to the library. This application is described in more detail in the Testing the Toolkit section.

It's worth noting that several binary files for EasyHook are committed to OverlayManager\bin. Normally it's considered bad practice to commit binary files to your repository, however these were included already in the original Git repo for the project. Perhaps it would be possible to set up the project so that these files are downloaded through NuGet instead, but I wanted to make as few changes to the library as possible so I didn't bother with this.

HRPUManager

With the other two projects built we can now finally build HRPUManager. Open HRPUManager\HRPUManager.sln in Visual Studio and attempt to build it. If InputSimulator and OverlayManager were built successfully and you haven't modified the directory structure from SVN in any way then it should build successfully.

Compared to the other two projects HRPUManager doesn't have very many source files nor does it rely on any arcane Windows DLLs so the main thing that you may run into trouble with while trying to build the project are the references to the DLLs for the two other projects. These references are set to use relative file paths so, as I said before, the project should build successfully so long as you don't mess with the directory structure used in SVN. If you have changed the directory structure for some reason (or you built the projects in Release configuration) then you can fix the references by removing the Capture and WindowsInput references through the Solution Explorer and re-adding them to point to the correct location. To be more precise, the references must point to the Capture.dll and WindowsInput.dll files produced in the previous build steps.

The only other thing worth noting about the build process for HRPUManager is that it has a post-build event that runs after the project itself is built. What this event does is it copies over the EasyHook binaries from OverlayManager\bin into the binary folder for HRPUManager. (I tried adding these binaries as references instead but Visual Studio complained that they were corrupted or something.) These binaries are needed to run the program, so if the post-build event fails (e.g. because you changed the directory structure) then you'll need to copy over these files manually to use the toolkit.

Using the Toolkit

The main executable for the toolkit is located at HRPUManager\HRPUManager\bin\Debug\HRPUManager.exe. (Or bin\Release if you built the project in Release configuration.) The program can be run either from the command line or through Visual Studio. You can also run it by simply double-clicking the executable in Windows Explorer.

The program only takes one command-line argument, which is the location (i.e. a filepath) of the configuration file to use. If you don't provide the command-line argument then by default the program will look for a file named "config.xml" in its working directory. This file is written in XML, and the format of it is described on the Config File Format page. A template file (named config.xml) is provided in HRPUManager\HRPUManager. You can copy it into the executable directory for the program to use it, however some configuration options will need to be filled in before it will work. Being able to specify the location of the config file is useful when you have multiple config files that you want to use with the toolkit. For example, if you're using the toolkit with multiple games or different experimental conditions then you can have a config file for each case and simply change the command-line argument when you go to run the toolkit.

Once HRPUManager has been started and the config file has been verified, it then waits for the target process to start. After the target process has started, the toolkit hooks into its Direct3D functions so that the heart rate overlay can be drawn. It also starts up FireAnt as a child process and sends virtual key presses to the target process to indicate activation of player movement and the power-up. The implementation of the toolkit is described in more detail in The Toolkit section. The main thing to note here is that the virtual key presses are not sent directly to the target process. Rather, they get sent to whatever window currently has focus. So usually you'll want your target application to have focus while you're using it with the toolkit. (The toolkit automatically switches focus to the target application after it hooks into its Direct3D functions.)

Testing the Toolkit

If you end up making changes to any of the projects used in the toolkit then there are a few testing applications that you may find useful for trying out your changes.

HRPUManager

For testing changes to HRPUManager I added the Tests class. Be forewarned however that I used this fairly early on in the development process so it may not be up to date with all of the changes that have been made since then. Its main use is as a starting point for you to write your own testing code. To run the tests, add code such as the following to the start of the Main function in the Program class:

Tests.TestHeartRateOverlay("TESV");
return;

The return statement is needed to prevent the usual code for HRPUManager from running. (Admittedly it would be better to set this up so that the tests can be run as a separate application, but I didn't bother with this since I didn't use this class very often.)

OverlayManager

The original Direct3DHook library came with a testing application named TestScreenshot. You may find this useful for testing changes to the heart rate overlay instead of using HRPUManager. TestScreenshot is already set up as the StartUp project for OverlayManager so to use it simply press F5 while working on the library.

The application opens a window with a large canvas for displaying screenshots ("captures") from the targeted DirectX application in addition to various options for controlling the library. It should be noted that none of these options are related to the heart rate overlay. The Draw Overlay checkbox, for example, only controls display of an FPS counter in the target application. However, I did make a few changes to the internal code for this application (primarily in Form1.cs) as I worked on the heart rate overlay to ensure that it still worked.

Testing changes to the heart rate overlay with TestScreenshot is easy. Simply start up TestScreenshot, then start up the application that you're targeting, and then switch back to TestScreenshot (e.g. with Alt+Tab) and enter the name of the executable for the target application. Once you click Inject it will switch focus back to your target application and the heart rate overlay should be displayed. In my experience the Autodetect option for the version of DirectX used by the target application works fairly well, however if the library has trouble detecting the DirectX version in use then you may want to specify the exact version with the radio buttons.

InputSimulator

There is also a testing project for the InputSimulator library, named WindowsInput.Tests. I haven't actually used this application since I didn't make any changes to the library, so I can't comment much on its usefulness. Like the Tests class in HRPUManager, it may be most useful as a starting point for you to write your own testing code in case you need to make drastic changes to InputSimulator.

List of Mods

This section provides links to pages describing the various game mods that have been developed with the toolkit. Such documentation should obviously instruct other people on how to use the mod that you've developed, but it can also be useful more generally as an example for how a mod developed to use the toolkit can function. If you develop a mod that uses the toolkit then you should consider documenting it here so that there's a record of its functionality.

Future Work

In this section I'll describe some of the features that could be added to the toolkit to make it more useful.

More Positioning Options for the Overlay

One of the current limitations of the heart rate overlay is that the position you define for it in the config file is always relative to the top-left corner of the screen. This is fine if you know the resolution of the target application ahead of time, but if you don't know the resolution and you aren't simply drawing the overlay in the top-left corner then specifying its position can be tricky. My idea for this feature was to change the configuration options so that if you specify the position with negative integers then the overlay would be drawn relative to the bottom-right corner instead. For example, a position of x=-10 and y=-10 would draw the overlay 10 pixels left and 10 pixels up from the bottom-right corner of the screen. Another possibility is to continue specifying the position with positive integers and instead add a new configuration option that controls which screen corner is used as the "anchor" for the position.

With regards to implementation, adding this feature would be fairly simple. Obviously you would change the overlay code so that it checks each screen coordinate to see if it's negative, in which case you would then convert the coordinate so that it's relative to the bottom-right corner instead of the top-left one. To do this you just need to subtract the coordinate from the dimensions of the screen (i.e. the resolution). The dimensions can be obtained from the SharpDX Viewport class. (See the Viewport property of a Direct3D9 Device for an example of how to get at this structure.)

I suppose it's worth pointing out that in the current implementation it's perfectly valid to specify the position using negative integers. Doing so will cause the overlay to be drawn partially (or completely) offscreen so I'm not entirely sure why you'd want to do this, but it is possible. If this feature was implemented as I suggested (negative position coordinates draw the overlay relative to the bottom-right corner) then it would still be possible to draw the overlay offscreen, although doing so wouldn't be as intuitive. For example, to draw the overlay 10 pixels away from the top-left corner on a screen that's 1920 by 1080 pixels you'd need to specify the position as x=-1930 and y=-1090, whereas in the current implementation this would be done with x=-10 and y=-10.

Add Support for More DirectX Versions

As I mentioned in the Limitations section for OverlayManager, currently the overlay only works with DirectX 9 applications. I'll describe here some of the implementation concerns for adding support for newer versions of DirectX.

The most important class for drawing the overlay in DirectX 9 is the Sprite class. Basically you just give a Sprite object a reference to an image file loaded onto a texture and tell it what position on the screen to draw it. Thankfully this class also exists in DirectX 10 so adding support for the overlay in this version shouldn't be too difficult. Granted there are some differences from the DirectX 9 counterpart, but nothing too significant. Unfortunately this class was removed in DirectX 11, so implementing the overlay in this version may be challenging. I did some research on this earlier and was unable to find an easily reusable solution, but perhaps one exists. The solutions that I did find relied on shaders and also performed certain matrix transformations on the textures that the sprites were loaded onto. Needless to say, you may need to spend some time expanding your knowledge of DirectX to implement the overlay in this version.

As I mentioned in an earlier section the original author of the library already implemented the hooking behaviour for DirectX 10 and 11, so the main task to complete should you choose to add overlay support for these versions is to implement the drawing code. (I should mention that it seemed to me like the code for DirectX 11 was incomplete in some areas, so you may actually need to tweak it somewhat to get everything working correctly.) As of this writing DirectX 12 has not been released yet, so if you're working on something really edgy and need support for this version of DirectX or an even newer one then you'll need to write the hooking code yourself in addition to the drawing code. Since I didn't write any hooking code myself I can't be of much help unfortunately, but I'd suggest looking at the existing hooking code for prior versions of DirectX to see how it was done. The original author also describes the approach that he took for things like DLL injection in his blog, so that's another good resource to consult.

Screen Obscuring Effect for the Overlay

In the original design of our study there were four experimental conditions: the game played without any modifications; the game with pedaling enabled; the game with pedaling and heart rate power-ups enabled; and the game with pedaling enabled and a screen obscuring effect. The fourth condition was eventually cut due to concerns over how long it would take to run each participant through the study. (Keep in mind we were doing this for two games, not just one.) The basic idea behind this condition is that the screen would be darkened in some manner, and this effect would be reduced as the player's heart rate increases. When the player's heart rate is at the target amount, the screen would be displayed as it normally does. One way to picture this is like a tunnel vision effect, although that's not the only way to do it.

In contrast with heart rate power-ups, the screen obscuring condition attempts to augment the player's behaviour through negative reinforcement, whereas the power-up condition attempts to do this through positive reinforcement. In this section I'll describe some of the possible ways that this feature could be implemented. In case it wasn't obvious already the bulk of the work to implement this feature would be done in OverlayManager, although some small changes would need to be made in HRPUManager as well since you'd want to add a configuration option to enable or disable this effect. I considered two possible ways to implement this feature: an easy way and a hard way.

The easy way to implement the screen obscuring effect is to take the same approach used to draw the heart rate overlay. That is, you would darken the screen using a set of sprites (which are really just image files). The approach can be described as follows:

  • You would have a set of image files stored on disk, let's say five. Each image would be a different size, and when drawn on the screen would create a tunnel vision-like effect.
  • Using the target heart rate, you would calculate a series of heart rate intervals that correspond to each image file used to darken the screen. For example, a heart rate of 60-70 would correspond to the image that obscures the screen the most, 70-80 would correspond to the image that obscures the screen slightly less, and so on.

The main advantage of this approach is that it would be simple to implement since the code would be fairly similar to the existing code used to draw the heart rate overlay. There are two main disadvantages. The first is that the animation of the darkening effect wouldn't be very smooth. Whenever the player's heart rate crosses over to a different interval the image being drawn would change, causing the darkening effect to "jump" to the next level. Players may find this jarring at first. The other disadvantage is that the implementation wouldn't be very generic in terms of the screen resolution, since the images used to obscure the screen would be of fixed size. Since 1920*1080 is the most common display resolution right now and most games support it perhaps this isn't such a big issue. But if you wanted to use this feature in a game with a different resolution then you'd need to add a set of sprites that fit that resolution and configure OverlayManager to use them instead, otherwise the obscuring effect wouldn't display properly.

[Note: After thinking about this more I realized that it may be possible to mitigate these disadvantages by scaling the sprite used to obscure the screen. For the first disadvantage, you could change it so that instead of having multiple images for different intervals of heart rate you would have just one image and increase its size as the player's heart rate increases. So long as the sprite is repositioned properly so that it's still centered on the screen, the animation of the darkening effect would appear much more smooth. This works because any part of the image that exists offscreen simply doesn't get rendered. The second disadvantage could be avoided by scaling the image so that it fits the resolution of the game. Thus you wouldn't need to provide a separate set of images specifically for the resolution in use.]

The hard way to implement this feature would likely require quite a bit more work than the easy way, but I believe it is the correct way to do it. This approach uses a fragment shader to draw the screen obscuring effect instead of sprites. For those not familiar with graphics programming, a fragment shader is like a little program that gets run for each pixel that will be rendered to the screen. The shader can augment the various traits of the pixel, such as its colour, saturation, and depth. In both DirectX and OpenGL, shaders are written using specialized languages with C-like syntax. Shaders are usually executed on the graphics card for efficiency purposes. The shader that I had in mind would function as follows:

  • First, you would determine how much of the screen should not be obscured by the shader by comparing the player's current heart rate to the target heart rate. This should produce a value that defines a circle in the center of the screen which is not obscured at all.
  • In the shader, you would calculate the distance from the pixel to the center of the screen. If the distance is less than the value calculated in the first step then you would simply output the colour for that pixel. If the pixel is outside this value then you would blend its colour with a transparent gray, causing that portion of the screen to become obscured somewhat.

Writing a shader to accomplish the above shouldn't actually be too hard. It may require a bit of start up time to become familiar with HLSL (the shading language used by DirectX), but once you know how to write a basic DirectX shader, writing one to obscure the screen in this manner should be fairly simple. The tricky part is figuring out how to set up the shader. Normally shaders are set up in the initialization stage of a graphics program before the graphics pipeline starts running. However, since we're hooking into the DirectX functions of an already-running process, we don't have access to this initialization code. I'm not sure if it's possible to set up a shader once the pipeline is running. Another issue is that the application you're hooking into with the toolkit likely has its own fragment shader already, assuming that the program is nontrivial. I'm not entirely sure if DirectX allows you to define multiple shaders of the same type, or if they will conflict with each other in some way. So basically this approach may not be possible due to the fact that we're "borrowing" the DirectX functions of another process. If this approach is possible, though, then it should be preferred over the other approach since it seems like the proper way to implement the obscuring effect and it also avoids the disadvantages that I mentioned before.

Other Considerations

I glossed over a few things in the discussion thus far that I'll go over now. The first issue is that even when the screen obscuring effect is displayed, you still want the heart rate overlay to be visible so that the player has some idea of their current heart rate. (I suppose you could argue that the amount of the screen that's obscured also gives them an idea of their heart rate, but the overlay likely does a better job of this.) This requirement could affect which approach is used to implement the feature since if one of the approaches causes the overlay itself to become obscured then it isn't really desirable. In the case of the first approach (the one that obscures the screen with sprites) I doubt that this will be an issue because you can probably control how the sprites overlap with each other by changing the order in which they're drawn. However, this may be a problem in the shader approach. If the shader is run after the overlay has been rendered then the result will be that the overlay also becomes obscured, assuming that it's positioned in an area of the screen that's obscured. Perhaps you could avoid this by telling the shader the exact area in which the overlay is drawn, but I don't know an easy way of doing this.

Another thing worth considering is the question of what the screen obscuring effect should look like. Up to this point I've assumed that it would look like a tunnel vision effect. (See https://en.wikipedia.org/wiki/Tunnel_vision if you don't know what I mean.) You could implement the feature so that it looks like traditional tunnel vision, in which case you have a small visible area in the center of the screen surrounded by complete darkness. However, instead of surrounding the visible area just with blackness I think it would be better to blend the original display of the game with a transparent gray. This allows the player to still have some idea of what's going on outside the visible area. In the former case I think that players would find the game to be too discouraging when they're at a low heart rate since they wouldn't be able to see very much.

Having said all that, the tunnel vision approach isn't the only way to draw the obscuring effect. Another possibility is to simply obscure the entire area of the screen by blending it with a transparent gray. In this case, instead of reducing the area that's obscured as the player's heart rate increases you would reduce the opacity of the colour used to blend with the original display to reduce the obscuring effect. As with the tunnel vision approach, I think that a shader-based solution would still be preferred over a sprite-based one in this case.

One final thing that I'll briefly mention here is the timing of the screen obscuring effect. The naive approach is to start drawing the effect as soon as the toolkit hooks into the target application. However, since it takes people a few minutes of exercise before their heart rate gets up to reasonable levels, it may be better to delay activation of the darkening effect so that they aren't presented with the negative condition immediately. The simplest way to do this would be to hard code the delay into OverlayManager, but I think it would be better to add this as a configuration option in the config file so that the user of the toolkit has some control over it.