MY ROLE : Solo UX Researcher

TIMEFRAME: January 2020 - June 2020

TOOLS USED: Published Heuristics

BIGGEST TAKEAWAY: This space is massive and truly needs more data and research dedicated to it!

Video Game Heuristics

Objective

1st: Discover and collate existing heuristics centered on video game user experience (VGUX).

2nd: Evaluate a variety of games to uncover how they excel, meet, or fall below UX standards.

3rd: Reflect on heuristics and evolve them into a modern
and holistic list.

 
 

SetUp

This project was driven by my love of video games and user experience research. So, I wanted to understand what the intersection of my passions looked like and how I might involve myself.
So I asked “How is the user experience for video games currently?

 
 

First Objective: Discovery

In order to constrain the UXR direction, I focused on existing heuristics. This provided an opportunity to excel as a solo researcher, and become more experienced with one method. To do so, I performed an expansive literary review and collated the results into one list of heuristics. 25 heuristics were uncovered in the process, covering 7 topics: Mechanics, Interface, Visual Clarity, Feedback & Rewarding, Error & Recovery, Assistance, and Customization.

View the unfiltered heuristic collection at the following link: Heuristic Sheet Link.

Pictured below is the final list of 25 heuristics:

When collating the rather dense list, it was very intriguing to see what aspects were actually related to the user experience versus the storytelling of the experience. Storytelling is a method that is more abstract and, thus, it is more efficient and fruitful to have users provide feedback than experts. The latter heuristics were deemed irrelevant to the heuristic development objective, and so were cut from the list.

Now that I had a working list of heuristics it was time to move on to the next objective: application of the heuristics, aka evaluation!

 

Second Objective: Evaluation

Games were selected to capture a wide variety of genres, to truly test whether the heuristics could hold up to any and all games. Additionally, to be cost effective, I began with games I had already in my inventory, but had not played in at least a year’s time. This would provide valuable insight of not remembering every single detail, but also remembering enough to figure out edge cases, or solve confusing user experience situations. The exception is the 4th game, Towns, where I had not played the game before.

Each game has been given a deck for its evaluation:

Stardew Valley is game I’ve had for a few years, that I tend to play on and off. It’s one of my most played games, though I hadn’t played it in over a year.

This was one of the primary reasons it was chosen as the first game. I was heavily familiar with the mechanics of the game (and others similar to it), as well as the details that were included in it. As such, as the first evaluation, it felt very thorough.

 

Papers, Please is one of the most stressful games I’ve ever played, that I also enjoyed for how stressful it was.

I fully expected it to be deliberately overwhelming, and thus break some of the heuristics found. I was driven to analyze this game by its intentionality of the tone and scenario. I was not disappointed, and in fact I appreciate the game even more.

 

Overwatch is an intense, multiplayer shooter game. Trying to capture more variety in my analyses, Overwatch suited easily to branch out in multiple ways.

I must note that I am not a very good first person shooter player, but the game proves that any level of player can enjoy it in a myriad of ways.

 

Towns is an abandoned game, and one that I haven’t played beforehand. As such, it provided great insight for two reasons: to show what it might be like to use the heuristics for a game that is “in development” (rather than published, like the other games); and also to show that an expert does not need to be entirely familiar with the analyzed game to use these heuristics to their full potential.

 

Third Objective: Iteration of Heuristics

Each game desired to influence the heuristics in some fashion. At the end of each evaluation there are notes of how each game recommends to iterate the heuristics. Though most were distinct or even overlapping, Stardew Valley and Overwatch competed on how to adjust the heuristic The Player does need to read the manual or documentation to play as each implemented documentation in different ways. With games that are more stable (like Stardew Valley), in the sense that are not constantly adding new characters and thus powers for the Player to control and be strategic with (like Overwatch), documentation shouldn’t be constantly given to the Player. But with the latter, when the game is in fact adding more and more powers and ways to play, gentle documentation should be provided through easy, non-disruptive, and seamless access. As such the heuristic was iterated to be The Player does need to pause the game to read the manual or documentation to play.

The rest of the recommendations were collated and the heuristics were iterated, as denoted in bold below:

Reflection

I highly enjoyed applying these heuristics to various games, and I am looking forward to more games to analyze. The process of playing the game for roughly an hour, and recording it along the way proved to be useful for later documentation. I did run into technical issues of one game not even starting past the main menu, so it was replaced with Towns. Another issue that occurred was the later recordings cut out the audio, and so I adapted and used other player’s recordings on YouTube. In the future, a stronger computer, that is more equipped to handle game recordings would be more useful, and more consistent in quality of recordings.

From this point forward, I want to use the iterated heuristics, taking notes as done with these four. After another set of four, I will iterate the heuristics again. In tradition with UX, the iterations shouldn’t stop till there is enough saturation in overlapping comments that there is no longer a need for iterations.