Better playtests for game developers [How To Be A Games User Researcher - Bonus Edition š®āØ]
Five of the biggest problems game developers face when playtesting, and a tool to solve them.
I spent last year interviewing game developers about playtesting - how & why they run playtests, and what makes playtesting harder than it should be.Ā
As a user researcher, weāre lucky enough to spend 24/7 working on crafting reliable studies - but for most game developers itās just one small part of their role. And itās a huge time-sink.
Many of the problems they described were very familiar:
"It's hard to find real players to take part"
"Itāll just tell me problems I already know"
"I donāt have the tools I need to run a playtest"
"It's hard & take ages to deal with the messy data"
"Iām not sure Iām doing it right. Am I biasing my results?"
This uncertainty adds up to "we donāt have time to playtest", and teams donāt run playtests or UX research studies as often as they would like to (or should be doing). Risky!Ā
As user researchers, we know how to deal with all of these. But not every team can afford a dedicated UX professional. Iāve been interested in fixing this problem.
At Develop last year I gave a talk, Better Playtesting for Indie Developers, which explained some of the challenges developers face, and how to overcome them using UX and user research approaches. In this bonus issue I wanted to share some of those lessons.Ā
Today is also the launch of The Playtest Kit, which helps solve these problems for game developers. Because you are How To Be A Games User Researcher Iāve included a discount code thatāll get you 20% off if used this month. Find the code at the end of this bonus post.
Better playtesting for game developers
In my Develop talk, I explored some of the barriers to more frequent and efficient playtesting for game developers, and shared how we deal with them in the research community.
š Find the right players
We know how important recruiting the right players is to running successful playtests. Without the right players, the problems + behaviour weāre discovering are unlikely to be representative - itās not helpful testing the difficulty of a complex first person shooter with someone who has never played games before. (As an aside, experience with dual-stick controls was always the first clue that the user was misrecruited when running playtests at PlayStation!)
Unfortunately, participant recruitment takes time to do right. Many game teams find it hard to dedicate time to do appropriate participant recruitment, and test with friends, colleagues or their existing fan community. They then draw unreliable conclusions about āwill players like my gameā or āis the difficulty balanced correctlyā based on those playtests, and I heard a few horror stories about that too!Ā
Recently Iāve been encouraging teams in this position to start to develop playtest participant panels, and fill them regularly with fresh batches of playtesters who can be called upon when needed. This is a repeatable process to try and maximise the value of each individual playtest recruitment effort, and minimise the time needed when trying to call on playtesters in the future. When aware of the risks, this can be the perfect balance for small teams.
š Learn new things with your playtest
The teams I spoke to often worried about not discovering new things in their playtest. They believed they knew what they would learn from a playtest already, and that it wouldnāt be useful. This was often rooted in vague research objectives, unfocused tasks, and just āseeing what players thinkā.Ā
A repeatable structured process to identify what decisions have been made recently, rank them by risk, and use the riskiest assumptions to form research objectives is how researchers deal with this. The research objective informs the method, rather than the one way round. And with a bit of help, teams can identify their riskiest assumptions and then pick the right method to deal with them.Ā
This means that playtests are focused, relevant and ultimately lead to a culture of running small frequent tests, to reduce the risk created by large, festering untested assumptions.
š ļø Get the right tools for playtests
Playtesting creates a lot of admin work. For ethical and legal reasons, data from players has to be treated very carefully and adhere to local laws. Many game teams are also invested in keeping their game secret, which requires creating and managing NDAs. And tests can be technically complex - potentially requiring installing builds remotely, recording cameras + screens, and checking the technical state of the game.Ā
This is hard when doing it for the first time. Templates, processes and the right tools help reduce the risk of critical failure, and make playtesting more efficient and speedy. Time invested in āresearch-opsā will pay dividends for making playtesting easier to run as a regular activity.
ā³ Make the most of the time with players
Organising playtests can be a lot of effort to plan and run. I was concerned when I heard that teams were missing opportunities to make the most of that effort, and choosing underperforming research methods, such as āget players to give some comments on our Discord channelā.Ā
Playtests can potentially generate lots of different types of data. Observational data from watching people play. Sentiment data from what players say, or ratings they give. Behavioural data from recording what they do within the game.Ā
Researchers understand the type of data that is generated from playtests, and how best to match that to research objectives. As mentioned above āwhat we want to learn from the playtestā should be informing āwhat we ask players to doā and ultimately āwhat data we collect from the playtestā. Trying to balance appropriate mixed-method studies with the pragmatic decisions that game developers have to make has been a particularly interesting challenge!Ā
šŖ Deal with messy playtest data, draw reliable conclusions & take the right action
Playtests, and research studies, generate a lot of raw data. The interpretation of that data takes practice, so that we donāt have misguided confidence about our decisions (or get too distracted by players saying āi donāt like thisā)
In my interviews, I heard that game developers were not confident about this, and were worried about the risk of introducing bias, misinterpreting the data - or just getting bogged down in it for days, taking away valuable time.
In the Develop talk, I shared an approach for identifying weak + strong signals from your data, and how to treat both opinion data and behavioural data differently to draw reliable conclusions. This is essential for removing the risk of misinterpretation of data from playtests, and getting game development moving.
ššš The Playtest Kit ššš
This brings me to The Playtest Kit - my attempt to scale solving these playtest problems.
Iāve made The Playtest Kit to solve all of these problems for game developers (and a whole bunch more!) - creating a structured and repeatable playtest process, to efficiently run high quality playtests, without having to start from scratch (or hire expensive consultants š ).
Itās designed for game developers who are not user researchers, but just need to run better playtests - making our knowledge and expertise accessible to teams without dedicated UX professionals. The toolkit speeds up effective playtesting, making it possible to run more regularly, and ultimately improving the quality of games.Ā
During May the playtest kit is at a launch discount price (the price will go up next month!). Because you are a HTBAGUR subscriber, I also have a discount code thatāll get you an extra 20% off this month.Ā
šØ The code is āHTBAGURā, and runs out when the price rises at the end of May. Until then - double discount!Ā šØ
Learn more, and pick up the playtest kit before the discount runs out, at playtestkit.comĀ
(no need to enter the code with the button above!)
The Playtest Kit has got some great feedback from developers so far, including Size Five Gamesā Dan Marshall, Snap Finger Clickās Jo Haslam, Paradox Interactiveās Juney Dijkstra, and more (see all their thoughts on playtestkit website). If you pick up the toolkit, Iād love to hear your impressions - do drop me an email.
Stayed tuned for our regular How To Be A Games User Researcher issue, later this month.
Steve