Analytics, the wrong way and the right
Rocket Mail was the first game in which I’m tracking metrics, using Google Analytics. Adding Analytics support to your app is fairly straightforward, but using it well isn’t. I’ve learned a thing or two from Rocket Mail, so with Mystery Game No. 1, I’m taking a different approach, as documented in this post.
Analytics was one of the last things I did before launching Rocket Mail. When I added it, I basically went through all the screens in the game, asking myself “hmm, what data is available for us to gather at this point?”. I thought I might as well gather it all, because it’s just a few lines of code, and who knows when this data might come in useful? (Don’t worry, everything is entirely anonymous, down to IP address anonymization.)
This left me with a pile of data that’s very hard to make sense of. Google Analytics is a complicated product, with a fair share of seemingly arbitrary limitations, and if you don’t upload your data in just the right way, the dashboards won’t tell you what you actually want to know – even if all the needed data is technically available. And even if you wanted to do it yourself, a full dump of the raw data does not seem to be possible.
For example, there is the difference between “screen views” and “events”. Think of a screen view as a page view in traditional web analytics; it can be a particular screen in your mobile app. Events are typically triggered by user actions, like button clicks. These may or may not lead to a different screen. So which things to record as screen views, and which as events? If you want to know how long the user spent on that screen, you need to use a screen view. But if you want to assign categories, you need to use an event. What if an event of a particular category led to a new screen? So you may even need to send both. In Rocket Mail, I used events exclusively, so I cannot track how long a game takes on average. (Only average. Averages suck, but percentiles are outside the capabilities of Google Analytics.)
As another example, some operations are only available on some aspects of your
data. In Rocket Mail, I wanted to track how many launches (cities) the player
went through before hitting game over. I figured, if I send a start
event at
game start, and a launch
event for every rocket launch, I can divide the two
and get my answer. However, this suffers from two problems:
- The Analytics dashboard does not know how to divide two numbers.
- This data is skewed towards the low end because it also counts launches in games that were never completed.
With hindsight, it would have been better to send a game_over
event at the
end of the game, with a value containing the number of launches. The
dashboard will average over the values of events. However, if I also want to
record the score (and average it), again I’d need to send two separate
events…
Now this sounds like I’m ranting about Google Analytics. Maybe I am, a little, but it is a mature product with a lot of features, and entirely free, and I don’t know of any product that would do a better job. The problem is just that by trying to record everything, I was putting the cart before the horse. With so many ways to get it wrong, that didn’t lead to a useful outcome.
So in my next game (so far referred to as Mystery Game No. 1), I’m swapping cart and horse into their appropriate places. Rather than tracking all data haphazardly, I considered which questions I want the data to answer. I then added instrumentation just for that, and nothing else.
This is a mobile board game for 2-4 players, either local humans or computer (AI) players. Here are the questions I’m adding tracking for, and how:
What percentage of players start the game on day 0, 1, 7, 14, 21, and 28 since they installed it? This lets me know how long players keep playing the game (retention).
The Cohort Analysis page in the Analytics dashboard breaks down metrics by cohort, where a cohort is a group of people who installed (actually: first started) the app on a particular day. This feature is really nice, since it doesn’t just show app usage by cohort, but other metrics like session duration as well. Just record a
start
event at app startup (“resume” on Android).What percentage of games are started with 2/3/4 players, human or AI, at which difficulty level? This lets me know how people are playing (by themselves or with friends), so I know where to focus my efforts.
For this, I record the start of the game as a view of the
game
screen, so that we get game duration for free. Custom dimensions are used to track number of players, type of players, and difficulty level. Breakdown by custom dimensions is simple in the dashboard (although breaking down by more than one at a time may be harder) and percentages are available.What percentage of games is played through to the end, as a function of the number of players, the type of players, and difficulty level? This lets me know whether a single game does not last too long, and gives some clue about the cause.
For this, the end of the game is recorded as an event, which happens on the
game
screen. The average number ofgame_over
events per screen view is the number we’re after; this would be 1 in the ideal case, lower in reality. I can’t find this number in the dashboard directly, but since both numerator and denominator are available, it’s not too hard to construct it by hand.For games played to the end against the AI, what percentage is won by the human, as a function of the number of players and the difficulty level? This lets me know whether the AI is doing a good enough job.
With the
game_over
event, I associate an integer value: 1 if the human player won, 0 if they lost. The values are averaged in the dashboard, again sliceable by the custom dimensions. This gives a win fraction.How many games are played (started) per session? This tells me whether people get bored after one game, or are frequently up for another match.
We’re already recording game start as a screen view. The Session Duration tab in the dashboard does not just break down sessions by duration, but also shows how many screens were viewed for each duration bucket. Unfortunately I haven’t found a way yet to reveal the number of finished games per session.
All together, these provide a much more useful and above all actionable set of metrics. It’s just a single case study, but I hope this post is of some use to others who want to add or improve their game’s analytics. Happy measuring!