We’ve recently finished a long project for a client.
The brief: “We want over a dozen mini-games that kids can easily play on their phones…”
Us: EZ
The brief: “... while introducing accessibility features so we are more inclusive…”
Us: Okaaay
The brief: “... for kids with disabilities, including blindness.”
Us: ...
Also us: Challenge accepted!
So after some internal discussions we rolled up our sleeves and did what we do best, solve Kuppin’ hard tech challenges. After all, we are tech-problem-solving junkies… and no, that has no correlation whatsoever with loving (*cough* being addicted to *cough*) Koffee.
We’ll break down how we dealt with the challenges, what NOT to do in similar situations and share some learnings in retrospect.
Table of Kontents:
The general challenge of building accessible games
Keyboard-accessible
Giving the vision-impaired a helping hand voice
Game design with accessibility in mind
Changing our attitude towards accessibility
The general challenge of building accessible games
Most of you reading this will already know what Accessibility is, but for those of you lurking here trying to understand the bits and pieces, we’ll try to make it laic-cessible*
*[laic = layman = a person without professional or specialized knowledge in a particular subject.]
Web Accessibility means the websites, tools and technologies are designed and developed so that people with disabilities can use them. People can perceive, understand, navigate and interact with the web.
Web Accessibility can have all the disabilities covered that affect access to a website Including:
Auditory
Cognitive
Physical
Speech
Neurological
Visual
Now, with that in mind, we need to look at how the guidelines for accessibility were actually designed for static content, therefore making games accessible turns into a slightly bigger challenge than one would tend to think.
They assume that the content of the website will be mainly text-based.
They assume that the content is mostly static - meaning, that it won’t change too much over time or in reaction to the user inputs.
The guidelines only allow the implementation of a few very specific user controls - all of them primarily designed to interact with… forms
How did we overcome these challenges? Basically, by creating a giant, invisible html form on top of the game, and by firing lots of audio descriptions to inform the user about every relevant in-game event.
To be completely honest with you, and not catch ourselves just erratically flexing our biceps in front of the mirror, the truth is that there was quite a learning curve – both for us and our partner. With every new game we were pushing the limits a bit further and accessibility advanced with each game. Think of it as if each game was a dungeon with a boss at the end – after completing it, we leveled up, learned new spells and were more confident in pushing higher level dungeons.
Keyboard-accessible
As the mini-games were primarily for mobile, we started by making them keyboard-accessible. Sounds like a piece of cake, huh? Well… THE CAKE IS A LIE!
A seemingly easy-to-do task was unironically the hardest part from any of the accessibility features we implemented. For example, we had to create a virtual, keyboard-accessible html overlay on top of the graphics, handle the focus order and handle dynamic buttons – amongst many other things. Let us explain that in a bit more detail.
MOBILE GAMEPLAY IS TOO FANCY FOR A KEYBOARD
Since these games are meant to be played on mobile devices, the user inputs usually come from interactions with the screen - touching, swiping, tapping… But you cannot swipe on a keyboard. You cannot drag-and-drop, you cannot shake or rotate your device. So, any time we had a ‘fancy’ interaction, we came up with a fallback gameplay that would work with a keyboard.
For example, in order to clear up a canvas, mobile users can scratch the zone with swipe-like gestures on their screens, and for keyboard users we decided to let them repeatedly press on the spacebar.
A CANVAS IS LIKE A DARK VOID OF NOTHING, FOR YOUR BROWSER
Web browsers are usually pretty smart, and they provide off-the-shelf accessibility features for your website if you code it the right way.
However, when you are using a game engine, you are rendering everything that happens on each frame of a canvas - a rasterized image, an array of pixels, that makes sense to the human eye, but not to your browser.
In order to let the browser know what’s happening in games, we built tools to automatically sync up the game with an html grid - a layer of controls hidden to the user, but visible and understandable by the browser.
Platform specific subtleties
On top of that, a final challenge awaits the accessible web developer: different phones behave differently. This is a classic of web dev, and it’s true for accessibility as well. For example on iOS, some keyboard buttons are reserved shortcuts - you shouldn’t rely on them to let the users interact with your game. In the same fashion, different browsers have different ways of handling the removal of interactive elements on the screen - we had to test them all, and come up with the solutions that would work no matter the platform.
As you can see, it was rather a piece of slow sand. We were really hoping for cake; maybe next time!
Giving the vision-impaired a helping hand voice
Now here one would think the opposite compared to the keyboard, this on first glance sounds way harder than simply “plugging in a keyboard”, but thankfully it was quite the opposite. The tools were already built for this by iOS (VoiceOver) and Android (TalkBack), we just needed to find a way to tie them effectively to the mini-games so it’s reading out audio descriptions of what's happening at the right time.
Not to get too technical, but through code, we can define an ARIA live-region - a part of the website whose content is meant to change over time. The browser will watch this region, and read out loud any content update made within it.
We defined a written description for every event of our games - for example, ‘Your spaceship is taking off’ - and we sent these messages to our ARIA live-region when the corresponding event happened in-game.
We also needed a way to properly describe the on-screen interactive elements. The tricky part is that the description of these elements could change depending on how the user interacts with them - for example, an item could be described as red, green, blue... Depending on how the player painted it. It becomes even more tricky if you can change both the color and the shape of the item.
There are basically two ways around it : you hardcode a description for every possible combination… or you try to be smart about it, which we did! We defined fragments of descriptions that could be automatically combined (while still making a valid sentence in any language) to cover all the cases.
There was a lot of tedious testing to be done, as using the screen readers to navigate your phone is so slow compared to using touch, that slow motion is an understatement. A big learning here was to primarily listen to real user feedback, rather than always going “by the book”. Trust us, it’s absolutely critical to test on real devices
Most of the time, screen readers are not supported with emulators or remote phones.
NVDA is a great software to test screen reader on desktop - but the speech rate can be different from Android’s Talkback or iOS’ VoiceOver
Older devices and OS sometimes handle accessibility in a slightly different way - we often have to provide fallback to account for older softwares
Game design with accessibility in mind
It’s great to create a game and then add accessibility to it, but where the real magic happens (in terms of accessibility) is already coming up with gameplay and level design that will be accessible. It’s a delicate balance between the fun playfulness of the game and its accessibility.
A few key things we thought about during the game design:
If a sighted user can just tap with their finger to pick an item from a list, visually impaired people will have to use their keyboard to select the item - meaning, they will have to tab through each item on the list until they get to the one they’re after. It takes significantly more time to do so! Keep that in mind when working on your game’s timings.
If you have too much visual stuff happening at the same time, the screen reader will have trouble describing all of it. Try to space out the events so the users are not overwhelmed by audio descriptions
In general, time-based games are trickier to handle. If possible, use game mechanics that do not use any timer, and if you do, make sure to include the possibility to change the speed of the game in the settings.
One of the main things we realized was that the more control we give to the player the better. For example, visuals and rhythm had a huge impact on easier accessibility.
Here’s an example: As much as you can, let the player decide when they’re done with one sequence and ready to move to the next one. For a visually impaired user, nothing is worse than having the whole context of the page changing without any warning. Offer a ‘next’ button instead, so the player can get ready before they move on to the next stage.
Changing our attitude towards accessibility
After this extensive project, our perception and attitude towards accessibility improved. We appreciate accessibility way more now and have found deep respect for it. Working on accessibility for your games doesn’t have to be a nuisance and can benefit everyone. It also helps to think about it as doing it to deliver the best experience possible for EVERYONE, rather than doing it out of fear of getting sued or shamed. It even provides you with new skills as an aftermath! If you’re implementing keyboard accessibility, you’ll suddenly catch yourself navigating websites more using your keyboard. After spending days and hours on end battling with the screen reader, you empathize better with the vision-impaired and understand their struggles way more.
We also couldn’t help but think about how accessibility could benefit from the latest AI breakthroughs. Image labeling, speech to text, customized content - this is what accessibility is about, and AI is pretty good at all this. Sounds like a pretty exciting application to explore!
Generally, it was a very enlightening experience for us and would recommend it to anyone to try implementing accessibility for their game, even on a smaller scale than we did!
Alright, my Koffee cup is empty now… need to get a refill. See ya on our next blog post