Common sense tips for designing Augmented Reality experiences.
Augmented Reality (AR) is gaining more popularity and ubiquity as hardware and software becomes more accessible. That’s great! Now, how do you make sure the experiences you design are easy to use, memorable, and leave a lasting impression?
During my time working with the world-class experience design studio Helios Interactive (who just got acquired by the fantastic provider of global brand experiences, Freeman — Hooray!), I have been given the opportunity to work with our team to help design UX for a myriad of AR experiences deployed all over the globe. I have designed UX for brands like Cartoon Network, Medtronic, Golden State Warriors, Sprint, and more.
Let me tell you a bit of what I’ve learned.
Environmental Design vs Interaction Design
Environmental design is the context in people will be engaging with your application. Imagine looking through a ‘window’ into an enhanced (augmented) environment. Whatever context Users are currently in is the environment! From walking down a city street, attending a heavy metal concert, or driving down a country lane. Your device is the window into the AR world. Keep the User’s context in mind, as that will have some bearing on things like UI placement, color, and size.
This is how you interact with the context or environment. These interactions take place within the ‘window’ of your phone screen or the viewfinder of the headset, where there exists Media and 3D objects. Remember that context will also drive the possible interactions that Users can use to touch your AR experience. In what situations are touch interactions possible? Are there some contexts in which only voice commands would be safer (driving, for example)? Will users have a long or short time to interact?
There are two main types of objects in AR: 3D Volumetrics that can interact with light and shadow, and Animated Media, things like images and videos. These media items are basically the same as traditional 2D media but are rendered in a new context for AR.
How Users interact with the objects, media, and UI in your experience can be influenced by several factors. When considering how to interact with the environment, first consider what hardware the Users will be engaging with. Phone interactions are different than HoloLens interactions. Interactions also depend on the goal of your experience. Think about Snapchat vs Pokemon Go vs Furniture/Car Staging — each of these attempts to encourage the User to reach a different goal, and the interactions possible to get Users to achieve that goal should be obvious and helpful towards that end.
Think about how people will hold their devices while using your app. Try to place frequently accessed UI in comfortable-to-reach areas — that includes placing important elements in the center of the interface. Think about how you use your device. Where is comfortable to reach? Will Users use one or two hands to navigate your app? Think about different device sizes — the Pixel vs the Pixel XL might require different button placement. Read about and experiment with comfortable interaction zones! Where can your thumbs reach easily? Think about design trends…lots of mobile applications are now putting frequently used elements on the bottom of the UI.
Visual and Audio Cues
Cues are simply clues that inform the User what elements of the UI are designed to be interacted with, and how to interact with them.
Use on-screen UI to encourage looking around and show off-screen elements. Adding hover states to buttons and highlighting interactable elements are effective ways of prompting your User to look around the space. If you don’t, they may miss great things… like giant squid. Give the User feedback that something is happening and give him/her time to un-select or move on if he/she wish. More visual feedback is better!
When designing a Snapchat or Animoji-style application, use placement guidelines for user tracking. Even better, employing easy-to-discover gestures (such as opening your mouth or winking) for additional animations or effects adds to the ‘delight’ of the User when he/she discovers them.
Like visual indicators, sound will help the User notice off-screen objects and incentivize him/her to view the full 360 space. That way you won’t miss any friendly bears. People are used to interacting with flat screens, so you need to really encourage your User to look around the full environment. Otherwise, some of your hard work might go unnoticed!
Color and Text
The science of Color Theory works the same in AR as in print, mobile, web, and the rest of your life. Use contextually and culturally appropriate colors. Green (usually) means Go. Red (usually) means No or Stop. Blue is calming and generally agreed to be the color of “technology” and “the future”. Complementary (opposite colors on the color wheel) have higher contrast when used next to each other. Don’t make people’s eyes bleed!
Make text visible and easy to read by making it large and making smart font choices. San Serifs can be easier to read than Serif fonts depending on your context. Use the same amount of text (or less) than you would use on a traditional mobile interface. No one wants to read a novel. Light text on dark background or dark text on light background are the best contrast schemes for reading.
Lighting can make the difference between immersing your Users and leaving them bored. Project shadows from your objects to make them more substantial and bring them to life. Where, then, is a good place to position your light-source? The best is simply overhead at the 12 o’clock position. Severely directional lighting will most likely be incongruous to the context that your User is viewing your application in, whereas the presence of direct overhead lighting is most likely a given in your User’s environment.
Innovate in Small Doses
Knowing how and when to innovate in such a new field as AR is important.Easy ways to increase engagement are to make buttons look like buttons, and make sure there’s enough familiarity to not scare away new users. You don’t want to discourage people or make them work hard to figure out your interface. Remember that just like 2D design, having a high Cognitive Load on your User is bad! Known mobile gestures like pinch, swipe, tap, etc will help ground Users in their context so they feel comfortable exploring the full experience.
Finding What Works
Hopefully you know the importance of Usability Testing for 2D experiences. Well, it is just as important for AR! For one facial tracking application, we discovered that the tracking software we were using had issues tracking people with darker skin tones and glasses. The solution was to change what facial tracking software we were using.
Another important note for all Usability Testing: tailor your Usability Survey to your participants. For a mobile AR game, I was conducting a lot of my play-tests with children. This meant that I had to write the questions in a way that the children could understand and be comfortable answering; e.g. questions had to be short and to the point — utilizing smaller words was key. Don’t ask leading questions, and don’t ask questions that can be answered with a Yes or No. Instead, ask them open-ended questions — What, Why, How? You’ll get much more detailed responses about their personal gameplay mechanics.
Empathy allows you to preemptively solve problems or issues that certain Users may encounter during your experience. By taking the time to envision their needs, you can make your experience more holistically accessible and useful for all of your Users. Furthermore, placing yourself in another’s shoes will: save designers and developers time, plus save money for your company because there will be fewer bug reports, unknown edge cases, and critical issues when things go live! In short, how would you feel going through this experience if you weren’t you?
I hope this article helped you think about all that goes into making a good AR experience, and I hope you learned something new and interesting.
Thanks for reading!