A VR Prototyping Workflow for the Real World
A real workflow for delivering actual VR wireframe prototypes to maximize client understanding
Virtual Reality is pretty much the wild west. There are no defined conventions or standards, and most people in the world (meaning your clients) are probably unfamiliar with it. How then, as a User Experience Designer, do you present VR UX to them in a quick and shareable way with minimum effort from the rest of your team? This article is a response to a bevy of examples I have seen of “VR Prototyping” tools that do not actually solve the real world problem of showing VR UX to clients using the same workflow as non-VR UX, or as close as we can currently get with today’s toolset.
As UX Designers, we need to be able to design scenes, objects, and interactions in a 3D 360 degree environment and then easily share them with clients. Yes, tools like Tiltbrush are fantastic, but can people really translate VR scribbles into fully-realized screens if they have never even put on a VR headset? Also, how consistent are your Tiltbrush (or Blocks or Dry Erase or Spark) elements? Build a workflow where your designs can be systemized and are easily distilled to their simplest elements so that they’re easily recognizable and repeatable.
Things clients need for VR UX
A sense of the 360 VR world, basic scene layouts with simple 3D objects, ability to move through the scenes as a userflow, easy-to-read text descriptions of any elements, objects, or interactions that the UX cannot model fully. They also need to be able to actually walk through the experience flow themselves seamlessly and show it to others, without having to buy expensive hardware (or your team having to ship to them) that they then need detailed technical support to set up properly. These barriers to entry may already be turning clients off from your experience. Familiarity is an important element in VR experiences. Knowing where to innovate and where to follow conventions (e.g. using clearly labeled buttons, or standard interaction gestures) helps you as a UX designer to not confuse your user or the client.
Remove as many barriers to entry as possible
Things UX Designers don’t need for VR UX
Fully-realized Avatars, crude drawing tools, interactions requiring invisible people to move elements around, a myriad of color options. Prototyping in VR should work in the same way as prototyping wireframes for your 2D experiences. Use the simplest forms possible to walk people through the overall flow, objects, media, and interactions in an experience without overwhelming the client with flashy visuals or making him/her use a headset that he/she is unfamiliar with. You should be disseminating your VR UX to be used by the lowest barrier to entry interface and optimum user familiarity (at least until headsets are more ubiquitous): web and mobile devices. Getting client sign-off for a VR experience should save everyone on your team time and effort, not force designers and developers into a social VR netherworld where you all draw crude stick figures. Rather than putting on a whole shadow play during Usability Testing, just let the User walk through the experience at his/her own pace and add descriptions and clarifications as needed.
Use the simplest forms possible to walk people through the experience
Our Proof of Concept Workflow
Designing in 3D is important. As is using tools that are the current industry standards. So, we begin our process by designing each UX screen in Unity (with models created in Cinema 4D as needed). Unity is great for this because it allows you to position the perspective of the User as well as create a rudimentary 3D environment and other 3D objects in the scene. Of course there are multiple tools that can do this… we chose Unity because our developers can easily find or build specific tools we need to make the workflow easier. Once all the screens have been designed with all elements placed, we used a plugin that will export the Unity scenes as 360 degree equirectangular images. Now, how to share them?
To share and annotate the 360 images, we use Vizor.io — which lets us add scene transitions and place text and image annotations (like InVision, Adobe XD, insert your favorite prototyping tool). Guess what? This satisfies all our requirements! This Vizor.io prototype can then be viewed via web on desktop or mobile devices, and allows clients to view the full 360 scene and walk through the flow naturally, with verbose textual and image annotations for clarity.
This proposed flow can be created by one person (it took me a total of 6 hours from element and scene creation in Unity to uploading to Vizor and linking scenes and creating annotations) while still remaining flexible enough so that all stakeholders in your team can have their say, and elements can be updated simply and consistently to reflect changes. Unity also allows materials to be applied to objects, and with some customization you can show different hover and focus states on objects in your scenes.
Of course, this was just a proof of concept, and there is room for improvement. But, I believe this is a useful and extensible step forward for getting your VR experiences out into the world!