We are still at the infancy of Virtual Reality (VR) but it’s getting difficult to ignore its growing mainstream popularity. Video game companies have been utilizing VR for years and brands and publishers are now finding new ways to broadcast news, stories and events through this new medium. Putting all the cool content aside, what really excites me about VR is the lack of progress with interaction design within our virtual environment. User Interface (UI) design is one of the most important challenges in getting Virtual Reality to break into popular culture and it’s a challenge that hasn’t been cracked just yet.
User Interface design within VR is an unusual and challenging problem. To find a solution we need to attempt to artificially recreate what feels natural to us in our everyday lives. How do we recreate real world instincts in interfaces? Does the UI look more like real-life gestures than buttons and drop down menus? How does User Interface Design evolve in this new virtual world? Best practices are not obvious yet. No standards are set and we are still trying to figure out what works and feels natural to us. I get super excited about the challenge in finding a new user interface design that feels familiar to us yet unique to the challenges that a virtual reality world present.
The Early Years of UI in VR
So far User Interface design in Virtual Reality has consisted of web-like designs and some joysticks. Most of the menu interfaces look like website buttons, arrows and drop downs. It doesn’t feel natural and it is often a clunky experience to interact within a virtual environment. Most of the big Virtual Reality headset manufacturers have introduced a joystick-type device to help us navigate our virtual worlds. The Oculus Rift recently came out with an improved VR specific joystick, the Oculus Touch. The HTC Vive comes with two hand controllers with advanced precision and haptic feedback. The Sony Playstation VR system comes with a pair of Playstation Move controllers. All these joystick type of devices make sense for video games and cool art applications but doesn’t feel very natural when watching a film, story or news content. We don’t walk with joysticks in our hands when we are interacting with our realities. There needs a be a better way to naturally mimic how we interact with our VR surroundings.
Remember smartphones before the iPhone? They featured styluses, clunky keyboards and desktop designs. It wasn’t until Apple introduced multi-touch that we realized we had been doing mobile computing wrong for all those early years. Multi-touch was the breakthrough UI for our mobile device and introduced the next computational platform for the masses: gestural interface design. We went from typing in commands, to touching and swiping screens. What is the next computational interface that makes sense in Virtual Reality?
Your Hands in VR
We underestimate how much we use our hands to interact with our real world. We point, hold, flip off, grab, touch, weigh, caress and feel — all with our hands and fingers. The hand is a complex thing to replicate but some companies are starting to experiment with hand tracking to find more genuine ways to incorporate hand gestures in Virtual Reality.
Manus VR is making the first consumer Virtual Reality glove. It is a clever way to incorporate natural finger and hand gestures in a virtual world.
Leap Motion and Oculus Rift came out with a hack that enabled your Leap Motion device to track your hand while in your virtual world. Hand tracking without the gloves!
Microsoft just recently released some of their VR hand-tracking prototypes. By using their Kinect technology they can more precisely emulate a virtual sense of touch.
These great advances hint to a UI mechanism that can be useful and natural within Virtual Reality. However, Kinect and Leap Motion have taught us that humans don’t like using hand gestures for long periods of time. Hand tracking alone can’t be the answer for the future of UI in VR. And that is where the promise of the next computational interface can help influence the future of Virtual Reality.
Your Voice in VR
All the big tech companies are betting on the grand vision that “conversation as a platform” will be the next big paradigm shift in how we interact with our computers. And I believe UI in VR will benefit greatly from this new conversational platform. Yes, natural language processing is a problem we have been trying to solve for quite some time and still frustrating to use. But it is finally starting to become useful, open and (more importantly) reliable. My Amazon Echo works surprisingly well inside the home. Apple recently expanded Siri to be able to communicate with other apps and companies. Samsung just recently purchased Viv, the original creators of Siri, to get into the voice assistant game. It is hard to ignore the direction we are headed and the big tech companies are bolstering their efforts in natural language processing.
Combining natural language processing with hand gesture interaction makes for a powerful and familiar combination of how we interact with our surroundings. We are moving away from the need to physically see and touch and moving towards a UI that is more based on voice and gestures — just like humans naturally interact. Quick hand gestures and voice commands can make a very genuine combination in helping UI in VR feel right.
It’s exciting to be at the infancy of Virtual Reality because it gives us the opportunity to help define and invent the standards and best practices that will survive the test of time and become nomenclature in mainstream VR. I’m putting my money on the right balance of gestures and voice recognition becoming natural ways we interact with our virtual worlds. I may be wrong, but I can say with certainty that joysticks are not the final answer for UI in VR.