Wearable User Experience Design

by Erin Malone

I have a friend who recently tweeted that he had some extra Google Glass invites. I mentioned this to my team and we all got very excited at the idea, as we’ve been discussing wearable computing quite a bit here at Tangible UX lately.

From a design perspective, video goggles, wrist fitness computers, bra body calculators, etc all pose exciting and new design challenges. How will people drive while wearing their Google Glass (assuming robot cars don’t come before wearable computing hits the mainstream)? How can we bring more functionality to the wrist, beyond a simple pedometer? How does that futuristic bra give you valuable body metrics data without being weird and annoying?

It’s no surprise that the success and proliferation of these new wearable gadgets all depends on whether they have good user experience at their core.

One thing to consider with wearable computing is the fact that these devices need to be smart enough and well enough designed to not get in our way. The brilliance in their design comes from their ability to be useful devices, while the interface all but vanishes when the user needs something from them.

The heightened demand for better speech-to-text/voice-command technology in our smartphones and cars alike is a great example of how we can no longer be bothered with an obtrusive graphical user interface. For some applications, we so desperately want the GUI to completely disappear… but of course, there still needs to be some sort of interface. With other devices, such as Google Glass, we need the graphical interface, but we need it to be minimal, elegant, and completely relevant to whatever it is we’re doing while wearing that device.

Another item to consider with wearable computing technology is the idea of application multitasking on these devices. Remember when Apple announced multitasking for iOS? Everyone sighed a collective “oh thank goodness I can finally multitask on my phone”. It almost seems that multitasking would have the opposite effect with wearable computing. Generally speaking, when someone is doing something with their wearable device, that user is probably performing a primary action in meatspace; walking to grab a coffee, driving to work, cooking dinner, etc. While the device may be doing things in the background like tracking motion, playing music or recording coordinates, the use-cases for the interface may be single: check email, take a photo, remember this thing, etc. To have an eye-goggle interface that displays too much heads-up information would prove that device to simply be annoying and useless. Could you imagine someone walking down the street wearing Google Glass, while trying to check his/her bank account, take a video, lookup coordinates, ignore the augmented reality advertisements, check email, and call his/her robot car all at the same time? There’s an intersection coming up, pay attention.

While, yes, the interface should be unobtrusive, it still must be designed right. This takes the “only what you need” thinking process of mobile app UI design to the next level. The person is doing something and this device should absolutely not interfere with whatever that is. This device and its interface should not disrupt but instead assist its user with those primary tasks – while hopefully delighting the user along the way.

It’s our job as designers and technologists alike to make the right user experience decisions while designing these futuristic gadgets. At Tangible UX, we see a very exciting and bright future with these devices getting closer to reality and we’re excited to be part of the journey.

Huge thanks to my friend Dav, who was great about offering and coordinating the invite for me. I received the official invitation from Google to the Google Glass program this morning and went ahead and ordered it. Our Tangible Tangerine-Orange Google Glass Explorer Edition is on its way… stay tuned for adventures in Wearable Computing User Experience Design!