Archive for July 2013
After reading an article about how Google Glass could change the way journalists work, it got me thinking how Glass could help us in the future with market research and usability testing.
What is Google Glass?
It’s like a pair of spectacle frames that lets you take photos, record video & audio, make calls, listen to music without headphones through its bone speakers and use the internet and apps discretely. It wirelessly connects to an android phone, giving you access to a stack of clever things through the tiny screen mounted on the frame that only you, the wearer can see. The wearer uses eye movements/gestures and voice commands to control it. It’s really smart and should be available to everybody late in 2013.
Google Glass and usability testing.
So what could we do with this exciting piece of hardware and a few Glass apps?
Easy interview recording?
It could be a great way to discretely record what’s happening during some usability testing. Glass will video what you’re looking at so it could be a great way to record a research session without lots of tech and obtrusive cameras. The participants maybe aware that you’re wearing a Google Glass but this maybe less invasive than using webcams and further software like Silverback or Morae to record the interviews.
Making notes during research?
Taking notes in any form of research can affect the participant’s behaviour. As you write, the participant becomes aware that what they are doing is of interest and may start to think in detail about why their behaviour is so interesting. Using the eye gestures in Google Glass in a custom app could maybe place bookmarks in a recording. This could simply be a marker on a timeline to note something of interest or maybe markers to help us time task completion.
Keeping an eye on time?
It’s a simple one – but sometimes it can be hard to sneak a look at your watch when interviewing a participant and not all research facilities have a clock in view. A quick glance at Google Glass could let you see the time, or a more advanced research Glass app may use a colour and countdowns to tell you how much time is left in the interview.
Reading a digital research script?
Having the interview script in Google Glass, may allow for a more relaxed environment within in-depths, where the facilitator could leave the printed script behind and just glance up at Glass and engage with the participant in a less formal/scripted manner.
Discretely receive questions and comments from the observation room?
Other members of the team observing the research, could send messages and questions to the facilitator either by audio or as a message to appear on Glass.
Potential problems with Glass for research?
- Apparently, battery life isn’t great – so we may need a powered version or hotswap batteries or just a big bag of charged up Google Glass’s!
- They’re not that discrete and I think people will slowly become more aware of Google Glass after launch so they may un-nerve research participants with their sci-fi look.
- Your eye gestures may put participants off if you’re using it during a face to face interview. A few have stated that eye gestures will become accepted at some point if this type of computer interaction takes off. So maybe we’ll think nothing of talking to somebody as they glance up to read their Glass.
What do you think?
I’m sure there are more ways we can use Google Glass in usability and market research. How else could researchers use it?