Nate Bolt  

Guide to Remote Usability Testing

July 18th, 2006 by Nate Bolt :: see related comic

As more usability practitioners start conducting remote usability testing, there seems to be a demand for some tips and guidelines around this technique. New screen-sharing tools like [Breeze][2], [Co-Pilot][3], and [GoToMeeting][4], and remote usability tools like [Ethnio][5] and [UserVue][6], make it easier to conduct moderated remote usability testing. Dealing with video and audio recordings keeps getting simpler as well. But observing people remotely presents a unique set of obstacles, so this is a guide to what we’ve learned from conducting 149 remote studies with 1,213 participants over the last seven years. We can’t get that time back, but hopefully some of what we’ve picked up will be helpful.

###Moderated Remote What?
Basically, there are two kinds of remote usability – moderated and automated.

- __Moderated__ is one person watching another person use a computer, viewing their screen movements with a screen-sharing tool, and talking with them over the phone. The moderator watches and listens to where the participant runs into difficulty while interacting with the application or site, and records both the conversation and the participant’s screen. It’s pretty much the same as in-person testing, but minus the user’s facial expressions. Also known as __facilitated__ remote research.
- __Automated__ means hundreds or thousands of participants report their own behavior through a browser window or frame that has an open text field and survey questions. As the user navigates through a web site, they enter their feedback and answer page-specific questions in the browser frame. It’s more than a survey, because it’s still behavioral, but there is nobody watching and talking with the users. Some people call this unattended remote usability. The data is usually automatically distilled into a report with aggregate verbatim answers, click data, or exercise results. Some of the best new automated remote tools are [MindCanvas][7], [UserZoom][8], [ClickTale][9], [RelevantView][10], and [KDA Revelations][11].

You can and should mix these methods, but this article is going to stick to moderated. If you’re just getting started with remote usability, there are articles [here][12], [here][13], [here][14], and [here][15], that can give you more background.

###Recruiting Doesn’t Have to Suck
In order to get started, you’ll need some users, and there are basically three options for how you can recruit for remote testing:

- __Traditional__ includes all the ways you would recruit for an in-person study - email campaigns, customer lists, recruiting agencies, craigslist ads, bus stop flyers, whatever.
- __Online scheduled__ is where you use an online survey or recruiting screener of some kind to schedule participants for sessions in advance.
- __Live Recruiting__ means you use an online screener to intercept people in the middle of their real-life tasks, and watch them live as they complete those tasks, in their native task environment. No scheduling in advance. This method is so effective that I’m going to dedicate the entire next section to it. Here we go.

###Native Task Environment: A Really Big Deal
The ability to intercept someone live as they begin a task, have them quickly share their screen and comments over the phone, and then watch them continue to use their computer the way they were about to anyway, is by far the greatest benefit remote testing offers. The criteria by which participants make click decisions are totally different when they have a temporal, emotional, and logistical attachment to completing a task. Throw in their native physical and computing environments, and you are looking at the most accurate form of behavioral usability research possible. Well, that might be a stretch, but you see what I’m getting at. In order to catch people in their native task environment, you’ll have to do some form of live online recruiting.

###Live Online Recruiting
This method assumes two things – you are usability testing something that has a web site associated with it, and you can convince the I.T. or Web director to let you temporarily place some code on that page. Not always possible, of course. This code can be a static link that goes to your Zoomerang survey, or in the case of Ethnio, one line of JavaScript that triggers a DHTML layer recruiting invite. Here are the details of this method:

- __Got webernet?__ You’ll need that editorial access to a web site or web application related to the tool you are testing. Why do you need this? Because that’s where you’re going to place the invite to your live recruiting form. You can recruit live for software by asking people from your organizations web site if they are going to be using, say, QuickBooks, for anything that day. Then ask them to describe the task they will be performing. You’ll get real insight just from that alone, but then you can call them before they’ve opened up QuickBooks for the day.
- __We all hate pop-ups__ but this is the best way to do live recruiting. We use a DHTML layer that cannot be blocked by a pop-up blocker inside Ethnio, but you can also insert a link or a feature box onto a single page.
- __Call Them__. Use your phone to make first contact with participants instead of email. You may lose the ability to get people to pre-setup for screen-sharing, but they have to be able to start right away if you want to get the true live behavior.
- __Six live recruits per hour__ is the minimum response rate to do live recruiting (that we suggest). This means you need to have at least six people to choose from, in our experience, in order to not have to schedule anybody in advance for your study, ever again. If the site or web app you’re testing doesn’t have that kind of traffic, you could sit around and wait forever, but that generally isn’t much fun. With six per hour, you can conduct one session per hour and get 5-6 sessions per day in easily. Samurai moderators can do 8-10 per day. Grab a second moderator and you can do 20 users in a day.
- __Use any old survey tool__ to put your recruiting screener questions into. Just be prepared to do a whole lot of exporting to excel to get the verbatim responses. You could also build your own live recruiting tool in Ruby on Rails. That would be mad Web 2.0. There is a certain [live recruiting product][16] out there as well.
- __It’s Tricky to Call Fast__. Calling Live Recruiting Respondents within a few seconds can be tricky. It involves a lot of monitoring of your live responses coming in, but when you get someone within a few seconds, it yields awesome data.
- __Extremely Narrow Recruiting Quotas__ don’t really work with live recruiting. This means that as you narrow down your potential target audience you won’t be able to easily get live users. If you need to recruit a married database administrator from the east coast who enjoys marmoset photography and live folk music, you’re going to need to schedule them in advance.
- __But I want new users__. A common misconception is that you cannot recruit brand new users with an online invite of some kind from an existing web site. We’ve found that using the live recruiting method to call within 10 seconds of them arriving at the site and filling out the screener will give you nearly the same results as getting users from an agency or craigslist ad that swear they have never used your web site before. Just add a question in your screener that asks: “How many times have you visited the site before?” You can also do a mix of “new user” email recruits and live recruits and see for yourself.
- __Lady Luck__. In reality, live recruiting takes a lot of luck to get people before they have started actually using your site. But even calling within a few minutes can yield good results, and you can absolutely mix and match recruiting methods A common setup for us is 8 users live, and 8 users recruited by an agency.

###Observing
This is where you watch someone else’s desktop behavior using a screen-sharing tool. You just have to pick the screen-sharing tool that’s best for you. No matter what tool you use, some people won’t be able successfully share their screen. We see about an 85% success rate just getting someone to successfully install a screen-sharing tool. The participant often has firewalls, no admin installation privileges, etc. Here is a list of my favorite tools, and you can find updates and a more detailed list at the [remote usability wiki][17]:

- __[Breeze][2]__ (It’s in Flash. Its slick. Very reliable. Kinda pricey)
- __[UserVue][6]__ (Coming soon from TechSmith, pricing not yet determined)
- __[GoToMeeting][4]__ (Least expensive. Speed is much better than WebEx. Also, if you are using the “Reverse Morae + GoToMeeting” method to conduct remote sessions, you can use the features and flags inside Morae to log behavior, which is cool. )
- __[Co-Pilot][3]__ (Based on open source VNC, which rocks.)
- __[WebEx][18]__ (Slow, and they once billed us $3,000 for two sessions. I hope they fixed that glitch.)
- __[Ethnio][5]__ (Early release, hard to get an account, and only tool with integrated audio recording and recruiting as of July, ‘06.)

###Tips for Remote Moderating
Once you’ve chosen a tool for the screen sharing, you’ll have some questions about how to contact participants and interact with them over the phone. Do you prefer taking notes by hand? You can still do that with remote testing but you might consider giving the old evil machine a try for note-taking. Coming up with a system for tagging quotes, behavior, and video time code as it happens will make your life so much easier. Here is a list of things to keep in mind when you call the participant:

- __Practice Muting__. Come up with a phrase you can use frequently to put the participant on hold. Practice saying, “Please hang on a sec while I adjust something on my end.” You can say this to your participant for ANY reason, so that you can put them on mute. If a fire breaks out at your desk (it happens), “Can you hang on while I adjust something on my end?” works great. If an observer starts laughing hysterically and calling your participant a moron, just go “Can you hold on again one more time while I adjust something again?”, press mute, then slap the observer with a reverse open palm.
- __Success Rate__. Only 80-85% of the people you call will be able to participate (and sometimes less). Some will have insane firewalls that don’t even work with the workhorse WebEx. Some will have kids that trip over Ethernet cables. Many will be whispering to you from their cubicle so quietly you can’t hear a d*mn thing they’re saying. Just plan for this in the same way you recruit extra lab users to accommodate for flakes.
- __Scary Willingness__. Almost everyone you call will be willing to install any crazy screen-sharing tool you ask them too. This is scary, and I don’t know why people are so willing. If someone doesn’t want to, thank them and move on. Users recruited live are plentiful.
- __Outside Observers__. The absolute most fun way to do remote testing is in a big conference room on-site with whoever is paying for your study (client, your company, etc). Use a projector and speakers hooked up to your phone tap, so that all the engineers, designers, and stakeholders can come in to watch live. If the moderator uses a headset, people can have low-level conversations in the same room and the directional mic won’t pick it up. They can also tap the moderator on the shoulder or send them an IM if they want follow up questions or something changed about the moderating. You can also setup remote observers with most of the tools out there by having them join the meeting. Make sure observers are invisible by hiding the participant list, and make sure observers are muted if you have them call in to a teleconference to listen to the session. Your participant may guess there is more than one person listening to them, but there is no need to make it obvious by having outside observers giggle or sneeze when the user does something silly. I mean interesting. Something interesting. Set these three ground rules posted on the door:
- No loud laughing or giggling.
- Speak at whisper level.
- No yelling out what they think the participant should click on.
- __Study Design is Up to You__. You can use the live recruiting method but have participants do all pre-determined tasks, or even measure time-on-task. You can mix online scheduled with live recruiting, or divert some users to a prototype, just as you could in person.

###Recording
Ahh, recording. You’d probably like to have the video and audio from your sessions afterwards so you can view, edit, and share them with your team. The easiest way would be to use a screen-sharing tool that has built-in audio and video recording, but until the Astoria Project Beta is complete or Ethnio is fully launched, you’ll have to use a combination of methods:

- __Camtasia + Phone Patch__. You can use a screen recording tool, like Camtasia, combined with this [phone tap from JK audio][19] to record to AVI. You just plug this tap in between the handset and your PBX or Analogue phone, and then the microphone input on the computer you’re using to test. The JK tap is inexpensive and the best quality phone audio tap on the market. We’ve tried the ones from RadioShack, HelloDirect, and WebEx, and they all pick up interference or send low quality signals.
- __Teleconferencing__. Another method is to use a teleconferencing service that records the phone call for you – Breeze alledgedly does this although we’ve never got it to work. You still have to use a screen recording tool such as Camtasia.
- __Speakerphone Rigged__. Yet another method is to position a microphone next to the speaker on your phone, or speakerphone. This is officially jury-rigged. It can work, but it’s susceptible to all sorts of problems. Mic getting knocked over. Noises in the room. Plus, nobody wants to listen to you over a speakerphone because you just don’t sound like yourself.
- __Reverse GoToMeeting + Morae__ is a tricky but popular method. You have to give the participant control of a computer (through GoToMeeting or WebEx, etc.) on your end that has Morae installed. Then you use Morae to record all the participants behavior and somehow get the audio in. The main problem is that the participant is surfing super slow, because they are remote controlling the Morae computer. This will be unnecessary with the Astoria Project release, I’m sure.

###International Love
Many people get into remote usability strictly for the ability to test international participants. Here are the most important things you need to know:

- Plan for about __double__ the time that a domestic remote usability session would take.
- Use a native language moderator if you can, or [AT&T language line service][20] to get a native language operator on the phone doing real-time translation between you and your user. Be prepared to put subtitles in your highlight video using a video editing program for the coolest sounding highlight clips ever.
- You can also recruit English speaking participants in any country, as long as they verify they have been a resident for more than, say, 10 years..
- Realize that your cultural biases may have a huge, and difficult to predict, impact on the results of your international remote research. Just like in-person, but easier to forget about since remote testing is just so much easier than flying to Beirut all the time.

###Incentives
We use Amazon gift certificates because they only require an email address to fulfill. You can use any incentive you like, but if it’s a check or an American Express gift card, you might have to spend a considerable amount of time with every person verifying their address and name spelling. _I’m sorry, could you please spell Moheekuwaka Drive for me?_

###Make it happen.
If you’d like to get started with remote testing, you have three easy steps. First, choose a recruiting method, then decide on a screen-sharing tool, and third, figure out if audio and video recording is important to you and decide on a method for that. Then you just have to give it a whirl, ideally with a trial study, or you can just use it for an actual project. If you’re already doing some remote usability, don’t forget to experiment with different study designs and new tools. In 1999, we did 5% of our studies remotely at Bolt | Peters using Timbuktu or PcAnywhere. By 2005, almost 95% of our studies were conducted remotely. Now we hardly ever leave the computer screen, which is everyone’s goal, right?

[1]:http://www.boltpeters.com
[2]:http://adobe.com/products/breeze
[3]:http://copilot.com
[4]:http://www.gotomeeting.com
[5]:http://ethnio.com
[6]:http://www.techsmith.com/uservue.asp
[7]:http://www.themindcanvas.com
[8]:http://www.userzoom.com
[9]:http://www.clicktale.com
[10]:http://www.relevantview.com
[11]:http://www.kdaresearch.com/services/revelation.php
[12]:http://www.boxesandarrows.com/view/remote_online_usability_testing_why_how_and_when_to_use_it
[13]:http://www.webpronews.com/expertarticles/expertarticles/wpn-62-20060403RemoteUsabilityTesting.html
[14]:http://www-128.ibm.com/developerworks/web/library/wa-rmusts1/
[15]:http://www.stcsig.org/usability/newsletter/9901-remote-tools.html
[16]:http://www.ethnio.com/products/recruiting/
[17]:http://remoteusability.com
[18]:http://www.webex.com
[19]:http://www.jkaudio.com/quicktap.htm
[20]:http://www.languageline.com/

_Nate Bolt is co-founder and CEO of [Bolt | Peters][1] where he works on remote usability and ethnography. Having overseen hundreds of moderated remote usability studies for clients like Oracle, Time Warner, Princess Cruises, and Hallmark, he led the creation of the first moderated remote usability application, Ethnio. Nate speaks regularly about remote research and created a degree titled “Digital Technology and Society,” at the University of California, San Diego, which focused on the intersection of technology and mass population usage. He also studied at the Sorbonne in Paris, where he was jailed briefly for playing drums in public without a license._

11 Responses to “Guide to Remote Usability Testing”
bjordan wrote:

Interesting development in usability testing. However reminds me that any usability testing will stifle innovation…

Jessica Enders wrote:

Thanks for the interesting overview. I have to wonder, though, whether there might be privacy legislation implications for the approach. Here in Australia we have to make sure the participant is informed about what is happening with their “data” (i.e. what they are saying and doing), who will have access to it etc, which means coming clean about observers. This would apply regardless of whether it is remote or not. And more than just legislative requirements, I wonder if the ethics of this approach should be considered also.

nate wrote:

Great point. I think usability can certainly stifle innovation sometimes. Like if you test a design concept too early, it gets trashed, and is discarded forever, where it could have been the bomb if it were more flushed out. But certainly not all the time.

David wrote:

Awesome summary….best I’ve seen so far.

I’m a user of RelevantView and recommend it highly….that’s how I came across this blog post, by searching for that name.

Mark DiSciullo wrote:

Usability testing should not be seen as “stifling innovation”. It serves to infom design, not dictate it. As a designer we all need to realize the user is not the designer. Usability testing activities serve as a great tool for allowing us to understand how well (or not) our target audience will readily embrace and adopt our designs. As designers it’s our job to interpret that data we gather and create the proper solution for our clients. In my experience, and especially dealing with complex business applications, I’ve discovered and designed more innovative solutions through user testing and analysis then if my designers and I had only worked in a vaccume .

BTW…I’m happy to report that I’ve been seeing Remote User Testing becoming much more accepted as a valid form of gathering both qualitative and quantitative user data. Both our clients, our test participants and my team have been very happy with our experiences performing user testing in this fashion. Great Article Nate!

steven liu wrote:

We are using Morae of the Techsmith product that’s very useful and powerful.

nate wrote:

hey steven, you have to use something in conjuction with Morae to conduct remote testing, like TechSmith’s UserVue, or some other screen-sharing and audio recording tool.

Cheryl Marks wrote:

I use WebEx (corporate contract)with Morae. To ameliorate the latency issues, I’ve started having the participant log onto the site directly. It seems to be much better than having the application open and passing control to them via WebEx.

Lynn wrote:

We would love to do remote testing of our web applications but would prefer to host it on our server because we manage confidential data. Any ideas for “observing” software that we can host ourselves, or a service that allows us to store the data? Do the services all encrypt the transmission?

helmiF wrote:

Hi Nate,
here’s another one you should add to your list for completeness, www.Loop11.com.
Recently launched in private beta.

Rob Edwards wrote:

Bit of a shameless plug but we’re currently building an remote usability testing app - http://testled.com. zero install and can test on any site.

Currently in beta but looking to launch soon.

Thanks,
Rob


Leave a Reply


OK/Cancel is a comic strip collaboration co-written and co-illustrated by Kevin Cheng and Tom Chi. Our subject matter focuses on interfaces, good and bad and the people behind the industry of building interfaces - usability specialists, interaction designers, human-computer interaction (HCI) experts, industrial designers, etc. (Who Links Here) ?