Categories
blog HCI Remote Research Research innovation usability

Usertesting.com Review by Rolf Molich

[Guest author: Rolf Molich owns and manages DialogDesign, a small Danish usability consultancy that he founded in 1993. Rolf conceived and coordinated the Comparative Usability Evaluation studies CUE-1 through CUE-8 in which almost 100 professional usability teams tested or reviewed the same applications.]

Usertesting.com – A usable and useful recruiting service

Usertesting.com offers unattended, 15-minute usability test sessions at reasonable prices with participants from the US, Canada and the United Kingdom. I have some experience with their service. I am pleased with it, but you should be aware that they really excel at recruiting users from the most broad demographics to spend time looking at your site, not usability analysis.

Usertesting.com home page
Usertesting.com home page

“Use it and your site will get better”

On their home page, Usertesting.com quotes Evan Williams, Twitter Co-founder: “Use it and your site will get better”. I respectfully disagree. Usertesting.com is a great recruiting service, but a skilled usability professional is still required watch the videos and extract useful usability recommendations. The opinions from the participants are mostly worthless. Participants’ behavior, not their opinions is important.

I used Usertesting.com in a recent comparative usability measurement study of Budget.com. Usertesting.com ran about 30 sessions for me with about 20 users. Since my five car rental tasks on Budget.com took about 25 minutes to complete, I had to split the tasks into two sessions and ask their customer service to get the same test participants to do two sessions with different tasks. This worked like a breeze, but it took me some extra time to handle and write instructions for two sessions.

19$, 29$, 39$, and counting?

At the time of my study in April 2009, Usertesting.com was charging just 19$ per test participant. Shortly after, they raised the price to 29$ per test participant. Now they have announced yet another price increase to 39$ per test participant. At 19$ per participant their price/performance ratio was amazing. At 29$ it was good. At 39$ I am not so sure, considering that you only get 15 minutes test time. At 4 x 39$ = 156$ per hour their service is less competitive compared to traditional recruiting. Despite their price increases, they still pay their participants 10$ per session.

High quality

Amazing turnaround time. In late December 2009 I used Usertesting.com to run three test sesssions of the Enterprise.com car rental service. The test sessions were completed within one hour from the time I submitted the request. Because of the time difference between the US and Denmark where I live, I sometimes submit requests at 4 am East Coast time, but even these requests are honored almost instantly.

Their test participants are great. So Usertesting.com’s screening process seems to work. One or two participants out of 25 were rather talkative and offered 10 minutes of worthless personal opinions about the website in addition to their helpful task solutions. Fortunately, this happened rarely and I simply filtered out the opinions and looked at actual behavior.

Video and sound from their recordings is good. I never had a problem understanding what a participant was saying or doing. See the sample screen from a recording below. In contrast, I’ve had professional usability consultancies who without blushing sent me usability test recordings where the video was blurred or the sound was a loud humming.

Screen showing usertesting.com video replay
Screen showing usertesting.com video replay

Their customer service is excellent. I got a fast and sensible response each time I asked them a question or submitted a comment.

Their customer service told me that their agreements with participants allow the clients any use of the recordings they want – including public presentation of the videos. This seems ethically defensible because you can’t see participants’ faces and you mostly don’t get their names.

A few limitations and caveats

For unattended test sessions where no moderator is available to correct misunderstandings on the spot, you may want to test your instructions to test particpants carefully. You must provide your own pre-session interview questions and the all-important debriefing questions.

Usertesting.com only provides rudimentary demographic information about participants as shown in the screenshot below. If you want information about approximately where participants live, what their profession is and if they’ve used the site or similar sites before, you must use precious test time to ask in your test instructions.

Rudimentary participant demographics on usertesting.com
Rudimentary participant demographics on usertesting.com

Usertesting.com’s setup allows the participant to interrupt the recording at any time. This may be OK for qualitative testing, but it spoiled a few of my task time measurements. Also, in a few cases test participants seemingly left the session for about 5 minutes without explanation – no sound and nothing moved on the video. In one case it seemed to me that a test participant had rehearsed the test questions before they clicked the recording button and started the real session. I must add, though, that if I expressed just the slightest bit of dissatisfaction with a participant, their customer service immediately refunded me what I had paid. Of course moderated sessions aren’t perfect either – and most often they don’t have a money back guarantee.

Some of my colleagues have argued that Usertesting.com offers “professional” test participants. It is true that on 2-3 occasions test participants inadvertently showed me their test session dashboard which showed that they had been doing about 10 test sessions recently. Personally, I don’t mind this. I have yet to see a participant who was so experienced in user testing that it prevented me  from getting the data I wanted.

All in all: Recommended

Usertesting.com has a great concept and implements it with care. Before their recent price increase, I would even have said “Highly recommended”.

Resoures

www.usertesting.com

The comparative study of Budget.com that I used usertesting.com to test (CUE-8).

Categories
Best Practices blog B|P Livin' HCI

So you wanna be a UX researcher?

It’s May! Time for graduations and job hunts, and while B|P isn’t hiring right this minute (we have an awesome team and are busy refining our practices) we will be again, before too long. It seemed like a good time for a post on what I look for when hiring user experience researchers, so here goes, in reverse order of importance:

#4: Education. If you have less than 3 or 4 years of experience, I’m looking for a degree in a related field, and that really can be anything from cognitive science to anthropology to HCI to…well, surprise me, I can be convinced. If it’s from a top school in our field, that’s cool, but it’s no big deal if it’s not. A Master’s is nice for showing a commitment to the field, but doesn’t tell me much about you as a practicing researcher. And whether you do or don’t have a related degree, you’ll definitely grab my attention with interesting people-oriented research projects during your education. (If you have more than 3 or 4 years of experience, I don’t much care where you went to school or what you studied—I’ll be evaluating your professional record exclusively.)

Categories
blog Conferencery HCI

Bill Buxton’s Bad Ass CHI 2008 Keynote on Being Human in a Digital Age

Planet CHI

Okay, this photo has nothing to do with Bill Buxton’s keynote, except for that it’s a photo planet I made at CHI this year in Florence. The closing plenary talk was the most inspiring talk I heard at CHI. I wasn’t planning on taking notes, but as soon as he said he’d thrown out the talk that he originally planned to give, i got out my laptop and started typing. If I’ve missed any key points, please let me know! Here goes:

Categories
blog HCI Presentations and talks

Bolt | Peters Goes Back to College!

Last week, Nate and I had the honor to speak with a bunch of Cognitive Science students down at our alma mater, UCSD. Almost every year we send Nate and others down to talk to a Cognitive Engineering class about how much our HCI degrees mean to us in our professional lives (shocker: it’s a lot). It’s always a lot of fun and we get to meet a bunch of great people and chat it up with our mentors Dr. Jim Hollan and Dr. Ed Hutchins of the Distributed Cognition and Human Computer Interaction lab. We were joined this year by Kathy Seyama from Qualcomm‘s Usability Group, Ed Langstroth from the Volkswagen Electronics Research Laboratory, and Rod Ebrahimi from Do The Right Thing. Stay classy, San Diego.