Thanks to everyone that came out for drinks and UX research discussions at User Research Friday a few months ago here at B|P! Along with KDA Research and Uzanto, we put on this little conference and had a wild turn-out, with talks from folks including Ravit Lichetenberg from HP, Wendy Castleman from Intuit, Indi Young, Lane Becker / Todd Wilkens from Adaptive Path, and many others. We’ve got some videos and notes of the sessions up onuserresearchfriday.com which has a sign up to hear about the next one. We’ll probably make advanced UX research the focus of the next event, since many fine citizens let us know they were familiar with the broad concepts we discussed. So bring your smarty pants next time, because it’s gonna get complicated. And crunk.
Ethnio Triple Threat
A big thanks to those that used the Ethnio alpha and beta over the last year, and participated in our Ethnio usability testing! We have more Ethnio news coming soon, and B|P is proud to announce three products as part of the Ethnio remote usability family:
- Live Recruiting – Recruit participants for usability studies and market research directly off your web site or web application. Recruiting this way ensures getting actual site visitors that care about completing a task on the site or application to participate in the study.
- Screen Sharing – Ethnio Screen Sharing allows you to quickly have your participant share their desktop so you can watch them interact with your web site or application while they are in their native task environment (home, office…tree house) during the study.
- Recording Pro – In addition to recruiting and screen sharing, Recording Pro has built in conference calling and Flash video recording of the participant’s screen and audio from the session, and allows up to 10 observers to join the session as well. They call in to an 800 number, can view the screen, and are automatically muted. We think it’s pretty cool.
New Team Members
To support the development of Ethnio, and the continued demand for remote usability studies, we have some new folks that have joined the team since our last update. Mike Towber is the fourth member here at B|P to graduate from UC San Diego’s Distributed Cognition HCI program. Also joining the team is Lead Software Architect Julian Wixson. Julian brings 15 years of professional programming experience to the development of Ethnio. Kate Nartker now plays the role of Office Superstar, handles all our accounting, and keeps the office and all of us in line. You can see the whole team here.
B|P is Hiring!
We’re pleased to announce that we’re hiring for two positions, a User Experience Research Specialist, and a User Research Intern! Since so many people are hiring during these Boom #2 days, you are probably sick of seeing these requests, and we can only hope that your frustration with the frequency of large-scale technology industry job market fluctuations will not prevent you from sending us someone that might be a good fit!
The Conference Tour
Whew! From Vancouver to Montreal to Broomfield, CO, we enjoyed meeting lots of you on our recent conference tour. You can get flickr’d out with photos of folks hanging at ourbooth at CHI or UPA here. Thanks to all those who stopped by and kicked it. It was great to hear about your recent remote research and experience with Ethnio.
Public Pricing. Sweet.
B|P has made our remote usability study pricing public, with an interactive spreadsheet. You just enter in how many users you might like in a study, choose which deliverables you’d like, and it will give you a price range. Consulting firms hate to make their pricing public, but we think it makes things just a little easier.To see pricing, visit our remote usability testing page, and click on B|P Cost Estimator on the right.
Guide to Remote Usability
Our favorite usability comic site, OK-Cancel, is featuring a guide to remote usability testing written by Nate. It will definitely not make sense to anyone who is not involved with usability in some fashion. It will make even less sense if you have not had your soul sucked away by the computer industry and are leading a happy healthy life outside of the interwebs. More power to you, friend.
For all you multi-channel retailers, this month’s Multichannel Merchant has an article, ‘It’s The User, Stupid’ on how usability can “quickly gel differing opinions,” from your business stakeholders, designers, and developers. They quote Nate like crazy, which is always dangerous.
Playing Games and Cursing
Come say hello to Bolt | Peters and the get fresh crew at Adaptive Path’s User Experience Week this year, August 14-17, in Washington DC, where Nate will be presenting ‘Playing Games and Cursing: The Truth About Remote User Research’ with Rashmi Sinha. Nate and Rashmi will be discussing Ethnio andMindCanvas, Rashmi’s saucy Game-like Elicitation Methods (GEMs) research web service. Together they’ll show how interface research that is typically done in a lab can be done remotely, and will help people curse at software and web sites less, by playing games and exhibiting more natural behavior.
Remote Usability Site
Blog we created with explanations of the different remote usability methods and links to the latest tools and methods.
Dolby asked B|P to conduct usability research as part of an effort to completely re-design the Dolby website. Based on the findings from the first round of research that B|P conducted, Dolby was able to improve key areas of the site in the redesign, including site navigation, content structure and screen flow. The second phase was completed in November measuring usability improvements of the re-designed site compared to the previous one.
Techinsurance and B|P share the belief that improved user experience leads directly to the success of online products in the marketplace. This mutual understanding was so strong that both parties committed to an agreement structured around payment as a percentage of improved conversion rates for contractedge.com, a legal software web site. Techinsurance implemented the B|P usability recommendations for contractedge.com, and as a result saw a conversion rate increase of 103%!
Find out how you can structure a similar relationship with B|P.
B|P Expands Its Staff
To meet growing customer demand, we have made three new hires in 2004. Senior Information Architect Benjamin Lerch brings to B|P eight years of user interface research and design experience; Information Architect Brian Enright is the third graduate of UC San Diego’s Distributed Cognition HCI program to join B|P. Jocelyn Wine is our new office manager handling the ever-increasing load of administrative tasks. Thank you Jocelyn!
Our New SOMA Digs
We’ve moved our satellite office! It’s now at 854 Folsom Street in San Francisco — right down the block from Yerba Buena Gardens and MOMA. We now have four permanent remote usability testing stations here, which can be used concurrently to test up to 20 users per day, as well as big leather couches and a full bar. Come visit! Of course, we still have our main office at One Market Street in San Francisco where we conduct our lab-based usability test sessions.
Happy Holidays!
Bolt | Peters would like to wish you happy holidays and a wonderful new year. We are wrapping up the year here with a new office, three new team members, and the Alpha release of our new remote usability testing software! We would love to hear what you’ve been up to, so drop us a line and let us know
Want to conduct your own remote usability study? We walk you through every step of the research, from recruiting to observation.
Remote Testing versus Lab Testing
Purpose
This is a case-study comparing remote and lab-based usability testing. Bolt | Peters conducted two parallel Usability Studies on the corporate web site of a fortune 1000 software company in January 2002. Both studies used identical test plans but one was executed in a traditional usability laboratory and the other was conducted remotely, using an online screen-sharing tool to observe behavior.
You can also find more information on remote usability tools at the Remote Usability Wiki.
Summary
Our comparison of methods showed key differences in the areas of time, recruiting, and resource requirements as well as the ability to test geographically distributed user audiences. The table below provides a snapshot of the key differences we found comparing the two usability testing methods. There appeared to be no significant differences in the quality and quantity of usability findings between remote and in-lab approaches.
|
Detailed Comparison of Methods
The table below breaks down the process for each of the recruiting, testing, and analysis phases. The left-hand column describes the lab study details, the right-hand column describes the remote study details.
Lab Recruiting
Recruiting for the lab-based study was outsourced to a professional recruiting agency. Ten users were recruited, screened and scheduled by G Focus Groups in San Francisco, including two extra recruits in case of no-shows. The total time required to recruit 8 users using the recruiting agency was 12 days. Agency assisted recruiting successfully provided seven test subjects for the lab study. The eighth recruit did not properly meet the testing criteria. |
Remote Recruiting
Recruiting for the remote usability study was conducted using a “live” online pop-up from the software company’s corporate website. The recruiting pop-up, hosted by B|P, offered the exact screening questions used by G Focus Groups to recruit users for the lab study. Users in both studies were selected based on detailed criteria such as job title and annual company revenues. Respondents to the online screener that met the study’s qualifications were contacted in real-time by B|P moderators. The online recruiting method took one day and recruited eight users total from California, Utah, New York, and Oregon. Normally the live screener requires 4 days of lead time to setup, but in this case it was completed for a previous project so setup was not necessary. |
Lab Environment
The lab study was also conducted from the software company’s in-house usability lab. The recruits for the lab study went to the lab in Pleasanton, CA to participate in the study and used a Windows PC to participate. In addition to users’ audio and screen movement capture, user’s facial expressions were also recorded. The video track of user facial expressions did not yield additional usability findings. |
Remote Environment
The remote usability study was conducted using B|P’s portable usability lab from software company’s headquarters in Pleasanton, California. The live recruits participated from their native environments and logged on to an online meeting allowing the moderators to view the participants’ screen movements. The users’ audio and screen movements were captured to be made into a highlights video. |
Lab Findings
The in-lab study uncovered similar issues of similar quality to the client when compared with the remote study results. The laboratory method uncovered 98 key findings, which is slightly lower than the in-lab results. The difference is too small to be statistically significant, but illustrates a trend with remote testing that more findings seem to surface than in lab testing. |
Remote Findings
The remote study uncovered usability issues of high value to the client. The number of key usability findings was slightly higher compared to the in-lab study. The difference of key findings is statistically negligible, however we believe there is a probability that remote testing could potentially yield a higher number of findings. |
Conclusion: Choosing a Testing Method for Your Project
Although both lab and remote methods delivered similar results, we found the following
key differences:
- Geographic reach
- Time, cost and logistics
- User environment
- Perceived value of physical laboratories within your organization
When to use Remote Testing
Remote usability testing delivers cost and time saving advantages with the benefit of testing globally distributed user audiences in their native environments. The remote testing approach is ideal for:
- E-commerce web sites
- Large, informational web sites
- Web Applications
- Intranets
When to use Lab Testing
The in-lab usability testing method requires more resources and is limited to testing users on location. There are however, usability research projects that leverage the physical proximity of tester and participant.
- Highly secure client/server applications
- Handheld Apps or other products with a significant hardware component (requires observation of physical interactions)
- Test sessions lasting three hours or longer
- Projects where the client wants to physically observe the users
Further Reading
1. Comp. Study of Synchronous Remote and Traditional In-Lab Usability EvaluationMethods Master’s Thesis Virginia Polytech, 3/2004
http://scholar.lib.vt.edu/theses/available/etd-05192004-122252/unrestricted/Thesis_Prakaash_Selvaraj.pdf
2. An Empirical Comparison of Lab and Remote Usability
Human Interface Design Dept, Fidelity Investmentshttp://home.comcast.net/~tomtullis/publications/RemoteVsLab.pdf
3. Two part article from IBM User Experience Team on remote usability testing
http://www-106.ibm.com/developerworks/web/library/wa-rmusts1/
http://www-106.ibm.com/developerworks/web/library/wa-rmusts2.html