By the first definition, we could probably classify Jakob Nielsen, Don Norman, Jared Spool and numerous others as gurus in our industry. Interestingly, like the second definition of guru, the gurus of usability seem to agree about as often as leaders of opposite religious factions - or perhaps west coast and east coast gangs.
As practitioners of design and usability, we look to these experts for some guidance. Not because we can’t make up our own minds but because presumably, they are spending money and effort into researching their claims and we would like to benefit from that research. But who are we to believe if they regularly contradict each other and why is there such a discrepancy?
To continue our urban theme, “It’s All About the Benjamins.” Gurus do not publish their research online because it gives them warm fuzzy feelings. UIE and NN/g are both consultancies. They make money by holding workshops, tutorials and consulting directly for corporations. While I only refer to UIE and NN/g here, this motivation is true for any consultancy. So one publishes papers in order to give themselves credibility and most importantly, to differentiate. Companies need to differentiate themselves from each other to gain clientele.
More than differentiate, consulting firms must stand out. Our industry doesn’t exactly offer the types of celebrities that could pull off a Janet Jackson Super Bowl publicity stunt - let’s pause on that for a second while I recover from some visual imagery I could’ve done without. Nevertheless, controversy is what gets attention. Nielsen exerts influence by quoting numbers and statistics. UIE does so by debunking common (mis)conceptions some touted as the foundations of our practise.
Of course, there are the less sensational reasons as well. People disagree with each other: that’s a fact of life. No, we can’t just all be friends. Further, people do make mistakes - even gurus.
So we know some reasons regarding why they disagree with each other but that still doesn’t aid the budding usability specialist. The young apprentice, rising from the chaos of academia, approaches industry seeking enlightenment and guidance. One errant article could lead the poor, gullible learner down a treacherous path of deceit! Who do we believe? How do we tell who’s right short of performing the studies ourselves?
The answer is that we can’t tell who’s right because we’re not given the information to make that call. Nielsen espouses numbers seemingly from somewhere in the nether region. Seriously, have a look at this quote from Alertbox:
> it takes 39 hours to usability test a website the first time you try. This time estimate includes …
Really? 39 hours? That’s an estimate? Why not 40 hours? Where in the world did you get 39? If my website is a single page vs. a full scale e-commerce site like Amazon it still takes 39 hours? He then goes on to talk about his discount usability of 5 users yielding 80% of usability problems. I don’t necessarily disagree with discount usability but the numbers seem arbitrary to me.
On the other hand, UIE isn’t much better. In debunking Nielsen’s discount technique, Spool estimates ninety users are necessary for full coverage.
> When we tested the site with 18 users, we identified 247 total obstacles-to-purchase. Contrary to our expectations, we saw new usability problems throughout the testing sessions. In fact, we saw more than five new obstacles for each user we tested.
> Equally important, we found many serious problems for the first time with some of our later users. What was even more surprising to us was that repeat usability problems did not increase as testing progressed.
What types of problems? How severe are the new problems? What constitutes “serious” to UIE? How were the tests designed?
The article continues with claims that the difference may be due to the increasing complexity of web pages against classic desktop software which Nielsen based his claims on. I find such statements bizarre. I’ve worked on fairly complex web applications and none of them can compare to desktop applications in complexity and feature set. Further, personal experience has seen, even after three users, a great deal of overlap.
Gray and Salzman, in their now somewhat infamous paper, “Damaged Merchandise (1998)“, warned against using erroneous experimental data as a basis for advice to practitioners. Six years later, have things improved? We don’t really know. Thousands of people visit UIE, Alertbox and countless other sites for advice on their work. Yet, none of these columns offer sufficient data to show exactly how these results were obtained. Under academic circumstances, such claims would never be acceptable.
Unfortunately, those who need the advice most are those least able recognize the validity of them. Offering editorials on methodology isn’t wrong (we do it all the time here). However, if offering advice based on research, the basis of such findings should be offered to substantiate the claims. Unlike PowerPoint presentations, the web does not need to be fed in bite sized snippets.
Nothin’ But a UCD Thang Pt. 1
The Ok/Cancel gang have published a very interesting perspective on the highly-visible disagreements between usability gurus such as Jakob Nielsen (NN/g) and Jared Spool (UIE). Be amused by the cartoon, and then read the commentary. To quote: By the fi…
Donald Norman has a new book out called Emotional Design. He has a few chapters available for free on his website. I think I might read his classic book The Design of Everyday Things first. There’s an interesting article from…
With respect to the ‘fight’ over load times, and their relevence to users abandoning the task/page.
(as if one personal comment can do anything to sway the argument).
If I am going to a page that is both important to me, and guaranteed relevent to my current task I am very willing to wait an extended period for that page to load. Examples, Amazon.com or w3c.org. I have experianced high load times at both sites, but have waited, knowing that the information I sought would be there.
However, if I am semi-blindly opening links after a google search, I am much more demanding. If a page does not begin to load with relevent information within a few moments I hit the backbutton and try another. This is the Internet after all, I can pretty much guarantee you that isn’t the only site with the information I need.
It sounds like for the UIE test, users were told to go about sites that they normally would. These were sites with which they were familiar, and as such had guaranteed value (and therefore, worth the wait). In the Neilson tests (I have read Web Usability by Neilson) users were given specific tasks on an unfamiliar website. As such they were looking at pages with unknown value to them, and it seems only resonable that users would abandon their tasks more often.
I’m glad to see (and also hear) your contribution to the struggle for evidence-based debate in HCI. Excellent cartoon this week! Thank you. The best I can offer in return is a short love story: http://www.flashboy.org/statuscode.html
Is it a deliberate omission that you can’t comment on features? I presume the rap was recorded on both sides of the atlantic - respect. When can we expect the OK/Cancel speed metal concept album, or an EP of OK/Cancel folk protest songs? Bonus marks if this includes a ukulele (http://www.ukuleleorchestra.com)
Because the features section includes articles that are less time sensitive, we decided not to put commenting on them initially. Comments have now been re-enabled with a brand new look.
And yes, the rap was indeed recorded across the pond. Lots of soundbites back and forth.
A Hard Look at Usability Experts
At last someone has brought down the ‘usability gurus’ a peg or two. OK/Cancel has a great article - Nothin’ But a UCD Thang - which seeks to dispel some of the myths around the pronouncements of usability honchos like…
OK/Cancel is a comic strip collaboration co-written and co-illustrated by Kevin Cheng and Tom Chi. Our subject matter focuses on interfaces, good and bad and the people behind the industry of building interfaces - usability specialists, interaction designers, human-computer interaction (HCI) experts, industrial designers, etc. (Who Links Here) ?