Thanks to Steven Villereal for sharing this assessment of LibGuides from the University of Chicago Library.
As we revisit our own guides in light of implementing v2, I was struck by a number of findings that align with other usability research we’ve done here on unrelated topics.
Here are some things that stood out to me, and I invite you to share your own perspectives:
- Users are highly task-oriented. You may have learned this from countless usability studies, but users come to our web pages to get things done. As uncovered in this report, some users were so task-focused that they even wanted guides to relate back to specific courses. We should keep in mind the concrete tasks that draw users to our guides as we curate them.
- Users love tutorials. I’m not sure how universal this finding is, or to what extent users would actually use the videos, but this report echoes what I’ve heard in interviews with undergraduates – that they would like short video tutorials on basic library and resource use. It’s a reminder that we need to consider ways to communicate information aside from text.
- Users hone in on search boxes. In a number of tests I’ve done, the search box always figures prominently in user feedback. They love them, but they often have the wrong idea about what they do. In the case of our guides, the default search is set to search Virgo. I’m betting many of our users would find that surprising.
- Users hate long lists but want descriptive text. I’d summarize this finding as, users want just enough but not too much. This study found, “…they [users] wanted some kind of descriptive text for each link. They wanted to understand what each resource was useful for, but they did not want to click on links to the databases themselves to experiment and discover that on their own.” This is completely in-line with usability principles. Clicks are like currency. Users are reluctant to spend them unless they have a pretty good idea about what they’ll be getting.
- Users want human connections! This University of Chicago Library study noted, “Our group also found substantial evidence that users want to see librarians as real people. Pictures of Library buildings featured on LibGuides were consistently viewed negatively by participants while pictures of librarians themselves, contact information, and “Chat with a Librarian” were consistently viewed positively. The personal connection appeared important to the patrons we surveyed.” This sentiment aligns with personal interviews and usability studies I’ve performed, and I’m often surprised by how often it pops up. Users want individual names and faces to help orient them to the library. Regardless of technology’s ubiquity, the human element is still a vital ingredient in our virtual presence.
What findings caught your attention?
I look forward to doing some research on our own guides in the near future, and I’ll be sure to loop you in to that process.
“Well, if they can’t find it, users can just do a search, right?”
I’ve heard this comment enough in meetings that it’s worth reviewing why a site search should not be the solution to bad information architecture.
The Nielsen/Norman Group recently published a fantastic article noting 5 reasons why we shouldn’t rely on searching as a primary means for users to find content. Here are some take-aways from the piece:
- Searching requires that users have a good mental model of the domain area in which their searching, including knowledge about what attributes are most relevant in their search. This expectation isn’t realistic. In the case of academic library websites, we tend to organize information by buildings and locations, disciplines and subject specialties, and services, for examples. Users who don’t approach our site with the same mindset would be at a loss when conducting a site search.
- Searching demands users rely on their memory. Users have to remember specific keywords and how to structure the best search. That’s a lot of mental work, and, therefore, not user-friendly.
- Search takes more time and effort (it’s costly!). Think about how many times you’ve had to retype keywords on your mobile phone because you touched the wrong key or auto correct intervened, and you can understand how searching demands more work from users than browsing.
- Site search usually works poorly.
- Users are bad at searching. Despite librarians’ best intentions and aspirations, poor search habits are ubiquitous. As stated in the article, “We keep seeing this over and over again: people have no understanding of what makes a good search query on a website. Their search mental model is corrupted by the big search engines and they expect search to work in the same way on every site.”
Yes, search is useful, particularly in cases where users have a good sense of what they’re looking for, but it is by no means an “out” for developing and maintaining a well-organized site that helps orient users to what we offer.
- User Experience Project ID: UX-132 Ask a Librarian
- Purpose: To gather user perspectives to inform a webpage redesign.
- Stakeholders: Library public services, All users
- Test dates: 4/11/14 – 4/18/14
- Test participants: 6 students (4 undergraduate students, 2 graduate students)
- Methodology: Conducted 1-to-1 structured interviews to gather feedback on prior experience with help sites and impressions of 5 library help pages (including U.Va. Library’s); Asked students to sketch their ideal help page.
- Project status: Project manager and UI designer presentation delivered 4/22/14. Project managers will develop content to inform UI work. UI designer will execute design based on user feedback. Work should be completed during the summer and live site will be tested in the fall.
Project files: https://virginia.box.com/s/k4lb08p03b3hh3xw67ez
- Final protocol
- Comparison site screenshots
- User test notes
- User test sketches
- Final presentation
Consider attending this outstanding conference organized by some fantastic people!
Register or learn more about edUi.
Why register for edUi?
- Get to know your web brethren from higher-ed, libraries, and museums.
- Intimately sized conference (approx. 250 people).
- Practical sessions focused on skill building.
- World-class presenters.
- Learn tips and tricks for crafting great user interfaces and user experiences.
- It’s affordable! Early bird registration is just $500.
- One workshop is included with your registration.
Time.com published an interesting piece about what the author terms “The Attention Web.” I’d summarize The Attention Web as a new philosophy about how content producers and advertisers seek to engage audiences. Rather than measure users’ interest solely by what they click on, this new philosophy favors garnering their sustained interest, and is powered by sophisticated means of data collection and analysis.
As the author Tony Haile demonstrates, this new approach to data reveals common misconceptions about how users interact with web content. The graphic above demonstrates one such myth, that users don’t read below the fold. According to Haile, “66% of attention on a normal media page is spent below the fold.”
Other counter-intuitive findings include:
- Myth #1: We read what we’ve clicked on – “In fact, a stunning 55% spent fewer than 15 seconds actively on a page.”
- Myth #2: The more we share the more we read – “We looked at 10,000 socially-shared articles and found that there is no relationship whatsoever between the amount a piece of content is shared and the amount of attention an average reader will give that content.”
While the article is addressed primarily to advertisers, it reveals some applicable findings for general content development and website design.
Source: What You Think You Know About the Web Is Wrong | TIME.com
Recently, I consulted with a colleague about conducting a focus group with users who could potentially benefit from a new collection. Given the type of collection it is, it also merits its own physical space in the library. The research issue at hand is how users would anticipate using this new collection, and how the space should be designed to support that kind of use.
One of my first thoughts was, is a focus group the best assessment tool for this question? My conclusion, ultimately, was yes. But why?
The research question is on the ‘fuzzy front end’ of service development, where there are many more unknowns than knowns about what users would like. A focus group can be a great tool for ferreting out ideas and letting them naturally build off one another, so that we can begin to understand the thoughts and feelings we should consider in developing this new offering. In this scenario, I’m not concerned about group members influencing one another, but rather, hopeful that the group dynamic will spur participants to generate many ideas and draw out underlying needs.
While there are many good reasons for employing focus groups, I rarely choose them for UX work. Most of my work concerns how individuals interact with systems in their natural environments. For that kind of research, I choose usability tests and personal interviews, so that I can understand how people behave without the interference of others.
I found that the presentation below has some good slides depicting how to best use focus groups and when to avoid them. (See slides 13-17 in particular.)
If you’re considering conducting a focus group, I also recommend consulting the document, “Guidelines for Conducting a Focus Group” from Duke University’s Office of Assessment. Slide 19 in the presentation also has a list of good moderator resources.
Source: BellaVia Research
- User Experience Project ID: UX-111 WSDS Evaluation
- Purpose: To determine user preference with respect to Primo, EDS and Summon products
- Stakeholders: Library public services, Virgo users
- Test dates: U.Va. Library staff – 2/26/14 to 3/31/14; U.Va. Faculty, Graduate Students and Undergraduate Students – 3/11/14 to 3/31/14
- Test participants: All interested Library staff and U.Va. faculty and students (cross-disciplines). Faculty recruited via subject librarians and students recruited via Library and ITS solicitations.
- Methodology: Test instances of EDS and Summon were established in development environment. Testers were asked to choose research topics relevant to them and conduct searches in existing Virgo (Primo) and the test products. Results are being gathered in SurveyMonkey.
- Details: Searchdev was put behind EZProxy to facilitate access off grounds. Users could complete the test at their convenience and complete the survey prior to the deadline.
- Project status: Testing closed 3/31/14. Staff presentation delivered 4/9/14.
Project files: https://virginia.box.com/s/n733sstrbnz30tt8ls4l
- Library Staff Test ‘n’ Lunch results (3)
- Protocol for U.Va. faculty/student test
- User test results (Excel)
- User test results (PDF)
Staff presentation: https://virginia.box.com/s/l1iqqefe5n0q5tte5i1f