Research Results: Usability testing on new design for DH@UVa site

UX-4102, User Testing for DH@UVa

Objectives

  • Assess understanding and clarity of DH certificate requirements
  • Assess understanding and clarity of DH certificate application process
  • Assess clarity of what counts as an elective course and MoU
  • Assess clarity and utility of menu headings, filters and nodes, About DH@UVa
  • Assess understanding of Activate Your Profile
  • Assess clarity of who/how to contact for help
  • Explore initial likes/dislikes related to site organization and design

Stakeholders: DH@UVa site owners

Dev site: https://digitalhumanities.dev.uvaits.virginia.edu/

Testing date: November 26 – november 30, 2018

Participants: 5 UVA graduate students from English (2), Art, Anthropology, and Slavic departments

Methodology: usability testing with one participant, one facilitator, and one note taker in a quiet space. participants were solicited from DH@uva mailing list. they used a laptop and were asked to complete tasks on a newly-revised website to determine the usability of the site. the facilitator and note taker listened for comments and opinions, and noted behaviors.

Strong Findings

  • None of the 5 participants had trouble finding and enumerating the steps to apply for the DH certificate. Most navigated by the DH Certificate link in the menu. One participant was briefly confused by the More about the graduate certificate in digital humanities section above the DH certificate requirements This participant also commented, “I think there is a mismatch between what I would expect from a digital certificate and the application process, which is so traditional. Why do I need words, why do I need a statement, when I’m applying to do something visual? What about art students?”
    • Recommendations: Remove More about the graduate certificate in digital humanities section to focus attention on enumerated steps. Alternative: work this section into About DH@UVa with visible links to requirements and application process. Reduce text blocks with diagrams or other images that can convey information in a non-textual way.
  • All participants could find and convey the DH certificate requirements as well as how to get an elective course approved; and each demonstrated basic understanding of the MoU.
    • Recommendation: None
  • 4/5 expressed the need to clarify the relationship between DH@UVa and IATH, Scholars Lab, and SHANTI in order to clarify the purpose of the DH@UVa website. One participant was still unclear at the end of the test: “I guess I’m also not 100% sure what DH@UVa is as compared to the Scholars Lab as compared to individual projects. Again, why am I going to this website?” From another participant: “I am really interested all digital stuff, but I am not sure what each center or office does…they all overlap, also Data Services. I don’t know who to contact. I have no idea how they’re different from DH.” This participant didn’t want to “have to read everything and figure out the differences.” A third participant: “What’s the difference between the three [pointing to the links in footer]? Are they sponsors? Needs to be made more clear.” This same participant wanted to see Makerspaces included. The one participant who did not express confusion about the centers or the purpose of the website found it to be “clear, organized, and inviting. Judging from what I see here I’d want to do this myself because it seems like a very organized program.”
    • Recommendations: Add new intro text, Welcome to DH@UVa or similar, above the fold on the home page stating clear purpose of the website and concise descriptions of how related centers intersect with DH. Retest to determine if there is an improvement or if further changes are needed.
  • 4/5 found the DH@UVa Network area at least somewhat unclear and that it appeared “awkward,” “scattered,” “crowded,” and “feels like 50 things are competing for my attention.” Some expressed that there isn’t enough information to provide context, for instance in the People filter: “I’m wondering why these people are here. There should be something about who they are, not just their names and picture.” One participant wanted more filters to narrow down results further by subject and geographic location. 3/5 wanted the nodes pages to have a more obvious organization, calling the current order “random” and indicating that “alphabetical is what I expected to find.” Participants wanted a different structure on the nodes page (“Not tiles, but the full title, followed by the first few lines of text”) and one commented: “This is good, but it needs a description to explain what we are looking at.” 3/5 expressed a need for a robust and visible search function on this page.
    • Recommendations: Add context by including tooltips for each node button. Add description to each card. Reformat nodes pages in a grid with full titles and abstracts to facilitate scanning and evaluating content without having to click on every Learn More Consider duplicating the search box near the heading of the page for optimum visibility.
  • All participants had no trouble figuring out where to go for help, but all had some familiarity with DH@UVa. Most knew Rennie and mentioned her by name. A related note: all 5 participants expected to find who administers DH@UVa on the About DH@UVa page under People (see #2 under Protocol below). 2 participants expressed that they expected to find the primary DH@UVa contact at the top of the People page, and the other 3 felt they had completed the task when they clicked on the People link and saw the headings for the Executive Board and Steering Committee. Only 1 participant eventually scrolled to the bottom of the page and located the DH@UVA Curriculum Development Team
    • Recommendations: Best practice would be to have a Contact link on the menu for optimum visibility. Move DH@UVA Curriculum Development Team section to the top of the People page.

Other Findings

  • 3/5 participants expressed some uncertainty about what to expect when clicking on the Activate Your Profile Answers ranged from “to save things I’m interested in” to “receive information, subscribe to a newsletter, make your profile available to others to network with.” Best practice tells us that it would help provide context to have one sentence explaining why you’d want to activate your profile before clicking on the button.
    • Recommendation: Add a brief description below Activate Your Profile to outline the benefits of creating a DH profile.
  • Resources is currently a bit of a catch-all: “To me it’s a miscellaneous list.” DH Organizations and Projects could easily be its own menu heading. 2/5 participants wanted to see organizations and projects from outside UVA. A listing of upcoming events are commonly found on a website’s top level page, so having a link to Events Calendar in the menu is unnecessary. One participant recommends: “Make sure this [Resources] page functions as a toolkit to people who end up doing the certificate. Sort of a gateway to skills you’d use in the certificate program.”
    • Recommendation: Remove Events Calendar from menu. Refocus Resources by pulling out DH Organizations and Projects and adding the heading to the menu. Consider creating a page for new DH scholars and put Jumpstart your DH project and Links for the new DH scholar there, and link to Apply for the DH Certificate.

Protocol

1 [Start: https://digitalhumanities.dev.uvaits.virginia.edu/]
This is a site under development.
What is your initial impression of this web page?
  • Assess initial impressions
  • Assess clarity and utility of menu headings
2 (task) You are unfamiliar with Digital Humanities and want to know who administers it at UVA. Where do you start?
  • Assess clarity of  About DH@UVa
3 [Scroll down and point to 1.) DH@UVA Network and 2.) View all nodes button]
What happens here?
  • Assess understanding of filters and nodes
4 [Point to Activate your profile button]
What would you expect to happen when you click on this button?
When would you click on it?
  • Assess understanding of Activate your profile button
  • Assess understanding of a profile
  • Does Log in help to define Activate your profile
5 (task) Find out about the DH projects that others have done.
  • Assess location of Projects under Resources
6 Describe, in your own words, the requirements of the DH certificate.
  • Assess understanding and clarity of the DH certificate requirements
7 (task) You think that a particular course in your home department of History might count toward a DH certificate. How can you determine if it does?
  • Assess clarity of what counts as an elective course
  • Assess ability to navigate to course information
8 Describe, in your own words, the steps to receive permission for a non-DH course to count toward the DH certificate.
  • Assess clarity of the first steps taken to get a non-DH course to count as an elective
9 You’ve received verbal approval for an elective course from your instructor and the DH administrator. What’s your next step?
  • Assess ability to navigate to Memorandum of Understanding (MoU)
  • Assess clarity of the MoU requirement
  • Probe for understanding of MoU
10 Describe, in your own words, the steps to apply for the DH Certificate.
  • Assess clarity of the steps to take to sign up for the DH Certificate
  • Ensure that the numbered steps are included in the process before clicking on the application link
11 (task) You have a question about the application. How do you get help?
  • Assess clarity of who/how to contact if further information
    is needed
12 What are three adjectives you’d use to describe the process to apply for a DH certificate?
  • Gain insight into process to learn about and apply for DH certificate
13 What is missing on this website?
What is confusing on this website?
  • Assess understanding and utility of page
  • Gain insights into what is missing or confusing
14 How would you make this website easier to use?
  • Final comments

Research Results: Improve Content on Graduate Student Services page

UX-2985, Improve Content on Graduate Student Services page

Objectives

  • Evaluate category headings for clarity, accuracy, completeness
  • Confirm understanding of breadcrumb navigation
  • Assess if CTA (call to action) areas work as expected
  • Explore likes/dislikes about descriptions, icons, boxes, design, wording
  • Gain insights with version preferences

Stakeholders: UVA graduate students

Testing date: 9/26/18-9/28/18

Participants: 4 UVA graduate students (East Asian Studies; Anthropology; Classics; Public Policy)

Methodology: usability testing with one participant and one facilitator in a quiet space. participants are representative users. they use a laptop and are asked to complete tasks on a new website to determine the usability of the site. the facilitator listens for comments and opinions, and notes behaviors. Two versions of prototypes were shown to assess preferences.

FINDINGS

  • Without a description the participants had some trouble correctly and fully defining what they would find under Publications: “Publications is the only one I’m unsure about. Does that mean I’m looking for publications?” Two participants expected “resources to help me publish my own works in various places” and “links for how to get published”. Even after reading the description and seeing the options under Publications, one commented: “I would expect to see more about how to research different publications. I’d expect it to be LESS dissertation-focused. So that strikes me as odd.” Another commented after reading the description: “That helps me understand that I’m the one doing the publishing.”
    • Recommendation: Change category to Publishing, which is more suggestive of the activity of creating scholarly content; and retest.
  • Two participants struggled to correctly interpret Instruction until they read the description and saw the options there. The other two participants defined the category as offering “helping with teaching” and “networks of graduate teaching assistants,” and both mentioned that they expected to find a link to the Center for Teaching Excellence (not a Library service).
    • Recommendations: Add a link to UVA Center for Teaching Excellence under Move Scholars Lab and RMC links to Consultation. Additionally, reorder categories so that Consultation and Instruction are next to each other.
  • All four participants preferred the prototype with descriptions (version 2):
    • “The second layout is more intuitive.”
    • “I feel like the emphasis on helping comes through more on [version 2]. You’ve got specifically ‘these resources can help’ under Publication, under Instruction you’ve got ‘the library can help’ That sort of language really sets up the library as a space you can come when you need help.”
    • “I like the graphics! And I like the discrete division. It’s visually appealing. And I like the descriptions underneath (the icons)… What is really helpful is it provides an explanation, and I don’t have to do the guesswork…I really do like these explanations. While the first version is graphically simplistic, and I do like simplicity, I also think that some of these [icons] are ambiguous, and the other version saves me time of knowing where to go and what to click on.”
    • Words used to describe version 2 include “clear,” “clean,” “descriptive,” “effective,” and “helpful.”
    • Recommendation: Use version 2 with descriptions.
  • Two participants would use a link to the subject guides if one were provided on this page.
  • Although being shown a mostly non-working prototype, participants seemed to correctly interpret the CTA links.
    • Recommendation:
  • All four correctly interpreted how to use the breadcrumbs.
    • Recommendation:
  • When asked to project what it might be like to use version 2 on tablet or mobile, two participants said they were unlikely to use the website on a device. Another participant commented, “I have a lot more tolerance for scrolling when it’s a smaller screen. I know that they can’t put [in] all the information.”
    • Recommendation: None at this time.

top of first prototype web page for graduate student services

Version 1 prototype

screen capture of top of second prototype web page for graduate student services

Version 2 prototype

Protocol

1 [Show version 1]
What is your initial impression of this top portion of the web page?
  • Evaluate initial impression
  • Evaluate clarity and utility of category headings
  • Listen for comments about icons, How Can We Help, descriptions…
2 [Point to breadcrumb navigation]
What are these? What happens here?
  • Assess understanding of breadcrumb navigation
3 What would you expect to find under Request?
(if they repeat the word ‘request’ without further definition:)
Describe in your own words.
  • Evaluate clarity of Request category
  • Probe for terms other than ‘request’
4 What would you expect to find under Consultation?
(if they repeat the word ‘consultation’ without further definition:)
Describe in your own words.
  • Evaluate clarity of Consultation category
  • Probe for terms other than ‘consultation’
5 What would you expect to find under Publication?
(if they repeat the word ‘publication’ without further definition:)
Describe in your own words.
  • Evaluate clarity of Publication category
  • Probe for terms other than ‘publication’
6 What would you expect to find under Instruction?
(if they repeat the word ‘instruction’ without further definition:)
Describe in your own words.
  • Evaluate clarity of Instruction category
  • Probe for terms other than ‘instruction’
7 What would you expect to find under Spaces?
(if they repeat the word ‘spaces’ without further definition:)
Describe in your own words.
  • Evaluate clarity of Spaces category
  • Probe for terms other than ‘spaces’
8 Are these categories useful to you?
Is there anything missing that you haven’t already mentioned to me?
  • Assess if the category headings are complete and accurate
  • Probe for other heading terminology
9 (task) You want to set up a time for your undergraduate students to learn about doing research in Special Collections. Where would you start?
  • Does tester use Instruction heading to complete the task, or do they browse/scan the page
  • Do call-to-action features (CTAs) work as expected
10 (task) You require a book for your research that the UVA Library doesn’t have. Where would you start?
  • Does tester use Request heading to complete the task, or do they browse/scan the page
  • Do CTAs work as expected
  • Familiar with ILL? If not do they find the service under Requests
11 (task) You have questions about how to submit your thesis. Where would you start?
  • Does tester use Publication heading to complete the task, or do they browse/scan the page
  • Do CTAs work as expected
12 Let’s look at another version of the same page.
[Show version 2]

What is your impression of version 2? Feel free to click back and forth.
  • Explore likes/dislikes about descriptions, icons, boxes, design, wording
  • Gain insights into preferences
13 What are three adjectives you’d use to describe this page?
  • Assess usability and opinions
14 Which version do you prefer, and why? (if not already indicated)
What is missing?
What is confusing?
  • Assess understanding and utility of page
  • Gain insights into preferences
15 How would you make this page easier to use?
  • Final comments

Research Results: Library staff directory redesign

UX-3822, Redesign staff profile page template

Purpose: Assess usability and clarity of redesigned staff directory with profile pages

Stakeholders: users and staff

Testing date: 8/31/18

Participants: 5 UVA undergraduates

Methodology: “Guerrilla” testing, in which UX staff, some in tyrannosaurus Rex costume, invite students entering Clemons Library to spend 10 minutes of their time to improve a library web page. Coffee and danishes offered to all passers-by.

FINDINGS

  • Participants didn’t initially know what or who Library Subject Specialists were, but were able to grasp the concept (“for doing research, to see someone to talk to”) after spending a few moments looking at the page.
    • Structure of the Librarian Subject Specialist page is clear and effective. All could grasp that the organization was by school, then by department.
screenshot of Librarian Subject Specialist view

fig. 1, Librarian Subject Specialist view

  • However, the concept of a “library department” was confusing to participants. When shown the Department tab participants said the page was “unexpected” and that it wouldn’t be useful to them
    • Recommendation: Consider tooltips or another method to give additional description and context for Librarian Subject Specialist and Department tabs
  • The Profile pages were easily grasped and appreciated but not carefully read. Comments included:
    • “Ask Me About” should be higher up because it’s most important
    • Gender pronouns are “inclusive”
    • “What’ ORCID id?”
    • Page is “useful” and “helpful”
    • Recommendation: Put bulleted information at the top of the right column to improve ability to skim
    • Recommendation: Limit Bio and other text blocks to reduce visual clutter and increase white space

      screenshot of a library staff profile page

      fig. 2, profile page

  • Participant suggestions to make the staff directory easier to use echo the recommendation above to give additional description and context with tooltips:
    • “Need a description for Department tab”
    • “Too many tabs, too many options”
    • “Need to understand when to use Subject Specialist or Department”

Protocol

1 [Go to Staff Directory “Show All” view]
What can you do here?
Assess understanding of how and why to use the library staff directory.
2 What do you think you’ll see if you click on Image View?
[show Image View]
What are your impressions of this page?
Assess understanding of Image View.
Gather impressions.
3 What do you think you’ll see if you click on Librarian Subject Specialist?
[show Librarian Subject Specialist]
What can you tell me about how this page is organized?
Would you use it?
Assess understanding of Librarian Subject Specialist.
Gather impressions.
4 What do you think you’ll see if you click on Department?
[show Department]
What can you tell me about how this page is organized?
Would you use it?
Assess understanding of Department.
Gather impressions.
5 [Go to Profile page]
What are your impressions of this page?
What’s useful?
What’s missing?
Gather impressions of Profile page.
6 How can we make this staff directory easier to use? Evaluate overall ease in using the staff directory.

Research Results: Testing Robertson Media Center and equipment checkout webpages

UX-3833, Guerrilla test Robertson Media Center pages and equipment request process

Purpose: Assess usability and clarity of new RMc pages and equipment checkout pages

Stakeholders: users and staff

Testing date: 8/31/18

Participants: 5 UVA students

Methodology: “Guerrilla” testing, in which UX staff, some in tyrannosaurus Rex costume, invite students entering Clemons Library to spend 10 minutes of their time to improve a library web page. Coffee and danishes offered to all passers-by.

RMC site: https://www.library.virginia.edu/rmc

LibCal RMC equipment site:  https://cal.lib.virginia.edu/equipment?lid=241

Project phase one deliverables were to create a landing page and second-level pages. Based on the content, design, and architecture, users are to be able to understand the following:

  • Who the space is for (students, faculty, community members)
  • What services and equipment/spaces are available to whom and how to obtain them
  • This is a place for making, learning, and teaching

The protocol was developed to elicit responses to assess if the goals were met, and to test the response to wording for equipment check out that does not require reserving in advance.

STRONG FINDINGS

  • After perusing the content and images of the landing page, participants were asked what the space is for, and to describe the space using three adjectives. Their answers reflect a space for making and creating, equipment to use and check out, and help when needed.
    • Four out of five testers described the space as “high-tech” or “technological.” The specific terms “creative,” “useful,” “3-D,” “VR,” “studio,” “check-out,” and “advanced media” suggest that the participants see a space for using and exploring new technologies.
    • Participants used the terms “knowledgeable,” “helpful,” and “supportive,” and phrases such as “people to help,” “assists students,” “help you with stuff,” “classroom,” “tutorial,” “tech support,” and “equipment troubleshooting,” all of which convey help, learning, and teaching.
    • One participant offered that “faces and names [under heading, ‘Who We Are’] makes people more approachable.”
    • Other terms reflect favorably on the environment: “modern,” “variety,” “expansive,” “fresh,” “eclectic.”
  • Four out of five participants correctly identified UVA students and faculty as the audience focus for the RMC. One participant thought RMC might be limited to “STEM or media studies majors.” Two participants also mentioned access by “the public” and “anybody outside UVA” but that they wouldn’t be primary users.
  • No discernible problems with participants correctly interpreting the meaning of terms “No Reservations” or “First come, first served.” Similarly, the RMC equipment page heading, “Walk-up Equipment,” seems to provide enough context to be correctly understood. However, several participants expressed that they also assumed they’d be able to use equipment without training, or that there would be no lengthy check out procedure, or that there would be no need to sign out equipment at all. We recommend avoiding idioms and colloquialisms that may lead to confusion and misunderstanding, particularly for non-native English speakers. To ensure clarity we recommend using plain language phrasing, either “No reservations required” or “No reservations needed,” to be used consistently throughout the RMC and equipment request sites. Additionally, because “Walk-up Equipment” is easily misinterpreted, we recommend changing “Walk-up Equipment” to “No reservations equipment” or similar. Also please review the categories under that heading (for instance, video equipment and cameras must be reserved so instead belong under the heading, “Reservable equipment). Instead of “Browse the equipment collection” button use text more actionable like “Check out equipment” or “Use Equipment.”
  • After having reviewed the RMC landing and second-level pages, participants largely correctly identified who the space is for and what goes on there. Participants were then shown the Equipment page and asked: “If you wanted to use a camera, what would you do next?” Three participants initially thought they would have to go to the RMC desk to check out a camera, and one participant wondered if he needed to use equipment in-house. Four out of five had to be directed to click on “Browse the Equipment Collection” to make a reservation online. Three saw the text, “Training & Reservation Required” and another thought they should search online for a tutorial. All participants saw the links for “Availability” and “More details” but two participants were still unclear what to do next. Most participants clicked on “More details” but didn’t read any of the text. They all saw the calendar and figured out the process of selecting a date by clicking on a green box and logging in via Netbadge. One participant expressed confusion about the difference between red and grey boxes on the calendar, and the use of the term “padding.”
    • Recommendations for all LibCal equipment items, based on above results: Hide “Availability” since reservations can be made under “More details.” Change “More details” to a button, “Use this equipment” that users will see as an actionable thing. Review content and messaging about requirements and training on item pages to reduce amount of text. Highlight the Instructions tab (which no test participant clicked on). Describe specifications of equipment separately from the process to train and check out. Use short, declarative bulleted lists rather than prose to improve clarity and make text more skimmable.

PROTOCOL

  Question Rationale
1 [Have tester examine the RMC page at https://www.library.virginia.edu/rmc]
Based on the contents of this page, what is the Robertson Media Center?
What is this space for?
2 Who can use it? Why do you think that? Who is this space for?
3 What are three adjectives you would use to describe this space? What does the page evoke (friendly, techie, diverse, inclusive, bright…)?
4 [Direct attention to /equipment and to “Walk-up Equipment”]
What does this mean to you?
Probe: How would you describe it in your own words?
(on-demand, walk-up, something else?)
Does walk-up or on-demand resonate? If not, what does resonate?
5 If you wanted to use a camera, what would you do next? Assess understanding of how to get training and how to reserve
Do they see/read “Training & Res Req”?
Go to Equipment
Go to Browse the equipment collection
View More Details or Availability
6 [Go to Location = Brown or Music]
Based on the contents of this page, what can you do here?
[Focus on Dry Erase Markers or Microphone and descriptions]
What can you do here?
How would you describe it in your own words?
Do they see “No Reservations” or Do they see “First Come…”
Do they understand it
Do they click on Availability
Are they confused?
7 How would you rate your ease in using this page, where 1 is very difficult and 5 is very easy? Evaluate overall ease in using the LibCal equipment page.

 

Research Results: Special Collections website information architecture testing

UX-3834, Analyze Special Collections Tree Test Results

Purpose: Assess usability and clarity of proposed Special Collections website reorganization

Stakeholders: Special Collections users and staff

Testing date: 8/27/18

Participants: 36 UVA undergraduates

Full results

https://www.optimalworkshop.com/treejack/LibUX/specialcollectionsweb/shared-results/2436i3ot63hw1m8v4nh6yxd0365e2oy7

Methodology

Enrolled UVA undergraduates were solicited by email from our queue of volunteers to take a 10 minute test for which they received a $10 deposit to their UVA debit card. An online tree test tool by Optimal Workshop was employed to help determine the efficacy of the proposed information structure. Based on previous testing, we created a simple hierarchy of 4 top categories, each of which had between two and six subcategories (see fig. 2, Tree Structure). Undergraduate testers were asked to complete ten tasks designed to assess how closely the information structure matched their mental models (see Full Test Results, Analysis tab). The tree test results indicate how many testers found the correct path on their first try, how many found on their second or third try, and how many failed to find the correct path; as well as how fast the tester completed the task. For an overview of how to analyze tree test results, visit Atlassian’s Tree Testing for Websites.

Summary of preliminary findings

  • Overall success rate was 70%. Even without visual cues, navigation, and menus, participants are finding the correct answer at least two-thirds of the time, suggesting the tree is effective but may need some tweaking.
  • Overall directness (going directly to the correct area) was 48%. Some participants are struggling to find the right path but get there eventually.
  • Participants usually identified what they would find under the About and Donate categories. The Destination tab on the full results site shows that large numbers of participants (the green boxes) correctly selected these categories, and few incorrectly selected these categories (the orange, red, and white boxes).
  • The Donate link is readily found and understood. 31 participants found the correct policy under Donate, and the average time to complete the task (5.52 seconds) was half the average time taken to complete the other 9 tasks.
  • Almost half of the participants selected Class visits and instruction to complete a task about finding a map, suggesting that we should duplicate some information about planning a visit in this category.
  • The Destination tab shows us that Online reference request was incorrectly selected 16 times (although it is not necessarily a bad thing that students opt to ask for online help when they don’t know an answer).
  • Participants were able to correlate the term “artifacts on display” with Collections and Exhibitions (task 5).
  • Participants were able to correlate the term “give” with Donate (task 6).
  • Participants found the digital camera policy under Using Special Collections/Usage Policies and correctly interpreted that this policy would probably cover taking photos with a phone.

Deeper dive: task #7: “Can you confirm that there is a substantial number of William Faulkner papers in Special Collections?”

pie chart of task 7 results

fig. 1, pie chart of task #7 results

Roughly half of the participants found the correct category, which is several levels down in the hierarchy. However, first-click data shows that an additional 34% clicked on Collections and Exhibitions but didn’t investigate Featured Collections, where they would have found the Faulkner papers. Most wound up in either Manuscript Collections or UVA Archives, so were circling in the right neighborhood but didn’t ring the right doorbell.

Recommendations

  • Class visits and instruction seemed to resonate with undergraduates, so consider building up this popular area with duplicate links from Plan a visit.
  • Online reference request is used to answer a wide spectrum of questions, so give this a high profile (possibly as a sidebar).
  • About and Donate work well as top-level categories.
  • Using Special Collections is a broad category that may need further refinement through discussions with Special Collections staff and/or further user research.
  • Collections and Exhibitions may not resonate with undergraduates but may work for others with more familiarity with research collections. Featured Collections should be rethought to identify a current goal and purpose, which will inform how and where to present it, and may inform Collections and Exhibitions.

Next steps

The UX team will factor in these recommendations to rework the information architecture.

Tree test structure being tested

fig. 2, tree structure

 

Research Results: Library home page testing and surveys

User Experience Project ID: UX-1922, Library homepage development
Purpose: Assess usability and clarity of proposed website designs
Stakeholders: UVA Library staff and users

Test dates

  • 2/22/17 – 2/24/17 ; 9/14/17 – 9/19/17 ; 11/15/17 – 11/21/17 (usability testing with UVA undergraduate and graduate students)
  • 4/5/17 – 4/14/17 (online surveys of UVA students, staff, and website visitors)

Methodologies: Usability tests (3) and online surveys (3) to present iterations of Library home page designs to test usability and elicit feedback. Efforts were focused on search box location, carousel location and navigation, Hours page, Research page, mobile and tablet navigation, and link naming.

Summary of findings

  • Hours are easily found and testers liked its new prominence
  • Mobile version of Hours is easily navigated and was praised as “well-formatted”
  • Improved visibility of search, which is a primary or secondary task for website visitors
  • The “Ask a Librarian” brand is firmly established and the “Ask a subject specialist” link seems to be understood as another place to get help
  • Iterations of new design were usually viewed as more attractive than the previous versions
  • Mobile hamburger menu is more easily found when labeled “MENU”
  • The Research page redesign was described as clean, efficient, logical, and organized
  • Thumbnail images under the carousel image helped testers navigate content

Project status: Newly designed website theme rolled out 3/7/2018

Final reports (UVA only): https://virginia.box.com/v/2017WebResearch

Research Results: Website Information Architecture testing

User Experience Project ID: UX-2846, Analyze Tree-Test Results

Purpose: Assess usability and clarity of proposed website top level reorganization
Stakeholders: UVA Library staff and users
Testing dates: 11/13/17 – 11/14/17
Participants: 29 UVA undergraduates (paid)

Methodology
Enrolled undergraduates were solicited via a banner on the Library web page to take a 15 minute test. An online tree-test tool by Optimal Workshop was employed to help determine the efficacy of a proposed information structure based on previous testing results. We created a simple hierarchy of 5 top categories for the Library website:

  • About UVA Libraries
  • Using the Library
  • Help and requests
  • Research and instruction
  • Advanced technology

Each category had between three and nine subcategories. Undergraduate testers were asked to complete ten tasks, and were told it was possible to complete these tasks on the Library website. The tree-test results indicate how many testers found the correct path on their first try, how many found on their second or third try, and how many failed to find the correct path.

Summary of preliminary findings

  • Undergraduates correctly identified categories about spaces (libraries, study areas, room reservations)
  • Undergraduates correctly identified categories about searching for books and articles
  • The category Advanced Technology was not well understood
    • The task to find specialized software could only be successfully completed by selecting the “Advanced technology” category, but only 38% of testers selected the category immediately, and 41% of testers ultimately selected one of the other four categories
  • Color printers and scanners (equipment) were not easily found under Using the Library
    • 38% of testers found it immediately, an additional 28% found it eventually, and 34% did not find it at all

Deeper dive: task #8, “Find course materials on reserve for your class”

Only Task 8 Find course materials on reserve for your class12 testers (41%) navigated correctly to “Search course reserves” or “Search Virgo” in order to complete the task. Of the 17 that did not navigate correctly, 11 incorrectly looked for the answer under the “Help and requests” category (which may show a willingness to ask a librarian for help when they can’t find something on the website). Of those 17 failures, 10 selected “Research and instruction” (one of the correct categories) at some point, but still did not successfully complete the task. It took all 29 testers an average of 17.65 seconds to finish this task, which was considerably slower (by at least 8 seconds) than the time spent on the other tasks. Previous testing has indicated that some students are unfamiliar with the concept of course materials being held on reserve in a library. The high rate of failure coupled with the extended time it took to finish this task further suggests confusion with this concept.

Project status
The testing information gathered here continues to inform website organization projects. Some categories of information should be surfaced in order to be found by undergraduates, but other user groups (graduate students and faculty) also need to be tested.

Testing results: https://www.optimalworkshop.com/treejack/LibUX/uvalib2/shared-results/8xa6v522406acu7g2y3l3n0gu7x3nnf8

Research Results: Graduate Students and Their Needs

User Experience Project ID: UX-2368, Facilitate two focus groups with graduate students

Purpose: Investigate sources of dissatisfaction among graduate students
Stakeholders: Academic Engagement, Administration & Planning, Collections, Access & Discovery and other Library areas
Focus group dates: 11/29/17, 11/30/17
Participants: 11 graduate students (including one session with five international graduate students)

Methodology
Low ratings by a small subset of graduate students in the 2016 user survey were a cause for concern, but root causes were elusive. Two focus groups were offered to assist with identifying issues. One focus group was limited to international students to learn of their specific needs. Most students were recruited from usual avenues (web banner on the Library home page; contact cards left at print stations). Others had participated in the 2016 user survey and agreed to future participation in user research, and several were invited based on past feedback they’d given. UX research staff created two focus group protocols probing on problems and resolutions, confusion or difficulties, type of communication preferred, and how the Library can better contribute to graduate student work and to their overall positive experience. The international students were also asked about specific challenges they face.

Major findings

  • Communication and connectivity are important elements to a positive graduate student experience. Graduate students want us to market to them early and often, and want communications and workshops that are geared toward them.
  • Graduate students want document delivery service and they want us to be proactive, try harder, and care more about getting the paper and electronic materials they need.
  • Library systems and policies are confusing and frustrating, especially to incoming students. Many navigate with the help of their peers. They want DIY short online video tutorials for online tools that can be found and viewed at point-of-need, but like in-person assistance with subject specialists for in-depth research.
  • Graduate students want space for work and storage in libraries. Physical comforts impact willingness to use and stay in library spaces.
  • Students with mobility issues can be negatively impacted by collection decisions that split call number runs between multiple libraries.
  • International students matriculate with few expectations for library services beyond checking out books and comfortable study space; there were no notable differences in library experience with their peers from the U.S.

Recommendations

  • Create a communication plan, to include physical materials, for incoming and other graduate students to convey services, policies, and contacts.
  • Create workshops to engage graduate students in library services and research resources.
  • Create short online video tutorials for services and processes.
  • Consider providing scanning and electronic delivery of print journal articles to all graduate students.
  • Explore possibilities to expand dedicated graduate student space in all libraries.
  • Increase efforts to convey information about critical systems, services, and policies to graduate students, particularly incoming. Include basic information about online catalog, Interlibrary Loan, policies (borrowing, recall, course reserve), off campus access to online research materials, printing, obtaining a carrel, how to reserve a classroom, and stacks navigation.
  • Increase in-person communication with graduate students about subject liaison program.
  • Review tight stacks areas that may provide poor accessibility.
  • Consider revising collections policies to keep all materials in the same subject together to enhance accessibility for students with mobility issues.
  • Do further research to gather more complete data on the following:
    • assess how the Library should communicate with graduate students;
    • gain a deeper understanding of the specific needs of international students;
    • determine graduate student priorities for document delivery, paging from other libraries, and other delivery services;
    • assess what current graduate students wish they’d known as incoming students.

Final Report and protocols (UVA only): https://virginia.box.com/v/gradfocusgroups

Research Results: What Undergraduate Students Want from a Library Makerspace in Clemons

User Experience Project ID: UX-1905, Focus group of undergraduate students to get feedback on the future Library Makerspace

Purpose: Assess general themes and issues relating to creating a creative space in the Library

Stakeholders: Library Services and Spaces

Test dates: 4/7/17; 4/14/17; 4/18/17

Motion capture suit

Motion capture suit

Participants without Makerspace experience

  • 4th year, History and Religious Studies
  • 4th year, Nursing
  • 3rd year, Urban & Environmental Planning
  • 4th year, Economics

Participants with Makerspace experience

  • 3rd year, Computer Science
  • 3rd year, Drama
  • 4th year, Neuroscience & Biology
  • 2nd year, Philosophy & Middle East Studies and History
  • 3rd year, Computer Science & Studio Art
  • 1st year, Computing Engineering
  • 4th year, Computer Science

Methodology

Lulzbot 3D printer

Lulzbot 3D printer

Three focus groups were held. One focus group had four undergraduates with no Makerspace experience. Two focus groups had seven undergraduates with Makerspace experience. Inexperienced participants were asked to give impressions of the terms “build,” “make,” and “create,” while experienced Makers were asked to give impressions of the term “Makerspace” and asked about past Makerspace experiences. All participants were asked what a Library creative space should help students do or accomplish; what makes for a good experience in a creative space; and the importance of whether or not such a space is student-run. All students also gave input on equipment, training, and design of the Clemons space.

Summary of findings

  • All groups express the need for a space that is welcoming to all students and the Library is uniquely positioned in students’ minds as a neutral space for exploration.
  • All student groups felt that first impressions of the space are important, and that a modern look and an open floorplan are important elements for the design of a Library creative space.
  • Perceptions of Makerspaces and of highly-technical equipment is informed more by students’ field of study than by their year of study. Those students familiar with Makerspaces tend to be in STEM fields but span all class years, suggesting Maker culture is well-integrated into those curriculums.
  • Students who have not used a Makerspace find them intimidating and unwelcoming, and they perceive that they lack the necessary skills to use Makerspaces. They are more likely to be from the Humanities and Social Sciences.
  • Too much high-end technology is “a turn-off” to inexperienced students and they
    Sewing machine

    Sewing machine

    want a creative space with “simple stuff.”

  • There was no strongly expressed opinion by major or class year that the Clemons space should be student-run.
  • Recommendations include:
    • Plan a range of outreach strategies for all UVA students
    • Design a well-lit space with comfortable, modern furniture with a variety in seating, tables, and desks to support group and individual work
    • Host inclusive design sessions with students to give them planning input
    • Consider having an ongoing steering committee that includes students to create policy and address operational questions
    • Consider means to monitor service quality and respond to issues as they arise
    • Plan a range of outreach strategies for students who may be interested in creating and making with non-technology items
    • Ensure that signage and supplemental information clearly demonstrates the equipment available, how to use it, and potential uses to reduce psychological barriers to using the tools
    • Arrange space so that both quiet exploration and collaborative ideation sessions can be supported
    • Design service points to be integrated into the space and easy to approach
    • Implement a variety of instruction programs to serve multiple learning styles
    • Specify (in print and on web) what equipment is available for use and for check-out
    • Partner with existing student clubs and organizations (technology, gaming, media…) to promote Makerspace services and instruction

Project status: Clemons Makerspace is expected to open in advance of the Fall 2017 semester.

Project Files (UVA only):  https://virginia.box.com/v/UX1905

Research Results: Remote testing on home page prototype

User Experience Project ID UX-1687: Create, Launch, and Evaluate WhatUsersDo.com Online Test

  • Purpose: Assess new home page design by non-UVA users
  • Stakeholders: UX team
  • Test date: 12/7/16
  • Test participants: 3 online testers in the United States, age 18+, each using a desktop or laptop computer
  • Methodology: I received three credits to run a remote test with WhatUsersDo.com. I wrote the test (see project files) and typed it into the online form. I was able to designate what device I wanted testers to use (desktop, smartphone, tablet), plus choose the testers’ country (UK, US, France, Germany, Netherlands), age range, and socioeconomic status. Testers were asked to view https://uvalib.github.io/pr
    Prototype screen (partial)

    Prototype screen (partial)

    ototype/ and use it to complete six tasks and rate their ease at completing each task. Testers were also asked for their opinions about the site. The testers used screen cast software which recorded their voice and keystrokes, and they were told to think out loud as they worked through the test. Links to the resulting .mp4 files and a summary of the test were made available for download.

Findings

  • 3/3 testers had trouble locating subject specialist help from a Virgo screen. After a minute of clicking and scrolling 2/3 testers found the answer.
  • From the Library Hours page 2/3 testers clicked on Collections when asked to look for a book, possibly because there is no search box on the Hours page (see fig. 1). One tester used the search box at the top of the Collections page to search Virgo, and the second tester used a Virgo link under the heading, “Ways to Explore” on the Collections page (see fig. 2). The third tester used his browser back button until he found a search box on a library page. All testers were able to successfully complete the task.
Figure 1 Library Hours page

Figure 1 Library Hours page

Figure 2 Library Collections page

Figure 2 Library Collections page

  • 3/3 testers had some difficulty finding a library lab with reservable space, although each was eventually successful (in different ways). The first tester went to Reserve a Room under Spaces and Equipment, then clicked on Group Study Rooms (a link to the booking software). The second tester also selected Reserve a Room (but via the Services page). She scanned for the term “lab” but went back to the Services page when she didn’t find it. She then clicked on Digital Labs, Study Space Info, and again on Digital Labs before finding text about reserving rooms within the Scholars Lab. The third tester also went into Reserve a Room under Spaces and Equipment, then backed out to the main page and selected Explore Libraries and Labs, where he found text about reservable rooms in the Library Data Commons@Curry. It is reassuring that each tester found the answer, and it supports that we need to maintain some reciprocal links and different navigational paths.
  • All testers began new tasks without navigating back to the main page, which reinforces the need for good navigation and design from every page.
  • One tester confessed to being a retired librarian, so was quite familiar with terminology like “online catalog” and “journal finder”. Not my intended audience!

Project files: WhatUsersDo_Report