UX-4614, Perform Usability Testing on Special Collections Request System (Aeon)
Purpose: Assess usability and clarity of request process and dashboard
Stakeholders: Special Collections Library users and staff
Testing dates: 4/24/19-4/29/19
Participants: 1 non-UVA user (from Monticello); 1 UVA staff; 2 UVA graduate students; 1 UVA undergraduate student
Methodology: Participants were recruited from postcards in the Special Collections Reading Room, direct contact with researchers, online web banner, and by solicitation of our existing research queue. A targeted recruitment web form was created: https://virginia.libsurveys.com/ImproveSpecialCollections. Each participant was to have experience with either online or in-person Special Collections services and collections. A queue of UVA and non-UVA researchers was developed and solicited. Two very similar protocols were developed for UVA (Netbadge) and non-UVA users to account for minor differences in registration and logging in.
Summary of Findings
- No participants had any significant problem with either registration procedure.
- All participants sometimes struggled to complete tasks, but most felt confident of their ability to navigate and complete tasks after having gone through it once. On a scale of 1-5 where 1 was very easy and 5 is very difficult, the average rating of ease in completing tasks was 2.
- One participant, who rated her ease a 3, had more trouble navigating and more confusion when completing tasks.
- A few buttons and messages should be changed for clarity and to avoid confusion: “Anytime that the language can be particularly precise about what a button means is more helpful.” When editing a request 2 participants expressed concern that they might be duplicating requests rather than simply modifying them. They wanted to see a button that said Modify Request” rather than “Submit Request”.
- Participants needed more cues as to where they were in the process to avoid confusion and mistakes (“I thought I ordered it, but apparently I didn’t”). More visual cues would also reduce memory load: “I just did it but it’s already vanished from my brain how it was that I did that.” Some participants expressed that they sometimes felt “caught in a loop”: “’You currently have 2 active requests from your available limit of 5.’ So why then can I not seem to request them? Here, you’ve two active requests, but then it’s saying, to submit your request, please select the request, indicate the date and click Submit Information…I seem to be in a circle here.”
- Participants wanted confirmation of what happens next in the request process.
- Participants were unclear about the differences between the various Request options on the left menu. Some questioned why Reproduction was separate from Requests since they were all things that could be requested.
- All participants were confused by the location and function of the search bar. None understood that the search was for requests (some thought Virgo).
- All participants understood what fields were required in the forms, and when asked, most could identify the “red star” or “red asterisk” as meaning that the field was required. Error messages were clear when a mistake was made in filling out the registration form.
- The “Save for Later” function was understood (eventually) by 4 participants, but 2 struggled at first to understand what they were saving and then why they wouldn’t want to immediately submit their requests.
- Participants were confused or annoyed by the two modal screens when requesting from Virgo: “This page is unnecessary. Because I just did that. I had the same information on the– this screen is unnecessary.” A similar complaint about what seemed like unnecessary steps when requesting from Archives at UVA: “I definitely felt like when I was looking at Poe records [in Archives at UVA] trying to figure out how to request it I kind of got into a loop there, clicking on Request, then it would take me back to the record, then back to the request thing.”
- Final comments include:
- “Pretty straightforward…pretty intuitive.”
- “It’s just not a great user interface. It’s an easy process. It’s pretty straightforward. It’s just a lot of form-filling.”
- “[Archives at UVA] was a little bit confusing, wasn’t sure how to get where I needed to go, lots of text in a form that was a little bit hard to decipher because it was in big paragraphs… but was pretty easy to eventually figure out and navigate.”
- No button or link should open in a new tab or window. This is a usability best practice that allows visually-impaired researchers to remain oriented and able to navigate backwards. Read more about the experience for visually-impaired and keyboard-only researchers at https://webaim.org/techniques/hypertext/hypertext_links#new_window and https://www.nngroup.com/articles/keyboard-accessibility/. At minimum, request buttons that open in a new tab or window should have a tooltip (and corresponding hidden labels for screenreaders) that indicates that action. At this writing Virgo has a tooltip on the actionable request button but Archives at UVA does not.
- Logoff and Main Menu buttons should be clickable everywhere, not just on the text. This is a usability best practice.
- Add a visual cue on every entry in the Outstanding Requests table that the line is clickable. All text in the row should be a visible hyperlink and the cursor should turn to a pointing finger on hover. (Alternatively, put an Edit button alongside each entry.) This is a usability best practice.
- The Outstanding Requests table should communicate effectively to researchers. Headings “Status” and “Order Status” aren’t significantly different to the researcher and should be changed to something more meaningful. “Awaiting User Review” should be changed to “Unfinished Request” or similar to emphasize to the researcher that they need to take action. Entries should be distinguishable from each other, therefore need more data displayed. One participant suggested “a timestamp or details about the request (in-person or digital). That would be helpful.”
- The “Clone to Copy” button was not understood as the place to clone a record for a duplication request. 4 participants eventually found the button through trial and error (“Edit? No. Cancel? No. Clone? No. Clone to Copy? Okay.”) and were able to complete the task. I recommend changing the button name to something more meaningful to researchers, but I don’t have a good suggestion. Would either “Request a PDF/JPG” or “Request a Duplication” work in this context?
- When coming into the system from Virgo the request form is prepopulated but participants still expressed confusion about where they were in the request process: “The title is misleading. If I’m actually in the process of making a request, then remind me I’m in the process.” Rather than “New Book and Printed Material Request” or “New Manuscript/Archives Request” use the same text as when coming into the system from Archives at UVA, as edited in this screenshot:
This change will help orient the researcher that they are in the middle of the process and will prompt them to do the next step to complete the request or save for later.
- After selecting a date and submitting a request this screen appears as in this screenshot:
There is a conflict between “Your request is almost complete” and “You do not currently have any requests in review,” and the messages in the blue bars were not consistently seen. I recommend replacing everything in the gray box with a heading, “Your request is complete” and text below that indicates next steps: “Your materials will be available to you on [scheduled date] in the Special Collections Reading Room. Please request them at the desk. This information will also be emailed to you.” Include a link to hours and a map to parking and building location. If possible, send all this information in an email to the researcher and include the TN number and other identifying information. Multiple participants voiced concern about what happens next in the request process, and it is best practice to clearly communicate success and next steps when forms are filled out.
- Add placeholder text in the search bar, “Search Your Requests” and add a corresponding hidden label for screenreaders.
- Add tooltips (and corresponding hidden labels for screenreaders) to all links on the left menu to communicate and clarify purpose of each (“As a first-time user I don’t see a difference between some of these [links], and I would just mash the button to figure out what I want.”)
- Add links to Hours and Planning a Visit in the left menu. Only 1 participant looked for Hours in the FAQ but commented, “I can’t believe I had to do all of that for hours.” The other 4 participants went to the Library home page or to Google to find the answer.
- After changes are made and the system has been live for a few months we should run usability tests again, which will give us a chance to fine-tune language and confirm fixes. I also need to finish looking at the Request System from an accessibility perspective to confirm full access to all researchers.
Protocol for UVA (Netbadge) users
||There is a signed copy of the book, Tough Guys Don’t Dance, in Special Collections. How would you look at it?
How would you describe the process this far in your own words?
If you were to click on the Request button, what do you think would happen next?
Is there anything unclear about this process so far?
- Evaluate clarity of multiple request buttons
- Evaluate effectiveness of dynamic shading
- Listen for how much of the first and second request screens are read
- Evaluate clarity of instructions modals:
- do they understand where the book will be delivered;
- do they acknowledge that they’ll choose a pick up date next;
- do they understand they’re being taken into another system
- Assess clarity of request process in participant’s own words
||You need to read an article in the journal, Epoch, but it could be in 2001 or 2002 so you’ll need to see both. What do you do next?
- Evaluate clarity of multiple request buttons
- Evaluate visibility of “Please select ONE item to request” and tooltip, “Select only a single item to continue”.
- Evaluate UVA new user registration process
||Complete the request so you can see the journal later today.
How do you know you’ve got all the data you need
How do you know you’ve successfully submitted the request
- Assess clarity of submission process from Aeon dashboard
- Determine if * is understood to represent required fields
- Evaluate clarity of date picker
||How can you check to see today’s hours for Special Collections?
- Assess visibility and clarity of FAQ
||You’ve decided to wait until tomorrow to see the journal. Can you make that change?
- Evaluate clarity of selecting request (TN#?) to edit request
- Note if they instead use Outstanding Requests or other option on left menu
||Is it possible to get a copy of the Epoch article instead?
- Evaluate clarity of steps to have a copy made
- Evaluate understanding of file types
||Take a few moments to look at the dashboard. What are your impressions of this page?
What are some of the specific things you can do on this page?
- Assess general impressions
- Assess clarity of different request functions
||Now we’re going to go into a new system for searching for unique Special Collections materials including unpublished manuscripts, university records, visual materials such as films and photographs, audio recordings, digital material, and more. This time I’m going to have you go through the request process as a non-UVA researcher, which has a different login procedure.
Please give me your first impressions of the page at https://archives.lib.virginia.edu/repositories/3/top_containers/371
- Evaluate Archives at UVA record for clarity and completeness
- Evaluate process for non-UVA researchers
||You’d like to take notes on an item in the Poe papers collection. If you are a non-UVA researcher, what do you do next?
- Evaluate visibility of request button and clarity of request button text
- Evaluate visibility of First Time Users link
- Evaluate First Time Users registration process
||Go back to Virgo and begin the request process for Tough Guys Don’t Dance as a non-UVA researcher. Once you are in the Special Collections Request System, can you save your work without submitting?
- Evaluate clarity of Save for later option.
- Discover improvements to recommend.
- Evaluate clarity of Awaiting User Review
- Evaluate clarity of Unsubmitted vs. Outstanding Requests
||Do you have the option to submit more than one request at once?
||Is there anything confusing about this request process?
Is there anything missing from this page?
What would make this request process easier?
- Last comments about request process and Aeon dashboard
||On a scale of 1 to 5, where 1 is very easy and 5 is very hard, how would you rate your ease in completing these tasks?
- Gauge overall experience with request process
UX-4276, User research on proposed IA structure
Purpose: Assess usability and clarity of proposed public website reorganization
Stakeholders: Library users and staff
Testing dates: 4/3/19-4/4/19
Participants: 15 graduate and 27 undergraduate students (all uva)
Methodology: Enrolled UVA graduate and undergraduate students were solicited by email from our research queue to take a ten minute test for which they received a $10 deposit to their Cavalier Advantage card. An online tree test tool by Optimal Workshop was employed to help determine the efficacy of the proposed information structure. Based on previous testing and analytics, we created a simple hierarchy of four top categories, each of which had between two and five subcategories (see Tree Structure). Participants were asked to complete 12 tasks designed to assess how closely the information structure matched their mental models (see Full Test Results, Analysis tab). The tree test results indicate how many participants found the correct path on their first try, how many found on their second or third try, and how many failed to find the correct path; as well as how fast the participant completed the task. For an overview of how to analyze tree test results, visit Atlassian’s Tree Testing for Websites.
Summary of Findings
- Across all 12 tasks, 79% of participants ended up at the expected answer and 77% of answers were chosen without backtracking. Even without visual cues, navigation, and menus, participants are finding the correct answer four out of five times.
- Seven out of 12 tasks had a high rate of success (between 81% and 95% participants got to the correct answer) and completing each task took an average of 5.29 seconds.
- All 15 graduate students ended up at the expected answer on 5 tasks.
- Half of the participants first clicked on ABOUT and most immediately found and correctly selected Plan a Visit. Another quarter of the participants looked first under USING THE LIBRARY but all found their way to ABOUT/Plan a Visit (task 3).
- Participants were able to correlate the terms “library study areas” and “reserve a study space” with Library Spaces (tasks 1 and 7).
- Participants were able to correlate the phrase “find a book” with Search, Borrow, Request (task 4).
- Participants seemed to have no trouble identifying “Special Collections” as a library and locating it under Libraries & Collections (task 8); although possibly using the term “collections” in the task was a false identifier.
- Five out of 15 graduate students incorrectly selected TEACHING & PUBLICATION SUPPORT/Teaching Support to look for course reserves (task 9).
- Five participants incorrectly selected Accessibility Services as the location for getting help with an off-grounds access problem. 26 participants went directly to GETTING HELP/Ask a Librarian (task 10).
- Five tasks from the previous tree test in 2017 were largely duplicated in this latest test. In each case, the results improved. See 2017 test results.
Deeper dive: task #2: “A librarian visited your Economics class last week to talk about doing research. Where can you find his name?”
Participants looked in all four categories for the answer, and only eight got there. 20 participants selected ABOUT/Staff Directory, which is logical if a name is known (and probably indicates a poorly-worded task). We intend to link the subject specialist listing with the staff directory and usability testing will determine if this helps in getting students to subject experts. The remaining 8 participants selected GETTING HELP/Ask a Librarian, and it is not necessarily a bad thing that students opt to ask for online help when they don’t know an answer.
- The overall high rate of success indicates a strong IA that aligns with the mental model of students.
- Plan a Visit should remain under ABOUT.
- Library Spaces and Search, Borrow, Request seem to work well under USING THE LIBRARY. Usability testing with a wireframe or prototype will help confirm.
- Whether or not Special Collections can be found under Libraries & Collections is inconclusive and would benefit from further testing.
- Teaching Support resonated with some participants as a logical location for course reserves, so consider adding a search for course reserves there.
- Add further description to Accessibility Services to define it more clearly.
- The plan to link the subject specialist directory with the staff directory will improve people searching.
Tree structure as tested
Tasks and answers
||Where would you find library study areas?
(USING THE LIBRARY –> Library Spaces)
||A librarian visited your Economics class last week to talk about doing research. Where can you find his name?
(GETTING HELP –> Subject Specialists)
||Your parents are visiting and want to see the Berlin wall installation outside Alderman. Where can they find directions to nearby parking?
(ABOUT –> Plan a Visit)
||Find a copy of the book, Invisible Man by Ralph Ellison.
(USING THE LIBRARY –> Search, Borrow, Request)
||Which libraries have color printers?
(USING THE LIBRARY –> Equipment & Technology)
||Where would you find submission instructions for an open source journal published at UVA?
(TEACHING & PUBLICATION SUPPORT –> Writing & Publication Resources)
||Reserve a study room.
(USING THE LIBRARY –> Library Spaces)
||Find out why Special Collections is named for Albert and Shirley Small.
(ABOUT –> Libraries & Collections)
||Find course materials on reserve for one of your classes.
(USING THE LIBRARY –> Search, Borrow, Request)
||You’re searching in a journal article database and get an error message instead of the article you need. What do you do next?
(GETTING HELP –> Ask a Librarian)
||What is the email address for the guy in charge of all the libraries?
(ABOUT –> Staff Directory)
||Ask the Library to purchase a book.
(USING THE LIBRARY –> Search, Borrow, Request)
UX-3833, Guerrilla test Robertson Media Center pages and equipment request process
Purpose: Assess usability and clarity of new RMc pages and equipment checkout pages
Stakeholders: users and staff
Testing date: 8/31/18
Participants: 5 UVA students
Methodology: “Guerrilla” testing, in which UX staff, some in tyrannosaurus Rex costume, invite students entering Clemons Library to spend 10 minutes of their time to improve a library web page. Coffee and danishes offered to all passers-by.
Project phase one deliverables were to create a landing page and second-level pages. Based on the content, design, and architecture, users are to be able to understand the following:
- Who the space is for (students, faculty, community members)
- What services and equipment/spaces are available to whom and how to obtain them
- This is a place for making, learning, and teaching
The protocol was developed to elicit responses to assess if the goals were met, and to test the response to wording for equipment check out that does not require reserving in advance.
- After perusing the content and images of the landing page, participants were asked what the space is for, and to describe the space using three adjectives. Their answers reflect a space for making and creating, equipment to use and check out, and help when needed.
- Four out of five testers described the space as “high-tech” or “technological.” The specific terms “creative,” “useful,” “3-D,” “VR,” “studio,” “check-out,” and “advanced media” suggest that the participants see a space for using and exploring new technologies.
- Participants used the terms “knowledgeable,” “helpful,” and “supportive,” and phrases such as “people to help,” “assists students,” “help you with stuff,” “classroom,” “tutorial,” “tech support,” and “equipment troubleshooting,” all of which convey help, learning, and teaching.
- One participant offered that “faces and names [under heading, ‘Who We Are’] makes people more approachable.”
- Other terms reflect favorably on the environment: “modern,” “variety,” “expansive,” “fresh,” “eclectic.”
- Four out of five participants correctly identified UVA students and faculty as the audience focus for the RMC. One participant thought RMC might be limited to “STEM or media studies majors.” Two participants also mentioned access by “the public” and “anybody outside UVA” but that they wouldn’t be primary users.
- No discernible problems with participants correctly interpreting the meaning of terms “No Reservations” or “First come, first served.” Similarly, the RMC equipment page heading, “Walk-up Equipment,” seems to provide enough context to be correctly understood. However, several participants expressed that they also assumed they’d be able to use equipment without training, or that there would be no lengthy check out procedure, or that there would be no need to sign out equipment at all. We recommend avoiding idioms and colloquialisms that may lead to confusion and misunderstanding, particularly for non-native English speakers. To ensure clarity we recommend using plain language phrasing, either “No reservations required” or “No reservations needed,” to be used consistently throughout the RMC and equipment request sites. Additionally, because “Walk-up Equipment” is easily misinterpreted, we recommend changing “Walk-up Equipment” to “No reservations equipment” or similar. Also please review the categories under that heading (for instance, video equipment and cameras must be reserved so instead belong under the heading, “Reservable equipment). Instead of “Browse the equipment collection” button use text more actionable like “Check out equipment” or “Use Equipment.”
- After having reviewed the RMC landing and second-level pages, participants largely correctly identified who the space is for and what goes on there. Participants were then shown the Equipment page and asked: “If you wanted to use a camera, what would you do next?” Three participants initially thought they would have to go to the RMC desk to check out a camera, and one participant wondered if he needed to use equipment in-house. Four out of five had to be directed to click on “Browse the Equipment Collection” to make a reservation online. Three saw the text, “Training & Reservation Required” and another thought they should search online for a tutorial. All participants saw the links for “Availability” and “More details” but two participants were still unclear what to do next. Most participants clicked on “More details” but didn’t read any of the text. They all saw the calendar and figured out the process of selecting a date by clicking on a green box and logging in via Netbadge. One participant expressed confusion about the difference between red and grey boxes on the calendar, and the use of the term “padding.”
- Recommendations for all LibCal equipment items, based on above results: Hide “Availability” since reservations can be made under “More details.” Change “More details” to a button, “Use this equipment” that users will see as an actionable thing. Review content and messaging about requirements and training on item pages to reduce amount of text. Highlight the Instructions tab (which no test participant clicked on). Describe specifications of equipment separately from the process to train and check out. Use short, declarative bulleted lists rather than prose to improve clarity and make text more skimmable.
||[Have tester examine the RMC page at https://www.library.virginia.edu/rmc]
Based on the contents of this page, what is the Robertson Media Center?
|What is this space for?
||Who can use it? Why do you think that?
||Who is this space for?
||What are three adjectives you would use to describe this space?
||What does the page evoke (friendly, techie, diverse, inclusive, bright…)?
||[Direct attention to /equipment and to “Walk-up Equipment”]
What does this mean to you?
Probe: How would you describe it in your own words?
(on-demand, walk-up, something else?)
|Does walk-up or on-demand resonate? If not, what does resonate?
||If you wanted to use a camera, what would you do next?
||Assess understanding of how to get training and how to reserve
Do they see/read “Training & Res Req”?
Go to Equipment
Go to Browse the equipment collection
View More Details or Availability
||[Go to Location = Brown or Music]
Based on the contents of this page, what can you do here?
[Focus on Dry Erase Markers or Microphone and descriptions]
What can you do here?
How would you describe it in your own words?
|Do they see “No Reservations” or Do they see “First Come…”
Do they understand it
Do they click on Availability
Are they confused?
||How would you rate your ease in using this page, where 1 is very difficult and 5 is very easy?
||Evaluate overall ease in using the LibCal equipment page.
UX-3834, Analyze Special Collections Tree Test Results
Purpose: Assess usability and clarity of proposed Special Collections website reorganization
Stakeholders: Special Collections users and staff
Testing date: 8/27/18
Participants: 36 UVA undergraduates
Enrolled UVA undergraduates were solicited by email from our queue of volunteers to take a 10 minute test for which they received a $10 deposit to their UVA debit card. An online tree test tool by Optimal Workshop was employed to help determine the efficacy of the proposed information structure. Based on previous testing, we created a simple hierarchy of 4 top categories, each of which had between two and six subcategories (see fig. 2, Tree Structure). Undergraduate testers were asked to complete ten tasks designed to assess how closely the information structure matched their mental models (see Full Test Results, Analysis tab). The tree test results indicate how many testers found the correct path on their first try, how many found on their second or third try, and how many failed to find the correct path; as well as how fast the tester completed the task. For an overview of how to analyze tree test results, visit Atlassian’s Tree Testing for Websites.
Summary of preliminary findings
- Overall success rate was 70%. Even without visual cues, navigation, and menus, participants are finding the correct answer at least two-thirds of the time, suggesting the tree is effective but may need some tweaking.
- Overall directness (going directly to the correct area) was 48%. Some participants are struggling to find the right path but get there eventually.
- Participants usually identified what they would find under the About and Donate categories. The Destination tab on the full results site shows that large numbers of participants (the green boxes) correctly selected these categories, and few incorrectly selected these categories (the orange, red, and white boxes).
- The Donate link is readily found and understood. 31 participants found the correct policy under Donate, and the average time to complete the task (5.52 seconds) was half the average time taken to complete the other 9 tasks.
- Almost half of the participants selected Class visits and instruction to complete a task about finding a map, suggesting that we should duplicate some information about planning a visit in this category.
- The Destination tab shows us that Online reference request was incorrectly selected 16 times (although it is not necessarily a bad thing that students opt to ask for online help when they don’t know an answer).
- Participants were able to correlate the term “artifacts on display” with Collections and Exhibitions (task 5).
- Participants were able to correlate the term “give” with Donate (task 6).
- Participants found the digital camera policy under Using Special Collections/Usage Policies and correctly interpreted that this policy would probably cover taking photos with a phone.
Deeper dive: task #7: “Can you confirm that there is a substantial number of William Faulkner papers in Special Collections?”
fig. 1, pie chart of task #7 results
Roughly half of the participants found the correct category, which is several levels down in the hierarchy. However, first-click data shows that an additional 34% clicked on Collections and Exhibitions but didn’t investigate Featured Collections, where they would have found the Faulkner papers. Most wound up in either Manuscript Collections or UVA Archives, so were circling in the right neighborhood but didn’t ring the right doorbell.
- Class visits and instruction seemed to resonate with undergraduates, so consider building up this popular area with duplicate links from Plan a visit.
- Online reference request is used to answer a wide spectrum of questions, so give this a high profile (possibly as a sidebar).
- About and Donate work well as top-level categories.
- Using Special Collections is a broad category that may need further refinement through discussions with Special Collections staff and/or further user research.
- Collections and Exhibitions may not resonate with undergraduates but may work for others with more familiarity with research collections. Featured Collections should be rethought to identify a current goal and purpose, which will inform how and where to present it, and may inform Collections and Exhibitions.
The UX team will factor in these recommendations to rework the information architecture.
fig. 2, tree structure
User Experience Project ID: UX-1922, Library homepage development
Purpose: Assess usability and clarity of proposed website designs
Stakeholders: UVA Library staff and users
- 2/22/17 – 2/24/17 ; 9/14/17 – 9/19/17 ; 11/15/17 – 11/21/17 (usability testing with UVA undergraduate and graduate students)
- 4/5/17 – 4/14/17 (online surveys of UVA students, staff, and website visitors)
Methodologies: Usability tests (3) and online surveys (3) to present iterations of Library home page designs to test usability and elicit feedback. Efforts were focused on search box location, carousel location and navigation, Hours page, Research page, mobile and tablet navigation, and link naming.
Summary of findings
- Hours are easily found and testers liked its new prominence
- Mobile version of Hours is easily navigated and was praised as “well-formatted”
- Improved visibility of search, which is a primary or secondary task for website visitors
- The “Ask a Librarian” brand is firmly established and the “Ask a subject specialist” link seems to be understood as another place to get help
- Iterations of new design were usually viewed as more attractive than the previous versions
- Mobile hamburger menu is more easily found when labeled “MENU”
- The Research page redesign was described as clean, efficient, logical, and organized
- Thumbnail images under the carousel image helped testers navigate content
Project status: Newly designed website theme rolled out 3/7/2018
Final reports (UVA only): https://virginia.box.com/v/2017WebResearch
User Experience Project ID: UX-2846, Analyze Tree-Test Results
Purpose: Assess usability and clarity of proposed website top level reorganization
Stakeholders: UVA Library staff and users
Testing dates: 11/13/17 – 11/14/17
Participants: 29 UVA undergraduates (paid)
Enrolled undergraduates were solicited via a banner on the Library web page to take a 15 minute test. An online tree-test tool by Optimal Workshop was employed to help determine the efficacy of a proposed information structure based on previous testing results. We created a simple hierarchy of 5 top categories for the Library website:
- About UVA Libraries
- Using the Library
- Help and requests
- Research and instruction
- Advanced technology
Each category had between three and nine subcategories. Undergraduate testers were asked to complete ten tasks, and were told it was possible to complete these tasks on the Library website. The tree-test results indicate how many testers found the correct path on their first try, how many found on their second or third try, and how many failed to find the correct path.
Summary of preliminary findings
- Undergraduates correctly identified categories about spaces (libraries, study areas, room reservations)
- Undergraduates correctly identified categories about searching for books and articles
- The category Advanced Technology was not well understood
- The task to find specialized software could only be successfully completed by selecting the “Advanced technology” category, but only 38% of testers selected the category immediately, and 41% of testers ultimately selected one of the other four categories
- Color printers and scanners (equipment) were not easily found under Using the Library
- 38% of testers found it immediately, an additional 28% found it eventually, and 34% did not find it at all
Deeper dive: task #8, “Find course materials on reserve for your class”
Only 12 testers (41%) navigated correctly to “Search course reserves” or “Search Virgo” in order to complete the task. Of the 17 that did not navigate correctly, 11 incorrectly looked for the answer under the “Help and requests” category (which may show a willingness to ask a librarian for help when they can’t find something on the website). Of those 17 failures, 10 selected “Research and instruction” (one of the correct categories) at some point, but still did not successfully complete the task. It took all 29 testers an average of 17.65 seconds to finish this task, which was considerably slower (by at least 8 seconds) than the time spent on the other tasks. Previous testing has indicated that some students are unfamiliar with the concept of course materials being held on reserve in a library. The high rate of failure coupled with the extended time it took to finish this task further suggests confusion with this concept.
The testing information gathered here continues to inform website organization projects. Some categories of information should be surfaced in order to be found by undergraduates, but other user groups (graduate students and faculty) also need to be tested.
Testing results: https://www.optimalworkshop.com/treejack/LibUX/uvalib2/shared-results/8xa6v522406acu7g2y3l3n0gu7x3nnf8
User Experience Project ID: UX-2368, Facilitate two focus groups with graduate students
Purpose: Investigate sources of dissatisfaction among graduate students
Stakeholders: Academic Engagement, Administration & Planning, Collections, Access & Discovery and other Library areas
Focus group dates: 11/29/17, 11/30/17
Participants: 11 graduate students (including one session with five international graduate students)
Low ratings by a small subset of graduate students in the 2016 user survey were a cause for concern, but root causes were elusive. Two focus groups were offered to assist with identifying issues. One focus group was limited to international students to learn of their specific needs. Most students were recruited from usual avenues (web banner on the Library home page; contact cards left at print stations). Others had participated in the 2016 user survey and agreed to future participation in user research, and several were invited based on past feedback they’d given. UX research staff created two focus group protocols probing on problems and resolutions, confusion or difficulties, type of communication preferred, and how the Library can better contribute to graduate student work and to their overall positive experience. The international students were also asked about specific challenges they face.
- Communication and connectivity are important elements to a positive graduate student experience. Graduate students want us to market to them early and often, and want communications and workshops that are geared toward them.
- Graduate students want document delivery service and they want us to be proactive, try harder, and care more about getting the paper and electronic materials they need.
- Library systems and policies are confusing and frustrating, especially to incoming students. Many navigate with the help of their peers. They want DIY short online video tutorials for online tools that can be found and viewed at point-of-need, but like in-person assistance with subject specialists for in-depth research.
- Graduate students want space for work and storage in libraries. Physical comforts impact willingness to use and stay in library spaces.
- Students with mobility issues can be negatively impacted by collection decisions that split call number runs between multiple libraries.
- International students matriculate with few expectations for library services beyond checking out books and comfortable study space; there were no notable differences in library experience with their peers from the U.S.
- Create a communication plan, to include physical materials, for incoming and other graduate students to convey services, policies, and contacts.
- Create workshops to engage graduate students in library services and research resources.
- Create short online video tutorials for services and processes.
- Consider providing scanning and electronic delivery of print journal articles to all graduate students.
- Explore possibilities to expand dedicated graduate student space in all libraries.
- Increase efforts to convey information about critical systems, services, and policies to graduate students, particularly incoming. Include basic information about online catalog, Interlibrary Loan, policies (borrowing, recall, course reserve), off campus access to online research materials, printing, obtaining a carrel, how to reserve a classroom, and stacks navigation.
- Increase in-person communication with graduate students about subject liaison program.
- Review tight stacks areas that may provide poor accessibility.
- Consider revising collections policies to keep all materials in the same subject together to enhance accessibility for students with mobility issues.
- Do further research to gather more complete data on the following:
- assess how the Library should communicate with graduate students;
- gain a deeper understanding of the specific needs of international students;
- determine graduate student priorities for document delivery, paging from other libraries, and other delivery services;
- assess what current graduate students wish they’d known as incoming students.
Final Report and protocols (UVA only): https://virginia.box.com/v/gradfocusgroups