One Search Bar To Rule Them All

  • Purpose: Understand use of and expectations about search from the UVA Library homepage
  • Stakeholders: UVA Library homepage users and staff
  • Survey dates: April 19-25 (students) and June 29-30 (faculty)
  • Participants: 33 UVA undergraduate students, 19 graduate students, and 9 UVA faculty
  • Methodology: Participants were recruited from our existing research queue and through emails forwarded to academic departments. Students were compensated with a $10 Amazon card for completing the survey. The online Qualtrics surveys presented examples of site search bars and search results using screenshots from Harvard and Stanford libraries. Participants were given tasks to determine clarity of designs and were asked for opinions about the preferred display of search results. See Survey for a full copy of the survey questions.

Summary of Recommendations

  • Keep the Virgo search bar prominently displayed on the Library home page. Faculty and graduate students in particular use it frequently.
  • The magnifying glass icon that represents site search isn’t sufficiently identifiable on its own. If used, the icon needs a label.
  • There is some evidence that patrons are not expecting separate search bars for the catalog and library site search, and also evidence that patrons prefer to have a comprehensive page of search results. This strongly suggests that we should work toward having a combined search for Virgo and the Library website, at minimum. Preferably we should also include LibGuides and other major content sites within the Library domain.
  • The search results page should be categorized in a familiar way (“catalog,” “articles,” “library information,” “LibGuides”) and should have a clean, clear, and modern design. Breadcrumbs will help patrons navigate this complex search results page.

A Closer Look: Survey Question #4

“Here is a screenshot of another library homepage.  Click or tap once on the area where you would perform a search for information about this library’s food and drink policies.”

Crimson banner with white search bar at center

Harvard home page screenshot

Only 10/52 of students and 3/7 faculty correctly identified the magnifying glass icon as the place to find site search. Almost half of the students (22/52) misidentified the Hollis catalog search bar as site search. Participants may have interpreted “perform a search” to be synonymous with clicking on links. Student comments indicate that many chose the Hollis search bar because of its size and central location. Students may have expected the main search bar to provide this information or they weren’t expecting a second search bar. “Hollis” may not be identifiable as an online catalog, leading to incorrect use. Or there may be an expectation that a search bar will search everything Library-related.

When asked to explain why they chose the area they did:

  • “Food and drink policies wouldn’t be in the catalog, it would be on the library site”
  • “That’s where I would normally search in other databases”
  • “It is a search box”
  • “Very in my face; I don’t have to search for the search box”
  • “It is at the center of the page”
  • “It looks like a regular search bar”
  • “It may be easiest just to ask someone”
  • “No good choices”
  • “Unsure where to search”

A Closer Look: Survey Question #9

“Which option do you prefer for the display of search results?”

One column with headings Library, Tool, Service, Collection

Option 1 [Harvard] search results screenshot

Option 1 [Harvard]

Two columns with headings Catalog, Articles, Guides, Library Website

Option 2 (Stanford) search results screenshot

Option 2 [Stanford]

Answer Undergrads Grads Faculty TOTAL
OPTION 1 [Harvard] 19 3 2 24
OPTION 2 [Stanford] 13 13 2 28
No preference 1 1 0 2

Combined totals are roughly split between Options 1 and 2, but the majority of graduate students (13/17) preferred Option 2 (Stanford). Faculty were evenly split between Option 1 and Option 2. Even when considering which students indicated that they currently use UVA Library site search at least once a month, the preference between Option 1 and Option 2 is evenly split.

The comments give us a good idea of what we should take away from each design. The Harvard design (option 1) was thought to be sleek, clean, and easy to read and navigate. The Stanford design (option 2) received high marks for breaking down search results into familiar categories like articles, catalog, and library information. The Stanford design was considered organized and complete in its presentation of results; but also criticized for being “outdated” and “cluttered.”

“Do you have any further comments about the display of search results?”


  • “I like how in [Option 2] it clearly distinguishes between logistics & research.”
  • “Option 2 is categorized in a slightly more clear way but the design looks hideous and outdated + confused about catalog vs. articles options. Option 1 would be better if it had clearer categories.”

Sampling of comments from graduate students:

  • “Option 1 is easier to read, but Option 2 provides the most information for more specific searches.”
  • “I like how [Option 2 ] breaks down the information into categories like articles and library website.”

Sampling of comments from undergraduate students:

  • “I like the subcategorization [of Option 2]. Helps me quickly eliminate places I need to look.”
  • “I think the descriptiveness of option 2 is great but option 1 layout is cleaner.”


Faculty and students were given surveys with only slight differences. This is the survey as presented to students. Some questions were removed from the final analysis due to problematic wording.

Please help the Library evaluate our website.

The Library is reviewing the search bar function on our web pages and we would like your help. You will be asked to view several screenshots from library pages at other institutions and asked questions about them. This survey should take no more than 10 minutes of your time. It is necessary to use a desktop or a laptop or a tablet in landscape view to complete this survey.

If you have any questions about this survey please contact Melinda Baumann, User Experience Librarian, at

Thank you!

Q1. Which best describes your relationship with the University of Virginia?

  • UVA Faculty or Staff
  • UVA Undergraduate Student
  • UVA Graduate Student
  • UVA Alum
  • Other

Please tell us about your experiences and expectations when searching for information on the Library’s web page.

Q2. How many times a month do you start a Virgo (Library catalog) search from the Library home page?

blue header with logo and search bar

UVA Library header screenshot

____ 0 ____ 1 – 5  ____ More than 5

Q3. How many times a month do you search the Library website using the search bar shown here?

white search box with heading, Search Our Site

Library site search screenshot

____ 0 ____ 1 – 5  ____ More than 5

Here is a screenshot of another library homepage.

Q4. Click or tap once on the area where you would perform a search for information about this library’s food and drink policies.

Crimson banner with white search bar at center

Harvard home page screenshot

What prompted you to choose this area?


The remaining survey questions relate to the display of search results.

Here is a screenshot of another library’s search results page.

Q5. Click or tap once on the area of this screenshot where you might expect to find information about that library’s food and drink policies.

One column with headings Library, Tool, Service, Collection

Search results screenshot from another library

Please rate your confidence that you might find information on food and drink policies here, where 1 would be not confident, 3 would be neutral, and 5 would be very confident.

5 4 3 2 1
Very confident neutral Not confident

Q6. Click or tap once in the area where you might search the library catalog to find resources about different food and drink policies. [Question removed from final analysis]

One column with headings Library, Tool, Service, Collection

Search results screenshot from another library

Please rate your reaction to these search results, with 1 would be confusing or frustrating, 3 would be neutral, and 5 would be clear and easy to read. [Question removed from final analysis]

5 4 3 2 1
Clear and easy to read neutral Confusing or frustrating

Do you have other comments about this search results page? [Question removed from final analysis]


Here is a screenshot of another library’s search results page.

Q7. Click or tap once in the area of this screenshot where you might expect to find information about that library’s food and drink policies.

Two columns with headings Catalog, Articles, Guides, Library Website

Search results screenshot from another library

Please rate your confidence that you might find information on food and drink policies here, where 1 would be not confident, 3 would be neutral, and 5 would be very confident.

5 4 3 2 1
Very confident neutral Not confident

Q8. Click or tap once in the area where you might search the library catalog to find resources about different food and drink policies. [Question removed from final analysis]

Two columns with headings Catalog, Articles, Guides, Library Website

Search results screenshot from another library

Please rate your reaction to these search results, with 1 would be confusing or frustrating, 3 would be neutral, and 5 would be clear and easy to read. [Question removed from final analysis]

5 4 3 2 1
Clear and easy to read neutral Confusing or frustrating

Do you have other comments about this search results page? [Question removed from final analysis]


Q9. Which option do you prefer for the display of search results?

One column with headings Library, Tool, Service, Collection

Option 1 [Harvard] search results screenshot

____ Option 1 [Harvard]

Two columns with headings Catalog, Articles, Guides, Library Website

Option 2 (Stanford) search results screenshot

____ Option 2 [Stanford]

____ No preference

Do you have other comments about this search results page?


Thank you for helping to improve the Library web site!


Research Results: Library homepage survey of students and faculty

Web page with blue header and Happy New Year image, Today's Hours, and more links.

Screenshot of UVA Library homepage, 1/25/21

  • Purpose:  Update research on use of and expectations about UVA Library homepage
  • Stakeholders:  UVA Library homepage users and staff
  • Survey dates:  March 15-17 (students) and April 5-19 (faculty)
  • Participants:  58 UVA undergraduate and graduate students and 21 UVA faculty
  • Methodology: Participants were recruited from our existing research queue, homepage banners, and emails to Library liaisons forwarded to academic departments. Two online Qualtrics surveys were written based on data gathered four years ago when the homepage was newly-designed, and based on current research questions. Students were remunerated with a $10 gift card.

Highlights of Findings

Survey participants were asked about the three links or areas were most important to them (Q1). Faculty selected Virgo or Advanced Search Virgo as their first choice by a more than 2:1 margin, and 85% of faculty chose Virgo as one of their top three links. LEO/ILL and Research (tied at 40%), Databases and Services (tied at 35%), and Hours (25%) rounded out faculty’s top link choices.

bar chart with tan bars and yellow lines

Final tally of most important links (without regard to whether the choice was first, second, or third)

Students also selected Virgo as their first choice by a more than 2:1 margin. Other top choice links were Research (18%), Hours (13%), Spaces (11%), and Databases and Subject Guides (tied at 5%). Hours were more important to faculty than students, which may be an anomaly caused by the closing of most libraries due to the COVID pandemic; analytics from March and April 2021 show that the Hours page is still a top performer (second only to the Research page).

Faculty described the Library homepage as “informative,” “organized,” and “comprehensive” (Q2). Third terms provided by faculty were more likely to be negative

Prominent words are clean, informative, easy, organized, boxy, busy

Wordcloud combining faculty and student top descriptive terms

(“busy,” “cluttered, “overwhelming”). Students used descriptors such as “clean,” “informative,” “organized,” and “helpful.”

Survey participants were asked to rank the importance of nine types of information available on the About the Library page (Q5). Both students and faculty gave Library Collections top importance (86% of faculty and 77% of students ranked it 1, 2, or 3). Then opinions diverged for students, who chose Directions and Maps next, followed by Plan a Visit, Mission and Policies, and Staff Directories. Faculty ranked Staff Directories more highly than Directions and Maps, followed by Mission and Policies and Plan a Visit. Both faculty and students ranked Tours, Library Assessment and Statistics, Jobs, and Ways to Financially Support the Library as having less importance to them.

Survey participants were asked to evaluate the Cornell University Library Hours page (Q8). About a third of the students liked the green and red colors representing Open and Closed: “The color coding makes it easier to find libraries that are open and closed when quickly scrolling.” Almost as many students liked the contact information (email, phone numbers). Seven students liked that services such as making a research appointment and picking up materials were listed: “I like that it provides links immediately for further steps.” Five students appreciated the thumbnail photos of libraries, and one commented that they might provide a necessary visual cue: “I like how it shows pictures for some of the libraries. If possible, that would be great because we don’t always call the libraries by their proper name so it can be confusing.” Faculty had similar responses: They approved of the color-coding, contact information, and links to services. Neither group mentioned the Libraries map button in the upper right corner.


  • Virgo search bar, ILL/LEO, Research, and Hours should all be prominent on the new homepage design.
  • Terms used to describe the homepage include “informative” and “organized” that reflect the UVA Library brand. Students first described the homepage as “clean” which, while not a negative term, may not be what we envision. Refresh this data with more testing after we’ve determined what terms we’d like to represent the Library brand.
  • Faculty and students expect to find Collections on the About page; we’ve already put it there in the new navigation. The remaining top areas will either be on the About the Library page or a subpage. Plan a Visit ranked in the middle with faculty and students but should prove to be more important to other types of users, meriting its place of prominence in the new About navigation.
  • When viewing another institution’s Hours page, the color-coded Open and Closed buttons were favored by students and faculty, as were the contact information and the links to primary services to make appointments and retrieve materials. Some respondents cautioned about adding more information to our already-complex Hours page. Until we take on a complete redesign of the Hours page we should heed the latter advice.


Faculty and students were given surveys that differed slightly in question order. Faculty were not asked two questions about Hours as marked below. This is the survey as presented to students.

Please help the Library evaluate our website!

The Library is refreshing the look of its webpages and we would like your help. You will be asked to review several current pages of the Library website and a screenshot from a library website at another institution, and asked questions about them. This survey should take no more than 10 minutes of your time.

If you have any questions about this survey please contact Melinda Baumann, User Experience Librarian, at

Thank you!

Which best describes your relationship with the University of Virginia?

  • UVA Faculty or Staff
  • UVA Undergraduate Student
  • UVA Graduate Student
  • UVA Alum
  • Community Member or Other

Please open the Library homepage ( in a web browser, and then answer the following questions.

Q1. As you think about using the Library homepage, what are the three links or areas on this page that are most important to you?

1st link or area ________________________________________________

2nd link or area ________________________________________________

3rd link or area ________________________________________________

Q2. What are three words or phrases you would use to describe this homepage?

1st descriptive word or phrase _______________________________________________

2nd descriptive word or phrase ______________________________________________

3rd descriptive word or phrase _______________________________________________

Please tell us more about your expectations when using the Library homepage.

Q3. If you were to click or tap on the About link in the Library website header, what types of information would you expect to find on the resulting page?

blue header with About circled in red


Q4. Now please open the Library About page ( in a web browser.

Is this what you expected to find? Check the box that best describes your answer.

5 4 3 2 1
very much what I expected to find neutral not at all what I expected to find

Q5. Here are the types of information about the Library that you might find on an About page. Please rank them in order of importance to you, with 1 being most important and 9 being least important.

______ Information to plan a visit

______ Directions and maps

______ Tours

______ Library collections

______ Library mission and policies

______ Library assessment and statistics

______ Jobs

______ Staff directories

______ Ways to financially support the Library

Q6. (Students only) If you were to click or tap on the Hours link in the Library website header, what types of information would you expect to find on the resulting page?

blue header with Hours circled in red


Q7. (Students only) Go to the Library Hours page ( in another browser tab.

Is this what you expected to find? Check the box that best describes your answer.

5 4 3 2 1
very much what I expected to find neutral not at all what I expected to find

Q8. Please compare the UVA Library Hours page to the screenshot from another library hours web page, below. Are there any features on this hours page that you like?

Library hours listed with closed button in red or open button in green, plus photo of library and other links

Please tell us about your experiences and expectations when searching for information on the Library’s web page.

Q9. If you were to use the search bar in the Library website header above, what type of search results would you expect to see? This question refers to

  • Results from Virgo, the Library’s online catalog
  • Results from the Library website (site search)
  • Results from both Virgo and site search
  • None of these results

Thank you for helping to improve the Library homepage!








Research Results: Learning Resources and Mental Models

OBJECTIVE: Identify and assess mental models for Learning Resources web page via card sort research

Stakeholders: UVA students; Teaching & Learning staff
Testing dates: 10/7-10/9/20
Participants: 8 graduate and 24 undergraduate students

Methodology: Enrolled UVA graduate and undergraduate students from our research queue were solicited by email to take a fifteen minute online test for which they received a $15 VISA card. We used a card sort tool by Optimal Workshop to map how students think about organizing online resources. T&L staff created a list of 40 learning resources (or “cards”). The 32 participants were asked to sort the cards into groups that made sense to them (see Full Test Results, Analysis tab). The data were then analyzed to determine patterns in mental models to inform the organization of the Learning Resources web page. For more information see Nielsen Norman Group’s overview of the card sort methodology.

Summary of findings

  • There was considerable agreement on the grouping of cards relating to multimedia, technology, and training. Nine cards were grouped together from 84% to 100% of the time, indicating that participants saw a strong relationship between these cards.
  • A second grouping occurred with cards that related to conducting research, research methods, and definitions of scholarly terms.
  • A third grouping related to discrete tasks one would accomplish in the Library and things a new Library user might find helpful to know.
  • Cards that related to getting help were more often put into other topic-specific categories; there were few mental models that included a generic “Help” grouping.
  • Although the term “Virgo” was used on four cards there was little attempt to create a specific “Virgo” grouping (more likely to be a librarian’s mental model of categorizing by specific research tools). These and other cards relating to acquiring books and journal articles were distributed in categories about doing research and about accomplishing Library-related tasks.
  • After completing the card sort participants were asked to give a name to the web page the resources might appear on. Although suggestions were provided to steer people away from the terms “learning” and “resources,” the majority of participants (21/32) still used at least one of those terms, and of those, 11 used both terms.


  • The learning resources can be grouped into three task-based categories that align with students’ mental models.
  • Each category should have subcategories for Getting Started; Help; and other subcategories that will allow students to drill down to more specific tasks and resources.
  • Further usability testing on these categories may result in some fine-tuning of these terms. At this writing, the following categories and subcategories are recommended:
    • Doing Research
      • Getting Started
      • Help
      • Citations
      • Searching for Books and Articles
    • Training and Multimedia
      • Getting Started
      • Help
      • Tutorials
    • Using the Library
      • Getting Started
      • Help
      • Accessing Materials

40 Learning Resources cards

  • 3D Printing Studio Training-MakerBot Playlist
  • Accessing a Book Through the Hathi Trust Emergency Temporary Access Service
  • Audacity Basic Training Playlist
  • Booking a Room at a UVA Library
  • Boolean Search Tips
  • Canon C200 Camera Training Playlist
  • DaVinci Resolve 16: Basic Training Playlist
  • DaVinci Resolve 16: Beginner’s Guide
  • Deeper dive into databases
  • Finding a Book in the UVA Library
  • Getting Help
  • Getting Started with Virgo, the Library’s catalog
  • How Do I Find What I Need? – searching UVA resources
  • How Do I Start My Research?
  • How Do I Use Information Correctly?
  • How Is Information Created? – about peer review and scholarship
  • How to Make a Request in Virgo
  • How to Make Video Tutorials Playlist
  • iMovie Essentials Training Playlist
  • Information Ethics & Citations
  • Interlibrary Loan – when UVA Library doesn’t own it
  • Introduction to databases
  • Navigating the Library Website
  • Peer Review in 3 Minutes
  • Placing an Interlibrary Loan Request
  • Podcasting in Audacity
  • Recalling a Book
  • Scholarly vs. Popular Sources Checklist
  • Searching Virgo
  • Thinking Tool
  • Using Academic Search Complete for articles
  • Using UVA Library Subject Guides
  • Virgo Help
  • Walking Tour of Grounds
  • What is an Academic Library?
  • What is Authority? – how to determine an authoritative source
  • What Type of Source Do I Need?
  • Who is a Scholar and How Do I Become One? – foundations of scholarship
  • Word Press Training
  • Zotero Walk-Through – citation manager

Renovations, Remote Meetings, and the Promise of an Accessible Workplace

Alderman Library at the University of Virginia is starting its first major renovation since new stacks were added onto the back of the building in the 1960s. This renovation necessitates that the building be closed until 2023, which right now seems like a very long time…But already I’m noting a silver lining for the Library staff, and it all has to do with inclusivity during staff meetings.

Most of the Alderman staff moved in late 2019 to outposts on Old Ivy Road and up 29N to the University Research Park; a few others are cozied into other libraries (Clemons, Brown, Harrison Small) or embedded in Kerchof and Zehmer Hall. We had our first Library-wide staff meeting on January 14, 2020, an event that occurs monthly, in the Harrison/Small auditorium. There were maybe 75 present of our staff of 225. Many others attended remotely through Zoom, a streaming video conferencing platform. Remote attendees individually participated from their desks or collectively from conference rooms. More than passive observers, the remote participants were able to submit questions and comments via an integrated Zoom chat tool.

To facilitate a smooth inaugural remote staff meeting there were a few rules: always use a microphone, whether presenting from the podium or asking a question from the audience. No one was allowed to say “I have a loud voice and don’t need a mic.” Remote participants were asked to mute their microphones so Zoom wouldn’t pick up any distracting throat-clearing or ringing phones. Presenters were asked to repeat questions into the podium microphone (until it was confirmed that the audience microphones were, in fact, broadcasting audio to the remote posts). One person in the auditorium watched the chat screen and relayed technical issues and questions from afar. All were asked to be clear to whom they were addressing questions, since there would not always be a visual cue such as making eye contact with the person at the podium.

The meeting went pretty well for a first effort, in part because a little awareness and patience goes a long way. The people in the auditorium understood that they really needed to use the microphones so the remote people could hear everything. In order to be sure every comment was heard things were repeated, sometimes more than once, but that assured that everyone could fully participate whether they were in Harrison/Small or miles away. That was the goal: to be sure everyone was included.

The silver lining I see is this: in taking this step we actually made staff meetings a little more accessible. We just made life a little easier not only for staff who are hard of hearing, but also for anyone with an ear infection or sitting near a noisy air handling unit. These are conditions, whether permanent or temporary or situational, that might affect one’s ability to hear, and we are all susceptible.

That’s the thing about accessibility: we all need, or will need, this consideration at one time or another. There is no “normal” condition, only the human condition, which means constant growth and change. A broken arm may hinder you only until it heals, or you may need help opening a door only because you are holding a bag of groceries. No matter the reason, if you can hit that panel with your elbow to automatically open a door, the room is more accessible to all. And when we’ve improved accessibility for all, we are more inclusive.

There is a new UVA initiative called Inclusive Excellence that promotes the “active, intentional, ongoing process to build community well-being and belonging.” When we take time to assure that others can hear everything that is said at a staff meeting, we are being more inclusive. The Library is now one step closer to having an accessible and inclusive workplace for all.

Read more about accessibility

Research Results: Focus Group with Graduate Students


  • Gather feedback from graduate student population on overhauled online catalog tool in development
  • Explore overall impressions of the design, utility, and clarity of the new interface
  • Obtain high-level guidance on how to approach future design/development
  • Expose areas of design/development consideration we may not have explored or that may elaborate on user stories
  • Identify potential usability issues and/or ideas we should explore in upcoming testing
  • Surface and discuss design/branding questions and ideas


  • Focus group with 6 UVA graduate students on October 25, 2019
  • Set ground rules:
    • We want you to do the talking
    • No right/wrong
    • All answers are confidential (but we are recording and transcribing for a limited number of project participants)
  • What we’re interested in: Reactions to this work-in-progress
    • Look-and-feel
    • Organization of info
    • Functionality
    • Presentation/understandability of results
    • Ideas for improvement: Generating ideas and figuring out mental models. Answers the question: “How would that solve a problem for you?”
    • Paper for sketching, jotting notes, plus white board and markers provided

online catalog tool in development


  • Provide more information sooner (full screen display of records; provide more details on records)
    • Want to get to info in fewer clicks; don’t add unnecessary steps
    • Prefer list display of search results to see more at once
  • Groupings, “see all”, availability, filters/facets/values all caused initial confusion but were learnable to grads
  • Very much liked that it was possible to search multiple sources at once and find things they weren’t looking for
    • Also want WorldCat, special collections, digital humanities, Google Scholar pools
  • Grads would use customization if available to set default search options (sources, filters) and display views
    • Really like bookmarks that extend over sessions (unlike current catalog’s ‘starring’ option)
  • Important to be able to limit by date, language, availability before searching
  • Subject headings should be hyperlinks to aid research
  • Missed having facets on basic search
  • Comment: “This looks like a blog”
  • Biggest problem with VIRGO currently is the search algorithm that produces results that are not always relevant (books and articles)

Research Results: Guerrilla Testing with Learning Resources

By Dave Griles

Purpose and Method

The Learning Resources webpage is new and in production. It serves as a gateway to research and teaching resources for students and faculty, so our purpose was to confirm that the design, terminology, and navigation made sense. Students, primarily undergraduate, were recruited at Clemons Library on October 1, 2019 and asked to review the website. Each user session lasted from four to ten minutes.

Summary of Results

Overall, the page performed well for the four testers, with most tasks readily accomplished by testers without additional prompting. However, testers did seem to expect to find citation engines and databases under the Research Basics category. Tester suggestions for improving the page centered on more specificity in category descriptions and including more information in the sidebar.

Questions Learning Resources

  1. What is this page for? What can you do here?
    • Testers identified that this was a place for searching for library resources.
    • Testers characterized the page varyingly as “FAQish” to an “abstract tool for research planning.”
    • One noted the instructional content.
  2. How would you find out about the ACRL Sandbox? [If they search:] How can you find it another way?
    • Testers primarily used A-Z list.
    • Testers found search box when asked about alternative methods to find target.
    • One tester attempted search from category view, which resulted in no apparent results.
    • Researchers noted page default of category view was inconsistently applied, sometimes leading the testers to start with A-Z list.
  3. What does the down-arrow mean?
    • While all testers identified that the down arrow would provide more information, there were differences in the nuances:
      • “all the links”
      • “more specific resources”
      • “more about items”
      • “drop down menu”.
  1. What would you expect to find under Research Basics?
    • Two testers expected to find databases, with one indicating “Citation Machine”
    • One tester expected easier resources, ones with a daily need to access
    • One tester did not know what to expect, identifying it as a very broad category
  2. What would you expect to find under Beyond the Basics?
    • Testers provided a wide array of responses
      • “More about research processes”
      • “More researching methods and other resources”
      • “More in-depth and using resources”
      • “Tools for research planning”
  1. What would you expect to find under Using the UVA Library?
    • Testers provided a wide array of responses, with two focusing on using the library
      • “How to use the UVA Library”
      • “Using the Library, rooms in the Library, about the Library”
    • Two testers focused on finding resources:
      • Find books and articles
      • Virgo and room reservations
  1. What would you expect to find under Teaching Resources?
    • Two testers responded that this area was for faculty or faculty/staff use and how to teach.
    • Third tester responded “resources for class”
    • This question was not presented to the fourth tester
  2. Which of these categories are useful to you? Is there anything missing?
    • Research Basics: 2
    • Beyond the Basics: 0
    • Using the UVA Library: 3
    • Teaching Resources: 1
  3. How would you make this page easier to use?
    • Categories are vague, need to click deeper to know what is in it
    • Do not know what you can find in the page. Would like info in a sidebar.
    • Confused as to why recalling a book is in Research Basics


  • Confirm search box works in either category or A-Z views
  • Page should by default open in category view
  • Description of Research Basics category should emphasize that these are videos and tutorials about using different types of resources, not the resources themselves
  • Consider reordering the content with Using the UVA Library as first category

Research Results: Intranet Testing with Library Staff

UX-4517, Gather feedback from advisory group and staff about proposed Confluence Information Architecture

Purpose: Assess usability and clarity of staff website migrated from wordpress to confluence (Wiki)

Stakeholders: Library staff
Testing dates: 6/14/19
Participants: 6 Library staff

Methodology: library staff were recruited to spend 10 minutes reviewing a new interface and arrangement of intranet content and asked to perform tasks designed to assess usability and clarity of key areas.

Objectives were to assess first impressions and to obtain feedback about what is missing or confusing with the new organization of content, and to determine what improvements are needed. Major categories of information were informed by analytics and other data. Tasks were to find organizational information; contact Library Staff Council; update one’s own directory information; receive reimbursement for work-related travel; and subscribe to a library listserv.

screenshot of new interface

Intranet as tested with library staff on 6/14/19


  • Top of the page needs something visually interesting.
  • The term “workflow,” used in the first task, confused some people which may have reduced successful completion.
  • Add an alpha list of departments with a “More…” link to complete list.
  • “Our Organization” was either overlooked or not fully understood. Clarify with more links or teaser info about what’s in this area.
  • “Space Shortcuts” should be relabeled as “Forms,” but the wiki software won’t allow. Need to make it clearer that these are all forms.
  • Add Library Staff Council link on main page for higher visibility.
  • “Listserv” is not a universally-understood term, so add names of listservs for identification: “Library listservs: Libtalk, Lib-reftalk, Libstaff…”
  • The purpose and function of the left menu caused some confusion but proved to be learnable.
  • Add a “Communication” category to include info about signing up for listservs and linking to committee and project meeting minutes to answer the question: “What is this group/project up to?”
  • With the creation of a “Communication” category the “Working at the Library” category could more closely map to Human Resources and professional development-related items to narrow focus.
  • Highlight the search bar to make more visually prominent.
  • Review all Quick Links and group similar links together.
  • Link to Confluence Help under Get Help.


What are your first impressions of this page?
You need to look up a detail about a workflow posted in Scholarly Resources & Content Strategy. Where would you start?
You’ve just moved to a new office. Where can you put in a request to update your directory listing?
You want to sign up for some work-related training. How much will the Library pay for?
You’d like to bring up an issue to Library Staff Council. How would you do so?
You’ve realized you’re missing too much good information by not being subscribed to How do you sign on?
What’s missing? What’s confusing?
What would you suggest to improve this page?

Research Results: Special Collections Request System (Aeon)

UX-4614, Perform Usability Testing on Special Collections Request System (Aeon)

Purpose: Assess usability and clarity of request process and dashboard
Stakeholders: Special Collections Library users and staff
Testing dates: 4/24/19-4/29/19
Participants: 1 non-UVA user (from Monticello); 1 UVA staff; 2 UVA graduate students; 1 UVA undergraduate student

Methodology: Participants were recruited from postcards in the Special Collections Reading Room, direct contact with researchers, online web banner, and by solicitation of our existing research queue. A targeted recruitment web form was created: Each participant was to have experience with either online or in-person Special Collections services and collections. A queue of UVA and non-UVA researchers was developed and solicited. Two very similar protocols were developed for UVA (Netbadge) and non-UVA users to account for minor differences in registration and logging in.

Summary of Findings

  • No participants had any significant problem with either registration procedure.
  • All participants sometimes struggled to complete tasks, but most felt confident of their ability to navigate and complete tasks after having gone through it once. On a scale of 1-5 where 1 was very easy and 5 is very difficult, the average rating of ease in completing tasks was 2.
  • One participant, who rated her ease a 3, had more trouble navigating and more confusion when completing tasks.
  • A few buttons and messages should be changed for clarity and to avoid confusion: “Anytime that the language can be particularly precise about what a button means is more helpful.” When editing a request 2 participants expressed concern that they might be duplicating requests rather than simply modifying them. They wanted to see a button that said Modify Request” rather than “Submit Request”.
  • Participants needed more cues as to where they were in the process to avoid confusion and mistakes (“I thought I ordered it, but apparently I didn’t”). More visual cues would also reduce memory load: “I just did it but it’s already vanished from my brain how it was that I did that.” Some participants expressed that they sometimes felt “caught in a loop”: “’You currently have 2 active requests from your available limit of 5.’ So why then can I not seem to request them? Here, you’ve two active requests, but then it’s saying, to submit your request, please select the request, indicate the date and click Submit Information…I seem to be in a circle here.”
  • Participants wanted confirmation of what happens next in the request process.
  • Participants were unclear about the differences between the various Request options on the left menu. Some questioned why Reproduction was separate from Requests since they were all things that could be requested.
  • All participants were confused by the location and function of the search bar. None understood that the search was for requests (some thought Virgo).
  • All participants understood what fields were required in the forms, and when asked, most could identify the “red star” or “red asterisk” as meaning that the field was required. Error messages were clear when a mistake was made in filling out the registration form.
  • The “Save for Later” function was understood (eventually) by 4 participants, but 2 struggled at first to understand what they were saving and then why they wouldn’t want to immediately submit their requests.
  • Participants were confused or annoyed by the two modal screens when requesting from Virgo: “This page is unnecessary. Because I just did that. I had the same information on the– this screen is unnecessary.” A similar complaint about what seemed like unnecessary steps when requesting from Archives at UVA: “I definitely felt like when I was looking at Poe records [in Archives at UVA] trying to figure out how to request it I kind of got into a loop there, clicking on Request, then it would take me back to the record, then back to the request thing.”
  • Final comments include:
    • “Pretty straightforward…pretty intuitive.”
    • “It’s just not a great user interface. It’s an easy process. It’s pretty straightforward. It’s just a lot of form-filling.”
    • “[Archives at UVA] was a little bit confusing, wasn’t sure how to get where I needed to go, lots of text in a form that was a little bit hard to decipher because it was in big paragraphs… but was pretty easy to eventually figure out and navigate.”


  • No button or link should open in a new tab or window. This is a usability best practice that allows visually-impaired researchers to remain oriented and able to navigate backwards. Read more about the experience for visually-impaired and keyboard-only researchers. At minimum, request buttons that open in a new tab or window should have a tooltip (and corresponding hidden labels for screen readers) that indicates that action. At this writing Virgo has a tooltip on the actionable request button but Archives at UVA does not.
  • Logoff and Main Menu buttons should be clickable everywhere, not just on the text. This is a usability best practice.
  • Add a visual cue on every entry in the Outstanding Requests table that the line is clickable. All text in the row should be a visible hyperlink and the cursor should turn to a pointing finger on hover. (Alternatively, put an Edit button alongside each entry.) This is a usability best practice.
  • The Outstanding Requests table should communicate effectively to researchers. Headings “Status” and “Order Status” aren’t significantly different to the researcher and should be changed to something more meaningful. “Awaiting User Review” should be changed to “Unfinished Request” or similar to emphasize to the researcher that they need to take action. Entries should be distinguishable from each other, therefore need more data displayed. One participant suggested “a timestamp or details about the request (in-person or digital). That would be helpful.”
  • The “Clone to Copy” button was not understood as the place to clone a record for a duplication request. 4 participants eventually found the button through trial and error (“Edit? No. Cancel? No. Clone? No. Clone to Copy? Okay.”) and were able to complete the task. I recommend changing the button name to something more meaningful to researchers, but I don’t have a good suggestion. Would either “Request a PDF/JPG” or “Request a Duplication” work in this context?
  • When coming into the system from Virgo the request form is pre-populated but participants still expressed confusion about where they were in the request process: “The title is misleading. If I’m actually in the process of making a request, then remind me I’m in the process.” Rather than “New Book and Printed Material Request” or “New Manuscript/Archives Request” use the same text as when coming into the system from Archives at UVA, as edited in this screenshot:

screenshot with corrections made to improve clarity

This change will help orient the researcher that they are in the middle of the process and will prompt them to do the next step to complete the request or save for later.

  • After selecting a date and submitting a request this screen appears as in this screenshot:

screenshot displaying conflicting language

There is a conflict between “Your request is almost complete” and “You do not currently have any requests in review,” and the messages in the blue bars were not consistently seen. I recommend replacing everything in the gray box with a heading, “Your request is complete” and text below that indicates next steps: “Your materials will be available to you on [scheduled date] in the Special Collections Reading Room. Please request them at the desk. This information will also be emailed to you.” Include a link to hours and a map to parking and building location. If possible, send all this information in an email to the researcher and include the TN number and other identifying information. Multiple participants voiced concern about what happens next in the request process, and it is best practice to clearly communicate success and next steps when forms are filled out.

  • Add placeholder text in the search bar, “Search Your Requests” and add a corresponding hidden label for screen readers.
  • Add tooltips (and corresponding hidden labels for screen readers) to all links on the left menu to communicate and clarify purpose of each (“As a first-time user I don’t see a difference between some of these [links], and I would just mash the button to figure out what I want.”)
  • Add links to Hours and Planning a Visit in the left menu. Only 1 participant looked for Hours in the FAQ but commented, “I can’t believe I had to do all of that for hours.” The other 4 participants went to the Library home page or to Google to find the answer.
  • After changes are made and the system has been live for a few months we should run usability tests again, which will give us a chance to fine-tune language and confirm fixes. I also need to finish looking at the Request System from an accessibility perspective to confirm full access to all researchers.

Protocol for UVA (Netbadge) users



1 There is a signed copy of the book, Tough Guys Don’t Dance, in Special Collections. How would you look at it?

How would you describe the process this far in your own words?
If you were to click on the Request button, what do you think would happen next?
Is there anything unclear about this process so far?

  • Evaluate clarity of multiple request buttons
  • Evaluate effectiveness of dynamic shading
  • Listen for how much of the first and second request screens are read
  • Evaluate clarity of instructions modals:
  • do they understand where the book will be delivered;
  • do they acknowledge that they’ll choose a pick up date next;
  • do they understand they’re being taken into another system
  • Assess clarity of request process in participant’s own words
2 You need to read an article in the journal, Epoch, but it could be in 2001 or 2002 so you’ll need to see both. What do you do next?
  • Evaluate clarity of multiple request buttons
  • Evaluate visibility of “Please select ONE item to request” and tooltip, “Select only a single item to continue”.
  • Evaluate UVA new user registration process
3 Complete the request so you can see the journal later today.

How do you know you’ve got all the data you need
How do you know you’ve successfully submitted the request

  • Assess clarity of submission process from Aeon dashboard
  • Determine if * is understood to represent required fields
  • Evaluate clarity of date picker
4 How can you check to see today’s hours for Special Collections?
  • Assess visibility and clarity of FAQ
5 You’ve decided to wait until tomorrow to see the journal. Can you make that change?
  • Evaluate clarity of selecting request (TN#?) to edit request
  • Note if they instead use Outstanding Requests or other option on left menu
6 Is it possible to get a copy of the Epoch article instead?
  • Evaluate clarity of steps to have a copy made
  • Evaluate understanding of file types
7 Take a few moments to look at the dashboard. What are your impressions of this page?
What are some of the specific things you can do on this page?
  • Assess general impressions
  • Assess clarity of different request functions
8 Now we’re going to go into a new system for searching for unique Special Collections materials including unpublished manuscripts, university records, visual materials such as films and photographs, audio recordings, digital material, and more. This time I’m going to have you go through the request process as a non-UVA researcher, which has a different login procedure.

Please give me your first impressions of the page at

  • Evaluate Archives at UVA record for clarity and completeness
  • Evaluate process for non-UVA researchers
9 You’d like to take notes on an item in the Poe papers collection. If you are a non-UVA researcher, what do you do next?
  • Evaluate visibility of request button and clarity of request button text
  • Evaluate visibility of First Time Users link
  • Evaluate First Time Users registration process
10 Go back to Virgo and begin the request process for Tough Guys Don’t Dance as a non-UVA researcher. Once you are in the Special Collections Request System, can you save your work without submitting?
  • Evaluate clarity of Save for later option.
  • Discover improvements to recommend.
  • Evaluate clarity of Awaiting User Review
  • Evaluate clarity of Unsubmitted vs. Outstanding Requests
11 Do you have the option to submit more than one request at once?
12 Is there anything confusing about this request process?
Is there anything missing from this page?
What would make this request process easier?
  • Last comments about request process and Aeon dashboard
13 On a scale of 1 to 5, where 1 is very easy and 5 is very hard, how would you rate your ease in completing these tasks?
  • Gauge overall experience with request process

Research Results: Library Web Information Architecture

UX-4276, User research on proposed IA structure

Purpose: Assess usability and clarity of proposed public website reorganization
Library users and staff
Testing dates:
15 graduate and 27 undergraduate students (all uva)

Full results:

Methodology: Enrolled UVA graduate and undergraduate students were solicited by email from our research queue to take a ten minute test for which they received a $10 deposit to their Cavalier Advantage card. An online tree test tool by Optimal Workshop was employed to help determine the efficacy of the proposed information structure. Based on previous testing and analytics, we created a simple hierarchy of four top categories, each of which had between two and five subcategories (see Tree Structure). Participants were asked to complete 12 tasks designed to assess how closely the information structure matched their mental models (see Full Test Results, Analysis tab). The tree test results indicate how many participants found the correct path on their first try, how many found on their second or third try, and how many failed to find the correct path; as well as how fast the participant completed the task. For an overview of how to analyze tree test results, visit Atlassian’s Tree Testing for Websites.

Summary of Findings

  • Across all 12 tasks, 79% of participants ended up at the expected answer and 77% of answers were chosen without backtracking. Even without visual cues, navigation, and menus, participants are finding the correct answer four out of five times.
  • Seven out of 12 tasks had a high rate of success (between 81% and 95% participants got to the correct answer) and completing each task took an average of 5.29 seconds.
  • All 15 graduate students ended up at the expected answer on 5 tasks.
  • Half of the participants first clicked on ABOUT and most immediately found and correctly selected Plan a Visit. Another quarter of the participants looked first under USING THE LIBRARY but all found their way to ABOUT/Plan a Visit (task 3).
  • Participants were able to correlate the terms “library study areas” and “reserve a study space” with Library Spaces (tasks 1 and 7).
  • Participants were able to correlate the phrase “find a book” with Search, Borrow, Request (task 4).
  • Participants seemed to have no trouble identifying “Special Collections” as a library and locating it under Libraries & Collections (task 8); although possibly using the term “collections” in the task was a false identifier.
  • Five out of 15 graduate students incorrectly selected TEACHING & PUBLICATION SUPPORT/Teaching Support to look for course reserves (task 9).
  • Five participants incorrectly selected Accessibility Services as the location for getting help with an off-grounds access problem. 26 participants went directly to GETTING HELP/Ask a Librarian (task 10).
  • Five tasks from the previous tree test in 2017 were largely duplicated in this latest test. In each case, the results improved. See 2017 test results.

Deeper dive: task #2: “A librarian visited your Economics class last week to talk about doing research. Where can you find his name?”

pie chart showing percentages of task success and failure and time taken to complete

Participants looked in all four categories for the answer, and only eight got there. 20 participants selected ABOUT/Staff Directory, which is logical if a name is known (and probably indicates a poorly-worded task). We intend to link the subject specialist listing with the staff directory and usability testing will determine if this helps in getting students to subject experts. The remaining 8 participants selected GETTING HELP/Ask a Librarian, and it is not necessarily a bad thing that students opt to ask for online help when they don’t know an answer.


  • The overall high rate of success indicates a strong IA that aligns with the mental model of students.
  • Plan a Visit should remain under ABOUT.
  • Library Spaces and Search, Borrow, Request seem to work well under USING THE LIBRARY. Usability testing with a wireframe or prototype will help confirm.
  • Whether or not Special Collections can be found under Libraries & Collections is inconclusive and would benefit from further testing.
  • Teaching Support resonated with some participants as a logical location for course reserves, so consider adding a search for course reserves there.
  • Add further description to Accessibility Services to define it more clearly.
  • The plan to link the subject specialist directory with the staff directory will improve people searching.

Tree structure as tested

Hierarchical structure of proposed website represented as a multi-level list of headings

Tasks and answers

1 Where would you find library study areas?
(USING THE LIBRARY –> Library Spaces)
2 A librarian visited your Economics class last week to talk about doing research. Where can you find his name?
(GETTING HELP –> Subject Specialists)
3 Your parents are visiting and want to see the Berlin wall installation outside Alderman. Where can they find directions to nearby parking?
(ABOUT –> Plan a Visit)
4 Find a copy of the book, Invisible Man by Ralph Ellison.
(USING THE LIBRARY –> Search, Borrow, Request)
5 Which libraries have color printers?
(USING THE LIBRARY –> Equipment & Technology)
6 Where would you find submission instructions for an open source journal published at UVA?
(TEACHING & PUBLICATION SUPPORT –> Writing & Publication Resources)
7 Reserve a study room.
(USING THE LIBRARY –> Library Spaces)
8 Find out why Special Collections is named for Albert and Shirley Small.
(ABOUT –> Libraries & Collections)
9 Find course materials on reserve for one of your classes.
(USING THE LIBRARY –> Search, Borrow, Request)
10 You’re searching in a journal article database and get an error message instead of the article you need. What do you do next?
(GETTING HELP –> Ask a Librarian)
11 What is the email address for the guy in charge of all the libraries?
(ABOUT –> Staff Directory)
12 Ask the Library to purchase a book.
(USING THE LIBRARY –> Search, Borrow, Request)

Research Results: Usability testing on new design for DH@UVa site

UX-4102, User Testing for DH@UVa


  • Assess understanding and clarity of DH certificate requirements
  • Assess understanding and clarity of DH certificate application process
  • Assess clarity of what counts as an elective course and MoU
  • Assess clarity and utility of menu headings, filters and nodes, About DH@UVa
  • Assess understanding of Activate Your Profile
  • Assess clarity of who/how to contact for help
  • Explore initial likes/dislikes related to site organization and design

Stakeholders: DH@UVa site owners

Dev site:

Testing date: November 26 – november 30, 2018

Participants: 5 UVA graduate students from English (2), Art, Anthropology, and Slavic departments

Methodology: usability testing with one participant, one facilitator, and one note taker in a quiet space. participants were solicited from DH@uva mailing list. they used a laptop and were asked to complete tasks on a newly-revised website to determine the usability of the site. the facilitator and note taker listened for comments and opinions, and noted behaviors.

Strong Findings

  • None of the 5 participants had trouble finding and enumerating the steps to apply for the DH certificate. Most navigated by the DH Certificate link in the menu. One participant was briefly confused by the More about the graduate certificate in digital humanities section above the DH certificate requirements This participant also commented, “I think there is a mismatch between what I would expect from a digital certificate and the application process, which is so traditional. Why do I need words, why do I need a statement, when I’m applying to do something visual? What about art students?”
    • Recommendations: Remove More about the graduate certificate in digital humanities section to focus attention on enumerated steps. Alternative: work this section into About DH@UVa with visible links to requirements and application process. Reduce text blocks with diagrams or other images that can convey information in a non-textual way.
  • All participants could find and convey the DH certificate requirements as well as how to get an elective course approved; and each demonstrated basic understanding of the MoU.
    • Recommendation: None
  • 4/5 expressed the need to clarify the relationship between DH@UVa and IATH, Scholars Lab, and SHANTI in order to clarify the purpose of the DH@UVa website. One participant was still unclear at the end of the test: “I guess I’m also not 100% sure what DH@UVa is as compared to the Scholars Lab as compared to individual projects. Again, why am I going to this website?” From another participant: “I am really interested all digital stuff, but I am not sure what each center or office does…they all overlap, also Data Services. I don’t know who to contact. I have no idea how they’re different from DH.” This participant didn’t want to “have to read everything and figure out the differences.” A third participant: “What’s the difference between the three [pointing to the links in footer]? Are they sponsors? Needs to be made more clear.” This same participant wanted to see Makerspaces included. The one participant who did not express confusion about the centers or the purpose of the website found it to be “clear, organized, and inviting. Judging from what I see here I’d want to do this myself because it seems like a very organized program.”
    • Recommendations: Add new intro text, Welcome to DH@UVa or similar, above the fold on the home page stating clear purpose of the website and concise descriptions of how related centers intersect with DH. Retest to determine if there is an improvement or if further changes are needed.
  • 4/5 found the DH@UVa Network area at least somewhat unclear and that it appeared “awkward,” “scattered,” “crowded,” and “feels like 50 things are competing for my attention.” Some expressed that there isn’t enough information to provide context, for instance in the People filter: “I’m wondering why these people are here. There should be something about who they are, not just their names and picture.” One participant wanted more filters to narrow down results further by subject and geographic location. 3/5 wanted the nodes pages to have a more obvious organization, calling the current order “random” and indicating that “alphabetical is what I expected to find.” Participants wanted a different structure on the nodes page (“Not tiles, but the full title, followed by the first few lines of text”) and one commented: “This is good, but it needs a description to explain what we are looking at.” 3/5 expressed a need for a robust and visible search function on this page.
    • Recommendations: Add context by including tooltips for each node button. Add description to each card. Reformat nodes pages in a grid with full titles and abstracts to facilitate scanning and evaluating content without having to click on every Learn More Consider duplicating the search box near the heading of the page for optimum visibility.
  • All participants had no trouble figuring out where to go for help, but all had some familiarity with DH@UVa. Most knew Rennie and mentioned her by name. A related note: all 5 participants expected to find who administers DH@UVa on the About DH@UVa page under People (see #2 under Protocol below). 2 participants expressed that they expected to find the primary DH@UVa contact at the top of the People page, and the other 3 felt they had completed the task when they clicked on the People link and saw the headings for the Executive Board and Steering Committee. Only 1 participant eventually scrolled to the bottom of the page and located the DH@UVA Curriculum Development Team
    • Recommendations: Best practice would be to have a Contact link on the menu for optimum visibility. Move DH@UVA Curriculum Development Team section to the top of the People page.

Other Findings

  • 3/5 participants expressed some uncertainty about what to expect when clicking on the Activate Your Profile Answers ranged from “to save things I’m interested in” to “receive information, subscribe to a newsletter, make your profile available to others to network with.” Best practice tells us that it would help provide context to have one sentence explaining why you’d want to activate your profile before clicking on the button.
    • Recommendation: Add a brief description below Activate Your Profile to outline the benefits of creating a DH profile.
  • Resources is currently a bit of a catch-all: “To me it’s a miscellaneous list.” DH Organizations and Projects could easily be its own menu heading. 2/5 participants wanted to see organizations and projects from outside UVA. A listing of upcoming events are commonly found on a website’s top level page, so having a link to Events Calendar in the menu is unnecessary. One participant recommends: “Make sure this [Resources] page functions as a toolkit to people who end up doing the certificate. Sort of a gateway to skills you’d use in the certificate program.”
    • Recommendation: Remove Events Calendar from menu. Refocus Resources by pulling out DH Organizations and Projects and adding the heading to the menu. Consider creating a page for new DH scholars and put Jumpstart your DH project and Links for the new DH scholar there, and link to Apply for the DH Certificate.


1 [Start:]
This is a site under development.
What is your initial impression of this web page?
  • Assess initial impressions
  • Assess clarity and utility of menu headings
2 (task) You are unfamiliar with Digital Humanities and want to know who administers it at UVA. Where do you start?
  • Assess clarity of  About DH@UVa
3 [Scroll down and point to 1.) DH@UVA Network and 2.) View all nodes button]
What happens here?
  • Assess understanding of filters and nodes
4 [Point to Activate your profile button]
What would you expect to happen when you click on this button?
When would you click on it?
  • Assess understanding of Activate your profile button
  • Assess understanding of a profile
  • Does Log in help to define Activate your profile
5 (task) Find out about the DH projects that others have done.
  • Assess location of Projects under Resources
6 Describe, in your own words, the requirements of the DH certificate.
  • Assess understanding and clarity of the DH certificate requirements
7 (task) You think that a particular course in your home department of History might count toward a DH certificate. How can you determine if it does?
  • Assess clarity of what counts as an elective course
  • Assess ability to navigate to course information
8 Describe, in your own words, the steps to receive permission for a non-DH course to count toward the DH certificate.
  • Assess clarity of the first steps taken to get a non-DH course to count as an elective
9 You’ve received verbal approval for an elective course from your instructor and the DH administrator. What’s your next step?
  • Assess ability to navigate to Memorandum of Understanding (MoU)
  • Assess clarity of the MoU requirement
  • Probe for understanding of MoU
10 Describe, in your own words, the steps to apply for the DH Certificate.
  • Assess clarity of the steps to take to sign up for the DH Certificate
  • Ensure that the numbered steps are included in the process before clicking on the application link
11 (task) You have a question about the application. How do you get help?
  • Assess clarity of who/how to contact if further information
    is needed
12 What are three adjectives you’d use to describe the process to apply for a DH certificate?
  • Gain insight into process to learn about and apply for DH certificate
13 What is missing on this website?
What is confusing on this website?
  • Assess understanding and utility of page
  • Gain insights into what is missing or confusing
14 How would you make this website easier to use?
  • Final comments