0% found this document useful (0 votes)
35 views

EBSCO-FullText-02_07_2025 (5)

This study investigates the search behavior of college students using the LibGuides platform, aiming to enhance user experience by reducing unsuccessful searches. Through usability testing and analysis of search term logs, the researchers identified a mismatch between student expectations and the search functionality, leading to interface modifications that resulted in a slight decrease in no-result searches. The findings highlight the need for improved understanding of student mental models and suggest potential enhancements for academic libraries and the LibGuides system.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views

EBSCO-FullText-02_07_2025 (5)

This study investigates the search behavior of college students using the LibGuides platform, aiming to enhance user experience by reducing unsuccessful searches. Through usability testing and analysis of search term logs, the researchers identified a mismatch between student expectations and the search functionality, leading to interface modifications that resulted in a slight decrease in no-result searches. The findings highlight the need for improved understanding of student mental models and suggest potential enhancements for academic libraries and the LibGuides system.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

ARTICLE

Improving the Student Search Experience in LibGuides


Clayton Crenshaw and Melissa Johnson

ABSTRACT
This study is an in-depth look at the use of the LibGuides search function by college students. We
sought to better understand the mental models with which they approach these searches and to
improve their user experience by reducing the percentage of searches with no results. We used two
research methods: usability testing, which involved 15 students in two rounds, and analysis of search
terms and search sessions logged during three different weeks. Interface changes were made after
the first round of usability testing and our analysis of the first week of search data. Additional
changes were made after the second round of usability testing and analysis of the second week of
search data.
The usability tests highlighted a mismatch between the LibGuides search behavior and the
expectations of student users. Results from both rounds of testing were very similar. The search
analysis showed that the level of no-result searches was slightly lower after the interface changes,
with most of the improvement seen in Databases A-Z searches. Within the failed searches, we saw a
reduction in the use of topic keywords but no improvement in the other causes we studied. The most
significant change we observed was a drop in the level of search activity.
This research provides insights that are specific to the LibGuides platform—about the underlying
expectations that students bring to it, how they search it, and the reasons why their searches do and
do not produce results. We also identify possible system improvements for both academic libraries
and Springshare that could contribute to an improved search experience for student users.
IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES
Southern Methodist University is a private institution in Dallas, Texas, that serves approximately
5,000 graduate students and 7,000 undergraduate students annually. Since 2008, the libraries of
SMU have used Springshare’s LibGuides platform to publish online research guides, of which there
were 387 when our research data were gathered. We also make use of LibGuides’ Databases A-Z
page, a directory of the more than 700 licensed databases that the libraries provide.
A 2022 review of search term logs revealed a good deal of misunderstanding among users about
how the LibGuides search function works and how it should be used. We could see that many of
the terms entered would not produce useful results and, in some cases, no results at all. In
response, we started researching how our students thought about the LibGuides search boxes and
what changes we might make to increase their success in finding relevant databases and guides.
We explored two research questions:

About the Authors


Clayton Crenshaw ([email protected]) (corresponding author) is Discovery and Systems Development
Librarian, Southern Methodist University. Melissa Johnson ([email protected]) is Instructional
Design and Educational Technologies Librarian, Southern Methodist University. © 2024.
Submitted: 16 October 2023. Accepted for publication: 16 June 2024. Published: 16 December 2024.

INFORMATION TECHNOLOGY AND LIBRARIES | DECEMBER 2024


https://doi.org/10.5860/ital.v43i4.16971
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

• What is the mental model that students have for the search function provided on LibGuides
pages?
• Can we, through interface modifications, reduce the percentage of failed searches?
LITERATURE REVIEW
Within the academic library literature, we looked for research that included the LibGuides search
function. More broadly, we explored existing evidence about the mental models that users bring to
library search tools and the typical search behaviors and preferences of college and university
students.
Mental Models
According to Holman, a “mental model is an internal cognitive representation of a tool or system
that helps one master it. It is the user’s mental image of a system and its capabilities that he
employs to understand how to operate it.”1 An interesting finding from Holman’s research is that
the mental model used by some students does not include strict matching of keywords. Instead,
they understand keywords as a “concept rather than as a literal string of letters.”2 Lewis and
Contrino described students entering natural language questions into a search box and expecting
the system to understand them and respond accordingly.3

A mere seven years into the Google era, Griffiths and Brophy determined that commercial search
engines were the dominant model for students seeking information. “It seems that students’ use of
resources is now very colored by their experience with search engines, which in turn may lead to
expectations that may not be realistic for different types of services.”4 They concluded that
“students’ use of search engines now influences their perception and expectations of other
electronic resources.”5

It is clear from more recent literature that, for most students, Google serves as the single, well-
known mental model that is applied to all their online searching. 6 This translates to simple,
unrestricted keyword searches.7

Zhang studied how mental models for searching are developed, finding that users draw upon their
experience with similar systems. Building on this experience, they iteratively modify elements,
assimilate new ones, and phase out others. Over time, their search strategy becomes better
aligned with the new system.8 That does not seem a likely scenario in academic libraries, however,
when we consider the overwhelming influence of Google and the limited exposure to library
research tools that many students have.
Beyond the mental model of how a search works, it is also important to have an accurate model of
what is being searched. Unfortunately, this is something that students are unlikely to consider.
Katie Sherwin of the often-cited Nielsen Norman Group wrote,

Users expect search to include the entire site. People expect to be able to enter a term in a
search field and get relevant results from anywhere on your site. To most people, anything
on the website is part of a single entity, and search should include all of it. Users’
understanding of the “entire” site depends on their mental model of the website or
organization.9

Asher and Duke and Georgas found that study participants lacked basic understanding of how
information is organized.10 Willson and Given found that many searchers do not distinguish
between different sources of online information and have difficulty understanding what types of
IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 2
CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

resources they are accessing.11 All of this can contribute to problems in the LibGuides
environment, where the resources searched are typically limited to guides and databases. Sherwin
cautioned that misunderstanding the limitations of such a search can lead users to devalue the site
and leave it in frustration.12
Search Behaviors of Students
The Ethnographic Research in Illinois Academic Libraries (ERIAL) Project used ethnographic
methods to study student research behaviors, among other things, in five academic libraries. The
researchers found that most of the student participants “exhibited significant difficulties that
ranged across nearly every aspect of the search process.”13 Another important finding was that
students tend to give up quickly and will change search terms, database, or research topic if they
do not get the results they expected.14 Other researchers have found that when reviewing results,
students usually do not go beyond the first page.15

Among the issues observed by Holman were incorrect use of punctuation, frequent misspellings,
repeating failed searches, reuse of the same search terms and syntax across different systems, and
the inability to recognize why searches did not get the desired results. She also saw students
hastily searching and evaluating results, which can exacerbate the other problems.16 Along with
Willson and Given, Holman observed that today’s students depend on technology to provide
correct spelling and often do not recognize their own spelling errors. 17 This can lead to the
erroneous assumption that the library cannot fulfill their information needs.
When studying student searching of Google and a library search tool, Georgas reported frequent
use of natural language, failure to try synonyms and related terms, and little evaluation or revision
of search terms. Date limits were infrequently used.18 Despite all these indications of weak skills,
students tend to overestimate their abilities as searchers. 19
LibGuides Research
Since its debut in 2007, scores of articles have been written about LibGuides, Springshare’s
content management system for library guides. We found only a small number, however, that
discuss the search functionality included in the system. Several others speak to the lack of a
student mental model for LibGuides as a whole.

Though academic libraries serve up research guides by the hundreds, studies by Lierman et al.,
Conrad and Stevens, and Castro Gessner et al. found that students did not have a clear
understanding of their form or function.20 In the latter one, participants were not able to
distinguish them from other online, library-provided resources such as databases and articles. 21
Similarly, Conerton and Goldenstein reported that some students were completely unfamiliar with
the term “guides,” while others equated it with library databases. 22 Quintel also found
unfamiliarity with library guides, with nearly two-thirds of participants having never used one. 23

Regarding the Springshare-provided search boxes in LibGuides, researchers have long described
misunderstanding and inappropriate use. Tawatao et al. saw that they were consistently used
incorrectly in usability tests and removed them from their LibGuides pages.24 While testing a
prototype primary website built using LibGuides CMS, Vargas Ochoa reported that students used
the LibGuides search as a “search-all search box.”25 The same behavior was observed by
Azadbakht et al., with some participants trying repeatedly to search for book, journal, and article
titles.26 Sonsteby and DeJonghe reported that after the LibGuides search box was moved from the
header to the bottom of the pages, students in their study requested a search box and expected it

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 3


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

to perform like the library’s discovery tool. The researchers observed that the participants’ desire
to search was strong, but they did not understand what was being searched. 27

Some students in the Conerton and Goldenstein study expected the LibGuides search box on guide
pages would search within the guide, while others expected to get articles in response. The
researchers also observed that the most relevant results were not at the top of the list. Because of
its weak performance and the confusion it generated, they initially removed the search from all
pages except the LibGuides homepage, but restored it when the option became available to search
within an individual guide.28 In contrast, Lierman et al. found that participants were successful in
using the search box on their LibGuides homepage to find content within the guides.29

Conrad and Stevens conducted usability testing of two versions of LibGuides. Version 1 guides
used the built-in Springshare search function in the header area; Version 2 guides replaced it with
a search of the library’s discovery system. Almost all the participants used a search box in their
attempts to find requested information, either alone or in combination with browsing. Like other
researchers, they found that students often used the LibGuides search boxes that were designed to
find databases and guides to look for articles and books. Another participant tried searching for
guides using the search on the Databases A-Z page, which returns results for databases only. Their
tests also revealed the inflexibility of the LibGuides search, as in the example of a course code
search that failed because of a missing space between course prefix and number. Because the
discovery system search aligned better with user expectations and provided an experience
consistent with other library webpages, their library chose to use it in the header of all guides
moving forward. They also noted, however, that “many of their searches navigated students away
from the very guides that were designed to help them.”30
METHODS
We conducted a mixed-methods study with two components: usability testing of the LibGuides
search function and analysis of search term logs. Because it included human subjects, the project
was submitted to the SMU Institutional Review Board and approved as Exempt research under US
Department of Health and Human Services regulations (45 CFR 46). The University’s Office of
Research reviewed and approved the recruitment materials, consent form, and testing protocol.
Usability Testing
The usability testing participants were recruited from all SMU undergraduate and graduate
students. Digital signs, social media posts, flyers, and tabletop tent cards were all used in our
recruitment efforts. In addition, email messages were sent to students who had participated in a
previous research project and had given permission to be contacted about future ones. An Amazon
gift card was offered as an incentive to complete a research session. The students who
participated included both undergraduate and graduate students, together representing all of
SMU’s schools and colleges. Many were returning students, but some were in their first semester.

The testing was conducted in two rounds: in April and July 2022 (Round 1), before interface
changes were made, and from September to November 2022 (Round 2), after the first set of
interface changes were made. There were seven participants in the first round and eight in the
second. We conducted and recorded the sessions remotely on Zoom with a single facilitator. Both
of us reviewed the recordings later before discussing them.
Participants were asked questions about their prior use of the search function and what they
expected it to find. They were asked to conduct a search using terms of their choosing, then

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 4


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

describe their reaction to the results and assign them a numerical rating based on a five-point
scale. We observed the participants using the LibGuides search box on both the homepage for our
Research Guides and on the Databases A-Z page. The test script is found in the appendix.
Search Term Analysis
For the search term analysis portion of the study, we analyzed and compared more than 1,200
LibGuides search terms used during three, one-week periods: April 3–9, 2022; October 16–22,
2022; and February 12–18, 2023. We intentionally selected weeks during the fall and spring
semesters when students would be on campus and actively engaged in their courses. This helped
to ensure that a broad cross-section of our student population was represented.
Initially, we worked with the Search Tracking report provided through the LibGuides CMS
administrative interface. This led us to see that our analysis would benefit from knowing which
searches in the logs were entered by the same searcher. Subsequently, we requested and received
reports from Springshare that included the timestamp and session ID number for each search
term. This contributed greatly to our understanding of the search behaviors, and we are grateful
to the Springshare staff for providing the custom reports. They contained no personal identifiers
or IP addresses.

Before any coding of data was performed, the search logs went through a two-step process to
remove duplicate terms:

• The Location column in the reports indicated what was searched: the Databases A-Z list
(indicated as “AZ” in the report) or the other pages in LibGuides (indicated as “System”).
Searches launched from the Databases A-Z list search only that source, so the search terms
appeared once in the report. Searches launched from other pages search all enabled
sources—in our case, Research Guides pages and the Databases A-Z list. Those search
terms appeared twice in the Searches report—once as “System” and again as “AZ.”
Technically, both sources are being searched at the same time. However, we wanted to
analyze these terms only once, so the “AZ” terms generated by Research Guides searches
were removed from the reports.
• Searchers often use the same term more than once in a given session, so we also removed
duplicate search terms with the same session ID.

The search function is not case-sensitive, so we disregarded differences in case. Within each
report, we saw many examples of the same search term used in separate sessions. Some of these
were probably entered by the same searcher but, with no way to verify that, we left them in the
datasets for analysis. The session ID, then, served as a proxy for an individual user. Table 1
contains the number and types of searches in each dataset after duplicates were removed.

Table 1. Numbers and types of searches analyzed by week


Week sampled Databases A-Z Research Guides Total
Week 1—April 2022 335 (72%) 130 (28%) 465
Week 2—October 2022 258 (66%) 132 (34%) 390
Week 3—February 2023 300 (71%) 122 (29%) 422

After deduplicating the search terms, we replicated the searches and studied the results. We
recorded whether they produced results or not, as well as reasons for their success or failure.

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 5


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

Search terms that produced any results were counted as successful. This part of the study did not
try to assess the quality of results or their usefulness to the searcher.

We divided the search terms between us for initial coding. We then reviewed each other’s work
before meeting to discuss questions raised and to resolve differences.
For Databases A-Z searches with results, we recorded matches between one or more words in the
search string and words in the database name, description, and subject fields. We recorded the
same matches for Research Guides searches, as well as matches with course-related guides and
general guides.
For failed searches—defined for the purposes of this study as those that produced no results—we
tracked three types of problems that emerged from the data: misspellings, citations, and topic
keywords. Only one of these was recorded for a given search term. We did not track other causes
of failure, such as searches for resources our libraries do not provide. In addition to the obvious
typos, misspellings included variations on database names, such as “world cat” for the WorldCat
database. Counted as citations were the search strings that included the title of an article or a
major work such as a book or a film. We counted search strings as using topic keywords when, in
our best judgment, the words could plausibly be used in that way.
After completing Round 1 of usability testing and analyzing the first week’s data, we made several
interface changes in coordination with the SMU Libraries LibApps Team. These were designed to
help users better understand at a glance the purpose of the pages and how to use the search and
filter functions. They were completed in August 2022.
Databases A-Z Changes, August 2022
• On the Databases A-Z page, the existing description beneath the page title was “find the
best library databases for your research.” This description was revised to better
communicate available functionality: “Find the best database for your research by name,
subject, content type, or description.”
• At the top of the page, the background box was expanded to include the alphabetical anchor
links. A heading, “Filters,” and an icon were added to indicate that all the components—
alphabetical links, drop-down menus, and search box—serve that purpose. The “Clear
Filters” button was moved up to just below the filters rather than below the count of
databases found.
• In the Filters area, the text at the top of the Database Types menu was updated from the
default, “Any Database Type,” to “All Content Types.”
• The search box placeholder text was changed from “Search for Databases” to “Search This
Page.”
Research Guides Changes, August 2022
• On the Research Guides homepage, we added a new description beneath the title to explain
the function of guides: “Find selected resources to help with research.”
• Since some students in our tests were expecting to find specific information resources like
articles or books, we replaced the small “Search Library Resources” link with a large button
in the sidebar.
• The placeholder text in the search box was changed from “Search all guides” to “Search
within guides only” to better indicate the limitations of the search. The color was changed
to a darker shade of gray for better legibility.

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 6


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

• The search button text on guide pages was changed to “GO” to match the button on the
Databases A-Z page. High-contrast colors were used to make both buttons more prominent.
After Round 2 of usability testing and analyzing the second week of search data, we felt there was
still more we could do to improve the interfaces. Additional changes were made in January 2023.
Databases A-Z Changes, January 2023
• Point-of-need instruction was added in the form of search tips labeled “Search by” and
“Don’t search by.” They briefly communicate what types of searches will and will not be
successful. These were placed in the sidebar just beneath the search box.
• A button linked to the Database FAQ list was added in the same area.

Figures 1 and 2 show the Databases A-Z page before and after the interface changes.

Figure 1. Databases A-Z page before interface changes

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 7


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

Figure 2. Databases A-Z page after two sets of interface changes

Research Guides Changes, January 2023


• The placeholder text in the Research Guides search box was updated again to “Search for
guides & databases.” Since usability testing participants were often surprised by the search
results, we wanted to signal clearly what they should expect to see.
• The search tips were also added to the results page for Research Guides searches below the
lists of results. In many cases, the tips would not be visible without scrolling, so a message
was inserted between the search box at the top and the boxes containing search results:
“Search tips are available below. To find specific items like books or articles, explore the
guides and databases or use Library Search.”
• Headings for the two search result boxes were changed from “Research Help” and “A-Z
Database List” to simply “Research Guides” and “Databases.”

Figure 3 highlights the changes to the Research Guides search results page.

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 8


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

Figure 3. Research Guides search results page after additions and changes

Finally, we analyzed search terms from the third week, February 12–18, 2023, and compared
them with the previous weeks.
RESULTS OF USABILITY TESTING
Many of the participants in our usability tests had used the tested LibGuides pages before. Out of a
total of 15, only four told us that they had not visited them. Yet many encountered challenges
when using the search boxes. We tested a diverse group of undergraduate and graduate students,
and it quickly became obvious that there was a wide range of understanding of the pages and how
the search functioned. While most participants struggled to get useful results, two demonstrated
comprehension of the system and skill in their searches.
Research Guides Pages
On the Research Guides homepage, we first asked participants how they could use the search box
and what they would expect to find. In Round 1, several participants did anticipate that the results
would be guides and databases and that they would have to search or browse within them to find
specific information resources. However, the majority expected results that were more akin to
those in Google Scholar or the SMU Libraries discovery system. They mentioned specific content
types such as books, newspaper articles, papers, documents, and photos.
Half of the participants in Round 2 expected research guides or databases in the search results, but
this may have had more to do with their prior experience than the changes made to the interface
after Round 1. Some mentioned their previous use of research guides for specific courses. The
other half of this group had expectations that the search included journals, research articles, case
studies, topic overviews, and course descriptions. One student told us very clearly: “It’s like
Google, that’s what I’m expecting.”

Participants were then asked to think about one of their current courses and to run a search that
would be appropriate for a research assignment in that course. We observed whether their

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 9


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

searches produced results or not. In Round 1, four did, and three did not. Among those who got
results, three confirmed that the results matched their expectations. The other explained that a list
of books, papers, or journals was expected. In Round 2, five searches produced results, and three
did not. Within the successful group, one set matched expectations, and another was deemed
“better than I expected.” The other three described some degree of mismatch.
After reviewing their search results, participants were asked to give the search results a numerical
rating from 1–5, with 1 being “not useful at all” and 5 being “very useful.” In Round 1, a single
student gave the results a 5 rating; the rest gave them a rating between 1 and 3. Ratings in Round
2 were similar: one 5, one 4, and six in the 1–3 range. In both rounds, the average rating was 2.5.
Databases A-Z Page
Our questions and procedures for testing the Databases A-Z page were very similar to those used
on the Research Guides homepage. We started by asking participants which of the filtering options
they typically used. When the participant was unfamiliar with the page, we asked which filter
looked most useful. Their clear preference was the Subject filter—six out of seven in Round 1 and
six out of eight in Round 2. One participant preferred the alphabetical links, one the search box,
and another did not use the filters at all, instead scrolling down the page to locate a favorite
database. When responding to this question, some participants may not have considered the
alphabetical links to be filters. From Hotjar analytics data we know that, collectively, they are used
more than the filters with drop-down menus or the search box.

Next, we again explored the participants’ mental models for search results. In both rounds, most of
the participants were expecting to see databases. A third of them, however, expected item-level
results as in the Research Guides search. We also learned that the word “databases,” though
ubiquitous in our library communications, is not understood by all students. Asked about the
search box and what they would expect it to find, one participant explained, “It’s not very specific
at all. It just says, ‘Databases,’ so I have no idea what that could be talking about.” Another made a
similar comment in response to search results.

We also heard expectations from the students that went well beyond a simple search of metadata.
Like Holman, we mostly saw searching for concepts, not necessarily specific words, even on the
Databases A-Z page.31 One participant in our study expected metasearch functionality: launching
simultaneous searches within all the relevant databases to determine which ones contained a
specific topic. Another expected to see databases “as well as something specific within the
database, like an article.”
In both rounds of the Databases A-Z searches, we had one participant who did an extra search. In
Round 1, we saw four searches that returned results and four that did not. In Round 2, only one of
the nine searches attempted returned any results. Four out of the five participants who got results
said they were in line with their expectations. Three of those searches were for names of
databases. Two searches that produced zero results also matched expectations. In those cases, the
participants observed that their search terms were too specific. What follows are a few examples
of how this search fell short of student expectations.

• “I expected to see ATLA, since religious ethics is part of the journals that are in there.”
• “Carbon-13 isn’t very non-popular. It’s an important thing in research, so I would expect to
see it.”
• “I hoped that what I searched would be in the tags for the journals.”

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 10


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

• “What does surprise me is that it didn’t give me a recommended search option instead.”
• “I was pretty sure the Baroque era was broad enough to find some sort of database
information on it.”
For searches on the Databases A-Z page, six participants gave their results a 1 rating. Three gave
them a 2, one a 3, two a 4, and three a 5. For Round 1, the average rating was 3.1, and for Round 2,
it was 2.1. While most participants found this search problematic, a couple of experienced and
savvy researchers successfully used the search box in combination with another filter to get
precise results.

To conclude the usability tests, we asked for suggestions on improving the search function on the
two pages tested. The students gave us a number of thoughtful suggestions, some of which we
used in our subsequent interface changes. Their ideas included the following:

• introductory text for Research Guides to explain their purpose and function
• explanation of how the search box works and what it searches
• more sophisticated searching that would incorporate phrase searching and include close
matches in the results
• sub-categories of subjects assigned to databases and guides
• search suggestions, similar to “Related searches” in Google
• more detailed database descriptions
RESULTS OF SEARCH TERM ANALYSIS
We analyzed the two types of searches tracked by the LibGuides system: those conducted on the
Databases A-Z page and those launched from other pages. The three time periods studied
consisted of a week prior to the interface changes, a week after the first set of changes and before
the second set, and a week following the second set.
It should be noted that, in the LibGuides System Settings, we have configured the “System Search”
option as opposed to “Guide Search.” Searches launched from our Research Guide pages search
across all published guides, as well as the Databases A-Z page. Searches launched from the
Databases A-Z page are limited to the metadata describing our database collection. These are the
only search sources we have enabled, though it is possible to include search results from other
LibApps products and external systems.
For both types of searches combined, we found that searches with no results were 45% of the
total in Week 1, 31% in Week 2, and 36% in Week 3. Figure 4 shows the breakdown by week for
Databases A-Z searches and Research Guide searches.

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 11


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

Figure 4. Percentages of searches with no results

As we replicated the searches from the search term reports, we recorded matches with the
metadata and reasons for failure. Table 2 shows the results of searches on the Databases A-Z page
in percentages. In almost all cases, the productive search terms matched one or more words in a
database name. They matched the broad subject categories assigned to the databases only
7%–11% of the time. Matches between the search terms and the database descriptions were more
variable. We observed rates of 52%, 37%, and 61%. In addition to matches within the Description
field, we found and included matches within the optional More Info field and even within database
URLs.

Table 2. Search term matches and causes of failure in Databases A-Z searches
Sample Matches (%) Reasons for failure (%)
period
Database Database Database Misspelling Citation Topic
name description subject keywords
Week 1 92% 52% 9% 14% 7% 71%
Week 2 99% 37% 7% 9% 5% 61%
Week 3 93% 61% 11% 16% 8% 49%

In the search terms that produced no results, we found that misspelling was a factor in 9%–16%
of cases. However, the reports also showed that some searchers realized their errors and
immediately corrected them.

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 12


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

Citations used as search terms were the most obvious type of problem we observed in the
Databases A-Z searches, but not a common one. These represented 5%–8% of the no-result group.
Some were the titles of books or films, while others were full bibliographic citations, most likely
copied from another page and pasted into the search box.
By far, the biggest reason that Databases A-Z searches produced no results was the use of topic
keywords. For Week 1, they accounted for 71% of the no-result terms. That figure declined to 61%
for Week 2 and 49% for Week 3.
The search term matches in Research Guides searches are detailed in Table 3. Predictably, the
productive search terms used on these pages most often matched words in a guide. Course
Guides—those with a course code in the title—were matched in 58% of searches with results in
Week 1. For Week 2, it was 80%, and for Week 3, 76%. General Research Guides were matched at
an even higher rate: 86% in Week 1, 87% in Week 2, and 79% in Week 3.

Table 3. Search term matches in Research Guides searches


Sample period Percentage of search terms
Course guide Other guide Database Database Database
name description subject
Week 1 58% 86% 28% 9% 6%
Week 2 80% 87% 38% 25% 18%
Week 3 76% 79% 28% 24% 11%

Results from this search also included databases, so we also recorded matches with the Databases
A-Z list. Words in database names were matched by 28%–38% of the search terms, words in
database descriptions by 9%–25%, and database subject words by 6%–18%.
The recorded reasons for Research Guide searches failing to produce results are detailed in Table
4. Here, misspelling was a more significant problem than we observed with searches on the
Databases A-Z page, ranging from 18% to 24% of the terms. Citations also accounted for a larger
share, 13%–15%. Topic keywords were again the most frequent problem, 60% in Week 1.
However, this dropped to 40% in Week 2, followed by 46% in Week 3.
Table 4. Causes of research guides search failure
Sample period Percentage of search terms with no results
Misspelling Citation Topic keywords
Week 1 20% 13% 60%
Week 2 18% 15% 40%
Week 3 24% 14% 46%

In addition to search term matches and causes for failure, we made a few more observations about
patterns within a search session and the number of words used.

• In both types of searches, we saw many examples of iterative searching—adding and


subtracting words or characters, such as a plural “s,” before running a new search. When
citation searches failed, some users responded by removing all but one or two words and

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 13


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

searching again. As noted above, repeating a search term within the same session was fairly
common.
• Search term word counts were calculated separately for the two types of searches.
Combining data from the Databases A-Z page for all three weeks, search terms ranged from
one to 22 words. The median number of words used was one due to the many uses of
database names. (We did not consider the mean to be useful here, as it was skewed by the
entry of complete citations.) Word counts for all the Research Guides searches ranged from
one to 25 words. The median was again one, though close to two. Together, these
calculations tell us that, while some use far too many words, most LibGuides searches
consist of just one or two.
After analyzing the search terms used in LibGuides, we also looked at changes in search activity. In
the 12 months following the interface changes, the number of searches launched on the Databases
A-Z page decreased by 27%. The monthly total was higher in March than the previous year, and
for May, it was about the same. Totals were lower for all other months. These are shown in
Figure 5.
Figure 5. Search activity comparison for Databases A-Z page

Similarly, searches launched on other LibGuides pages were down 35% year over year. Monthly
totals were higher in March and May, but lower for the other months. These are compared in
Figure 6.

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 14


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

Figure 6. Search activity comparison for Research Guide pages

These decreases stand in contrast to the relatively stable use of the LibGuides system during the
same period. While monthly session statistics varied as much as 23% year over year, total
sessions for the year rose 3%.
DISCUSSION
The usability testing revealed considerable variation in the understanding of and familiarity with
LibGuides searches. Asher and Duke and other researchers have found that Google and other
search engines provide the mental model most students apply when searching library resources.32
Our research confirms that this is true for the LibGuides system. Many of our participants
expected results to include item-level information resources, not another finding tool to be
browsed or searched.

Most visitors to our LibGuides pages start their user journey with Google or another search
engine. Google Analytics reports show that search engines accounted for 69% of the traffic to our
LibGuides pages over the past year. When visitors use a search engine just seconds before starting
their LibGuides search, in most cases, it is to be expected that they would continue to search in the
same way. Referrals from other websites represented only 11% of the LibGuides traffic, though we
provide access from other LibApps products, the SMU Libraries homepage, our discovery system,
and the Canvas learning management system. The remaining 20% of visits are considered “direct”
and include such actions as typing the “friendly” URL into the address bar, selecting a bookmark,
or linking from a document.
The results of the first round of usability testing were used to inform the initial interface changes
we made in August 2022. In the second round of testing, we did see that the changes had some
impact—for example, some participants saw and read the new placeholder text in the search box
out loud. However, the results of Round 2 were not substantially different from those in Round 1.
In participants’ interactions with the search boxes, we often saw a disconnect between their
search terms and the placeholder text. On the Databases A-Z page, only two participants entered
database names. One used an article title, another a newspaper title, and the rest entered topic

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 15


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

keywords despite the placeholder text of “Search for Databases” (Round 1) or “Search This Page”
(Round 2). The search terms used were similar on the Research Guides homepage, where the
placeholder text was “Search All Guides” (Round 1) or “Search within guides only” (Round 2). One
participant used “citing” to find links to a citation help guide. One entered an author name, one an
article title, and the rest entered topic keywords. While the placeholder text was intended to
communicate the limitations of the searches, many participants still expected them to search
across all library collections and provide item-level results. These test results clearly showed us
that disrupting the ingrained search behaviors of college students is very challenging.
Students who are new to the academic library environment may not have a good understanding of
terms like “databases” and “research guides.” That makes it even more difficult for them to
comprehend how a LibGuides search box differs from those they have used elsewhere. The more
experienced researchers, even if they had a good understanding of what the search results would
contain, expected more from the searches than they delivered. Some imagined they were
searching the controlled vocabulary of databases or were disappointed that the results page did
not provide links to preformulated searches for them to try.
The participants found the limited nature of the LibGuides searches surprising, disorienting, and
somewhat frustrating. To improve the user experience and reduce the number of failed searches,
we must consider changes that academic libraries can make to LibGuides, as well as others that
Springshare might make to its product.
Since these search boxes do not function as most users expect, it is important to communicate
contextual differences to the extent possible. We did this by adding descriptive text, changing
labels, and changing the placeholder text in search boxes. After not seeing much change in the
search logs following the first set of interface changes, we added search tips at the point of need to
directly communicate what constitutes productive and unproductive search terms.

The search term analysis revealed that the use of topic keywords is the most prominent reason for
failure. Together with entering citations, this behavior corresponds with the expectations we
observed in usability testing. After making changes to the search interface designed to discourage
these types of searches, we saw a reduction in the use of topic keywords. The level of citation
searching was unchanged.
Returning to our second research question, could we reduce the percentage of searches that
produce no results? Maybe, but not in a dramatic or conclusive way. Most of the improvement was
seen in the Databases A-Z searches. We expected to see further reductions in Week 3 following the
second set of interface changes but instead saw a slight increase in the Databases A-Z figure and
no change in the one for Research Guides. The results were likely influenced by factors that could
not be measured, so our one-week snapshots did not provide definitive results. Analysis over a
more extended period would be needed to determine whether the decreases in no-result searches
are consistent and lasting.
The most important outcome of our interface changes, it seems, is not the changes in search terms
or results but rather the searches that were not done at all. No additional usability testing was
done after the final interface changes, so we have no data that explains why search activity
decreased substantially. However, it is what we would expect to see if students were heeding the
added search tips and responding in one of two ways: searching more efficiently or avoiding the
search function entirely.

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 16


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

While some academic libraries have chosen to remove the native search function from their
LibGuides instance, we did not see that as a viable option for SMU Libraries. It is an important
means of navigating to the many course and topical guides that are not linked from the main
landing page. Our Databases A-Z page is the primary way our users access those critical resources,
and it provides information, like brief descriptions and “Best Bets” tags, that supports the
selection process and is not available in our discovery system.

At Cal Poly Pomona, where Conrad and Stevens conducted their study, librarians replaced the
LibGuides search with a search box for their discovery system because that better aligned with
students’ expectations.33 We see that as problematic because it takes searchers to another
environment where guides and databases can be difficult to find among many other resource
types. Once they navigate away from LibGuides, they are unlikely to return. If they did, it would be
the result of an inefficient, circular workflow. We think it best for the search function to keep users
within LibGuides, even though it challenges their mental model of what a search box is and does.
Besides the interface changes implemented, we saw several improvements that could be made
behind the scenes. Many search failures arise from the imprecise entry of database names. It is
common for databases to be branded with concatenated words like PitchBook or ScienceDirect.
Since we offer hundreds of databases, it is understandable that users may not know or remember
which names include spaces between words and which do not. Centuries take different forms, too,
with some spelled out and others using numerals in the database name. An acronym or a
shortened form of a database name may be commonly used by instructors and their students but
fails to produce results in the LibGuides search.

The Databases A-Z list provides an Alternate Name field where variants can be saved and
recognized by the LibGuides search. Figure 7 shows a database record to which alternate names
have been added.

Figure 7. Database record with alternate names

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 17


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

Up to this point, SMU Libraries added alternate names on a case-by-case basis as we became
aware of a need anecdotally. In the research data, we could see these names in use and producing
results. However, we also saw a greater need for alternate names and, as a result of our research,
added them systematically throughout the database records. Wherever words or syllables (as in
PubMed) are concatenated, we created an alternate name with a space between them. Where
database names contain a space but could reasonably be concatenated (Capital IQ), we
accommodated that too. We also added alternate names for any titles that include centuries.
Through continued monitoring of the search term reports, we can identify short forms of database
names (IBIS for IBISWorld) and acronyms (EEBO for Early English Books Online) that are being
searched and add alternate names for them as well.
These additional access points will result in fewer dead ends and less frustration for our users, so
we consider them worth the investment of time and effort. Ideally, the LibGuides search would
include a concatenated version automatically when two words are entered and a version with a
space when a capital letter occurs in the middle of a name. Partial names, such as Lexis or Nexis
(both of which appeared in our search logs), should produce results instead of just LexisNexis. If
these enhancements were made to the search software at a system level, then the need for
libraries to create and maintain alternate names would be greatly reduced.

As we listened to the suggestions of usability testing participants and reflected on our search
experiences elsewhere, we thought of several other ways to make the LibGuides search more
flexible and to accommodate differences in search behavior and understanding.

• Search suggestions (autocomplete) could be used to reveal database names and other
words found in LibGuides as the user types in the search box. This would help guide them
to appropriate terms and correct spellings. Based on their research, Ward et al.
recommended including this functionality in all library search interfaces. 34 Kate Moran of
the Neilsen Norman Group wrote, “In recent years, search suggestions have become an
expected sign of a well-designed search feature. . . . Suggested terms that return zero
results, or irrelevant results, are worse than unhelpful—they sidetrack users and are
downright irritating.”35 To make a positive impact in LibGuides, suggestions should be
drawn from the institution’s instance of LibGuides rather than from all instances.
• Similarly, “Did you mean . . . ?” suggestions would help users get back on track after a
search with no results. This could address spelling errors and show users related terms
used in the system. Holman advised: “Database developers who design algorithms that
make allowances for spelling errors will facilitate student search success.”36
• Automatic stemming or lemmatization would expand results to include different endings
for a search term. For example, the term “market” could yield “market,” “markets,”
“marketer,” and “marketing” in the list of results. However, the results of these techniques
vary widely for English-language texts and stemming reduces precision.37 Therefore, it
might be best to employ them only when a search string would otherwise produce few or
no results.
LIMITATIONS
The positive changes in search behavior that we observed in our analysis may have been due in
part to factors other than the interface changes, such as the experience level of the users
represented and the number who were repeating their searches in separate sessions. While we
are confident that most of the analyzed searches were conducted by students, the data were

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 18


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

anonymous and may have included searches conducted by faculty members, university staff, or
unaffiliated visitors. Ongoing maintenance of our LibGuides pages may have caused some
differences in results when the searches were replicated weeks or months later.
CONCLUSIONS
For academic librarians, usability testing is a valuable way of seeing library websites through the
eyes of our student users. After being immersed in our online environment for years, we tend to
forget what it is like to approach a website like LibGuides for the first time. This research has
reminded us that we cannot assume familiarity with library terminology and systems. Rather, we
must meet each student where they are and provide the individual support needed to understand
and utilize the library’s resources.
The current generation of college students has never known a world without Google. Searching it
and other websites has been a daily part of their lives, and they bring those years of experience to
LibGuides. It is no fault of the students that they expect the search boxes to perform like those
they used previously. With our unified indexes and Google-like discovery tools, academic libraries
have reinforced that mindset. Therefore, when a website search has a different goal and a more
limited focus, it is incumbent upon our system designers and administrators to communicate that
to users. At the same time, it should conform as much as possible to website norms and user
expectations. Through clarity and conformity, we can lower barriers, expand mental models, and
decrease the percentage of failed searches.

Our research showed us that interface modifications can have a positive impact on student search
behaviors. But that is only a partial solution to the problems we identified. By complementing that
strategy with smarter system design and administration, we can make the search function more
flexible, robust, and intuitive, thereby improving the user experience and reducing the number of
roadblocks students encounter. We are hopeful that Springshare will take advantage of available
technologies to improve the performance of the LibGuides search. At SMU Libraries, we have
implemented alternate names for databases systematically and will continue to explore other
ways to improve the search function of our LibGuides pages. Options could include providing
search access on selected pages only, reducing the scope of the Research Guides search to
individual guides, adding our discovery system to the search sources in LibGuides, substituting a
Google Programmable Search Engine, or providing a browse navigation option for guides that are
currently dependent on search. All are possible pathways for further research.

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 19


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

APPENDIX: USABILITY TESTING SCRIPT

Instructions for the moderator Instructions given to the participant


Greet the participant and thank them for [Participant’s name], as you saw in the e-mail, this
participating. should take 10 minutes or less, and I will be recording
the computer screen and your voice. Do you have any
questions about our research project?
I want to emphasize that we are testing web pages, not
you, and there are no right or wrong answers to the
questions I will ask.
If you don’t want to share your video, just click on “Stop
Video” at the bottom of the Zoom window. That should
also hide your name and profile picture.
Give me just a moment to start the recording, then we
will begin the research activities.
Start the recording and announce the Today I will ask you questions about two library web
following: “We are now recording, and this is pages and ask you to do a couple of searches.
participant [unique number].”
Put this link into the chat: Please open the chat window, follow the link that I’m
https://guides.smu.edu/home putting there, and then share your screen with me.
Thank you. Have you visited this page before?
Please look at the search box at the top of the page. How
do you think you could use that, and what would you
expect to find?
Let’s imagine that you have been given a research
assignment in one of your current classes. Try using the
search box now to find something useful for that
assignment.
Please describe what you see in the search results. Are
they what you expected to see or not, and why?
On a scale of 1 to 5, with 1 being not useful at all and 5
being very useful, how would you rate these search
results?
Thank you. You can stop sharing your screen for
moment, and I will put another link into the chat for you.
Put this link into the chat: Now please follow that link and share your screen again.
https://guides.smu.edu/az.php
Have you visited this page before?
Please take a look at the options in the box labeled
“Filters.”
IF PREVIOUSLY VISITED: Which of those do you
typically use?
OTHERWISE: Which one looks most useful to you?
If you were using the search box here, what would you
expect to find?
Thinking about the same course and research
assignment as before, try a search here and see what you
get.
Please describe what you see in the search results. Are
they what you expected to see or not, and why?

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 20


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

Instructions for the moderator Instructions given to the participant


On a scale of 1 to 5, with 1 being not useful at all and 5
being very useful, how would you rate these search
results?
Do you have any suggestions for improving how the
search function works on these pages?
Those are all my questions, and I want to thank you very
much for your time today. You’re helping us to improve
our web pages for all SMU students and faculty
members.
To show our appreciation, we will send a $15 Amazon
gift card to you online. You will need to complete a form
for that, so watch for an email message from [staff
member]. She will provide instructions and send you the
gift card. Any final questions?
Wish the participant a good day and sign off.
Save a copy of the recording to your
computer, then upload it to the designated
folder in Box and verify that it will play there.

ENDNOTES
1 Lucy Holman, “Millennial Students’ Mental Models of Search: Implications for Academic
Librarians and Database Developers,” The Journal of Academic Librarianship 37, no. 1 (2011):
20, https://doi.org/10.1016/j.acalib.2010.10.003.
2 Holman, “Millennial Students’ Mental Models,” 22.
3 Cynthia Lewis and Jacline Contrino, “Making the Invisible Visible: Personas and Mental Models of
Distance Education Library Users,” Journal of Library & Information Services in Distance
Learning 10, no. 1–2 (2016): 15–29, https://doi.org/10.1080/1533290X.2016.1218813.
4 Jillian R. Griffiths and Peter Brophy, “Student Searching Behavior and the Web: Use of Academic
Resources and Google,” Library Trends 53, no. 4 (2005): 547.
5 Griffiths and Brophy, “Student Searching Behavior,” 550.
6 Andrew D. Asher and Lynda M. Duke, “Searching for Answers: Student Research Behavior at
Illinois Wesleyan University,” in College Libraries and Student Culture: What We Now Know, ed.
Lynda M. Duke and Andrew D. Asher (Chicago: American Library Association, 2012): 71–85;
Alec Sonsteby and Jennifer DeJonghe, “Usability Testing, User-Centered Design, and LibGuides
Subject Guides: A Case Study,” Journal of Web Librarianship 7, no. 1 (2013): 83–94,
https://doi.org/10.1080/19322909.2013.747366; Elena Azadbakht et al., “Everyone’s Invited:
A Website Usability Study Involving Multiple Library Stakeholders,” Information Technology
and Libraries 36, no. 4 (2017): 34–45, https://doi.org/10.6017/ital.v36i4.9959.
7 Asher and Duke, “Searching for Answers.”

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 21


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

8 Yan Zhang, “The Development of Users’ Mental Models of MedlinePlus in Information


Searching,” Library & Information Science Research 35, no. 2 (2013): 159–70,
https://doi.org/10.1016/j.lisr.2012.11.004.
9 Katie Sherwin, “Scoped Search: Dangerous, But Sometimes Useful,” Nielsen Norman Group,
January 18, 2015, https://www.nngroup.com/articles/scoped-search/.
10 Asher and Duke, “Searching for Answers”; Helen Georgas, “Google vs. the Library (Part II):
Student Search Patterns and Behaviors when using Google and a Federated Search
Tool,” portal: Libraries and the Academy 14, no. 4 (2014): 503–32,
https://doi.org/10.1353/pla.2014.0034.
11 Rebekah Willson and Lisa M. Given, “Student Search Behaviour in an Online Public Access
Catalogue: An Examination of ‘Searching Mental Models’ and ‘Searcher Self-
Concept,’” Information Research: An International Electronic Journal 19, no. 3
(2014), https://informationr.net/ir/19-3/paper640.html.
12 Sherwin, “Scoped Search.”
13 Asher and Duke, “Searching for Answers,” 73.
14
Asher and Duke, “Searching for Answers.”
15 Georgas, “Google vs. the Library”; Kate Lawrence, “Today’s College Students: Skimmers,
Scanners and Efficiency-Seekers,” Information Services & Use 35, no. 1–2 (2015): 89–93,
https://doi.org/10.3233/ISU-150765.
16
Holman, “Millennial Students’ Mental Models.”
17 Willson and Given, “Student Search Behaviour”; Holman, “Millennial Students’ Mental Models.”
18 Georgas, “Google vs. the Library.”
19 Holman, “Millennial Students’ Mental Models”; Georgas, “Google vs. the Library”; Willson and
Given, “Student Search Behaviour.”
20 Ashley Lierman et al., “Testing for Transition: Evaluating the Usability of Research Guides
Around a Platform Migration,” Information Technology and Libraries 38, no. 4 (2019): 76–97,
https://doi.org/10.6017/ital.v38i4.11169; Suzanna Conrad and Christy Stevens, “Am I On the
Library Website?: A LibGuides Usability Study,” Information Technology and Libraries 38, no. 3
(2019): 49–81, https://doi.org/10.6017/ital.v38i3.10977; Gabriela Castro Gessner et al., “Are
You Reaching Your Audience?,” Reference Services Review 43, no. 3 (2015): 491–508,
https://doi.org/10.1108/RSR-02-2015-0010.
21 Castro Gessner et al., “Are You Reaching Your Audience?”
22 Kate Conerton and Cheryl Goldenstein, “Making LibGuides Work: Student Interviews and
Usability Tests,” Internet Reference Services Quarterly 22, no. 1 (2017): 43–54,
https://doi.org/10.1080/10875301.2017.1290002.

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 22


CRENSHAW AND JOHNSON
INFORMATION TECHNOLOGY AND LIBRARIES DECEMBER 2024

23 Denise FitzGerald Quintel, “LibGuides and Usability: What Our Users Want,” Computers in
Libraries 36, no. 1 (2016): 4–8.
24 Christine Tawatao et al., “LibGuides Usability Testing: Customizing a Product to Work for Your
Users,” University Libraries, University of Washington, October 2010,
https://digital.lib.washington.edu:443/researchworks/handle/1773/17101.
25 Isabel Vargas Ochoa, “Navigation Design and Library Terminology,” Information Technology and
Libraries 39, no. 4 (2020): 5, https://doi.org/10.6017/ital.v39i4.12123.
26 Azadbakht et al., “Everyone’s Invited.”
27 Sonsteby and DeJonghe, “Usability Testing, User-Centered Design, and LibGuides.”
28 Conerton and Goldenstein, “Making LibGuides Work.”
29 Lierman et al., “Testing for Transition.”
30 Conrad and Stevens, “Am I on the Library Website?,” 71.
31 Holman, “Millennial Students’ Mental Models.”
32 Asher and Duke, “Searching for Answers.”
33 Conrad and Stevens, “Am I on the Library Website?”
34 David Ward et al., “Autocomplete as Research Tool: A Study on Providing Search
Suggestions,” Information Technology and Libraries 31, no. 4 (2012): 6–19,
https://doi.org/10.6017/ital.v31i4.1930.
35 Kate Moran, “Site Search Suggestions,” Nielsen Norman Group, May 20, 2018,
https://www.nngroup.com/articles/site-search-suggestions/.
36 Holman, “Millennial Students’ Mental Models,” 25.
37 Christopher D. Manning et al., Introduction to Information Retrieval (Cambridge, UK: Cambridge
University Press, 2008).

IMPROVING THE STUDENT SEARCH EXPERIENCE IN LIBGUIDES 23


CRENSHAW AND JOHNSON
Copyright of Information Technology & Libraries is the property of American Library
Association and its content may not be copied or emailed to multiple sites or posted to a
listserv without the copyright holder's express written permission. However, users may print,
download, or email articles for individual use.

You might also like