More than a Decade Later: Library Web Usability Practices at ARL Academic Libraries in 2007 and 2020

This study compares library web usability practices in 2007 and 2020 at academic libraries that are institutional members of the Association of Research Libraries. The authors performed chi-square and t-tests to determine whether there were differences in establishing policies/standards/guidelines (PSGs), conducting usability tests, and providing resources between samples of libraries from both years. There was no statistically significant difference between the number of libraries with and without PSGs in both samples. In 2020, the level of perceived importance of usability testing significantly decreased, and the resources needed for web usability initiatives doubled. The authors suggest that academic and research libraries foster a culture of web usability to actualize and optimize usability endeavors.

Introduction

In this digital age, the World Wide Web is the dominant medium for accessing information. As such, it is essential for web developers to make web-based information systems usable in various platforms. Usability scholars such as Nielsen, Norman, and Shneiderman, provided principles for best usability practices. 1 Additionally, the International Organization for Standardization (ISO) and the U.S. Department of Health and Human Services (HHS) published standards and guidelines for web developers to create information systems with superior usability. 2

Researchers in information system success modelling indicate that the quality of information, systems, and services is positively associated with intention to use and user satisfaction, which leads to the continued use of a system. 3 Continuous use of such a quality information system can then lead to higher rates of return on investments.

Academic libraries have put tremendous effort and funding into providing electronic resources and services via their library web portals. Hong et al. 4 revealed that perceived ease of use and usefulness can influence users’ acceptance and use of digital libraries. If libraries do not take these usability characteristics into account, they risk underutilization of their resources. 5

As electronic resources grow exponentially, academic libraries must develop web portals with quality usability to prompt continued use of these resources, thus making libraries’ investment cost-effective. To accomplish this goal, a sound infrastructure is indispensable, which includes employing web usability experts, establishing and implementing institutional usability policies/standards/guidelines (PSGs), and providing necessary resources. In 2007, Chen, Germain, and Yang explored the ways that academic members of the Association of Research Libraries (ARL) met these infrastructure objectives. 6 In this study, the authors have attempted to identify whether web usability infrastructure and efforts devoted to web usability testing have increased at these libraries over the last decade.

Problem Statement

In the library and information science literature, research on web usability usually addresses a specific aspect, such as case studies of usability testing, 7 discussions on web accessibility policies, 8 or web team development. 9 Instead of focusing on a particular area, Popp in 2001 examined several aspects of web usability practices at members of ARL libraries, such as testing, obtaining web assessment training, and supporting professional development. 10

As there was a void in the literature investigating holistic web usability, in 2007 Chen et al. expanded the scope of Popp’s study by incorporating PSGs and resources into their research. 11 They observed that of the eighty-four participating libraries, only twenty-five had web usability PSGs, even though the perceived importance of usability testing was high. Additionally, 85 percent of the libraries had tested their websites. Nevertheless, due to a lack of infrastructure and buy-in, there was minimal iterative testing of the various components of the library web portal. Furthermore, there were just twenty libraries with dedicated, full-time usability staff. Based on these research outcomes, Chen et al. advocated education and organizational support for usability initiatives. 12

It has been over a decade since Chen et al.’s initial study. 13 There are still few systematic studies of organizational web usability infrastructure. Therefore, the authors conducted a comparative study to determine whether ARL academic libraries have demonstrated a stronger commitment in their usability initiatives since then. Through this research, we aim to

The issues and degrees of progress identified in these results will help advance web usability enterprises in the information science and higher education communities.

Literature Review

In 1988, Norman advocated for the importance of usability by promoting simple design focused on the successful interaction between an object and its user. 14 Based on system engineering principles, Nielsen proposed five measurable usability attributes: easy to learn, efficient to use, easy to remember, low error rate, and overall user satisfaction. 15 ISO defined usability as the “[e]xtent to which a product can be employed by specified users to achieve specified goals with effectiveness, efficiency and satisfaction in a specified context of use.” 16 Palmer extended ISO’s goal-oriented perspective by highlighting a system’s information architecture. 17

As web technologies emerged, Brophy and Craven regarded web usability as “the experience the user has when reading and interacting with a website.” 18 The authors of this study took a holistic approach to the subject by introducing a working, multifaceted definition that addressed the gaps in the ISO definition concerning content, cognitive capacity, affect, and interactivity. 19 In 2018, ISO took a more inclusive stance in redefining usability and expanded its scope to include products and services. 20

Nielsen, Rosenfeld and Morville, and Shneiderman indicated that websites built for optimal usability during the development cycle enable users to interact more easily with and yield greater satisfaction from the systems. 21 Several studies revealed that websites with high levels of usability will engender user satisfaction, and that users will hence revisit these sites. 22 Likewise, in e-learning, a quality interface and useful content facilitate coherent teaching and learning, which increase acceptance and satisfaction. 23 Because academic libraries rely heavily on web technology to provide access to resources and services, it is thus essential that the design of their online system reflects users’ mental models and usability best practices.

Library professionals have adopted usability principles when developing their online portals. For example, they have conducted usability tests across platforms, including the library’s main pages, 24 lower-level pages, 25 OPACs, 26 and discovery systems 27 to ensure quality control. With the widespread use of mobile devices, libraries have also conducted usability testing on their mobile library websites. 28

Academic libraries apply various web usability testing methods. Card-sorting is an option for the preliminary stage of the development, since it takes the user’s mental models into account when designing intuitive information architectures. 29 Think-aloud protocol allows users to articulate their thought processes while navigating web resources. 30 Paper or online prototyping is a cost-effective method for constructing initial layout of a website, as it is easy to make design modifications in the early stages. 31

Sometimes, usability testing is conducted by experts in this area. Cognitive walkthrough, a process whereby experts emulate a novice navigating a system, yields information on its learnability and the ease of identifying its most straightforward path to accomplish a specific task. 32 Similarly, heuristic evaluation involves expert inspection of a system based on a set of established standards or guidelines. 33 Task analysis examines whether a system’s design aligns with the sequence activities necessary to complete a specific task. 34

As usability testing technology advances, some usability practitioners augment traditional methods with additional tools; for example, log analysis 35 and eye tracking. 36 Researchers also conduct focus groups or surveys to solicit feedback from users. 37

Usability testing is an on-going, indispensable process throughout the system development life cycle. 38 Iterative testing enables web designers to detect flaws and make improvements. 39 These usability initiatives require considerable personnel, time, technical expertise, funding, and other resources throughout the various phases of the process. 40 Teams can provide valuable support, but members with limited expertise in these areas may hinder a team from working at its full potential. 41 However, Nichols et al. noted that while some team members may not have a high level of usability training, they still bring important knowledge about users to the process. 42 Lacking staff expertise, some organizations opt to hire outside consultants to conduct usability testing. 43 Cervone’s model posited that whether usability training is knowledge-based or skill-based, it should be an organization-wide endeavor. 44 These diverse views “move usability towards an institutional value.” 45

Usability PSGs provide uniformity for quality information system design. After exploring web policies available on selected academic libraries’ websites, Lingle and Delozier provided a list of elements for library website policies. These elements include mission statement, target audience, scope and content, selection criteria, web administration, training, URL creation, types of platforms used, security levels, backup plan, and design. Their list mainly focuses on the collection, technical, and procedural aspects of policies, not usability per se. 46 ISO issued sets of usability guidelines and specifications for facilitating user-centered design. 47 The HHS publication Research-Based Web Design & Usability Guidelines provides institutions with a blueprint for establishing local policies for usability best practices. 48 Additionally, Nielsen’s seminal ten heuristics serve as general principles for creating intuitive web user interfaces. 49 Finally, for a system to be usable it must first be accessible. The Web Accessibility Initiative at the World Wide Web Consortium emphasizes prioritizing web accessibility for persons with disabilities. 50 Common elements for web usability PSGs derived from these authoritative usability guidelines include identifying goals, understanding user requirements, meeting user’s expectations, considering user interface issues, providing useful content, structuring content for easy navigation, using plain language, allowing user control and flexibility, preventing errors, avoiding information overload, addressing accessibility, and measuring outcomes of use (e.g., effectiveness, efficiency, satisfaction, user experience, etc.).

Although library professionals have applied these guidelines and standards toward general evaluation of their websites, there is little discussion in the library and information science literature specifically related to web usability policies. 51

The rapid evolution of web technologies has made it more common to offer online learning and information services (including seeking and disseminating information) since Chen et al. explored web usability practices in ARL academic libraries in 2007. 52 The transition from in-person to virtual environments further highlights the importance of web usability. Quality library web usability facilitates seamless interaction for teaching, learning, and research, thus providing better user experience for patrons of academic libraries. Achieving ultimate web usability requires a sound infrastructure and continuous efforts. A comparative study on these usability aspects will shed light on the progress made and the challenges encountered by the ARL academic libraries. The results can help library professionals, including library administrators, reflect on their library web usability practices. Additionally, the insights derived from this research can serve as informed strategies for advancing web usability enterprises in the information science and higher education communities to enhance user satisfaction.

Methods

In 2007, Chen et al. selected the ARL academic libraries for exploring web usability practices because they were regarded the top research libraries in North America. 53 As the authors of this study intended to determine if ARL academic libraries have demonstrated a stronger commitment in Web usability initiatives in the past decade, surveying the current state of Web usability practices in these institutions must take place first. To achieve this goal, the authors adapted Chen et al.’s 54 survey questionnaire. They added the “Library student worker” option to the question on testing population, and the “Eye tracking” option to the question on usability testing methods, as well as including new questions on testing a mobile version of library websites, availability and utilization of usability labs, and how existing usability PSGs and practices have influenced user experience. Furthermore, the authors added the phrase “in the past ten years” to the question on usability testing efforts to replicate the timeframe of the former study, which transpired approximately ten years after initial web usability testing initiatives occurred at academic libraries. These additions and modifications to the original survey questionnaire were meant to account for new methods and emerging web technologies, such as increased use of eye-tracking systems and mobile devices. In a forthcoming article, the authors provided a comprehensive report on the current state of web usability practices in the ARL academic libraries. 55 For the comparative analyses, only responses to common questions used for both the 2007 and 2020 surveys were considered (see appendix). Thus, the authors did not anticipate that the changes made to the survey would impact the comparability of the results between the current and former study.

The rationale behind adapting Chen et al.’s 56 survey instrument included that the target population was the same; to make the comparison meaningful, the scope of the investigation and survey instrument should remain the same; their questionnaire consisted of quantitative and qualitative elements providing a more complete view of the issues under examination; and the survey questions had been tested through two pilot studies to ensure validity and reliability.

The quantitative questions included multiple choice and Likert scale items focusing on usability PSGs, usability testing, and resources. The open-ended questions, pertaining to challenges encountered in the implementation of usability PSGs, web usability practices, and future plans for usability initiatives, allowed the authors to collect qualitative data which could not be captured via quantitative-oriented queries.

The authors followed the same approach identifying appropriate survey recipients. We visited the ARL academic libraries’ website directories in September 2019 and identified position titles or departments with responsibility for usability initiatives. The authors then contacted potential individuals to determine whether they were the appropriate survey recipients; if they were not, we requested a referral.

Upon receiving the IRB approval at the University, we sent the survey questionnaire via SurveyMonkey to the 105 ARL academic libraries at the end of October 2019. We followed up with emails and phone calls to increase the response rate. Due to the COVID-19 pandemic, there were delays in response submissions. As the survey solicited information on usability practices in the past ten years, the responses would not be affected by the interruption caused by the pandemic. In Chen et al.’s 2007 study, eighty-four institutions participated in the survey. 57 In the 2020 study, by the close of the survey in mid May 2020, ninety-one institutions responded, yielding an 87 percent response rate, which is a strong representation of ARL academic libraries. 58

The authors exported the data from SurveyMonkey to Excel for quantitative analyses. We performed chi-square tests of independence and t-tests to determine whether there were differences in terms of PSG establishment, usability testing, and resource availability between the 2007 and 2020 samples. Additionally, we downloaded responses to the open-ended questions and coded them using the themes that had emerged in the 2007 study, which applied grounded theory method. Discrepancies in coding were resolved through discussions.

Findings

Development and Implementation of Website PSGs and Web Usability PSGs

The authors mainly used the chi-square (χ 2 ) test of independence and independent samples t-test to conduct analyses and comparison of the data from 2007 and 2020. Table 1 shows that the numbers of libraries with or without library website PSGs remained similar (χ 2 = 0.074, df=1, p > 0.1). However, there was a significant difference (33 percent in 2020 vs. 30 percent in 2007) in the numbers of libraries with web usability PSGs (χ 2 = 8.219, df=1, p < 0.05). Likewise, there was a notable increase (41 percent in 2020 vs. 36 percent in 2007) in the number of universities with web usability PSGs (χ 2 = 34.181, df=1, p < 0.001).

Libraries/Universities with/without PSGs