Reflections on Tolliver et al. (2005) and Swanson & Green (2011).
Usability testing is arguably more important in today’s information environment than in the past. With all the different ways people can find and access information, assumptions on how they use library websites, mobile devices, and social media are nigh. Much along the same lines that librarians are no longer gatekeepers of information, so too should librarians now become knowledgeable in assessing how information is discovered and used.
Tolliver et al. (2005) emphasize the importance of rigorous usability testing with the use of a professional usability consultant. It would seem appropriate to invest in such a resource if you are making a significant technological leap (e.g. online catalog) and on your main online platform (i.e. library website). What can sometimes be lost on the entire process, however, is the need to continuously monitor and update these procedures. Too often, once a library website has been revamped there is no revisiting of the usability of the website — even as technology changes.
A parallel lesson is true when libraries implement social media. Too often libraries sign up for institutional accounts without really knowing how or why they are implementing these tools. For academic libraries, usability testing is particularly important as their core user demographic is significantly tech savvy. For other libraries, a strategic plan is also necessary: will this tool be used for community engagement? or to push out information? or to assist with improving a collection (e.g. crowdsourcing)? And in any of these cases, how much will this cost? Because even if the application itself is free, librarians will have to commit time to establishing initial set-up, mediating comments, and/or reformatting content. The Library of Congress has a particularly sophisticated set of guidelines and hopefully, like with other best practices, they will post their Social Media Strategic Planning best practices as well. A recent presentation on social media strategic management was also part of the SLA (Special Libraries Association) annual conference.
Usability testing has proven to be helpful in other library environments as well. I recently had the opportunity to listed to how Federal Reserve Board librarians constantly survey their users. Specific to usability testing, they have run focus groups to determine what facets are needed for a regular newsletter database; user testing to roll out a library mobile site; and stakeholder interviews with heavy library website users before making changes to the library website. Regardless of what the usability testing is called, they resulted in a highly knowledge library staff that provides simple and complex library services that are well-crafted to their users’ information needs.
In a similar way, Swanson and Green (2011) bring to light what many librarians already knew. While Google isn’t going anywhere, librarians need to be able to give their users guidance on how to assess their web search engine results against subscription databases. From a researcher’s perspective, there are two main differences between searching databases and searching the web: (1) what you are searching (search outcomes) and (2) how you are searching (search process).
Commercial databases invest in indexing their collection to make finding relevant results much easier. Web search engines rely on algorithms to locate results and can lead to very low precision. Further, controlled vocabulary is found within all databases and further assists with more precise search results. Google, as an example, remains at the keyword level, which boosts recall, but limits precision. More notably, while many results may be found in a web search, some results may still go unfound because of the lack of taxonomies underpinning web results. Unlike the extensive use of taxonomies in online databases and library catalogs in general, which can grow and expand a researcher’s search (Mann, 2005).
In turn, the possibilities of searching in online databases is far more diverse than that of the “one box” web search engine. While Hock (2010) discusses advanced features in web search engines, these are all but buried or unmaintained. They are not the primary features or functions of searching on the web. Commercial databases, on the other hand, use Boolean search terms and truncation, along with filters to better focus and expand one’s initial search results. To be sure, consider not only the author and publication year filters, but affiliate organizations filters and related keyword options.
Perhaps capturing the web’s search capabilities through projects like Google Scholar provide a pathway for improved recall within subscription databases. However, the search principles that lie behind subscription databases remain necessary to the precision that is highly valued by academic institutions.
Hock, R. (2010). The extreme searcher’s internet handbook: a guide for the serious searcher. (3rd ed.). Medford, N.J. : CyberAge Books.
Mann, T. (2005, August 15). Will google’s keyword searching eliminate the need for LC cataloging and classification?. Retrieved from http://www.guild2910.org/searching.htm
Swanson, T. & Green, J. (2011). Why We Are Not Google: Lessons from a Library Web Site Usability Study. Journal of Academic Librarianship, 37, 222-229.
Tolliver et al. (2005). Website redesign and testing with a usability consultant: lesson learned. OCLC Systems & Services, 21, 156-166.
This blog post fulfills an assignment for a library school course and includes readings related to information systems.