The key insight was the expectation from users that the simple iDiscover search function would automatically return a list of results as sophisticated and relevant as they would expect from other, more powerful search platforms. This led to frustration when, for example, a search for a journal title returned a number of articles and other results before the link to the journal holdings and links to online access. At this point, when asked what they would do next, many of our participants answered by saying that they would start using another search tool.
Some of the problems were a mismatch with the user's perception of the tool (as a catalog):
“Book reviews above books just don’t make sense!” (Archaeology PhD student)
“When looking for a book, you’ll end up with a random science article.” (English undergraduate student)
“If you search for a title that only has a few words in it, even if you type it in correctly, other less relevant titles will come up first.” (Education MEd student).”
When asked what was most important to them in terms of platforms used to search for information resources, the words ‘relevance’ and ‘relevant’ were used by a large number of our participants. This was directly linked to a desire for seamless, efficient searches which yielded appropriate and useful results, without the need to use pre- or post-search options to limit or refine them. People were often frustrated at the lack of percieved [sic] relevancy in the initial results list, after having used the main iDiscover search function
[lol, we had a vendor here to help us get our enterprise search going many moons ago... they said "relevance is dead!" I was like "nope!"]