User Behaviour in Resource Discovery: Initial Findings
Hanna presented some interesting findings, based on field studies with 18 students and researchers from LSE, Cranfield and Middlesex University:
- unless directed to academic sources such as EBSCO, Emerald etc, students were most likely to resort to tools they are more familiar with, such as Google, where the expectation of the students is to find an answer (without necessarily verifying its correctness, but to just find an answer) as quickly as possible.
- in addition to commonly used search tools such as Google and Google Scholar, students and researchers have been using tools such as YouTube, and personal networks, as well as social networking software to ask people for help to find known or not-known links. The use of social networking software is new, although should not be surprising, as it in some ways, parallels our own professional behaviours as we ask colleagues for suggestions and leads. The physical library was the source of last resort.
- there seems to be a difference in the strategies used by students from different backgrounds. Their information literacy skills this does not appear to correlate with their digital literacy (i.e. ability to use technology and gadgets such as iPhones, etc) skills.
- we need to determine if there are common strategies (e.g. how do users determine if the site they are using is a high scholarly quality site?) for the different user groups (e.g. same university, or same country if international students, or if part-time students).
- although it may not be part of this project, it would still be interesting to profile how the different institutions teach information literacy, and to assess the uptake of such courses and their effects on students’ and researchers’ information search skills.
- most of current systems provide a basic “quick” search and an “advanced” search. Our findings so far suggests that novices or students tend to avoid the “advanced” search, thinking / assuming that it is really for the advanced researcher, and that it would be beyond their skill level to use them.
- in addition, those who used advanced search, often use ‘safe’ strategies, i.e. they insert specific information that is available, e.g. known author, dates or parts of a title. Such a search is unlikely to reveal unanticipated associations.
- users new to such search often type in the complete title or sentence of the exercise they have been assigned into the keywords.
Further analysis will be carried out to determine their reasoning in how they narrow down their searches during the query formulation stage to identify specific possible candidate documents, and to then broaden it out to other relevant documents.
- this line of discussion led us to at least three categories of ‘results’ that a user will / should be interested in: (i) co-borrowing, (ii) co-citations, and (iii) context based tags.
- this also had implications for the kind of system architecture that would enable this to occur. While we discussed the notion of “fusion”,
it is more likely that we will be needing a database architecture that enables “mapping” and “connecting”, rather than “fusing”.
- the lack of ‘spell checker’ or ‘did you mean this …’ in library search systems has quite significant consequences. Users may type in an incorrectly spelt term assuming it is correct, and as a result, the system responds as perhaps, ‘no books available’ on the (incorrectly spelt) topic, re-directing or diverting the search path of a student or researcher.
- “time out session” is a problem as when the system times out, it often loses all trace of a search activity.
Some thoughts on searching:
- what is a ‘powerful’ search term?
- is there a taxonomy of good / bad search strategies?
- what makes a good query or search? what are its attributes?
- useful insight: we should perhaps be asking “What does a better query look like?” and how this can be presented to assist a novice in improving the way they formulate their queries?
- while others have used “stop words” and other indexing techniques such as TFIDF (Terms Frequency, Inverse Document Frequency) techniques, how to make it meaningful from a user’s perspective?
- formulating a query requires certain specific knowledge, e.g. structure of the domain, some basic language or knowledge of the domain, and features and functions of the tools available to construct the query.
- providing a trace or discovery path (“oh, this is where I’ve been”) is useful in helping the user ‘see’ where they’ve been or should have been in their search for information.
- we need to articulate the assumptions behind user searches, as what seems obvious to a software developer, seem to confuse or is counter-intuitive as far as the user is concerned.