Information Behavior Studies

[NOTE: Updated on 6-15-2009 to link to Dr. Makri’s recent lecture at UIUC, and on 5-17-2009 to describe in more detail the subjects of Dr. Makri’s studies. Thanks to Stephanie Davidson. –legalinformatics]

Several valuable studies of law-related information behavior have recently appeared. Here is a summary of them:

Dr. Yolanda P. Jones’s Ph.D. dissertation, “Just the Facts Ma’am?” A Contextual Approach to the Legal Information Use Environment (2008), applies Solomon’s “Discovering Information in Context” framework, see Paul Solomon, Discovering Information in Context, 36 Annual Review of Information Science and Technology 229 (2002), and Vygotsky’s Activity Theory, see L.S. Vygotsky & M. Cole, Mind in Society: The Development of Higher Psychological Processes (1978), in a qualitative study of information behavior of law students working in a legal clinic.

Among the key findings are that students working in a clinic often engaged in collaborative research with positive results; the students relied on informal legal information resources, particularly people (such as classmates and experts), as well as formal legal resources; and that organizational memory (including written records of case histories and the memories of former student clinic participants) plays a central role in clinic students’ information behavior. Based on these findings, Dr. Jones offers recommendations for legal information system design, including that CALR systems should furnish spaces for online collaboration and should enable user tagging of resources. Finally, Dr. Jones recommends several avenues for further research on information behavior, and particularly research into collaborative information retrieval.

Stephann Makri’s Ph.D. dissertation, A Study of Lawyers’ Information Behaviour Leading to the Development of Two Methods for Evaluating Electronic Resources (2008), used D. Ellis, A Behavioural Approach to Information Retrieval System Design, 45 Journal of Documentation 171 (1989), as the basis for a study of the information behavior of law students, lawyers pursuing graduate degrees, law school faculty or academic staff (described as “research staff (some of whom were also involved in teaching)”), and practicing lawyers.

Among the key findings:

  • all the subjects studied devote time and effort to updating information (e.g., by using citators), and practicing lawyers, but not academics, frequently tracked the history of information (e.g., for statutes, by examining legislative history and post-enactment amendments; and for cases, by examining cases that have cited a particular case);
  • both updating and “history tracking” are behaviors apparently distinctive to the discipline of law, and not previously recognized in the information science literature;
  • subjects make frequent use of secondary legal resources, and often search Google or Google Scholar, when beginning research in an unfamiliar area of law;
  • all subjects knew basic Boolean searching, but few knew advanced Boolean connectors;
  • subjects mainly search at the document level, and frequently browse through a document;
  • subjects typically keep a manual record of documents found and of search history, but made little use of automated “research trails” on CALR services;
  • subjects frequently highlight or “tag” relevant content within a document; and
  • subjects frequently seek and use one or more portions of a document, rather than the whole document.

Based on these findings, Dr. Makri proposes two new methods, one focusing on system functions, the other on usability, for evaluating electronic legal resources, and demonstrates their use on the LexisNexis Butterworths CALR system. Dr. Makri concludes by identifying several types of research questions on legal information behavior that could be pursued using his methods.

Three other works by Dr. Makri may be of particular interest. The research reported in Stephann Makri, A Study of Legal Information Seeking Behaviour to Inform the Design of Electronic Legal Research Tools, 1 Proceedings of the International Workshop on Digital Libraries in the Context of Users’ Broader Activities (2006), was limited to law students and lawyers enrolled in LL.M. or Ph.D. programs. The key findings:

  • subjects “found it difficult to find the information that they were looking for when using” CALR services, principally because of imperfect or false “knowledge about the coverage of the [content of the] system . . . and also how to formulate the correct search terms for a specific system”;
  • subjects “do not delve beyond the basics of [CALR] systems and were often unwilling to go to training classes on how to use [CALR systems] despite being aware that these classes had been available to them”;
  • “[m]ost students were [] aware that it was necessary to use different search terms when searching using Google compared with when using [CALR services] – due to differences in the type and
    scope of information that [each] . . . [was] designed to . . . find”;
  • “none of the students used any Boolean connectors or advanced search syntax when searching [CALR services], with the exception of enclosing phrases in quotation marks, which they may well have learnt initially from searching Google”;
  • “Law students’ lack of knowledge of the similarities and differences between individual [CALR systems] might well play a part in law students’ incorrect assumptions about the way that individual systems work,” suggesting “the need for students to gain an understanding of the similarities and differences between [CALR systems] in order to appreciate the situations in which different electronic resources might be useful”.

The author concludes “that [CALR services] should also support users in forming a mental model of the systems that they use to find information; information that can then be used to support users’ models of the work domain and of legal information seeking in general.”

A later article, Stephann Makri, Studying Academic Lawyers’ Information Seeking to Inform the Design of Digital Law Libraries, forthcoming in the IEEE Computer Society Bulletin of the Technical Committee on Digital Libraries, covered law students, lawyers who were graduate students, and several law faculty or academic staff (including “one Senior Research Fellow, two Lecturers, two Senior Lecturers and one Professor of Commercial Law”). The findings were similar to those of the “Study of Legal Information Seeking Behaviour” article. In addition, the author found that subjects had imperfect and/or incorrect knowledge of 3 types respecting CALR services:

  • “awareness knowledge (which resources exist to help locate certain materials),
  • “access knowledge (whether they have access to certain materials and, if they do, how they might go about doing so) and
  • “usage knowledge (how to use the electronic resource).”

Respecting awareness knowledge, some subjects did not know that Westlaw enabled subject searching by Key Number. Respecting access knowledge, some subjects incorrectly believed that certain content was omitted from their version of Westlaw because Westlaw purportedly provided a “UK version” having different content from the “US version,” when in fact the same Westlaw content was available in the UK and the US. Respecting usage knowledge, Makri found subjects had imperfect or false information respecting coverage of the services (this “often included not knowing where within the system to go in order to find certain types of material or to perform a certain type of search”), content and structure of documents within the services, and the authority of services and their contents.

In Stephann Makri et al., ‘I’ll Just Google It!’: Should Lawyers’ Perceptions of Google Inform the Design of Electronic Legal Resources? forthcoming in Proceedings of the Web Information-Seeking and Interaction (WISI) Workshop (SIGIR) (2007), a study of law students, lawyers pursuing graduate degrees, and law faculty or academic staff (described as “staff, lecturers and a Professor of Law”), the authors identified the following factors as reasons for law students’ and lawyers’ selecting Google as a legal research tool: “quality of results, degree of flexibility and control offered, simplicity and approachability, familiarity and speed/time-saving benefits.” The authors also found “that members of all groups of [subjects] in our study spoke of Google in a positive light (and none spoke of it in a negative light).”

“By contrast, [wh]en referring to [CALR services], [subjects] were just as negative as they were positive. Many [subjects], particularly taught students, spoke of frustration concerning knowing where in the system to go in order to find a particular type of legal document. Lawyers also mentioned []or demonstrated . . . that they sometimes found it difficult to know” how to formulate “a search that is restricted to a particular segmented field.”

The authors then discuss design implications of their findings. The authors urge user-interface designers to consider all of the factors they identified, plus users’ needs, and the type of information the resource is designed to find. The authors suggest that a single search box might be appropriate for broad searches for material intended to familiarize the user with an unknown subject, but that a multi-field interface would be optimal for known-item searching for structured documents.

Dr. Makri’s June 9, 2009 lecture, discussing his recent legal information behavior research, at the University of Illinois at Urbana-Champaign is linked here.

Judith Lihosit’s new article, Research in the Wild: CALR and the Role of Informal Apprenticeship in Attorney Training, 101 Law Library Journal 157 (2009), reports results of a qualitative study of the legal research behavior of fifteen practicing lawyers. Among her key findings:

  • Lawyers generally perform legal research as recommended in traditional guides:
    • when researching an unfamiliar area of law, they start with secondary legal resources;
    • when searching for case law, they use secondary legal resources or annotated codes;
    • after finding one or more relevant cases, to find additional cases they use citators, key numbers or keywords linked to headnotes; and
    • they “would use Westlaw’s Custom Digest or LexisNexis’s headnote topics or ‘More Like This Headnote’ as tools to scan through the available cases.”
  • In addition, the lawyers “used online searching for case law” primarily “to find support for their already-formulated arguments”;
  • as lawyers’ “experience level increases[,] . . . their use of [formal] secondary sources is usually supplemented with, or even over time replaced by, consultation with in-house document repositories or more experienced attorneys who are part of their informal networks”; and
  • “many attorneys tend to practice only in specific subject areas[.] . . . [This] allows attorneys to develop expertise, and thus simplifies the research process as they gain experience.”

The author concludes that “[a]ttorneys are still learning to do research and to practice in the ‘traditional’ manner, by using whatever tools are available to them, and, more importantly, by receiving on-the-job training and by being able to tap into the knowledge of their more experienced colleagues. In short, they are still being trained through the present-day manifestation of the long-standing apprenticeship system.”

To find other recent legal informatics scholarship, see the Preprints, Articles, Indexes, Dissertations, Conferences, and Monographs sections of our sister Website, Legal Information Systems & Legal Informatics Resources.

This entry was posted in Articles and papers, Dissertations and theses, Literature reviews and tagged , , , , , , , , . Bookmark the permalink.

One Response to Information Behavior Studies

  1. stephanie says:

    I’ve been interested in a closely-related area for the last year, and was thrilled to see Dr. Makri’s work.

    I’m still digesting his dissertation findings, but have a couple preliminary thoughts:
    * in the current legal information market, what is the effect of a +/- usability evaluation in terms of behavior? do users try another service, supplement with Google, or just stop short of a result as complete as it should be?

    * should legal research instructors teach from the standpoint of training them to expand their habits, or should we train them in the gold standard? Dr. Makri’s work confirms that users don’t generally make full use of functionality in a CALR system; I suggest that we need more work on why that is, and how users manage to achieve satisfaction without cranking the power to full.

    * this is a fascinating data set, it seems, with respect to doctrinal work, understanding the information-seeking behavior of the (UK) legal profession, but what of academics? Dr. Makri’s title might suggest that the study explores the information behavior of academic law faculty, but only one law professor was included (and 4 lecturers). Would these findings describe the information-seeking behavior of American legal scholars, 20-30% of whom (depending on whose stats you read) have Master’s or PhDs in an academic discipline?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s