Skip to document

Empirical Study Legal Communication

legal research and writing notes week 7
Course

Legal Research and Writing II

7 Documents
Students shared 7 documents in this course
Academic year: 2021/2022
Uploaded by:
Anonymous Student
This document has been uploaded by a student, just like you, who decided to remain anonymous.
Singapore Management University

Comments

Please sign in or register to post comments.

Preview text

+ 2(,1 1/,1(

Citation: Christopher R. Trudeau, The Public Speaks: An Empirical Study of Legal Communication, 14 Scribes J. Leg. Writing 121 (2011-2012)

Provided by:

Content downloaded/printed from HeinOnline

Thu May 9 02:36:03 2019

-- Your use of this HeinOnline PDF indicates your acceptance of HeinOnline's Terms and Conditions of the license agreement available at heinonline/HOL/License

-- The search text of this PDF is generated from uncorrected OCR text.

-- To obtain permission to use this article beyond the scope of your HeinOnline license, please use:

Copyright Information

Use QR Code reader to send PDF to your smartphone or tablet device

The Public Speaks:

An Empirical Study of

Legal Communication

Christopher R. Trudeau

Most attorneys agree that writers need to tailor their writing to a particular audience.' This just makes sense. So it's hardly a stretch to argue that in writing for a client, an attorney should use plain language - language that's so clear and effective that "the audience has the best possible chance of readily**...** understanding it." 2 But there is little empirical data on whether clients actually prefer plain language. Rather, as proof, books and articles rely mostly on anec- dotal evidence. In fact, in my classes I have offered anecdotal evidence of my own experiences with clients. I've said things like "Don't use con- flagration when you could just use fire. Don't say disseminate if you could say give or send out. Imagine if I told one of my English-as-a-second-language clients to 'disseminate this letter to your family.' They might ask, 'What do you want me to do to my family?'"

Supported by a scholarship grant from LexisNexis to the Legal Writing Institute and the Association of Legal Writing Directors. Reviewed for research methods by Deans Ann Wood and Laura LeDuc at Thomas Cooley Law School. See generally Wayne Schiess, Preparing Legal Documents Nonlawyers Can Read and Understand (ABA 2008); Nancy L. Schultz & Louis J. Sirico, Jr., Legal Writing and Other Lawyering Skills 161-68 (5th ed., Aspen 2010); Joseph M. Williams & Gregory G. Colomb, Client Communications: Delivering a Clear Message, 12 Persps. 127 (Winter 2004). 2 Annetta Cheek, Defining Plain Language, Clarity No. 64, at 9 (Nov. 2010).

2011-2012 121

2011-2012 The Public Speaks: An Empirical Study

made substantial progress: "In countries such as Australia, New Zealand, and Canada, legal practitioners and parliamentary draft- ers now feel no compunction in boasting about the 'plainness' of their drafting." 7 Then in 2010, the United States Congress passed the Plain Writing Act of 20108 - a major victory for plain lan- guage. Despite this progress, the empirical research supporting plain language in the law is incomplete, 9 especially when compared to other social sciences. About half the existing research focuses on the cost savings and time savings from using plain language Part of the other half largely focuses on pleasing one type of legal reader - judges and lawyers themselves." To be sure, this research has contributed critical ammunition by showing the huge economic benefits of plain language, together with the overwhelming prefer- ence for it within the profession (at least when it comes to reading). Significantly, one research study showed that over 82% of judges surveyed preferred plain language. 12 But what about studies addressing the other main type of legal readers - clients, prospective clients, or, more generally, the pub- lic? There have been five to date, all providing some kind of empirical

' Butt, supra n. 3, at 20. 8 Plain Writing Act of 2010, H. 946, 111th Cong. (Oct. 13, 2010). 9 See Karen Schriver & Frances Gordon, Grounding Plain Language in Research, Clarity No. 64, at 33 (Nov. 2010) (noting the need for further empirical research to support plain language). '0 See generally Joseph Kimble, Writing for Dollars, Writing to Please: The Case for Plain Language in Business, Government, and Law 106-33 (Carolina Academic Press 2012). " Id. at 135-42, 153. 12 Kimble, supra n. 4, at 13; see also Sean Flammer, Persuading Judges: An Empirical Analysis of Writing Style, Persuasion, and the Use of Plain Language, 16 Legal Writing 183 (2010) (showing a 66% preference in another study).

123

The Scribes Journal of Legal Writing 2011-

support for using plain language in legal communication with lay readers. The first is Mark Adler's study Bamboozling the Public." Adler sent a legal letter to a number of clients and asked them questions about their comprehension and impressions. Among other things, Adler confirmed that most of them did not fully understand what they were reading. 14 Second, in the mid-1990s, an Australian advertising agency sur- veyed "focus groups" of corporate executives. Through this qualitative study, the advertising agency confirmed that clients pre- ferred documents in plain language." Third, in 1997-1998, the Law Society of England and Wales studied 44 clients of 21 different solicitors.'" Through interviews, the Law Society found that clients value having a solicitor who will listen and who explains concepts in a manner that the client can understand 7 Fourth, in 1993, the Plain Language Institute in British Colum- bia conducted a study called Critical Opinions: The Public's View of Lawyers' Documents." It surveyed residents of British Columbia

" See Mark Adler, Bamboozling the Public, 9 Scribes J. Legal Writing 167 (2003-2004). 1 Id. at 185. 1 See Michele M. Aspray, Plain Language for Lawyers 57-58 (4th ed., Fed'n Press 2010) (discussing the study). 16 Clark D. Cunningham, What Do Clients Want from Their Lawyers? (Ga. St. U. Legal Studies Research Paper Series, No. 2010-04) (2009) (available at http:// ssrn/abstract=1505616) (citing and explaining the study conducted by Hilary Sommerland & David Wall, Legally Aided Clients and Their Solicitors: Qualitative Perspectives on Quality and Legal Aid, The Law Society, Research Study No. 34, at 2-6 (2000)). 17 Id. 1 Plain Lang. Inst. Rep., Critical Opinions: The Public's View of Lawyers'Documents, vol. 11 (1993).

124

The Scribes Journal of Legal Writing 2011-

Compiling the Sample

Initially, I intended to focus only on current or past clients, but it soon became clear that this was not practical. I contacted a num- ber of Michigan law firms, explained my study, and asked them to send the survey to their clients. From the outset, each firm was understandably hesitant to provide its clients' contact information. While this information may not be privileged in all instances, it's good professional judgment for firms to err on the side of cau- tion. 22 So I decided to use an online survey instead of the traditional printed survey that I had originally planned. Although this would force me to survey only those clients who had provided e-mail addresses, it would allow me to sidestep the firms' concerns about releasing client information. I could forward the survey's link to the firms, who would then e-mail the link to their clients, who in turn could respond anonymously to the online survey. (The online survey system I used, surveymonkey, allows researchers to make responses completely anonymous by not retaining IP addresses.) A secondary benefit was that I could reduce the clients' concern that their current attorney would discover how they responded to the survey. In the end, these benefits would greatly outweigh the sampling bias of having to exclude clients without e-mail addresses. Then I again contacted many Michigan law firms, explained the confidentiality protections I had made, and asked them to forward the survey's link to their clients. This time around, some firms were more than willing to help. But most were still hesitant. Although the stated reasons varied, they ranged from not wanting to bother clients with "trivial matters" to a managing partner's telling me that

22 See generally R. Weddle, Disclosure of Name, Identity, Address, Occupation, or Business of Client as Violation ofAttorney-Client Privilege, 16 A.L.R 1047 (1967).

126

2011-2012 The Public Speaks: An Empirical Study

he was afraid of the responses his firm's clients might provide. I explained that there would be no way for me to know where these responses were coming from, but alas, the firm still declined. At this point, I had four firms willing to send the survey to their clients. To protect client identity, I agreed not to disclose the firm names. These firms focus primarily on four different legal areas: civil defense, civil litigation, estate planning and real-estate transac- tions, and family law. I thought this range might provide a mix of clients so that I could further subcategorize the results. Next, I decided to expand my sample population to include the general public - even those who may not have used attorneys in the recent past. I did this for a few reasons: it would increase my

####### sample size (I was aiming for 300 or more); 23 I could reach people

who had used attorneys at some point in their past, as well as those who are potential clients in the future; and I could further compare and analyze the data among the two groups - clients and nonclients. To obtain these additional responses, I used a nonprobability sampling method called "snowball sampling." 24 With snowball sam- pling, "[t]he researcher begins with those members of the population to whom the researcher has access and then asks each participant to help the researcher**...** contact ... other members of the popula- tion... .The sample builds, or 'snowballs,' as more and more participants are discovered." 25 For this study, I employed a modi- fied version of snowball sampling by sending the survey to my own e-mail contact list and then asking those contacts to forward the survey's link to their contact lists. To ease the senders' burden, I

23 See Lawless et al., supra n. 21, at 149 (stating that "[i]n general, larger samples will result in more precise estimates of population characteristics"). 24 Id. at 148-49 (discussing snowball sampling). 25 Id.

127

2011-2012 The Public Speaks: An Empirical Study

communication between attorneys and their clients. This survey should take no more than 15 minutes of your time. Protecting Your Privacy This survey is being conducted by Professor Christopher Trudeau, an associate professor at Thomas M. Cooley Law School. Your privacy is important to Professor Trudeau, and it will be protected at all times. Specifically, after completing this survey, your responses will be sent only to Professor Trudeau, and he will receive no identifying information, such as your name, your contact information, or any details about your legal experiences (other than what you provide). Plus, your responses will be combined with other responses and will never be linked to you directly. If you would like to contact Professor Trudeau, you may call him at (517) 371-5140 x 2603 or e-mail him at trudeauc@cooley.

Notice that there was no mention of plain language, clarity, or legalese. My ultimate goal was truly stated - "to help attorneys better understand what clients prefer when communicating with their attorneys" - regardless of the results. After the introduction, I grouped all the questions into four sec- tions: (1) experience with attorneys; (2) preferences for attorney- client communications; (3) choice-of-language questions; and (4) demographic questions.

Section One: Experience with Attorneys

I wanted to separate the responses into two groups - clients and nonclients - so the first question asked whether the respon- dent had used an attorney at any time in the past five years. To me, this time frame helped to ensure that the person's experience with an attorney was relatively fresh in mind. But I also asked whether the respondent had ever used any attorney without recalling how long ago, and I included that subset in the client group.

129

The Scribes Journal of Legal Writing 2011-

One benefit of using an online survey was that I was able to add "skip logic" to certain questions so that respondents received only

follow-up questions that were relevant to their own experiences 7 For example, if a respondent had not used an attorney, then the respondent would not see questions 2 through 4 asking about past use of attorneys. But if a respondent had used an attorney in the past five years, I then asked follow-up questions about how many times and for what types of legal matters. All these questions were designed to help further categorize the responses given to later substantive questions.

Section Two: Preferences in Communications from an Attorney

The second section consisted of eight questions, each designed to gather useful information on matters for which there is limited or no empirical data. Specifically, the questions measured:

  • the respondent's preference for oral or written commu- nication;

  • the respondent's preference for electronic or printed written communication;

  • the respondent's reaction when an attorney uses Latin words or complicated legal words in written documents;

  • the care a respondent usually takes when reading a le- gal document;

  • the importance a respondent attaches to understanding an attorney;

27 See SurveyMonkey, What Is Skip Logic? help.surveymonkey/app/ answers/detail/a id/39/kw/question%201ogic (last accessed May 9, 2012).

130

The Scribes Journal of Legal Writing 2011-

is, the online surveying program would randomly select which ver-

sion to list first. 28

I tested four things: (1) active voice versus passive voice (four questions); (2) strong verbs versus nominalizations (two questions); (3) plain words versus complex words (four questions); and (4) ex- plaining a legal term versus not explaining it (one question). Of course, there are many more aspects of plain language 9 In my view, though, the four I tested are hallmarks. And frankly, others are harder to test. By limiting the survey to those four, I was also able to include multiple questions on the first three, helping me analyze the consistency of the results. Crafting these questions required care. The plain-language ver- sion had to have the same meaning as the traditional version. And I had to vary the complexity of the traditional passages - I did not want every one to be so complex that the respondent was forced to pick the plain-language passage. So I weighed the number of errors to include in each question. (I realize that choosing between active and passive voice is not a matter of "error," but I'll use that word for simplicity.) In 5 of the 11 choice-of-language questions, I tested only one error. That is, the only difference between the versions was a word choice or the type of voice used in the passage. I call these single- variable questions: they were questions 14, 19, 20, 21, and 22. In those questions, I could tell for sure the reason for the respondent's choice.

28 See SurveyMonkey, How Do I Make Answer Choices Flip, Appear Randomly or Alphabetically? help.surveymonkey/app/answers/detail/a-id/104 (last accessed May 9, 2012) (discussing how the user can randomize answer choices). 29 See, e., Joseph Kimble, The Elements of Plain Language, in Writing for Dollars, Writing to Please, supra n. 10, at 5 (setting-out 42 guidelines).

132

2011-2012 The Public Speaks: An Empirical Study

For the other 6 questions, I tested more than one error. I call these multivariable questions: they were questions 15, 16, 17, 18, 23, and 24. For example, in question 16, I included a passive sen- tence that also contained a nominalization - "A decision was made by the Board of Directors to review the file." I did this to test my hypothesis that the more errors in the traditional version, the greater the likelihood of choosing the plain-language version. As you'll see in the results, this hypothesis proved to be true.

Section Four: Demographic Questions

I ended with demographic questions designed to help catego- rize the responses. I had learned that "[q]uestions like demographics or personal information are usually best to introduce towards the end of the survey. This way, respondents are likely to have already developed confidence in the survey's objective."3 0 In this section, I asked respondents to give their age, level of education, and income. To me, these were potentially critical fac- tors that could influence how respondents answered. For example, a person with only a high-school diploma might prefer different things from a person with a doctoral degree. After months of researching, preparing the survey, identifying the sample, and validating the survey, I was ready to administer it. On March 10, 2011, I officially released it by sending the link to the law firms and my e-mail contact lists. I accepted responses for about a month, and they began amassing rather quickly. Within a week, there were 100, and within three weeks, there were 350. When I closed the survey on April 13, 2011, there were 376 - 76 more than my goal.

30 SurveyMonkey, supra n. 26, at 14 (citing G. larossi, The Power of Survey Design: A User's Guide for Managing Surveys, Interpreting Results, and Influencing Respon- dents (World Bank 2006)).

133

2011-2012 The Public Speaks: An Empirical Study

does include a higher percentage in the 30-39 and 50-59 age ranges than the population as a whole. 32 For educational level, the breakdown was as follows:

  • 116 respondents (32%) had less than a bachelor's degree (an associate's degree, some college, or a high- school diploma);

  • 105 (29%) had a bachelor's degree;

  • 80 (22%) had a master's or degree; and

  • 61 (17%) had a law degree.

Admittedly, the sample includes far more respondents with ad- vanced degrees than the population as a whole. But that was a benefit here because it allowed me to more accurately measure whether respondents with advanced degrees had different prefer- ences from everyone else. The results may surprise you.

The Results

Preferred Forms of Communication

Early in the survey, I wanted to gauge the respondents' prefer- ences for oral or written communication. So in question 5, I asked this: "How do you prefer for an attorney to provide most client

32 Id. 33 See U. Census Bureau, Educational Attainment in the United States: 2003, at 3, table A (June 2004) (available at census/prod/2004pubs/p20- 550) (specifying the percentage of the U. population with a bachelor's degree or more).

135

The Scnbes journal of Legal Writing 2011-

communications?" Respondents were then given five choices: e- mail, phone, mail, face-to-face, and "other." Overall, 58% preferred oral communication to written, either face-to-face or by phone. About 35% favored e-mail, and only 4% preferred traditional mail. That result is not too surprising, and two of the qualitative re- sponses to the "other" category help explain it. One person stated that "either phone or face-to-face - vocal is important - commu-

nication by definition is two-way." And another said that "the most

effective communication type is face-to-face, then other two-way verbal communication modes. Written is by far [second] place, es- pecially when trying to clarify or explain issues." Well put. Still, written communication is an essential part of the attorney- client relationship. So in question 6, I asked this: "How do you prefer that attorneys send letters and documents to their clients?" For this question, I did not want to dissuade nonclients from re- sponding, so I wrote it to ask for a general preference. Respondents had five choices: as a hard copy in the mail, electronically through e-mail, as a fax, both as a hard copy and as an electronic copy, and "other." Overall, 43% preferred to receive both a hard copy and an elec- tronic copy of a letter or document; 33% preferred to receive only an electronic copy; and 21% preferred only a traditional mailed copy. (Nonclients were a little more likely than clients to prefer both hard and electronic copies.) What's telling is that 76% of all respondents preferred to receive either an electronic copy alone or an electronic copy along with a hard copy. The days of sending documents by mail alone should be behind us - only 21% pre- ferred that.

136

The Scribes Journal of Legal Writing 2011-

to do so. And the numbers are even worse for clients: 79% have received such a document in their lifetime. Third, in question 13, I again wanted to follow up on this line of questioning: "If you read an attorney's letter or legal document and you did not understand a term, would you look up that term?" Overall, 32% said they would "always" look up a term they didn't understand; 26% "often"; 25% "sometimes"; 13% "rarely"; and 4% "never." And these results were consistent among clients and nonclients. So while a majority (58%) would at least "often" look up a term, 17% would "rarely" or "never" do so. That means 1 in 8 people wouldn't understand the term - odds that I sure wouldn't want to take. Fourth, I wanted respondents' reaction to receiving a document that uses complicated terms or Latin words. So back in question 8, I asked: "How does it make you feel when an attorney uses Latin words or complicated legal words in written documents?" The positioning of this question was critical. I didn't want it to follow the questions that I discussed above because the answers might be biased by previous answers. So I put it before question 10, about the importance of understanding an attorney. As you might expect, 41% said they get "annoyed" when they read complicated terms or Latin words; another 19% are "both- ered a little"; 30% said that such terms have "no influence" on them; and - get this - only 0% (2 respondents) said they're "im- pressed." That's not much support for the long-held notion that using complicated terms and Latin words impresses people. In total, 60% were at least bothered by complicated terms or Latin words. And to tilt the scales even more, about half the "other" responses indicated a preference for simple terms or at least an ex- planation of any complicated term. Client respondents were even more likely to be put off: 68% were "annoyed" or "bothered a little" by complicated terms or

138

2011-2012 The Public Speaks: An Empirical Study

Latin words. Moreover, these results were similar across educational levels: Overall Less than Bachelor's Master's Juris Bachelor's & Doctoral Doctor Annoyed 41% 44% 42% 40% 34% Bothered a little 19% 24% 14% 19% 15% No influence 30% 23% 31% 30% 44% Impressed 0% 2% 0% 0% 0% Other 10% 9% 12% 11% 7% True, the percentage of "annoyed" respondents dropped slightly as the educational level increased. But the drop was insignificant except for respondents with a law degree. Note, too, that zero respondents with a bachelor's degree or an advanced degree were impressed, and these are the people who are more likely to understand complicated terms. The lesson: don't use complicated terms or Latin words. You'll impress half a percent of the people, and you'll annoy around 40% of them. Finally, I wanted to learn whether a respondent's frustration over a complicated document had ever caused the respondent to stop reading. Surely, this is the worst-case scenario for an attorney: no one benefits when a client is frustrated and doesn't understand the message. So in question 12, I asked this: "Have you ever felt so frustrated when reading an attorney's letter or a legal document that you stopped reading it before it ended?" This was a follow-up to question 11, which asked whether the respondent had ever re- ceived a document that was difficult to understand. Respondents were given three choices: "yes," "no," and "I cannot recall." I in- cluded a text box that allowed respondents who selected "yes" to explain why they were frustrated. About 38% said they had stopped reading a document out of frustration, 16% could not recall, and 47% had not stopped read- ing. But the results are skewed a bit because nonclients who never

139

Was this document helpful?

Empirical Study Legal Communication

Course: Legal Research and Writing II

7 Documents
Students shared 7 documents in this course
Was this document helpful?
+ 2(,1 1/,1(
Citation:
Christopher R. Trudeau, The Public Speaks: An Empirical
Study of Legal Communication, 14 Scribes J. Leg.
Writing 121 (2011-2012)
Provided by:
Content downloaded/printed from HeinOnline
Thu May 9 02:36:03 2019
-- Your use of this HeinOnline PDF indicates your
acceptance of HeinOnline's Terms and Conditions
of the license agreement available at
https://heinonline.org/HOL/License
-- The search text of this PDF is generated from
uncorrected OCR text.
-- To obtain permission to use this article beyond the scope
of your HeinOnline license, please use:
Copyright Information
Use QR Code reader to send PDF
to your smartphone or tablet device