In the study titled, 'A Sociological Lens on Linguistic Diversity: Implications for Writing Inclusive Multiple-Choice Assessments', researchers Jennifer Lightfoot, Katherine Lyon, Daniel Riccardi, Nathan Roberson, Mark Lam and Simon Lolliot, explored how multiple-choice assessments can act as a medium to shape linguically diverse students' abilities to demonstrate disciplinary knowledge.
In this Q&A with Language Sciences, the research team explained how multiple-choice questions can act as a barriers to demonstrating knowledge, the potential for negative cognitive burden, and how instructors can make multiple-choice assessments more linguistically accessible for English additional-language students.
1. What are some linguistic features used in multiple-choice questions that can act as barriers to the demonstration of students’ knowledge?
Our research has shown that language used in multiple-choice questions (MCQs) can be a barrier for students when it comes to demonstrating their knowledge. Language used in MCQs is often similar to the norms of academic writing, including abstract concepts densely packed into noun groups (a process called nominalization), use of more uncommon words, and content that may require specific cultural knowledge. Our study of 700 first-year sociology and psychology students revealed that when unpacking these three linguistic features, on average, students were 6 percent more likely to answer MCQs correctly, indicating that unpacking these complex linguistic features in MCQs increases students’ ability to accurately demonstrate their knowledge.
2. With regards to linguistic terminology, what is the difference between “packed” and “unpacked” questions?
Like many forms of communication, academic texts contain linguistic codes (Bernstein 1971). We refer to these codes as “packed” and “unpacked”. Unpacked language makes meanings and norms explicit and are typically used when engaging readers who may not have a common orientation or understanding of the text. Packed language, on the other hand, assumes that readers have a common understanding of the text and make communication more efficient by conveying more information in a more condensed form. Academic discourse, particularly in written form, operates primarily through packed codes.
In order to unpack MCQs, three linguistic interventions were employed: reducing noun group complexity (nominalization), substituting infrequent vocabulary, and making cultural references more explicit.
Linguistic Intervention #1: Reducing Noun Group Complexity
Nominalization refers to the organization of noun groups for the purpose of abstract representation and communication of ideas. A noun group is a group of words that describe an entity. Noun groups tend to be more densely packed in written rather than spoken language, particularly in academic writing (Biber, Gray, and Poonpon 2011; Halliday 1987; McCabe and Gallagher 2008). This use of nominalization can be problematic when assessing students’ content knowledge as these packed noun groups make meaning less evident and accessible, impeding reading comprehension (Biber and Gray 2010; Fang, Schleppegrell, and Cox 2006; Halliday and Martin 1993; Staples et al. 2016; Thompson 2014; Ventola 1996).
Consider the following comparison between a packed and an unpacked MCQ stem:
Packed MCQ stem: The form of emotion management characterized by the commodification of people’s deep acting inducing a sense of alienation is...
Unpacked MCQ stem: There are different forms of emotion management. In one form, people’s deep acting is commodified. This deep acting causes people to feel alienated. What is the name of this form of emotion management?
The packed MCQ stem consists of one highly dense noun group (a nominalized idea), whereas, in the unpacked MCQ stem, we modified and expanded the noun group into three distinct and more accessible statements.
Linguistic Intervention #2: Substituting Infrequent Vocabulary
We also modified MCQs to include more common vocabulary. Hu and Nation (2000) suggest that a reader must understand 98% of a text in order to comprehend it. Therefore, since students will struggle to comprehend material that contains a large number of unfamiliar terms, we replaced uncommon words with more frequently used synonyms according to the Corpus of Contemporary American English. This is intended to make the questions easier to understand for a greater number of students.
Linguistic Intervention #3: Making Cultural References Explicit
Instructors often use cultural references to explain course concepts in an effort to engage students; however, for international students, this can be an additional obstacle to decode and interpret (Lee 1997). Furthermore, these references often come in the form of unfamiliar words that are context-dependent (Hsu and Yang 2013). This confusion can lead international students to believe their language skills are lacking, rather than recognizing the confusion as a result of a lack of contextual knowledge (Andrade 2006), which may prevent students from asking for clarification about cultural references. To make the MCQs more accessible, references to specific cultural knowledge not pertinent to the content were removed.
3. Can you discuss the potential for negative cognitive burden for English as an additional language (EAL) students, compared to English as a native language (ENL) students, when being tested with multiple-choice questions?
Multiple-choice questions (MCQs) are widely used in large introductory courses. Yet, access to the norms of academic discourse embedded in MCQs differs between groups of first-year students. During their initial year at university, students undergo socialization into academic discourse norms alongside course content. These norms are part of the institutionalized cultural symbols that maintain and reproduce social and cultural exclusion (Lamont and Lareau 1988:156). MCQs are an example of what Bourdieu (1984, 1991) refers to as cultural capital—different groups of students bring different capacities to decode MCQs. Some students arrive at university with language that is more attuned to academic discourse based on their previous experiences, particularly linked to different ways language has been used in their home (Lamont and Lareau 1988; Lareau 2011), whereas others have to adapt to a new set of linguistic codes. As a result, students (and their parents) are not positioned in the same way in relation to academic norms, potentially impeding their “ability to conform to institutionalized expectations” (Laureau and Weininger 2003:588).
While our results suggest that unpacking increases all students’ ability to demonstrate their knowledge by answering more MCQs correctly, our results indicate that international EAL students are more likely to benefit from reduced complexity of MCQs. These observations of differences in student performance are supported by previous research suggesting that language use in assessments that is above the proficiency level of learners can pose a cognitive burden and lead to lower student scores, specifically as a result of the linguistic complexity of test items (Abedi et al. 2004, Abedi and Lord 2001, Parkes and Zimmaro 2016). At the same time, this research suggests that reducing the linguistic complexity of test items improves student performance, helping to decrease the performance gap between EAL and non-EAL students.
As there has been an increase in international students across institutions of higher education in English-speaking countries, there is a great need for supporting multilingual learners. UBC is no exception – international students constitute 27 percent of the student body (University of British Columbia 2021). Many instructors assume MCQs are validly measuring content mastery; however, the validity of MCQs may be compromised by complex academic language that may be difficult for students to comprehend, particularly for first-year EAL students. Since these students are not likely to have had extensive experience navigating the norms of academic discourse, revised and clarified MCQs can help to improve the understanding of test questions for EAL learners without compromising disciplinary content.
What are some ways that instructors can make multiple-choice assessments more linguistically accessible for EAL students?
Instructors tend to already be very adept at helping students unpack and understand difficult material being taught in class (Matruglio, Maton and Martin 2013). Instructors often explain abstract concepts in more accessible language, providing concrete examples that are easier to apply and comprehend, as this facilitates comprehension and learning of course concepts.
Given this natural attention to using more unpacked forms of language within the classroom, we suggest that unpacking complex language used in MCQs (using the three linguistic interventions mentioned above) can make the meaning of the question more accessible to all students so they can better understand the question and thus better demonstrate their content knowledge. In order to reduce cognitive burden in MCQ exams, we echo Parkes and Zimmaro’s (2016) call for language use in exams to be appropriate for the academic level of the students as it would be an unfair assessment of content knowledge if the reading level is beyond the students.
Here are a few strategies you might consider:
- Consider the vocabulary words used in the stem of the question. Try to identify complex words that have many syllables. If these are technical terms that you expect students to know based on what they’ve learned in your course, then they should not be changed. However, if these complex words have not been explicitly taught to students in your course, try to replace these words with more familiar synonyms.
- Run your questions through a readability calculator. This will give you a sense of the general difficulty of the language as well as identify specific words that may be more challenging for students. There are many free ones available, such as:
- Try to avoid writing questions that begin with long, complex noun groups as this creates long sentences which can be more difficult to understand than shorter sentences. Take a look at another example below of a packed question stem, with all of the information packed in one sentence, compared to the unpacked version:
- Packed: The enzyme that synthesizes short sequences of RNA during DNA replication and necessary for DNA synthesis to begin is…
- Unpacked: During DNA replication, a short segment of RNA must form. This RNA segment is the beginning of DNA synthesis. What enzyme creates these RNA segments?
- Finally, think critically about any analogies you might use. While these may help some students understand a question, consider whether any of these analogies depend on specific cultural knowledge (e.g.: sports team analogies, films, books, etc.).
Click here to read the full study.
Written by Kelsea Franzke