College Composition Weekly: Summaries of research for college writing professionals

Read, Comment On, and Share News of the Latest from the Rhetoric and Composition Journals


Abba et al. Students’ Metaknowledge about Writing. J of Writing Res., 2018. Posted 09/28/2018.

Abba, Katherine A., Shuai (Steven) Zhang, and R. Malatesha Joshi. “Community College Writers’ Metaknowledge of Effective Writing.” Journal of Writing Research 10.1 (2018): 85-105. Web. 19 Sept. 2018.

Katherine A. Abba, Shuai (Steven) Zhang, and R. Malatesha Joshi report on a study of students’ metaknowledge about effective writing. They recruited 249 community-college students taking courses in Child Development and Teacher Education at an institution in the southwestern U.S. (89).

All students provided data for the first research question, “What is community-college students’ metaknowledge regarding effective writing?” The researchers used data only from students whose first language was English for their second and third research questions, which investigated “common patterns of metaknowledge” and whether classifying students’ responses into different groups would reveal correlations between the focus of the metaknowledge and the quality of the students’ writing. The authors state that limiting analysis to this subgroup would eliminate the confounding effect of language interference (89).

Abba et al. define metaknowledge as “awareness of one’s cognitive processes, such as prioritizing and executing tasks” (86), and explore extensive research dating to the 1970s that explores how this concept has been articulated and developed. They state that the literature supports the conclusion that “college students’ metacognitive knowledge, particularly substantive procedures, as well as their beliefs about writing, have distinctly impacted their writing” (88).

The authors argue that their study is one of few to focus on community college students; further, it addresses the impact of metaknowledge on the quality of student writing samples via the “Coh-Metrix” analysis tool (89).

Students participating in the study were provided with writing prompts at the start of the semester during an in-class, one-hour session. In addition to completing the samples, students filled out a short biographical survey and responded to two open-ended questions:

What do effective writers do when they write?

Suppose you were the teacher of this class today and a student asked you “What is effective writing?” What would you tell that student about effective writing? (90)

Student responses were coded in terms of “idea units which are specific unique ideas within each student’s response” (90). The authors give examples of how units were recognized and selected. Abba et al. divided the data into “Procedural Knowledge,” or “the knowledge necessary to carry out the procedure or process of writing,” and “Declarative Knowledge,” or statements about “the characteristics of effective writing” (89). Within the categories, responses were coded as addressing “substantive procedures” having to do with the process itself and “production procedures,” relating to the “form of writing,” e.g., spelling and grammar (89).

Analysis for the first research question regarding general knowledge in the full cohort revealed that most responses about Procedural Knowledge addressed “substantive” rather than “production” issues (98). Students’ Procedural Knowledge focused on “Writing/Drafting,” with “Goal Setting/Planning” in second place (93, 98). Frequencies indicated that while revision was “somewhat important,” it was not as central to students’ knowledge as indicated in scholarship on the writing process such as that of John Hayes and Linda Flower and M. Scardamalia and C. Bereiter (96).

Analysis of Declarative Knowledge for the full-cohort question showed that students saw “Clarity and Focus” and “Audience” as important characteristics of effective writing (98). Grammar and Spelling, the “production” features, were more important than in Procedural Knowledge. The authors posit that students were drawing on their awareness of the importance of a polished finished product for grading (98). Overall, data for the first research question matched that of previous scholarship on students’ metaknowledge of effective writing, which shows some concern with the finished product and a possibly “insufficient” focus on revision (98).

To address the second and third questions, about “common patterns” in student knowledge and the impact of a particular focus of knowledge on writing performance, students whose first language was English were divided into three “classes” in both Procedural and Declarative Knowledge based on their responses. Classes in Procedural Knowledge were a “Writing/Drafting oriented group,” a “Purpose-oriented group,” and the largest, a “Plan and Review oriented group” (99). Responses regarding Declarative Knowledge resulted in a “Plan and Review” group, a “Time and Clarity oriented group,” and the largest, an “Audience oriented group.” One hundred twenty-three of the 146 students in the cohort belonged to this group. The authors note the importance of attention to audience in the scholarship and the assertion that this focus typifies “older, more experienced writers” (99).

The final question about the impact of metaknowledge on writing quality was addressed through the Coh-Metrix “online automated writing evaluation tool” that assessed variables such as “referential cohesion, lexical diversity, syntactic complexity and pattern density” (100). In addition, Abba et al. used a method designed by A. Bolck, M. A. Croon, and J. A. Hagenaars (“BCH”) to investigate relationships between class membership and writing features (96).

These analyses revealed “no relationship . . . between their patterns knowledge and the chosen Coh-Metrix variables commonly associated with effective writing” (100). The “BCH” analysis revealed only two significant associations among the 15 variables examined (96).

The authors propose that their findings did not align with prior research suggesting the importance of metacognitive knowledge because their methodology did not use human raters and did not factor in student beliefs about writing or questions addressing why they responded as they did. Moreover, the authors state that the open-ended questions allowed more varied responses than did responses to “pre-established inventor[ies]” (100). They maintain that their methods “controlled the measurement errors” better than often-used regression studies (100).

Abba et al. recommend more research with more varied cohorts and collection of interview data that could shed more light on students’ reasons for their responses (100-101). Such data, they indicate, will allow conclusions about how students’ beliefs about writing, such as “whether an ability can be improved,” affect the results (101). Instructors, in their view, can more explicitly address awareness of strategies and effective practices and can use discussion of metaknowledge to correct “misconceptions or misuse of metacognitive strategies” (101):

The challenge for instructors is to ascertain whether students’ metaknowledge about effective writing is accurate and support students as they transfer effective writing metaknowledge to their written work. (101)