College Composition Weekly: Summaries of research for college writing professionals

Read, Comment On, and Share News of the Latest from the Rhetoric and Composition Journals


Bunch, George C. “Metagenres” as an Analytical Tool at Two-Year Colleges. TETYC, Dec. 2019. Posted 02/24/2020.

Bunch, George C. “Preparing the ‘New Mainstream’ for College and Careers: Academic and Professional Metagenres in Community Colleges.” Teaching English in the Two-Year College 47.2 (2019): 168-94. Print.

George C. Bunch, describing himself as a “relative ‘outsider’” who has been studying English learners and the “policies and practices” affecting their experiences as they enter and move on from community colleges (190n1), writes about the need for frameworks that can guide curricular choices for the “New Mainstream,” the students with diverse backgrounds and varied educational preparation who populate community colleges (169). He suggests attention to “metagenres,” a concept advanced by Michael Carter (171) as an “analytical tool” that can provide insights into the practices that will most benefit these students (170).

Bunch contextualizes his exploration of metagenres by reporting pressure, some from policymakers, to move community-college students more quickly through layers of developmental and English-as-second-language (ESL) coursework. Such acceleration, Bunch suggests, is meant to allow students to move faster into college-level or disciplinary coursework leading to transfer to four-year colleges or to career paths (168).

Bunch reports a study of ten California community colleges he and his team published in 2011. The study revealed contrasting orientations in approaches to developmental writing students. One endorses a skill-based curriculum in which students acquire “the basics” to function as “building blocks” for later more advanced coursework (172). The other promotes curriculum leading to “academic pathways” that encourage “opportunities for language and literacy development and support in the context of students’ actual progression toward academic and professional goals” (172). Bunch contends that in neither case did his team find adequate discussions of “the language and literacy demands of academic work beyond ESL, developmental English, and college-level composition courses” (173; emphasis original).

Bunch writes that scholarship on the role of writing instruction as students prepare for specific professional goals follows two divergent trends. One approach assumes that literacy instruction should promote a universal set of “generalist” competencies and that writing teachers’ “professional qualifications and experience” make them best qualified to teach these practices (173). Bunch points to the “Framework for Success in Postsecondary Writing” developed by the Council of Writing Program Administrators, the National Council of Teachers of English, and the National Writing Project, as well as work by Kathleen Blake Yancey, as exemplifying this approach (173-74).

At the same time, he notes, the later “WPA Outcomes Statement” illustrates a focus on the specific rhetorical demands of the disciplines students are likely to take up beyond English, asking, he writes, for “guidance” from disciplinary faculty and hoping for “share[d] responsibility” across campuses as students negotiate more targeted coursework (174). Bunch expresses concern, however, that faculty in the disciplines have “rarely reflected on those [literacy practices] explicitly” and tend to assume that students should master language use prior to entering their fields (174).

Bunch suggests that the concept of metagenres can supply analysis that affords a “grain size” between “macro approaches” that posit a single set of criteria for all writing regardless of its purpose and audience, and a “micro-level” approach that attempts to parse the complex nuances of the many different career options community-college students might pursue (175).

To establish the concept, Carter examined student outcomes at his four-year institution. Defining metagenres as “ways of doing and writing by which individual linguistic acts on the microlevel constitute social formations on the macrolevel” (qtd. in Bunch 176), Carter grouped the courses he studied under four headings:

  • Problem-Solving, most apparent in fields like economics, animal science, business management, and math
  • Empirical Inquiry, which he located in natural and social sciences
  • Research from Sources, visible in the humanities, for example history
  • Performance, notably in the fine arts but also in writing coursework (176)

Bunch notes that in some cases, the expected definitional boundaries required negotiation: e.g., psychology, though possibly an empirical discipline, fit more closely under problem-solving in the particular program Carter analyzed (176-77).

Bunch offers potential applications at the levels of ESL/developmental/composition coursework, “[w]riting across and within the disciplines,” “[c]ollege-level coursework in other disciplines,” and “[i]nstitution-wide reform” (177-79). For example, writing students might use the metagenre concept to examine and classify the writing they do in their other courses (178), or faculty might open conversations about how students might be able to experience discipline-specific work even while developing their language skills (179). Institutions might reconsider what Thomas Bailey et al. call the “cafeteria model” of course selection and move toward “guided pathways” that define coherent learning goals tied to students’ actual intentions (179).

Bunch and his group considered coursework in nine programs at a “small community college in the San Francisco Bay Area” that is designated a Hispanic-Serving Institution (180). In selecting programs, he looked for a range across both traditional academic areas and career-oriented paths, as well as for coursework in which minority and underprepared or minority-language students often enrolled (180-81). Primary data came from course descriptions at both class- and program-levels, but Bunch also drew on conversations with members of the community-college community (180).

He writes that “the notion of metagenres” was “useful for comparing and contrasting the ‘ways of doing’ associated with academic and professional programs” (181). He writes that history, fashion design, and earth science (meteorology and geology) could be classified as “research from sources,” “performance,” and “empirical inquiry,” respectively (182-83). Other courses were more complex in their assignments and outcomes, with allied health exhibiting both problem-solving and empirical inquiry and early childhood education combining performance and problem-solving (183-86).

Bunch states that applying the metagenre concept is limited by the quality of information available as well as the likelihood that it cannot subsume all subdisciplines, and suggests more research, including classroom observation as well as examination of actual student writing (186). He cites other examinations of genre as a means of situating student learning, acknowledging the danger of too narrow a focus on particular genres at the expense of attention to the practices of “individuals who use them” (187). However, in his view, the broader analytical potential of the metagenre frame encourages conversations among faculty who may not have considered the nuances of their particular literacy demands and attention to writing as part of students’ progression into specific academic and career paths rather than as an isolated early activity (174). He posits that, rather than trying to detail the demands of any given genre as students enter the college environment, institutions might focus on helping students understand and apply the “concept of metagenre” as a way of making sense of the rhetorical situations they might enter (189; emphasis original).

Ultimately, in his view, the concept can aid in

providing more specific guidance than afforded by the kinds of general academic literacy competencies often assigned to the composition profession, yet remaining broader than a focus on the individual oral and written genres of every conceivable subdiscipline and subfield. (189).


1 Comment

Anderst et al. Accelerated Learning at a Community College. TETYC Sept. 2016. Posted 10/21/2016.

Anderst, Leah, Jennifer Maloy, and Jed Shahar. “Assessing the Accelerated Learning Program Model for Linguistically Diverse Developmental Writing Students.” Teaching English in the Two-Year College 44.1 (2016): 11-31. Web. 07 Oct. 2016.

Leah Anderst, Jennifer Maloy, and Jed Shahar report on the Accelerated Learning Program (ALP) implemented at Queensborough Community College (QCC), a part of the City University of New York system (CUNY) (11) in spring and fall semesters, 2014 (14).

In the ALP model followed at QCC, students who had “placed into remediation” simultaneously took both an “upper-level developmental writing class” and the “credit-bearing first-year writing course” in the two-course first-year curriculum (11). Both courses were taught by the same instructor, who could develop specific curriculum that incorporated program elements designed to encourage the students to see the links between the classes (13).

The authors discuss two “unique” components of their model. First, QCC students are required to take a high-stakes, timed writing test, the CUNY Assessment Test for Writing (CATW), for placement and to “exit remediation,” thus receiving a passing grade for their developmental course (15). Second, the ALP at Queensborough integrated English language learners (ELLs) with native English speakers (14).

Anderst et al. note research showing that in most institutions, English-as-a-second-language instruction (ESL) usually occurs in programs other than English or writing (14). The authors state that as the proportion of second-language learners increases in higher education, “the structure of writing programs often remains static” (15). Research by Shawna Shapiro, they note, indicates that ELL students benefit from “a non-remedial model” (qtd. in Anderst et al. 15), validating the inclusion of ELL students in the ALP at Queensborough.

Anderst et al. review research on the efficacy of ALP. Crediting Peter Adams with the concept of ALP in 2007 (11), the authors cite Adams’s findings that such programs have had “widespread success” (12), notably in improving “passing rate[s] of basic writing students,” improving retention, and accelerating progress through the first-year curriculum (12). Other research supports the claim that ALP students are more successful in first- and second-semester credit-bearing writing courses than developmental students not involved in such programs. although data on retention are mixed (12).

The authors note research on the drawbacks of high-stakes tests like the required exit-exam at QCC (15-16) but argue that strong student scores on this “non-instructor-based measurement” (26) provided legitimacy for their claims that students benefit from ALPs (16).

The study compared students in the ALP with developmental students not enrolled in the program. English-language learners in the program were compared both with native speakers in the program and with similar ELL students in specialized ESL courses. Students in the ALP classes were compared with the general cohort of students in the credit-bearing course, English 101. Comparisons were based on exit-exam scores and grades (17). Pass rates for the exam were calculated before and after “follow-up workshops” for any developmental student who did not pass the exam on the first attempt (17).

Measured by pass and withdrawal rates, Anderst et al. report, ALP students outperformed students in the regular basic writing course both before and after the workshops, with ELL students in particular succeeding after the follow-up workshops (17-18). They report a fall-semester pass rate of 84.62% for ELL students enrolled in the ALP after the workshop, compared to a pass rate of 43.4% for ELL students not participating in the program (19).

With regard to grades in English 101, the researchers found that for ALP students, the proportion of As was lower than for the course population as a whole (19). However, this difference disappeared “when the ALP cohort’s grades were compared to the non-ALP cohort’s grades with English 101 instructors who taught ALP courses” (19). Anderst et al. argue that comparing grades given to different cohorts by the same instructors is “a clearer measure” of student outcomes (19).

The study also included an online survey students took in the second iteration of the study in fall 2014, once at six weeks and again at fourteen weeks. Responses of students in the college’s “upper-level developmental writing course designed for ESL students” were compared to those of students in the ALP, including ELL students in this cohort (22).

The survey asked about “fit”—whether the course was right for the student—and satisfaction with the developmental course, as well as its value as preparation for the credit-bearing course (22). At six weeks, responses from ALP students to these questions were positive. However, in the later survey, agreement on overall sense of “fit” and the value of the developmental course dropped for the ALP cohort. For students taking the regular ESL course, however, these rates of agreement increased, often by large amounts (23).

Anderst et al. explain these results by positing that at the end of the semester, ALP students, who were concurrently taking English 101, had come to see themselves as “college material” rather than as remedial learners and no longer felt that the developmental course was appropriate for their ability level (25). Students in one class taught by one of the researchers believed that they were “doing just as well, if not better in English 101 as their peers who were not also in the developmental course” (25). The authors consider this shift in ALP students’ perceptions of themselves as capable writers an important argument for ALP and for including ELL students in the program (25).

Anderst et al. note that in some cases, their sample was too small for results to rise to statistical significance, although final numbers did allow such evaluation (18). They also note that the students in the ALP sections whose high-school GPAs were available had higher grades than the “non-ALP” students (20). The ALP cohort included only students “who had only one remedial need in either reading or writing”; students who placed into developmental levels in both areas found the ALP work “too intensive” (28n1).

The authors recommend encouraging more open-ended responses than they received to more accurately account for the decrease in satisfaction in the second survey (26). They conclude that “they could view this as a success” because it indicated the shift in students’ views of themselves:

This may be particularly significant for ELLs within ALP because it positions them both institutionally and psychologically as college writers rather than isolating them within an ESL track. (26)