College Composition Weekly: Summaries of research for college writing professionals

Read, Comment On, and Share News of the Latest from the Rhetoric and Composition Journals


Leave a comment

Estrem et al. “Reclaiming Writing Placement.” WPA, Fall 2018. Posted 12/10/2018.

Estrem, Heidi, Dawn Shepherd, and Samantha Sturman. “Reclaiming Writing Placement.” Journal of the Council of Writing Program Administrators 42.1 (2018): 56-71. Print.

Heidi Estrem, Dawn Shepherd, and Samantha Sturman urge writing program administrators (WPAs) to deal with long-standing issues surrounding the placement of students into first-year writing courses by exploiting “fissures” (60) created by recent reform movements.

The authors note ongoing efforts by WPAs to move away from using single or even multiple test scores to determine which courses and how much “remediation” will best serve students (61). They particularly highlight “directed self-placement” (DSP) as first encouraged by Dan Royer and Roger Gilles in a 1998 article in College Composition and Communication (56). Despite efforts at individual institutions to build on DSP by using multiple measures, holistic as well as numerical, the authors write that “for most college students at most colleges and universities, test-based placement has continued” (57).

Estrem et al. locate this pressure to use test scores in the efforts of groups like Complete College America (CCA) and non-profits like the Bill and Melinda Gates Foundation, which “emphasize efficiency, reduced time to degree, and lower costs for students” (58). The authors contrast this “focus on degree attainment” with the field’s concern about “how to best capture and describe student learning” (61).

Despite these different goals, Estrem et al. recognize the problems caused by requiring students to take non-credit-bearing courses that do not address their actual learning needs (59). They urge cooperation, even if it is “uneasy,” with reform groups in order to advance improvements in the kinds of courses available to entering students (58). In their view, the impetus to reduce “remedial” coursework opens the door to advocacy for the kinds of changes writing professionals have long seen as serious solutions. Their article recounts one such effort in Idaho to use the mandate to end remediation as it is usually defined and replace it with a more effective placement model (60).

The authors note that CCA calls for several “game changers” in student progress to degree. Among these are the use of more “corequisite” courses, in which students can earn credit for supplemental work, and “multiple measures” (59, 61). Estrem et al. find that calls for these game changers open the door for writing professionals to introduce innovative courses and options, using evidence that they succeed in improving student performance and retention, and to redefine “multiple measures” to include evidence such as portfolio submissions (60-61).

Moreover, Estrem et al. find three ways in which WPAs can respond to specific calls from reform movements in ways that enhance student success. First, they can move to create new placement processes that enable students to pass their first-year courses more consistently, thus responding to concerns about costs to students (62); second, they can provide data on increased retention, which speaks to time to degree; and finally, they can recognize a current “vacuum” in the “placement test market” (62-63). They note that ACT’s Compass is no longer on the market; with fewer choices, institutions may be open to new models. The authors contend that these pressures were not as exigent when directed self-placement was first promoted. The existence of such new contexts, they argue, provides important and possibly short-lived opportunities (63).

The authors note the growing movement to provide college courses to students while they are in high school (62). Despite the existence of this model for lowering the cost and time to degree, Estrem et al. argue that the first-year experience is central to student success in college regardless of students’ level when they enter, and that placing students accurately during this first college exposure can have long-lasting effects (63).

Acknowledging that individual institutions must develop tools that work in their specific contexts, Estrem et al. present “The Write Class,” their new placement tool. The Write Class is “a web application that uses an algorithm to match students with a course based on the information they provide” (64). Students are asked a set of questions, beginning with demographics. A “second phase,” similar to that in Royer and Gilles’s original model, asks for “reflection” on students’ reading and writing habits and attitudes, encouraging, among other results, student “metaawareness” about their own literacy practices (65).

The third phase provides extensive information about the three credit-bearing courses available to entering students: the regular first-year course in which most students enroll; a version of this course with an additional workshop hour with the instructor in a small group setting; or a second-semester research-based course (64). The authors note that the courses are given generic names, such as “Course A,” to encourage students to choose based on the actual course materials and their self-analysis rather than a desire to get into or dodge specific courses (65).

Finally, students are asked to take into account “the context of their upcoming semester,” including the demands they expect from family and jobs (65). With these data, the program advises students on a “primary and secondary placement,” for some including the option to bypass the research course through test scores and other data (66).

In the authors’ view, the process has a number of additional benefits that contribute to student success. Importantly, they write, the faculty are able to reach students prior to enrollment and orientation rather than find themselves forced to deal with placement issues after classes have started (66). Further, they can “control the content and the messaging that students receive” regarding the writing program and can respond to concerns across campus (67). The process makes it possible to have “meaningful conversation[s]” with students who may be concerned about their placement results; in addition, access to the data provided by the application allows the WPAs to make necessary adjustments (67-68).

Overall, the authors present a student’s encounter with their placement process as “a pedagogical moment” (66), in which the focus moves from “getting things out of the way” to “starting a conversation about college-level work and what it means to be a college student” (68). This shift, they argue, became possible through rhetorically savvy conversations that took advantage of calls for reform; by “demonstrating how [The Write Class process] aligned with this larger conversation,” the authors were able to persuade administrators to adopt the kinds of concrete changes WPAs and writing scholars have long advocated (66).


Leave a comment

Klausman et al. TYCA White Paper on Placement Reform. TETYC, Dec. 2o16. Posted 01/28/2017.

Klausman, Jeffrey, Christie Toth, Wendy Swyt, Brett Griffiths, Patrick Sullivan, Anthony Warnke, Amy L. Williams, Joanne Giordano, and Leslie Roberts. “TYCA White Paper on Placement Reform.” Teaching English in the Two-Year College 44.2 (2016): 135-57. Web. 19 Jan. 2017.

Jeffrey Klausman, Christie Toth, Wendy Swyt, Brett Griffiths, Patrick Sullivan, Anthony Warnke, Amy L. Williams, Joanne Giordano, and Leslie Roberts, as members of the Two-Year College Association (TYCA) Research Committee, present a White Paper on Placement Reform. They review current scholarship on placement and present two case studies of two-year colleges that have implemented specific placement models: multiple measures to determine readiness for college-level writing and directed self-placement (DSP) (136).

The authors locate their study in a current moment characterized by a “completion agenda,” which sees as a major goal improving student progress toward graduation, with an increased focus on the role of two-year colleges (136-37). This goal has been furthered by faculty-driven initiatives such as Accelerated Learning Programs but has also been taken on by foundations and state legislatures, whose approach to writing instruction may or may not accord with scholarship on best practices (137). All such efforts to ensure student progress work toward “remov[ing] obstacles” that impede completion, “especially ‘under-placement’” in developmental courses (137).

Efforts to improve placement require alternatives to low-cost, widely available, and widely used high-stakes tests, such as COMPASS. Such tests have not only been shown to be unable to measure the many factors that affect student success; they have also been shown to discriminate against protected student populations (137). In fact, ACT will no longer use COMPASS after the 2015-2016 academic year (137).

Such tests, however, remain “the most common placement process currently in use at two-year colleges” (138); such models are used more frequently at two-year institutions than at four-year ones (139). These models, the Committee reports, also often rely on “Automated Writing Evaluation (AWE) software,” or machine scoring (138). Scholarship has noted that “indirect” measures like standardized tests are poor instruments in placement because they are weak predictors of success and because they cannot be aligned to local curricula and often-diverse local populations (138). Pairing such tests with a writing sample scored with AWE limits assessment to mechanical measures and fails to communicate to students what college writing programs value (138-39).

These processes are especially troublesome at community colleges because of the diverse population at such institutions and because of the particular need at such colleges for faculty who understand the local environment to be involved in designing and assessing the placement process (139). The Committee contends further that turning placement decisions over to standardized instruments and machines diminishes the professional authority of community-college faculty (139).

The authors argue that an effective measure of college writing readiness must be based on more than one sample, perhaps a portfolio of different genres; that it must be rated by multiple readers familiar with the curriculum into which the students are to be placed; and that it must be sensitive to the features and needs of the particular student population (140). Two-year institutions may face special challenges because they may not have dedicated writing program administrations and may find themselves reliant on contingent faculty who cannot always engage in the necessary professional development (140-41).

A move to multiple measures would incorporate “‘soft skills’ such as persistence and time management” as well as “situational factors such as financial stability and life challenges” (141). Institutions, however, may resist change because of cost or because of an “institutional culture” uninformed about best practices. In such contexts, the Committee suggests incremental reform, such as considering high-school GPAs or learning-skills inventories (142).

The case study of a multiple-measures model, which examines Highline Community College in Washington state, reports that faculty were able to overcome institutional resistance by collecting data that confirmed findings from the Community College Research Center (CCRC) at Columbia University showing that placement into developmental courses impeded completion of college-level courses (142). Faculty were able to draw on high-school portfolios, GED scores, and scores on other assessment instruments without substantially increasing costs. The expense of a dedicated placement advisor was offset by measurable student success (143).

The Committee presents Directed Self-Placement (DSP), based on a 1998 article by Daniel Royer and Roger Gilles, as “a principle rather than a specific procedure or instrument”: the concept recognizes that well-informed students can make adequate educational choices (143). The authors note many benefits from DSP: increases in student agency, which encourages responsibility and motivation; better attitudes that enhance the learning environment; and especially an opportunity for students to begin to understand what college writing will entail. Further program benefits include opportunities for faculty to reflect on their curricula (144).

Though “[e]mpirical evidence” is “promising,” the authors find only two studies that specifically address DSP models in use at community colleges. These studies note the “unique considerations” confronting open-admissions institutions, such as “limited resources,” diverse student bodies, and prescriptive state mandates (145).

The case study of Mid Michigan Community College, which implemented DSP in 2002, details how the college drew on some of the many options available for a DSP model, including two different reading scores, sample assignments from the three course options, an online survey about students’ own writing and reading backgrounds, and an advisor consultation (146). Completion results improved substantially without major cost increases. The college is now addressing the effects of a growth surge as well as the need for the model to accommodate students with dual-enrollment credits (146-47).

Other possible reforms at some institutions include “first-week ‘diagnostic’ assignments,” “differentiated instruction” allowing students with some degree of college readiness to complete the capstone project in the credit-bearing course; and various options for “challenging” placement, such as submission of portfolios (147-48). The authors caution that students who already understand the institutional culture—possibly “white, middle-class, traditional-aged students”—are the ones most likely to self-advocate through placement challenges (148).

The Committee reports that the nontraditional students and veterans in many two-year-college populations often do not score well on standardized tests and need measures that capture other factors that predict college success, such as “life experiences” (148). Similarly, the diverse two-year student population is best served by measures that recognize students’ abilities to “shuttle among a wealth of languages, linguistic resources, and modalities,” rather than tests that may well over-place students whose strength is grammar knowledge, such as some international students (149).

The Committee recommends that placement procedures should

  • be grounded in disciplinary knowledge.

  • be developed by local faculty who are supported professionally.

  • be sensitive to effects on diverse student populations.

  • be assessed and validated locally.

  • be integrated into campus-wide efforts to improve student success. (150-51)

The White Paper provides a substantial Works Cited list as a resource for placement reform.