Geisler, Cheryl. “Current and Emerging Methods in the Rhetorical Analysis of Texts. Opening: Toward an Integrated Approach.” Journal of Writing Research 7.3 (2016): 417-24. Web. 08 May 2016.
Cheryl Geisler introduces a special section of the Journal of Writing Research focusing on the use of various digital tools to analyze texts. Noting the “rise of digital humanities,” which involves making use of the options software provides for “all sorts of rhetorical purposes,” Geisler and the authors of the articles in the special section ask two related questions: “How can we best understand the costs and benefits of adopting a particular approach? Are they simply alternatives or can they be integrated?” (418).
To experiment with different approaches, the authors of the special-section articles all worked with the same texts, a set of documents “produced by eight pairs of PhD advisors and their advisees” across the disciplines of Computer Science, Chemical Engineering, Materials Science Engineering, and Humanities and Social Sciences (418). This body of texts had been collected for a larger interview-based study of academic citation practices and source use conducted by one of the special-section authors, A. Karatsolis. Karatsolis’s coding was provided for half of the documents in the later study and the “coding schemes” were provided for all.
Geisler’s overview of the status of digital text analysis draws on the categories of I. Pollach, who proposed three types of analysis. To those categories, Geisler added two more, hand-coding and text mining. Geisler discusses
- Hand-coding, in which human readers assign text elements to categories developed in a coding scheme;
- Computer-aided content analysis, which draws on “content dictionaries” to “map words and phrases onto content categories”;
- Computer-aided interpretive textual analysis, a.k.a. computer-assisted qualitative data analysis (CAQDAS), which aids human analysts in efforts to “manage, retrieve, code, and link data”;
- Corpus linguistics, which searches texts for “words or terms that co-occur more often that [sic] would be expected by chance”; and
- Text mining, which finds features pre-selected by humans. (419)
Geisler explores various current uses of each process and includes a list of software that combines qualitative and quantitative analysis (420-21). Her examples suggest that approaches like hand-coding and corpus linguistics are often combined with digital approaches. For example, one study used a “concordance tool (AntiConc)” to search teacher comments for traces of a “program-wide rubric” (421).
Discussing the possibility of an integrated approach, Geisler summarizes three examples. The first is from Helsinki University of Technology: the study combined “text-mining techniques with qualitative approaches” (421). A second, from 2011, is referred to as the KWALON Experiment. In this project, as in the study reported in the JOWR special section, researchers examined the same body of texts, a very large data set (421-22). Only one researcher was able to analyze the entire set, a result Geisler posits may result from the use of the digital concordance tool to select the texts before the researcher hand-coded them (422).
In the third example of integrated approaches, researchers from the University of Leipzig developed “Blended Reading,” in which digital tools help readers designate appropriate texts; expert human readers use “snippets” from the “most relevant” of these documents to “manually annotate” texts; and finally, these annotations contribute to “automatic detection” over “multiple iterations” to refine the process. The resulting tool can then be applied to the entire corpus. According to Geisler, “[w]hat is intriguing” about this example “is that it seems to combine high quality hand coding with automatic methods” (422).
Geisler offers the articles in the special section as a study of how “a choice of analytic methods both invites and constrains” rhetorical examination of texts (423).