Reflection by Paul Fyfe

Without explicitly saying so, Reductive Reading may seek nothing less than to reconcile the two most conspicuous developments in Victorian studies in the last decade: computational methods on the one hand and the provocations of the V21 Collective on the other. Arguably, these have stood opposed: the V21 manifesto lists two separate objections to the “accumulation of mere information.” While that is not synonymous with data, it is subtext at least. Allison’s book teaches that reductiveness, heard in the adjective “mere,” may instead clear space for considering complexity (34). Reductive Reading makes that case for quantitative research methods as well as critical discourse about ethics and the novel.

At the start, the book focuses entirely on justifying its methodology, including computational text analysis as well as stylistic approaches to prose and ethics. Each reduces its objects to features. Computational research must decide upon measurable features of texts which, if lacking dimension, may challenge the researchers’ sense of where and how and at what scale meaning is made. Questions of ethics in novels frequently involve consideration of plot, representations, or reader-character or reader-narrator relations. Yet it is possible to reduce that question, too, into syntactical features to study how their larger contours might shape a novel’s moral project. Reductive Reading uses the first in service of the second, moving from quantitative analysis into literary criticism engaged with ethics, genre, and reading.

Nominations of “[adjective] reading” are becoming more and more familiar on the critical landscape. But Reductive Reading does not privilege close, distant, surface, zoomable, suspicious, reparative, its own eponymous reading, or another method. Ultimately, the book is both original and surprisingly uncontroversial. It argues that novels construct ethical relationships through patterns of style. Thus, “moralizing” in the Victorian novel is less outwardly prescriptive—which the Victorians agreed made for bad art anyway—and more experiential at the level of syntax. As the book explores the “syntactic templates for critical moral judgment” in Victorian texts, it also leaves digital debates behind (11). And as the book concludes, it explains its scholarly contributions without reference to computers at all.

Those debates have recently flared again over “computational literary studies.”[1] But Reductive Reading instead pursues what Steven Ramsay called “algorithmic criticism”—using computational methods to carry forward a long tradition of hermeneutical reading.[2] Indeed, the case studies in Reductive Reading engage well-defined critical conversations about the novel as a form, as the object of criticism, as a textual phenomenon within broader tissues of genre and language. By the end, I wondered to what degree the book needed its elaborate opening defense. (A version of the book’s excellent chapter on Eliot previously appeared as an article in ELH: it makes no mention of computation, data, spreadsheets, or the digital.[3]) Of course a book should evolve as it proceeds, but the contrast of Reductive Reading’s beginning and ending left me curious: to what degree are quantitative approaches just part of the ways we argue now?

[1] See the article by Nan Z. Da and the accompanying forum of responses: “Computational Literary Studies: A Critical Inquiry Online Forum,” In the Moment (blog), March 31, 2019,

[2] Stephen Ramsay, “Algorithmic Criticism,” in A Companion to Digital Literary Studies, ed. Ray Siemens and Susan Schreibman (Oxford: Blackwell, 2004),

[3] Sarah Allison, “Discerning Syntax: George Eliot’s Relative Clauses.” ELH 81.4 (2014): 1275-1297.

%d bloggers like this: