Can humane literacy survive without a grand narrative?
p. 51-69
Plan détaillé
Texte intégral
The use of letters is the principal circumstance that distinguishes a civilized people from a herd of savages incapable of knowledge or reflection.—Gibbon
Think what punishment shall come upon us on account of this world, when we have not ourselves loved [the gift of literacy] in the least degree, or enabled others to do so.—King Alfred
1I want to begin by recounting a story from the earliest days of literacy in the English-speaking world. It was the end of the sixth century. The flower of Roman antiquity had wilted and then been blasted into fragments by corruption from within and barbarian invasions from without. But Christian missionaries were still being sent out, bearing their book into dark and brooding lands such as that of the islanded Angles, Jutes and dreaded Saxons. The effect of eventual Christianization on the savage Germanic warriors who ruled what is now England was, despite resistance, remarkable in its amelioration of their savagery and its gradual replacement by a love of learning and desire for God. This was evident not only among monks like the Venerable Bede, that signal historian of early Britain from whose pages comes the account I wish here to relate, but among warrior kings turned from bloodshed to the pursuit of books and public wisdom, such as King Alfred, justly called Alfred the Great.
2The account Bede gives of the conversion of King Edwin of Diera (roughly present-day York and Northumberland) tells how the king, whose wife had already become a Christian, sought the advice of “friends, princes and counselors” concerning the question of his own possible openness to the new faith. Among the speakers, Coifi, chief among the pagan priests, offered the pragmatic observation that there seemed to be “no truth and no usefulness in the old religion,” so that if the new faith proved “stronger medicine,” they should take to it. But the next speaker, an unnamed counsellor, was concerned rather about wisdom in a deeper sense. In his speech he casts his counsel in the form of metaphor:
Such it seems to me, dear Sovereign, that the life of men present here in earth (for the comparison of our uncertain time, and days to live) is as if a sparrow beaten with wind and weather should chance to fly in at one window of the parlour, and flitting there a little about, straightway fly out at another, while your grace is at dinner in the presence of your dukes, lords, captains, and high guard. The parlour itself being then pleasant, and warm with a soft fire burning amidst thereof, but all places, and ways abroad troubled with tempest, raging storms, winter winds, hail, and snow. Now your grace considereth, that this sparrow while it was within the house felt no smart of tempestuous wind or rain. But after the short space of this fair weather and warm air the poor bird escapeth your sight, and returneth from winter to winter again. So the life of man appeareth here in earth, and is to be seen for a season: but what may, or shall follow the same, or what hath gone before it, that surely know we not. Therefore if this new learning can inform us of any better surety, methinks it is worthy to be followed.1
3This second counsel proved decisive. In its proposed matriculation from pagan pragmatism toward the wisdom of the Book it is exemplary of that attraction to a more explanatory human story, a grand narrative by means of which English-speaking tribes began to be transformed from a severe proto-Darwinian servitude to fate and the strongest man toward order, choice and the consensual rule of common law. This transformation, however simply it appeared to come at first, in fact developed slowly over a long period of time. But in our time, 1300 years later, it may much more quickly be coming undone.
4My short answer to the question in my title “Can Humane Literacy Survive without a Grand Narrative?,” let me admit it straightaway, is “probably not.” I am not unaware that in our cultural context a wide-ranging effort has been mounted to promote contrary views—at the very least to celebrate the decentralized cultivation of what American philosopher Richard Rorty calls “incommensurable discourses.” My doubtfulness might seem in this light lamentably unfashionable, if not in outright poor taste. After all, ours is the age of the triumphant polymorphous, the multicultural mosaic, the omniversity—an age, in short, where even the idea of a commonly possessed community story has been made to seem an overbearing as well as anachronistic imposition. Indeed, the term “grand narrative,” signifying for many such outdated concepts as “foundational myth,” “metaphoricity” and “canon,” has been generated and deployed as a term of opprobrium.
5What then do I mean by calling this current convention into question? Well, what I mean to suggest is that if we still wish to preserve humane culture we should move our focus, for a time, away from theory and other substitutes for empirical evidence to consider a hugely pertinent and yet regularly neglected factor in the demise of the influence of any grand narrative upon our general culture. As a percentage of the total, fewer and fewer of our citizens are in any meaningful sense—that is, any humane sense—literate. In the academic trenches, as the mountains of first-year essays beside our desks grow to gargantuan proportions and we have begun timidly to look our fate, so to speak, in the laser-printed eye, we are all of us aware that the level of even basic literacy possessed by many of our students is often insufficient for them either to think their way through the texts they read or to write their way through commensurable levels of interaction in an essay. We who teach did not need the now numerous literacy polls to tell us that one university graduate in eight cannot read a straightforward newspaper editorial and say in two or three sentences what it is about, despite 16 or 17 years of formal schooling. Who is not aware of the downward pressure upon the language as well as the content of seminar and lecture, the attenuation of reading lists, the levelling of performance and the compensatory inflation of grades? All these, and more, are palpable evidences that the reach of humane literacy into the minds and hearts of our contemporaries doth oft indeed exceed their grasp. Employers berate the university for the lack of basic literacy in our graduates, international testing has long since ceased to favor the products of our schools and even lackadaisical government bureaucrats have begun to complain vociferously. Defenders of the status quo, who like to protest that today’s students are brighter and more articulate than ever before, and that all the talk about literacy levels is hyperinflated by sinister agents of a neoconservative agenda, have, frankly, ceased to be convincing, even to themselves.2
6Not everything in this unfortunate state of affairs, let us hasten to agree, is the fault of the schools. Humanists are among those only too happy to be able to point convincingly to other pressures against literacy: is it our fault that only about two percent of North Americans any longer regularly read books? The American novelist Walker Percy, who at least as much as the newspaper chains and beleaguered professors of humanities had a considerable vested interest in the literacy problem, imagined that the
diagnosis of this state of affairs by men of letters might run something like this: literacy in America has declined for a variety of reasons—bad schools, decay of the family, most of all, the six or seven hours of daily TV. This decline of literacy is accompanied by a rise in philistinism in America: a preference for the skillfully marketed and packaged product for the consumption of the mass man—the Top Ten on TV, NFL telecasts with the quite well-done Miller Lite and Mean Joe Green commercials—plus a few big commercial novels, whether the Harold Robbins novel in which sex figures second only to money, the Barbara Cartland novel in which sex becomes something called romance, or the Judy Blume novel in which teenagers are introduced to sex like Tarzan and Jane.3
7Touché: his summary will almost do, as far as it goes. (Alas, he died before they could get him “on-line.” Or perhaps he might have said “in line.”)
8My own purpose is not to go much further over this somewhat muddy ground. Relativizing so-called “child-centred” primary and secondary education, shot through as it is with a fuzzy and self-justifying 1960s southern California metaphysics, has taken its toll among many we would not wish to think of as philistines. I am more concerned with philistinism of an analogous and, for us, still more immediate sort. One register of this philistinism has been almost obsessively discussed, but too little constructively engaged. I refer to that dissociation between common sense and educational purposes so successfully effected on our campuses by the overt rhetorical conflict over what has come to be called “political correctness.” Most will agree with me, I think, that a slightly less charged term covers more fully the ethos of constraint or repression of its own intellectual tradition in the contemporary university. The term “postmodernism” was first given currency in American academic argot by a literary critic, Leslie Fiedler.4 In the view of another and currently more fashionable culture theorist, philosopher Jean François Lyotard, we are to understand that postmodernism refers above all to a shift away from traditional theories of knowledge and the knower, particularly such theories of knowledge as imply a philosophy of history—or world view—as a means of legitimating that knowledge. Lyotard’s simplest definition of postmodernism is “incredulity toward metanarratives.”5 In addition to apparent “post-Christian” (or post-Jewish) presuppositions, this incredulity entails a loss of confidence in any modern theory of “progress,” most particularly, any narrative explanation of our educational efforts that portray them as leading to some type of emancipation for those who labor to learn. In the encapsulation of Anthony Giddens, a British sociologist:
The condition of postmodernity is distinguished by an evaporating of the “grand narrative”—the overarching “story line” by means of which we are placed in history as beings having a definite past and “predictable future.”6
9In his sense, it is surely the case that, as Lyotard, Giddens and Charles Taylor among others have argued, the assiduous intellectual cultivation of postmodernism is effectively a phase of acceleration for what we have long called “modernism,” that is, an accumulation of modernism’s full momentum, a time in which “the consequences of modernity are becoming more radicalized and universalized than before.”7 One of these consequences is the fuller politicization of educational culture: from John Dewey’s notion that the purpose of public education was socialization, its goal the “final pooled intelligence” of the mass mind, many educators have come at last to construe the task of the university as the political institution of a prevalent sociology of knowledge in the bureaucratic class. It now appears that some bureaucrats feel this “institution” requires political coercion rather than the exchange of free debate to succeed. For Lyotard, for example, the postmodern cyberspace era will ultimately require of the university’s learners a stern and severe context of constraint. He advised his 1979 audience of Quebec university administrators that their system decisions “do not have to respect individuals’ aspirations”: the aspirations have to aspire to the decisions, or at least to their effects. Administration procedures should make individuals “want what the system needs in order to perform well.” He assured them, further: “It cannot be denied that there is persuasive force in the idea that context control and domination are inherently better than their absence” and, ominously, that indeed the coming system “can count severity among its advantages.”8 At the level of theoretical discussion in the humanities, this technologically supported liquidation of nostalgia for a shared explanatory context, for vestigial impulses toward the “grand narrative,” has the effect of increasing, not reducing, pressures to conformity. Persistent grand narratives may prove an unwelcome source of constraint for movements toward still greater constraint.
10In our universities themselves one sees this development in authoritarian pronouncements like that of Barbara Johnson, Professor of English at Harvard, who says that “professors should have less freedom of expression than writers and artists, because professors are supposed to be creating a better community.”9 More disturbingly, it has taken the form of bureaucratic edicts, such as the former (NDP) Ontario government’s “Framework Regarding Prevention of Harassment and Discrimination in Ontario Universities” (1993), with its “zero tolerance” speech code. This “Framework” was successfully resisted by professional associations and the professoriate. Yet within three years university administrators themselves were asking for still harsher constraints on professors’ freedom of speech. These include sanctions against professors publishing in any other forum than the professional organs of one’s narrow discipline.10 We are discovering that the modernist impulse, which begins in the promise of greater personal liberty, when pressed to its full consequences can suddenly seem to revoke that original promise. In social terms at least, this is one apparent meaning of “postmodernism.”
11At another level of educational experience, however, as instanced notably by enthusiastic promotion of the Internet, there is a world of information now available at a whim, suggesting to the contrary a kind of armchair emancipation. Cyberspace can be made to seem the most promising environment for an entertainment-driven, variety-hungry postmodern academic. In the secret confines of our office or study carrel, we enter into a consolatory, therapeutic experience of apparent mastery over all the information in the universe without significant limits to either appetite or expression. In striking contrast to the bureaucratic and administrative systems with which we struggle, the World Wide Web is spun, it would seem, by virtually invisible spiders. While our actual social and educational environments are being more and more compressed by severe speech codes and by a narrowing of other political and judicial freedoms, we are entertainingly diverted as perhaps never before by the silky cocoon world of cyberspace, in which we are given an antidotal, perhaps onanistic, and almost certainly opiate illusion.
12This illusion conforms rather well to notions fostered by various expressions of “value-free” education. In a fashion evocative of what Dostoevsky once said would happen in civic morality upon proclamation of the death of God, so too in our palaces of wisdom upon declaration of the death of reason: “Everything is now permitted.” Of itself, increased information produces little in the way of wisdom, and often less in the way of clarity. In some spheres of the printed word, textual interpretation or even historical recollection, for example, there is now deemed to be no such thing as a “wrong answer.” Normative grammar and lexical meaning are, after all, merely the imposition of patriarchy. Pursuit of the “grammatically correct” has in some jurisdictions almost become “politically incorrect,” and that with the blessing of some among my own colleagues. Likewise with logic: for Richard Rorty, to cite an infamous example, what we call “common sense” is “nothing more than a disposition to use the language of our ancestors, to worship the corpses of their metaphors,”11 and so by definition retrograde.
13For Rorty, the exponential spread of talk at cross-purposes is not in any sense problematic. Quite the contrary, it is a growth industry to whose uneasy conscience he and his confrères seek to provide their own brand of occupational therapy. The therapy of choice is self-involved “conversation,” a directionless but diverting dance of dialogue among interlocutors whose starting points—and probably ending points—are assumed to be “incommensurable.” In an era of pluralistic and multicultural societies and in which, moreover, there has been a resurgence of tribalism despite (perhaps to some degree because of) the global economy, we can see why therapies for institutionalized incommensurability, even if they should prove to be merely palliative to the inmates of senescent institutions, are attractive. But what, university students may well ask, is the educational value of such an odd, even self-contradictory notion of “conversation”? (Why, when they could be surfing the Net, buzzing the Web or boobing the tube, should they persist with our incommensurable Babel-Tower lingo? Surely a picture, any picture, is worth a thousand empty words.)
14Yet the therapeutic and palliative aspect of “conversation” in the university as Rorty sees it makes “self-edification” rather than arrival at truth or even common understanding the goal of participants in conversation, and requires of its apprentices only that we abandon forever “believing that we know ourselves by knowing a set of objective facts” and “thinking that we possess a deep, hidden, metaphysically significant nature which makes us irreducibly different from inkwells or atoms.”12 A problem with such a notion of “self-edification” is that it is non-generative: all too quickly it becomes a prescription for self-defeat as well as for an effective constraint of other selves. What anybody thinks or says, in effect, is not at last really very important. The limitation of Rorty’s notion of “conversation” for large community enterprises—nation building, let us say, or forging some sort of national standards in education—will be fairly apparent.
15It may be useful to remind ourselves of the specific bête noire, or perhaps we should say “corpse of ideas,” against which Rorty’s notion of conversation has been set to work. This, too, may be done succinctly, by recollection of a classic defence of the humanities—“the Great Conversation,” he called it—made by Robert M. Hutchins while he was still chancellor of the University of Chicago. In this lecture, Hutchins argued what to many of our contemporaries must now seem an offensive thesis, namely that “the Civilization of the Dialogue is the only civilization worth having.” For Hutchins too, conversation was crucial, but clearly his was a very different notion of conversation:
An educational institution should be a community. A community must have a common aim, and the common aim of the educational community is the truth. It is not necessary that the members of the educational community agree with one another. It is necessary that they communicate with one another, for the basis of community is communication. In order to communicate with one another, the members of the community must understand one another, and this means that they must have a common language and common stock of ideas. Any system of education that is based on the training of individual differences is fraudulent in this sense. The primary object of education should be to bring out our common humanity. For though men are different, they are also the same, and their common humanity, rather than their individual differences, requires development today as at no earlier era in history.13
16Hutchins’ “Great Conversation” evidently has not only a synchronic dimension but a diachronic one as well: it is a conversation in this present time against and across time; in short, it is a paradigm example of “grand narrative” educational thinking.
17It is precisely this granting of a measure of authority to the past (that is, to teaching from the past), ceding a possible pertinence of voices from history or tradition to our present understanding, that is most to be rejected according to postmodern strategists. Why has this become an agenda item of such evident urgency? Humanists, presumably, can guess at some possible answers. To begin with, it is evident that the “loss of grand narrative” offers an attractive means of justifying exemption from one critical sphere of accountability. Rationalizations—often overstated—for the virtues of this “loss” or jettisoning of a common story are often therefore “theoretical” masks for simpler and more candid declarations, such as Sartre’s “I create myself” or William Blake’s famous response to tradition: “I must create my own system or be enslaved by that of another man.”
18Refusing one’s obligation to the past can take a variety of forms of hubristic ingratitude. If I may recollect a classic text from English literature of the Renaissance: one of the humorous preludes to the damnation of Marlowe’s professor Doctor Faustus occurs at the theatrical diversion put on for his benefit by Lucifer and Mephistopheles, a dance of the Seven Deadly Sins—to Faustus the laughably outworn form of an archaic ethical analysis. The first dancer is Pride, whose brazen self-declaration to the besotted Faustus is: “I disdain to own any parents.” This, of course, quite precisely mirrors the professor’s own hubris. Yet if we jump forward to nineteenth-century America, we must acknowledge an evident sea change concerning hubris. What Marlowe regards as a self-deluding vice sufficient to lead his Wittenberg professor to perdition, Professor Ralph Waldo Emerson, following Blake, makes a virtue of “Self-Reliance” in his famous essay of that name: “History is an impertinence and an injury,” he declares inter alia, “if it be anything more than a cheerful apologue or parable of my being and becoming.”14 A more acute self-consciousness about this stance is found in the famous quip by Oscar Wilde: “The one duty we have to history is to rewrite it.”15 It may be the ironic Wilde who has really grasped the nettle that pricks most deftly. History, like myth, tends when rejected to require replacement by something more unequivocally and unambiguously self-justifying—or “therapeutic.” So the expedient proposal is to rewrite it so as to accommodate these frankly self-centring purposes.
19Let me consider briefly just one academic example: the sustained connection in North American Marxist versions of postmodern theory between the otherwise patently contradictory pursuits of a radically subjective hermeneutics and a radically determinist socialist politics becomes more comprehensible in the light of the rejection by the autonomous ego of parenting, mentoring and tradition.16 Like many another six-figure salaried bogus leftist, Rorty exemplifies the odd contradiction, echoing self-deifying voices from Faustus and Blake to Sartre when he urges that instead of attempting to understand ourselves as part of an intellectual and social tradition we should follow Nietzsche’s example and insistently define the world from the ego out. Such a world, he concedes, with a pragmatic sigh, is likely to prove a lonely place. If we need palliation, he says, we should “seek consolation, at the moment of death, not in having transcended the animal condition, but in being that peculiar sort of dying animal who, by describing himself in his own terms, had created himself.” Loneliness is not such a bad thing if you are an übermensch. Rorty makes it hard not to think here of the song made famous by America’s most notorious hoodlum singer, Frank Sinatra: “I did it my way”—the theme song of hell, Peter Kreeft has called it.17 But it is surely an odd concert in which gangsters, directors of multinational corporations and academic Marxists can all be found singing the same tune.
20More pitiable banalities of confused self-idolatry live on in the North American academy. At Duke University, for example, the self-advertised “cutting edge” of postmodern literary criticism, several of the most prominent Marxist, poststructuralist and feminist academics have abandoned both literature and criticism for the writing of autobiography. These “leading national figures” in English literary theory—Frank Lentricchia, Alice Kaplan, Marianna Torgovnick, Eve Kosofsky Sedgwick and Jane Tompkins (wife of ex-chairman of the English department, Stanley Fish) among them—by this latest intensification of their disdain for explanatory discourse and turn to direct literary self-creation, merely carry the logic of their romantic theorizing and, as Tompkins calls it, their “trajectory of personal development” to its inevitably embarrassing conclusion. Fish himself, meanwhile, has edged himself out of the (non)community he largely created, apparently under ungrateful social pressure to do so, to take full-time shelter in Duke’s Faculty of Law.18 Undergraduate enrollment in English has meanwhile dropped precipitously, the ample complement of reputedly highest-paid English professors in North America notwithstanding. Their graduate students find it just as difficult as graduate students in humanities and social sciences everywhere else to get an academic job. The prediction of Jean-François Lyotard in 1979 that with the advent of cyber-learning the professoriate itself would dwindle to merely fractional existence may yet prove to have been accurate. His further prediction, that the elite among researchers would gravitate to think-tanks, in many cases leaving the universities behind, has proved him prescient. But his assurance that a stern coerciveness in the reconfigured university would not in the end damage pedagogy has been less persuasive.19 The elevation of a coterie elite attending to theoretical preoccupations in increasingly opaque language and the eventual replacement of most other professors by monitors and modems—his imagination of the future of the university—hardly makes it seem like a propagator of general literacy.
21Let me risk the indelicacy of this point a bit further. All culturally accountable persons, one hopes, will want to think about the motives that may lie behind the Nietzschean articulations that tend to recur in much postmodern discourse. Why should any wielder of potentially tyrannous power want to insist on being auto-nomos, a law unto the self, “self-created”? What are the uses of such a myth? And how should such a one escape evidence that might give the lie to this myth of self? Protect his alibi? One of the oldest strategies of all (cf. Genesis 4) is to exclude contrary or implicating witnesses, even if by the simple expedient of not taking them seriously enough to argue their point. And that, it seems, is what some “rewriters” of history evidently wish to do, as much as did those who were burners of books.
22In general, we who labor in the university should ponder such developments with concern. If Paul Ricœur is right that the paradigm shift in modernist historiography involves reluctant abandonment of the enlightenment theory of progress to an age of ambiguity, and that the crisis of ambiguity in turn is bound to resolve itself either by finding alternative grounds for hope or falling into despair, then perhaps what we are living with now is confused irresolution of both impulses.20 It seems beyond question the case that, as Anthony Giddens has suggested, “loss of a belief in ‘progress’... is one of the factors that underlies the dissolution of ‘narratives’ of history.”21 Sustaining narrative rationales for cultural literacy have tended to dissolve with them. Perhaps we ought to take the measure of our colleague’s anxieties quite seriously. Shamed by the painful failure of the old Enlightenment and Darwinian assurances about the triumph of rational progress, yet unable to admit that this bankruptcy is open to succinct and yet psychologically plausible analysis from within the Christian tradition upon whose rejection the Enlightenment project was constructed, is it possible that some of our contemporaries are drawn to “posthumanist,” “postliberal,” “postmodernist” strategies out of a felt need for what the media handlers of politicians like to call “damage control”? Could it be that much postmodern theory is less forward-looking than after-the-fact apologia for a failed Utopian vision? Whether in their liberal humanist or Marxist guise, modern theories of progress and emancipation have never seemed more hopelessly at variance with the pertinent evidence.
23If this predicament—or some version of it—has been an actual motivation for certain kinds of ego-driven yet consensus-demanding postmodern theories, how might one purposefully, yet compassionately and self-critically, query the antirealist theoretical smoke screen that the underlying embarrassments have thrown up? How might more of us together enter into a discourse of sober, grounded and patient rational inquiry concerning the increasingly evident divorce of moral accountability from educational and professional life—a kind of intellectual examination of conscience?
24This task, it should be admitted, will not prove easy. As we know, for the postmodernist, language is no longer to be used according to its conventional expectations of reference. For Rorty, for example, the notion of truth external to the self to which language attempts correspondence is purely chimerical, the faded vestige of a world view in which people could believe, as Roger Lundin trenchantly puts it, “in something so demeaning as a Creator-God.”22 In practice, say Rorty and postmodernists generally, it is as Kant suggested, only more so: “everything can be changed by talking in new terms”—by our language we constitute our world, as well as our “self.” Our saying makes it so. Where conventional associations make this awkward for us, we simply redefine key terms: a key recognition of radical modernism is the discovery that “anything could be made to look bad, important or unimportant, useful or useless, by being redescribed.”23 How convenient—and how resistant to learning, to others, to reality.
25It is an irony of more than passing interest that in America, Madison Avenue and the political spin-doctors have been well ahead of the philosophers in developing this theory as cultural practice. (It was the culture of Madison Avenue, we may remember, that popularized the materialist myth of the “self-made man.”) Philosopher Rorty identifies postmodernism with the egocentric “romanticism” of figures like Blake and Rousseau, as well he might. But the mediation of this romanticism turns out to have been by a quite specific and much less esoteric postromantic discourse. When Rorty says that the essential postmodern theme is that “what is most important for human life is not what propositions we believe but what vocabulary we use,” he quickly observes that philosophers like Nietzsche and William James have been instrumental in developing this thesis by teaching us to give up “the notion of truth as a correspondence to reality.” Henceforth, instead of saying that the function of language is to “bring hidden secrets to light, they said that new ways of speaking could help us get what we want.”24 But the rhetorical force of this last, essentially consumerist phrase is clarifying, a kind of giveaway. Here, an admirer of intelligibility may justifiably feel, is “adult” language more or less rationalizing the screeching egocentrism of a spoiled child—perhaps the new “everyman” for North American culture (cf. Bart Simpson or Calvin of “Calvin and Hobbes”). For those who read books to small children, statements like Rorty’s offer an inescapable reminiscence of Humpty Dumpty:
“There’s glory for you!”
“I don’t know what you mean by ‘glory,’” Alice said.
Humpty Dumpty smiled contemptuously. “Of course you don’t—till I tell you. I meant ‘there’s a nice knock-down argument for you!”’
“But ‘glory’ doesn’t mean ‘a nice knock-down argument,”’ Alice objected.
“When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.”
“The question is,” said Alice, “whether you can make words mean so many different things.”
“The question is,” said Humpty Dumpty, “which is to be master—that’s all.”25
26In such a self-referential linguistic environment, to cite Nietzsche, truths can be dismissed with a flourish as “illusions about which one has forgotten that this is what they are” (“On Truth and Lies”).
27When we permit linguistic meaning to be reduced to an affect of power, or mastery, there remains little possibility of truthful exchange. Accordingly, Lyotard claims that the academic question is no longer “Is it true?” but rather, “What use is it?” or, more precisely, “How much is it worth?” (Madison Avenue again. In the marketplace of ideas, option price can be the real bottom line, as anyone who has followed the career of certain luminous contemporary professors will appreciate.) And so Relativism reigns, often as the consort of Opportunism. It may thus be for a complex of not very high-minded reasons that, as Jaroslav Pelikan puts it succinctly, in some quarters of the academy, relativism, “especially relativism about first principles” has been itself elevated “to the status of a first principle (about which it is not permitted to be a relativist).”26 Contemporary demands for “pluralism” are often just another form of this new first principle, and just as often a bedmate of Opportunism.
28Contemporary humanist discourse, it is hardly necessary to say, is overrun with varieties of relativism and antirealism. Rorty’s version—that truth is what my peers will let me get away with saying—is just one of them. Philosopher Alvin Plantinga highlights the troubling social and political implications of this view by noting that on Rorty’s view “what is true for me, naturally enough, might be false for you; my peers might let me get away with saying something that your peers won’t let you away with saying.” More poignantly, “even if we had the same peers, they might not let you and me get away with saying the same things.” As Plantinga puts it, au courant or not, this view has peculiar consequences:
For example, most of us think the Chinese authorities did something monstrous in murdering those hundreds of young people in Tianamen Square; they then compounded their wickedness by denying that they had done it. On Rorty’s view, however, this is perhaps an uncharitable misunderstanding. What the authorities were doing, in denying that they had murdered those students, was something wholly praiseworthy: they were trying to bring it about that the alleged massacre never happened. For they were trying to see to it that their peers would let them away with saying that the massacre never happened; if they were successful, then (on the Rortian view) it would have been true that it never happened, in which case, of course, it would never have happened. So in denying that they did this horrifying thing, they were trying to make it true that it had never happened; and who can fault them for that? The same goes for those contemporary neo-Nazis who claim that there was no holocaust; from a Rortian perspective, they are only trying to see to it that such an appalling event never happened; why should we hold that against them? Instead of blaming them, we should cheer them on.
This way of thinking has real possibilities for dealing with poverty and disease: if only we let each other get away with saying that there isn’t any poverty and disease—no cancer or AIDS let us say—then it would be true that there isn’t any; and if it were true that there isn’t, then of course there wouldn’t be any. That seems vastly cheaper and less cumbersome than the conventional methods of fighting poverty and disease. At a more personal level, if you have done something wrong, it is not too late; lie about it, thus bringing it about that your peers will let you get away with saying that you didn’t do it; and, as an added bonus, that you didn’t even lie about it. One hopes Rorty is just joshing the rest of us. (But he isn’t.)27
29Thinking about these issues raises discomfiting moral dilemmas for the academic who believes herself to be responsible to pursue more than just the acceptance of her professional peers.28 A pertinent task for contemporary scholarship in the humanities—one, it must be admitted, not yet constructively enough accomplished—may thus be identification and analysis of the motivation for wishing relativism a priori to reign, for the banishing of truth questions. Perhaps we need to find ways to ask Lyotard’s question—“What use is it?”—from a more self-transcending, civic-minded perspective.
30We may find that we do not necessarily have to have read George Orwell or Aldous Huxley yesterday to come up with some disturbing possible answers. Yet surely we must probe further. What do contemporary humanists, even from the perspective of the normal commerce of their daily life in community, have to say about the counterfactual, antirealist temper of postmodern language in its flight from epistemology to “hermeneutics” (Rorty)? As ethos, is postmodernism merely taking evasive action here, or is it possibly striking out in anger at exposure of a misplaced idolatry? Behind the extremism, then, are there particular grounds for a more constructive, even compassionate understanding? Can we hear in the postmodernist’s belligerent antirealism and self-referential logic a cri de cœur explicable in much more straightforward terms? Perhaps. But in the meantime, one has also to wonder whether too much postmodernist antirealism has not had the effect of subverting our shared civic purpose, specifically our traditional and necessary commitment to the fostering in our citizens of a high-quality humane literacy.29
31Literacy is notoriously easier to lose than to acquire. And when it is lost something far larger than the mechanical ability to decode signs goes with it. As the prologues he wrote to those works of classical humane wisdom (Orosius, Boethius, Augustine, Gregory) he translated for his people make clear, King Alfred the Great understood that humane literacy was the key to a world in which it had suddenly become possible to focus hopefully on the future, rather than simply live under the tyranny of the present. As Bede’s narrative about the speech of the counsellors at the time of King Edwin’s conversion likewise illustrates, it was the very notion of “grand narrative” that created a sense of future hope and thus of the value of learning and humane literacy among the pagan Germanic peoples. It was revolutionary for them, and culturally securing, to enter into conversation with a grand narrative. Before Alfred, who could have imagined a model of Germanic kingship of which it might truthfully be said that the pen became mightier than the sword?30 Who even now would want to live in a world in which the pen had been emptied of its power to countervail the clash of steel?
32What is it that Germanic pagans, native Africans in the age of missions, attic Greeks or Augustan Romans saw as attractive and culturally sustaining about their own participation in a grand narrative—be it biblical, Homeric or Vergilian? Presumably what each grand narrative offered was the notion that there is meaning in history and, as the anonymous Anglo-Saxon counsellor captured in his metaphor of the sparrow flight through the mead-hall and out into the night, that an understanding of continuity with the past might help to undergird community future. These reflections in their turn engender the notion that there is, after all, something to be learned. They also imply that since language is an inheritance as well as a present instrument, literacy is that which most generously affords us a place in the continuing conversation—in Hutchins’ rational exchange rather than Rorty’s incommensurable bidding and forbidding. From the Judaeo-Christian point of view in particular, a grand narrative is what gives substance to the idea of human beings as persons with a purpose. In this respect, it is perhaps needless to add, it further subtends Western convictions about the fundamental character of human rights.31
33In its broadest Erasmian definition, the “Civilization of the Dialogue,” in its advocacy of humane literacy, encourages and permits access to other conversations. Precisely this is so because it commits its citizens to read—to interpret the other—and to write rationally and truthfully enough that the other may “read” accountably in turn. That is, the postmodernists notwithstanding, there is no self-edification at last without a complementary commitment to edification of the other: it is no accident that the English medieval poet Chaucer aptly commends his Clerk, “Gladly wolde he lerne and gladly teche.” The survival of humane literacy depends on literate humans personally communicating the literature by which they themselves have learned to become more humane.
34Neil Postman observes that
from Erasmus in the sixteenth century to Elizabeth Eisenstein in the twentieth, almost every scholar who has grappled with the question of what reading does to one’s habits of mind has concluded that the process encourages rationality; that the sequential, propositional character of the written word fosters what Walter Ong calls the “analytic management of knowledge.” To engage the written word means to follow a line of thought, which requires considerable powers of classifying, inference-making and reasoning. It means to uncover lies, confusions, and overgeneralizations, to detect abuses of logic and common sense. It also means to weigh ideas, to compare and contrast assertions, to connect one generalization to another.32
35These are features, we might think, which could lead to commensurable conversations, even in the university.
36In our day, for reasons both far wider and deeper than religious pluralism or multiculturalism, our culture may be in flight from rationality—even from intelligibility—and returning again to various manifestations of pagan insecurity. All of our institutions stagger under the withdrawal of their foundation of belief. But can the Western university—from its foundations and through its history a legacy of one grand narrative—survive without a sustaining belief and articulate practice of communicable rationality and humane literacy which has been the gift of that narrative? That is, can we survive in a world in which centring words themselves have lost their referents, been emptied out, and in which educated persons have lost both the ability and the will to reach toward a common truth? Can there continue a humane literacy without a shared “grand narrative”? At last to be brief: Dubio, as they used to say in the first universities—I doubt it, very much.
Notes de bas de page
1 Bede, The Ecclesiastical History of the English People, is quoted here in the 1565 translation of Thomas Stapleton (London: Burns, Oates and Wash-bourne, 1935), vol. 2, p. 13.
2 They can, however, exhibit creative strategies for masking the evidence (see University Affairs, January 1977). When the former (NDP) Minister of Education for the Province of Ontario came to the conclusion that the Ontario education system needed a major overhaul (and established the Royal Commission for Learning, whose findings were published in February 1995), he began by ordering a universal Grade 9 English test. The Queen’s Park bureaucracy quietly produced a travesty of meaningful testing. Students were allowed to: “(1) write comments about a poem chosen by the student from a list of several alternatives, and, (2) write a short narrative or supported opinion on a subject of the student’s choosing, and, (3) submit a piece of prose selected by the student as his/her best from previously written material on any subject, and (4) read short pieces of prose and answer a series of multiple choice or short answer questions.” Teachers could comment on early drafts of students’ efforts so that edited versions would be the final submission. Dictionaries could be used and help from fellow students and parents was also encouraged. Even with all this assistance, any student thought by the teacher to be unable to complete the test could be exempted from it, without any need to report on the number of such exempted students. The reading passages used in the test had difficulty levels (Edward Fry, “Effective School Practices,” 1994) equivalent only to Grade 7 material. Even so, only 41 percent of Ontario students achieved “C” or higher in the reading tests, and 51 percent in writing. See Ted Johnson, “When Is a Test Not a Test,” in Learning at Home, vol. 6, no. 3 (1994), 1-4. The Education Minister announced that there would be province-wide standardized tests at Grade 3,6,9 and 11 levels, administered by a new bureaucracy of 70 persons overseen by a seven-member volunteer panel of “high profile” lay persons reporting directly to the legislature, an initiative opposed by the teachers’ union. Cf. Graeme Hunter, “Can You Read? Philosophical Reflections on Labour, Leisure and Literacy,” Eidos, vol. 10, no. 1 (1991), 47-61.
3 Walker Percy, Signposts in a Strange Land (New York: Farrar, Straus and Giroux, 1991), 170.
4 It may also be the most misleading. Fiedler’s preoccupation with the subversion of traditional canons and the instituting of texts of heretofore marginalized culture (e.g., African-American, native Indian, homosexual) nevertheless signals a prominent political attribute of postmodernist agendas. See his “In the Beginning Was the Word: Logos or Mythos?” in No! in Thunder: Essays on Myth and Literature (Boston: Beacon, 1960), 295-308; and What Was Literature? Class Culture and Mass Society (New York: Simon and Schuster, 1982).
5 Jean-François Lyotard, The Postmodern Condition: A Report on Knowledge, trans. G. Bennington and M. Massumi (Minneapolis: University of Minnesota Press, 1984), xxiv. Cf. his penetrating observation in the appended essay, “What Is Postmodernism?” that “Modernity, in whatever age it appears, cannot exist without a shattering of belief and without discovery of the ‘lack of reality’ of reality, together with the invention of other realities” (77).
6 Anthony Giddens, The Consequences of Modernity (Stanford: Stanford University Press, 1990), 2. Noting that the decline of Marxist political regimes has accelerated the tendency in postmodern theorizing to disregard—or wish to disregard—any overarching world view whatsoever, Giddens amplifies his definition by reference to the actual disarray in postmodern discourses:
What does postmodernity ordinarily refer to? Apart from the general sense of living through a period of marked disparity from the past, the term usually means one or more of the following: that we have discovered that nothing can be known with any certainty, since all pre-existing “foundations” of epistemology have been shown to be unreliable; that “history” is devoid of teleology and consequently no version of “progress” can plausibly be defended; and that a new social and political agenda has come into being with the increasing prominence of ecological concerns and perhaps of new social movements generally.
Cf. Terry Eagleton, The Illusions of Postmodernism (Oxford: Blackwell, 1996).
7 Giddens, 3; Lyotard, 79.
8 Lyotard, 62.
9 Quoted by Robert Detlefsen, “White Like Me,” New Republic, April 10, 1989. That this constraint on professional free speech is directed against freedom of conscience and of religion has been pointed out before. Recently George Marsden, formerly Professor of History at Duke and now at Notre Dame, has gone so far as to suggest that truth in advertising laws might require American colleges and universities to add a specific disclaimer to their typical antidiscrimination assurances in university calendars and handbooks, saying after the sentences encouraging “diversity of perspectives,” “except of course, religious perspectives; we do, of course, discriminate on the basis of religion” (“Religious Professors Are the Last discriminate on the basis of religion” (“Religious Professors Are the Last Taboo,” Wall Street Journal, Dec. 22,1993; see also “Scholar Calls Colleges Biased against Religion,” by Peter Steinfels, New York Times, Nov. 26,1993).
10 E.g., the University of Ottawa took this position in bargaining with its Association of University Professors in May 1996.
11 Contingency, Irony and Solidarity (Cambridge: Cambridge University Press, 1989), 21.
12 Richard Rorty, Philosophy and the Mirror of Nature (Princeton: Princeton University Press, 1979), 373.
13 Quoted from Edward P.J. Corbett, Classical Rhetoric for the Modern Student, 2nd ed. (New York: Oxford University Press, 1980), 377.
14 Ralph Waldo Emerson, “Self-Reliance” in Collected Essays (New York: Hurst and Co., 1892), 7-48. For Emerson, perhaps, the paradigm American romantic incarnation of Kant’s transcendent ego, “Nothing is at last sacred but the integrity of our own mind.” Accordingly, “No law can be sacred to me but that of my nature. Good and Bad are but names very readily transferable to that or this; the only right is what is after my constitution; the only wrong what is against it.”
15 The Collected Works of Oscar Wilde, ed. G.F. Maine (London, 1954), 962.
16 This impulse is clearly discernible in Stanley Fish, There’s No Such Thing as Free Speech and It’s a Good Thing Too (1994); first essayed as an article with the same title in Paul Berman, ed., Debating Political Correctness (New York: Laurel, 1992), 233.
17 See Rorty, Contingency, Irony and Solidarity, 27 (cf. n. 2). Roger Lundin, in a helpful and penetrating recent study, comments on this and other of Rorty’s central formulations (The Culture of Interpretation: A Christian Encounter with Postmodern Critical Theory [Grand Rapids: Eerdmans, 1993], 224-225). Kreeft’s remark is found in his Back to Virtue (San Francisco: Ignatius, 1992), 100. “My Way,” written by Ottawa native Paul Anka, recently earned for its composer the order of the Chevalier des Arts et des Lettres, perhaps in part because it was in fact Anka’s adaptation of a French song, “Comme d’Habitude,” by Claude François. If repetition is any index to its general significance (since it was written in 1968 it has been sung in 600 versions, with more than 300 million records sold), then this song would seem to be a kind of anthem for our time.
18 For a recent review, see Adam Begley, “The I’s Have It: Duke’s ‘Moi’ Critics Expose Themselves,” Lingua Franca, vol. 4, no. 3 (1994), 54-59.
19 Lyotard, 48-51.
20 Paul Ricœur, History and Truth (Evanston: Northwestern University Press, 1974).
21 Giddens, 10.
22 Lundin, especially chapters 4, 8 (cf. n. 13).
23 Rorty, Contingency, 5, 7.
24 Rorty, Consequences of Pragmatism (Minneapolis: University of Minnesota Press, 1982), 142,150.
25 From chapter 6 (“Humpty Dumpty”) of Lewis Carroll’s Through the Looking Glass, ed. Martin Gardner, The Annotated Alice (New York: Bramhall, 1960), 268-269.
26 Jaroslav Pelikan, The Idea of the University: A Re-examination (New Haven: Yale University Press, 1992), 29.
27 Alvin Plantinga, “On Christian Scholarship,” in The Challenge and Promise of a Catholic University, ed. Theodore M. Hesburgh (Notre Dame and London: University of Notre Dame Press, 1994), 267-296.
28 According to David Bromwich, Politics by Other Means: The Limits of Institutional Radicalism (New Haven: Yale University Press, 1992), “The new academic professionalism shows that [universities] at this level are just one more casualty of an ethic of market rationalization that controls our society as never before.” Yet properly, he observes, “Professional development really has no more claim upon us than real-estate development... The truth is that with much refinement and convenience, professionalization has brought much damage everywhere. Everywhere: in the medical and legal professions, too; in every discipline the creation of which requires the creation of a new laity. What it most has destroyed... is our common sense of public life” (111).
29 Such questions have begun to be raised recently in a number of quarters, even if tentatively. A useful example from within the discipline is afforded by Morris Dickstein, “Damaged Literacy: The Decay of Reading,” Profession 93 (New York: Modern Language Association, 1993), 34-40.
30 I have taken up the relationship of Anglo-Saxon evangelization to the birth and growth of literacy and literature in chapter 4 of my People of the Book: Christian Identity and Literary Culture (Grand Rapids: Eerdmans, 1996).
31 Failure to understand the connection occasions much confusion in contemporary Western politics, as was illustrated by Clinton’s recent capitulation to China in the negotiations over trade and “most-favored nation status” (and Jean Chretien’s parallel gestures). Within the community of Chinese intellectuals on the other hand, there is considerable interest in the Western “grand narrative” because of their perception that Christian ethics, which apart from its religious foundation they profess to admire, is grounded in it.
32 Neil Postman, Amusing Ourselves to Death: Public Discourse in the Age of Show Business (New York: Penguin, 1984), 51.
Auteur
Professor Emeritus of English Literature at the University of Ottawa, Guest Professor of Peking University (Beijing) and Professor of Art History at Augustine College (Ottawa). He has written and edited books on medieval and modern literature, including most recently A Dictionary of Biblical Tradition in English Literature (1992) and People of the Book: Christian Identity and Literary Culture (1996)
Le texte seul est utilisable sous licence Licence OpenEdition Books. Les autres éléments (illustrations, fichiers annexes importés) sont « Tous droits réservés », sauf mention contraire.