Header Ads

ad

W0keness and the English Language

 Wokeness and the English Language



Language, the soul and tool of politics, is only rarely the subject of politics. But in the past few years, and with baffling speed, language has moved to the center of public life. The political conversation today churns with terms unfamiliar a few years ago: Latinx and BIPOC, cisgender and heteronormativedeadnaming and preferred pronouns. Some of these neologisms were made necessary by changing social realities. Others were created precisely to change those realities. For example, there was no need for Harvard School of Medicine to coin the phrase “birthing people” as a substitute for “mothers,” other than to topple the notion that only women can give birth.

Such terms emerge from the world of identity politics, the militant branch of the contemporary American left. And it is only natural that a movement that thinks in terms of racial and sexual identity would fixate on the words that define identity, to seek to control it. There are words that you may never say and there are words that you must always say, and a single misstep can bring serious, even career-ending consequences.

A term like birthing people may be good for a laugh, but not for those on the front lines of the identity movement. A California law, Senate Bill 219, would have punished nursing-home employees with up to a year in prison for repeated use of the wrong pronouns, a practice known as “misgendering.” That law was overturned by a California court of appeals in June. In this and other court cases, the First Amendment has acted as a shield against compelled speech. But if freedom of speech has long been regarded in this country as something sacred, so too is the freedom from discrimination. Canada’s Bill C-16, which in 2017 amended the Canadian Human Rights Act to include “gender identity or expression,” makes the use of the wrong pronoun punishable by two years in prison. Canada does not have a First Amendment—and it is now possible to imagine a scenario in which ours is made to give way to an expanded understanding of civil rights.

How did this come about? How did the linguistic ground beneath our feet, the bedrock of Shakespeare and Orwell, turn to quicksand and so swiftly? The rise of identity politics—itself a neologism of the 1980s—offers only a partial explanation. We must look in another quarter entirely to understand what has transformed our relationship to the words we use.

EVERY LANGUAGE is a work in progress, in perpetual flux. By the time one is 20, one knows this from personal experience, having observed new words come and go, some of them sticking. One can see this in graphic form with the Google Ngram Viewer, which shows the frequency with which any given word or phrase appears in printed texts. One can plug in a word and watch its rise and fall over the decades. (My students are usually surprised that no one said pasta before about 1980, for example, or that relatable in the sense of “personally relevant” dates to only about 2010.)

But if the law of language is change, it 
is not formless change. To have a written language, and not a merely spoken one, is to be in constant contact with its past. So long as a work of literature is still read, its phraseology and vocabulary persist as a substrate; they are the anchor, as it were, that counterbalances the sails of random change.

English has had two great anchors: the King James Bible and the works of Shakespeare. They came into being at about the same time, that moment around 1600 when modern English can be said to have emerged in its definitive form. Here Anglo-Saxon and Norman-French fused in a way that preserved the character of each, giving the speaker of English a linguistic keyboard with a remarkably expressive range, letting one glide in an instant from tangible, pungent concreteness at the lower end to lofty Latinate abstractions at the upper. No other European language has anything like English’s battery of synonyms, which permit us to make the finest of social distinctions. There is a reason why drama is the essential contribution of English to world culture.

Because of their cultural prestige, Shakespeare and the King James Bible influenced every succeeding generation of writers. And so, four centuries later, they remain largely accessible to us (although you might have to refresh yourself on the meaning of quietus or bare bodkin). The same language spoken at the Elizabethan court was still serviceable to the world of the Industrial Revolution and into the Cold War. But the social revolutions of the 1960s put strains on English that went beyond mere words to confront the structure of the language itself.

The civil-rights movement, momentous though it was, put no great emphasis on language. It did effect one significant reform, although from an unlikely direction. In the summer of 1968, the singer James Brown recorded a top-selling soul single with a euphoric call-and-response plea to “Say It Loud—I’m Black and I’m Proud.” The phrase had a galvanizing effect on both blacks and whites. Black was a word that in sound and dignity was equivalent to white; even better, it was a word proposed from within the community and not assigned from outside. It was adopted virtually overnight. Terms such as Negro and colored, which had previously been the polite alternatives, came to sound out of touch, if not outright offensive.

It was not so easy for the women’s movement to achieve a similar linguistic parity. Traditional sex roles had embedded themselves not only in the words of the language but in its structure and grammar. English itself was a “sexist language,” a phrase that first appeared in an essay of 1971 by Ethel Strainchamps. While the words man and woman were ostensibly equal counterparts, they were not used equally. Women were as likely to be called girlsfemales, or ladies—the connotation of the latter word particularly amused Strainchamps. (Why was it, she wondered, that one referred to “Republican ladies” but never to “Communist ladies” or “Black Panther ladies”?)

The principal bone of contention was that women were addressed in a way that declared their marital status, as Mrs. or Miss, and expressed their social identity in terms of their relationship to a man. The solution to this was the neologism Ms., which was to be the pendant to Mr. It had been proposed as early as 1901, but now it was reborn as the title of Ms. magazine, which debuted in December 1971 as a supplement to New York. Like James Brown’s song, that title both identified a problem and offered a solution. It was a curious fiction of a term, as it was an abbreviation with no word behind it. It prevailed, but not without a struggle. One of the last holdouts, oddly enough, was the New York Times, which did not adopt the term until 1986.

That first issue of Ms. featured a furiously ambitious article called “Desexing the Language,” by the feminist writers Casey Miller and Kate Swift. To repair the problems diagnosed by Strainchamps, their solution was radical surgery. Any word or structure suggestive of traditional sex roles was inherently degrading to women, and the language should be remorselessly purged of them. All sex-specific job titles were to be made androgynous; out went fireman and stewardess, in came firefighter and flight attendant. Also to be purged were any conventional words or phrases that included the word man, of which there are an endless variety, e.g., mankindmanpowermanhandlefreshmanone-man showman the lifeboats, etc. Substitutes were eventually found for each of these, with varying degrees of success.

As successful as Miller and Swift were with these causes, other aspects of their program left the public cold. The authors made much of the iniquity of using the masculine pronoun he as the generic pronoun corresponding to anyone (as in “if anyone comes late, he won’t be admitted”). They were not satisfied with the customary alternatives of he and she or the time-honored they, which was grammatically incorrect but good enough for Shakespeare and Jane Austen. Instead, they introduced a gender-neutral pronoun of their own devising, which they designated the “human pronoun,” complete with a table showing the declension of tey, ter, and tem (as in “if anyone comes late, tey won’t be admitted; oh, alright, let tem in”).

Here the public drew a line. The fact is that the American public has always had a fine ear for linguistic absurdities. That ear provides the limiting principle that restrains the impulse to reform everything.1 When the publisher Robert McCormick introduced “sane spelling” in his Chicago Tribune, the public laughed at follies such as fantom, doctrin, or jaz. And yet at the same time, it saw the elegant logic of simplified spellings such as catalog and thru.

It is one thing to adopt new words, quite another to learn a new set of grammatical rules—just as it is one thing to remove the suffix man from chairman and quite another to remove it from woman. When students at Antioch College proudly formed the Antioch Womyn’s Center in 1978, it was generally taken as a gag. And so, by the end of the decade, the linguistic map had been redrawn, some but not all of the feminist demands met. Flight attendant? Yes. Tey and ter? No.

But how is it that the same culture that could mock these pronouns a generation or so ago is now diligently declaring its own pronouns? The answer: It is not the same culture.

IN THE SUMMER OF 2013, an Army private and intelligence analyst was court-martialed for espionage and sentenced to 35 years (and subsequently pardoned by President Obama). One day after receiving that sentence, Bradley Manning publicly became Chelsea Manning and asserted the right to be referred to by female pronouns. Sexual-reassignment surgery was no longer a novelty by 2013, but this was a particularly sensational case in which some of the most contentious issues of the day converged: the passing of government secrets to Wikileaks, the rightness of the Iraq War, and the transgender experience. For the first time, the general public became aware that there was such a thing as a “preferred pronoun” and that it was a matter of common courtesy to use it.

The media complied with Manning’s request, and with a comprehensiveness not possible a generation earlier. There had been earlier transgender celebrities—the writer Jan Morris, the musician Wendy Carlos—and they had been treated respectfully, without any collective soul-searching about one’s own pronouns. In a digital age, when Wikipedia was coming to possess an authority exceeding that of the Oxford English Dictionary and the Encyclopedia Britannica combined, the application of preferred pronouns could be instantaneous and even made retroactive. Earlier references to Bradley Manning as “he” could be digitally scrubbed. Manning did not become a woman but had always been a woman, which would turn to be more than a semantic difference. To continue to refer to Chelsea as Bradley was to “deadname” her and affront her human dignity. It would not take long for what would have been considered a faux pas to become a demeaning, possibly criminal act.

It is impossible to imagine any of this could have happened without the digital revolution. Miller and Swift’s call for a human pronoun had fallen on deaf ears. It would have taken innumerable editors and publishers to sign on to their crusade and (barring a catchy new James Brown song to make the case), it still would have faltered. But Miller and Swift did not have Facebook at their disposal.

In 2014, the online platform gave its 1 billion-plus users the option of choosing from among 56 genders (including pangenderneutrois, and two-spirit). So began the second great campaign to reform the language. If feminism had been the prime mover of the first, the second would be dominated by the movement for transgender rights.

Black Lives Matter, for all its ubiquity in public life today, has had no great quarrel with the English language, not like that of the transgender movement. The transgender critique does not so much build on the earlier feminist one, which paved its way, as invert it. The feminist critic would say that there were two sexes, and not to treat them equally was oppressive and discriminatory; the transgender critic would say that it was oppressive and discriminatory to point out that there were two sexes. Such an attitude is scorned as “gender normativity,” which is to be swept from the language by removing sex or gender from all terms for kinship. Not brothers and sisters but siblings; not husband and wife but spouses; not man and woman, but person. A 2017 article in an academic publication with the instructive name of the Journal of Language and Discrimination gives practical suggestions on how to make language trans-inclusive. For example, the sentence “Women often grow up being taught to accommodate others’ needs” can be rewritten as “People assigned female at birth (often) grow up being taught to accommodate others’ needs.” By this logic, a term such as birthing people becomes not only comprehensible but more or less obligatory.

Languages change, as we have seen, but not at random. The tendency is almost always toward concision and clarity, giving rise to that great time-saver, the contraction. “People assigned female at birth,” however, is the very opposite of a contraction; it is not a word but a short story. And a speech or essay peppered with such circumlocutions will be a dreary slog indeed. It grates like fingernails on the chalkboard to anyone who has read Macbeth or knows Psalm 23. But who does nowadays?

The tolerance of the public toward ever more cumbersome circumlocutions is a great paradox. Is it out of a general sense of anxiety and the fear that one will be ostracized for dissenting and be cancelled (another characteristic word of the day)? Or is it that a nonreading public does not have a sufficiently developed ear to recognize verbal horror when it hears it? A half century of determined attempts to reform the English language—none of which paid the slightest heed to its aural and rhythmic qualities—has done a good deal of collateral damage. It has certainly done the ear no good. One can follow the declension in subsequent renditions of Matthew 4:4, “Man does not live on bread alone,” from “Human Beings cannot live on bread alone” (Good News Bible, 1976) to “No one can live only on food” (Contemporary English Version, 1995).

It is not only that the insertion of sex-neutral language has drained the aphorism of its cadence, but that it has made literal what was metaphorical. The gorgeous alignment of idea, imagery, and sound that gave us Matthew’s poetic aphorism has given way to a bald platitude whose individual elements seem to have been pried apart, translated piecemeal by Google translator, and clicked back into place. (The editor of that last edition noted that he based his language, at least in part, on how it was used in television.)

WHATEVER ELSE our ongoing process of linguistic revisionism has achieved, it has not made the language more beautiful or richer. It has certainly taught writers to be cautious. It is not that they are fearful of taboos, which are great aids to a writer. All languages have taboos, those fixed and stable rules that a child learns naturally. They constitute the boundaries within which the game of language is played, and the testing of those boundaries, even stepping slightly out of bounds with a slightly indelicate aside, constitutes one of the principal delights of language. But when the taboos are unseen and constantly shifting, like buried land mines in the field rather than bright lines on its edges, one must step as though through a minefield, and language becomes flat and banal. The fear of giving offense to even a single reader is fatal to vibrant prose (although that single reader, we all know by now, can do a great deal of damage with a single tweet).

A language purged of all figurative and allusive imagery, relentlessly literal and radically present-oriented, is a pitiful (and pitiless) instrument of communication. Any piece of prose that rises above the level of an assembly manual operates at multiple levels, from the factual to the imaginative, and requires good will on the part of the listener to grasp a speaker’s idea. After all, an idea is an intangible thing and must be brought into being by figural language.

But that is precisely what our identity-conscious linguistic revisionism has virtually ruled out by teaching people to read literally, rather than imaginatively; to look for certain words and formulations; and to judge prose by their presence or absence. And when those readers, shielded by the digitally enforced information silos in which they are confined, come across unfamiliar words or old-fashioned formulations, they are startled and liable to take the jolt of surprise as something unpleasant. It is the debilitating susceptibility that comes from isolation from outside irritants, much like the recent rise of allergies in children not exposed to certain foods. For the contemporary reader, much of English literature can induce a kind of moral peanut allergy.

All this occurred with little protest from those who were the traditional guardians of the language, the teachers and professors of English and linguistics. But these guardians had long since become “descriptivists,” detached watchers of how language is used rather than enforcers of its rules (who are now known benightedly as “prescriptivists”).
For the descriptivist, mistakes of grammar were themselves authentic speech; all dialects were equally valid, and there was no such thing as a standard language. “A language,” so ran the sneer, “is simply a dialect with an army.” But as Jacques Barzun liked to point out, a standard language is the most democratic thing of all; it makes the dweller of a village, able to communicate with a few hundred people at most, a citizen of a country.

In the end, this is the worst, if unintended, consequence of our half century of linguistic revisionism. It estranges us from our own language, pushes our language away from us as if it were an anthropological artifact so as to view it from a distance, not only critically but suspiciously, and as an instrument of oppression. But to try to cleanse a language of all the bad things that can be done with its words is to confuse ends and means. It is an endless and hopeless task. And to be alienated from your own language is to be alienated from yourself. No wonder people are so angry. And who’s to say that a term such as birthing people won’t soon be supplanted by another on the grounds that it, too, is a criminal violation of fairness, given that it is, shall we say, species-normative?


1 One sometimes reads that Americans were so consumed by anti-German hysteria during World War I that they replaced words such as sauerkraut and German Measles with liberty cabbage and liberty measles. In fact, these were ironic suggestions, briefly commented on with amusement at the time, and never really taken seriously.