Conventions of Writing
The human capacity for language has its roots in nature. The combination of physical and cognitive resources necessary to the production and comprehension of symbolic utterance is a result—and perhaps the defining acquisition—of our unique evolutionary history. The biological basis of our ability to speak and write has led a small number of scholars—such as the cognitive scientist Steven Pinker—to argue that humans are born not only with an innate capacity for language but with innate linguistic knowledge—that they are, as it were, prewired with the basic principles of a universal grammar. It seems more plausible to suppose, however, that although the human capacity for language arises from nature, particular languages, and their particular uses, are largely the product of culture.
In other words, language is a social practice. It is, at a deep if not the deepest level, a structure of social conventions, and in that sense thoroughly conventional.
This page considers the conventional aspects of language under the following headings:
- Convention and the varieties of discourse
- Convention and social change: Is English in decline?
- The politics of convention: language and political correctness
- Racism in language
- Sexism in language
Convention and the varieties of discourse
The most obviously conventional aspects of language—treated at length on a separate page of The Guide—are its rules of grammar and usage. But the conventions of language extend beyond grammar and usage to encompass such matters as style and organization. The result is a variety of conventional discourses. Academic discourse has some basic conventions that set it apart from, say, technical, legal, or journalistic discourse; within academic discourse, conventions vary from discipline to discipline. While most academic discourse is, broadly speaking, formal, writers in the natural sciences or social sciences may organize essays differently from writers in the humanities. Documentation styles for citing sources also differ according to discipline.
Convention and social change: Is English in decline?
Like the cultures that produce them, languages vary not only from one another but within themselves, for their constitutive conventions differ over time and place. Over the last ten centuries, English has undergone such dramatic changes in vocabulary, syntax, and morphology that its earliest recorded form, Old English, is for most modern readers a foreign language. Even the last few hundred years have witnessed noticeable shifts in the conventions of English spelling, punctuation, and usage, some of them significant enough to make a page of Dickens present a serious challenge for the twenty-first-century reader. And anyone who has learned English in the United States need only hop a plane to England, or for that matter travel the Eastern seaboard, for a reminder of the many dialectical differences to be found within modern English itself.
Social practices inevitably generate debates about their proper execution. Where you find conventions, in other words, you can depend on finding arguments over “correctness.” In the case of language, the debate takes on further complication from disagreement, not to mention a good deal of misunderstanding, concerning the role of conventions themselves in linguistic practice. Writers interested in learning more about this debate might profitably begin with linguist Geoffrey Nunberg’s “The Decline of Grammar,” an essay that explains the difference between “descriptive” and “prescriptive” grammar and discusses the politics and pitfalls of the latter. Novelist Mark Halpern’s response to Nunberg, “A War That Never Ends,” claims that human beings are naturally drawn to construct “standards” to correct their linguistic practice.
The politics of convention: language and political correctness
Politics is largely a function of speech. The rough inverse of this truth—that speech is more than occasionally a function of politics—should not, therefore, surprise us.
The most obviously political conventions of language have to do with names. After all, names are all about identity, and social wrangling over identities—of groups, of places, of events—can be very charged politically. “What’s in a name?” asked Shakespeare’s Juliet. The answer: a lot. If it were otherwise, the names for proponents and opponents of legal abortion would be uncontroversial, taxes would never have to masquerade as “revenue enhancements,” and all Southerners would have long since abandoned calling the events of 1861-65, “The War Between the States.”
For historically marginalized social subgroups, the struggle for political prominence often involves a push to eradicate or at least stigmatize conventionally accepted, sometimes patently demeaning names. As a result of such struggles, once-common names for some racial, ethnic, and religious groups have become conventionally regarded as offensive. Occasionally a marginalized group has adopted the opposite strategy, openly embracing a derogatory or otherwise suspect label as a way to express defiance and drain the label of its conventional power to insult or diminish. “Black,” “gay,” and “queer” are examples.
Though the two strategies are diametrically opposed, the groups who employ them share a single impulse: to wrest control of their own social identities as one step toward winning the ultimate prizes of recognition, respect, equality, and power.
In the 1980s, the term political correctness emerged to designate, among other things, a kind of censorship that the term’s users believed had resulted from recent struggles over the politics of social naming. Why, it was asked, can we not simply call people and things by their real names? Why must we engage in euphemism merely to flatter the aggrieved?
It was easy enough for critics of “pc” to find examples of ludicrous euphemism intended to promote self-esteem for underdogs, such as “vertically challenged” as a proposed substitute for “short.” But such simple examples belie the genuine complexity of the problem. In fact, nothing has a “real” name, since, as we’ve seen, all names are human in origin and therefore products of convention. In language, “real” names are those whose conventional acceptance is so widespread that we’ve ceased to think about their artificiality. A challenge to one of these “real” names represents a retort to such unconsciousness, and sometimes a retort to the underlying social reality that makes such unconsciousness possible. The “short” example seems absurd on its face because few people seriously accept the notion that, as a group, persons of lesser physical stature labor under a significant burden of social prejudice and discrimination reflected and perpetuated by the conventional name for them. The case is very different, however, with groups defined by ethnicity, race, religion, sexual orientation, or some other common characteristic that has indeed subjected them to significant prejudice or discrimination.
In any of these more vexed cases, no name will likely settle into “real” status while debate on the group’s social acceptance and influence remains open. As long as the perception of inequality or stigma persists, the conventional name of the moment will likely come, in time, to be associated with, to reflect, perhaps even to help facilitate, the inequality’s persistance.
The problem of establishing a “real” name grows still more complicated when the reality that the name picks out is an unstable social construction itself. Here the seemingly unending uncertainty and dissatisfaction with all conventions reflects not only persisting inequality or stigma but the indeterminacy of what our names attempt to identify. The National Association for the Advancement of Colored People (NAACP) still bears in its title a once-conventional racial label. The organization has outlasted the label, which eventually became offensive because of its evident blindness to the reality that “white” people possess pigmentation. (Student readers of The Guide may not realize that when the authors were of coloring-book age, Crayola manufactured a light-hued crayon hubristically lableled “flesh.”) Formerly “colored” people eventually became, by convention, “black,” though they are not truly black in color and indeed are not, in their genetic multiplicity, any one color, just as “white” people are not of one uniform hue. The currently conventional “African-American” (an addition to rather than a replacement for “black,” which most people still conventionally regard as acceptable) is a historically and geographically accurate description only of some people to whom the name now applies, and is useless either in discussing issues of race and ethnicity in a global (rather than strictly American) context or in capturing the multiracial, multicultural personal histories of many who live in America today. Recognition of this last point has given rise to the circle-completing advent of the name “people of color,” which covers a wider population than “colored,” “black,” or “African-American,” but which for that very reason can only augment, rather than replace, those other terms.
The artificiality of all such labels might seem a reason to dispense with them completely. However, they clearly fill a variety of important social needs—from demographic analysis to personal identification with particular historical, geographic, cultural, and other categories—and the prospect of their disappearance is neither likely nor clearly desirable. The inevitability of social naming must finally lead us to ask, in response to complaints about the inconveniences of “political correctness”: If we must name a group, should we not use the name that, in the group’s own view, most promotes recognition, respect, equality, and power for its members?
Racism in language
Besides taking pains to use names that maximize recognition and respect (see the section above) how can you avoid racism and other forms of bigotry in your writing? For starters, of course, you should scrupulously avoid all stereotypes and undocumented generalizations about social groups of any kind. You should also consult the dictionary on any words that you suspect of carrying offensive denotations or connotations.
However, the dictionary will only take you so far. There are no substitutes, in the end, for an awareness of history and a fine sensitivity to context and implication. The point is nicely illustrated by a minor aspect of the debate over the recent war in Iraq. Before and during the war, political pundits divided over whether the war would be a “cakewalk.” Discussion mainly focused on weapons, intelligence, strategy, and tactics, but columnist Patricia Williams saw a problem with the word “cakewalk” itself. In the April 28, 2003 issue of The Nation, she tartly observed that
one should never enter a fight announcing that it will be a “cakewalk.” A cakewalk was a dance contest popularized during the days of black minstrelsy, for which the prize was, as implied, a fluffy confection. Debussy, as our well-educated senior [military] advisers ought to know, wrote a funny little piece of musical condescension to this effect, Golliwog’s Cakewalk. (A golliwog, for the uninformed, is a charmingly old-fashioned word for “n[***].”) Such are the amusements of colonialism. But in the so-called post colonial era, such references do tend to rankle.
Type “cakewalk” into the Merriam-Webster’s Dictionary search box at the top of this page, and you won’t find any indication that the word is offensive. Williams objects not to the word in isolation, however, but to the word and its attendant imagery and history in the context of an aggressive military enterprise that in her view summons shadows of America’s slave-tainted imperial past.
In glossing the word “golliwog,” Williams herself uses, in quotation marks, what has become one of the most taboo words in English. Was she wrong to use it? Was The Nation wrong to print it in full (which they did), rather than abbreviate it in some way, such as “n–r” or “n***” (as we have done here, substituting for the original)?
Once again context matters. Williams and The Nation seem to have decided that the clearly condemnatory context, together with the quotation marks, sufficed to remove any message of hatred from the word, and perhaps even that printing the word in full would make the condemnation all the more forceful. Perhaps, too, Nation editors would have reached a different decision if the writer had been white. (Common consensus affords derogated groups special license to use the derogatory labels historically bestowed on them.)
As a student writer, you must exercise care and thought in making complex judgments about the fully contextualized meaning and impact of your words. It is a safe assumption, however, that you should never use any any plainly hate-charged language, even where context might seem to neutralize it, unless explicity permitted to do so by your instructor.
In addition, you should follow these practical guidelines:
- Modernize or contextualize outdated terms that you cite from your reading. For example, in The Fire Next Time (1963), novelist James Baldwin writes of the misplaced confidence that the systematic murder suffered by Jews in the Holocaust “could not happen to the Negroes in America.” In an essay on Baldwin, you should either quote Baldwin’s exact words or, if you choose to paraphrase, substitute the term “African-Americans” for “Negroes.”
- Keep your terminology consistent and parallel. Write “white people and black people” rather than “white people and blacks.” If you capitalize “White,” capitalize “Black.” (Some writers prefer to capitalize these words when referring to people and groups rather than skin color. Others prefer to use lower case. The important thing, again, is to be consistent.)
- Capitalize words that refer to or are derived from nationality. E.g., “The African-American poet wrote about her experiences traveling with the Vietnamese novelist.” Also capitalize words that describe a religion or a member of a religious faith. E.g., “Many Muslims in Rochester helped the Kosovar refugees settle in western New York.” “The largest Christian sect in Rochester is Roman Catholic.” “Faculty should avoid scheduling exams during Jewish holidays.”
For more on racism in language, see:
- Philip Herbst, The Color of Words: an Encyclopaedic Dictionary of Ethnic Bias in the United States (Yarmouth, Me., USA : Intercultural Press, 1997). In Milne Library this book is catalogued: ref E184.A1 H466 1997.
- Rosalie Maggio, Talking About People: A Guide to Fair and Accurate Language (Phoenix, Ariz. : Oryx Press, 1997). In Milne Library this book is catalogued: ref P301 .M33 1997.
Sexism in language
A father and son, on their way to baseball game, are involved in a horrible car wreck. The son is rushed to the hospital and wheeled into the emergency room. The doctor, seeing the patient, exclaims, “I can’t operate—that’s my son!” Question: How can this be?
Answer: the doctor is the boy’s mother.
Many people who believe themselves innocent of sexism will be stumped—if not completely, at least momentarily—by this riddle. Try it on your friends and see what you find.
The riddle works because sexism, like other forms of bigotry (see the sections above) runs deeper in society than we realize, affecting our very understanding of the words used in common discourse. Of all the historical power relationships embedded in the English language, that between men and women probably affects more aspects of speech and writing—from diction to grammar and usage—than any other.
Attempts to promote gender-neutral and nonsexist usage in everyday speech have met with remarkably fierce resistance from some quarters. In a television debate some years ago, a prominent conservative pundit derided the term “chair” as a gender-neutral substitute for “chairman” on the grounds that it was ludicrous to address a human being as an object of furniture. Why this is any more ludicrous than addressing a human being as an isolated body part—as in the unexceptional and uncontroversial term “head” for a person in a position of primary responsibility—the pundit didn’t say. Nor did he pause to reflect, apparently, that in English we regularly refer to people by using associated objects, such as when we speak of “the crown” (=the monarch) or “the White House” (=the President). Literary scholars and linguists call this pefectly ordinary manner of reference-by-substitute “metonymy.” In formal parliamentary parlance, to postpone consideration of a motion is to “lay it on the table.” At most meetings, the participants who wish to avail themselves of this procedure use a kind of metonymic verb and simply “table” the motion. One would think chairs pefectly at home amidst such linguistic furniture.
Many people still have difficulty with the honorific term “Ms.” They seem to think it a substitute for “Miss” merely, to be supplanted, upon marriage, by “Mrs.” They fail to understand that when English makes a public distinction concerning marital status for one sex only (men are simply labeled “Mr.” except to designate professional status, as in “Dr.,” “Prof,” or “Rev.”), English is sexist. For the distinction carries the plain message that a woman’s marital status is a public matter, and of primary importance, while a man’s is private and subsidiary to his choice of career.
The sexism in terms such as “mailman,” “fireman,” and “policeman”—not when used to refer to particular male individuals, but to refer to anonymous individuals or to the occupations themselves—also comes from the message bound up with the words: that the normative holder of the occupation is male.
Similarly, the sexism in the word “man” as a generic name for the human race comes from its implication that the normative human being is male, women being some kind of interesting, perhaps unaccountable variant—or, as linguists say of nonstandard verbal constructions, the “marked case.” Men are business as usual. Women are men with an asterisk (the symbol that linguists in fact use to indicate the “marked case,” and that Major League Baseball for many years placed beside Roger Maris’s home run record, to signal that it was somehow not the real thing.)
Attempts to gender-neutralize language, we commonly hear, confuse language and reality, as though we could wish away the actual power relations in society merely by using the correct words. This argument misses the point that words often do influence our perception of social reality—particularly in the childhood years—and that even when reform of language brings no major reform of society, it can be an important gesture of recognition and respect (see the sections above.)
For most job titles, gender-neutral words are neither as difficult to find nor as cumbersome as critics claim: mail carrier (or postal worker), firefighter (which is euphoniously alliterative), and police officer are not difficult to use, and the power of human invention being what it is, finding similar substitutes for other man-words should not be a herculean task. (If they can send a man to the moon … )
It’s a striking linguistic fact that anyone can invent a noun, adjective, or verb and, with adequate publicity, hurry it into ordinary people’s speech. Neologisms like “smog” and “blog” assimilate into English with ease, as do new verbs based on nouns or adjectives, such as “access,” “impact,” and “familiarize.” Pronouns are another matter altogether. It may in fact be easier to send a man—or woman—to the moon than to add a gender-neutral third-person singular pronoun (other than “it”) to English.
In spoken English, most people—even if they care nothing about gender equality—solve this problem by treating “they” and “their” as gender-neutral singular pronouns, as in “A writer can’t help where they came from” or “Everyone takes their hat off for the national anthem.” Moreover, historical precedent favors those writers who employ this solution even in formal written discourse, as Henry Churchyard illustrates in his extensive web treatment of singular “their,” which includes numerous examples from that impeccable English sylist, Jane Austen.
There may be problems with singular “their,” but illogic isn’t one of them. It is no more illogical to write, “A person isn’t responsible for their genetic make-up” than to write, “A person isn’t responsible for his genetic make-up” when roughly half of personhood is female. Churchyard’s site references Steven Pinker, author of The Language Instinct, on why the first type of sentence is actually more logical grammatically.