🧭 The Guide
SUNY Geneseo’s Writing Guide

Grammar and Usage

This page covers the following topics:

What is grammar?

The grammar of any language is the set of underlying rules that make possible meaningful utterances in that language. You may think you know nothing about English grammar, but if you speak the language competently—if others understand you—you have, in fact, a detailed knowledge of English grammar. What you may lack is detailed knowledge about English grammar—knowledge, that is, about the systematic structure of the rules governing the language. You know how to combine nouns, verbs, and modifiers to create sentences, even if you couldn’t define the terms noun, verb, and modifier.

The analytical study of grammar comprises the study of syntax, or sentence structure, and of morphology, or word structure (for example, the small units that differentiate word meanings, such as -tion and -s).

Usage is a much more slippery concept than grammar. It has to do with habitual or customary practices in spoken or written language. The concept is important for understanding how language works, because members of a language community, who by definition all share more or less the same grammar, will nevertheless differ when it comes to certain specific modes of expression. Anyone who speaks English competently understands, and knows how to use, the word ain’t, a word that has been part of the language for nearly two hundred years. Yet, at least in written English, ain’t is widely frowned upon as unacceptable usage.

What makes some usage unacceptable? The answer—tautological as it may sound—is that it is not accepted. Not accepted by whom? By those in a position to make some usages acceptable and others unacceptable÷that is, teachers, editors, compilers of dictionaries and, of course, authors of usage manuals.

Usage and convention

If usage seems a highly arbitrary matter, it’s worth remembering that all language is, after all, ultimately a matter of convention. In Lewis Carroll’s Through the Looking-Glass, Alice objects when Humpty Dumpty uses the word glory to mean “a nice knock-down argument.” The word doesn’t have the meaning Humpty gives it÷but only, of course, because people don’t use it that way. If tomorrow every English speaker used the word as Humpty does, “a nice knock-down argument” would become one meaning of glory.

Usage, you might say, is but the visibly conventional edge of language. The central conventions of a language don’t immediately strike us as conventional because they command universal, or nearly universal, adherence; and as John Stuart Mill once wrote, “everything which is usual appears natural.” The conventions at the periphery of a language reveal their conventional nature through the disputes they generate.

The rules of usage are no more arbitrary than those of grammar, then; but whereas the core conventions that constitute the grammar of a language are established by the vast majority of users, the canons of “acceptable” usage, as we’ve already indicated, derive their force from the authority of a relative few. The rules of grammar result from humanity’s intrinsically social nature; those of usage result from social hierarchy.

Usage and the long run

History suggests that in the long run the many prevail over the few. In his Dictionary of Modern English Usage (1926), H. W. Fowler recoiled from intrigue in the sense of “puzzle” or “perplex” (as in “His clothes intrigued me”); in Modern American Usage (1966), Wilson Follett took aim at finalize. Despite their protests, both words have moved from the periphery to the core. Follett insisted that “To burgeon means to put out buds; figuratively to come out in a small modest way, not to spread out, blossom, and cover the earth,” but thirty years later, any writer who used the word his way would almost certainly be misunderstood. Language authorities have failed to rebuff the dreaded prioritize or to prevent impact from becoming a verb synonymous with “affect” (e.g., “How will the new requirements impact our curriculum?”).

On the other hand, ain’t, despite enormous popularity, remains peripheral. Perhaps this represents a rare success for language authorities; more likely, it reflects the language community’s need to keep some expressions marginal as handy indicators of educational attainment and social class. It’s hard to believe that logic has kept ain’t from passing into acceptable usage; as R. W. Burchfield points out, ain’t I is no less logical than the perfectly acceptable aren’t I as a substitute for the apparently unusable amn’t I. Similarly, yous and y’all have failed to gain respectability, even though a plural form for you would clear up many an ambiguous English sentence. (Other languages distinguish singular from plural you÷for example, the French have tu and vous.)

If language authorities hold so little power in the long run, one might well ask why they even bother to resist linguistic change. Is it mere elitism? At times, alas, it has been just that. But the most responsible language authorities have always provided thoughtful reasons for their preferences. Precisely because conventions embody collective (if often tacit) decisions about how to do things, they are subject to rational debate. In The Guide, we’ve offered reasons for our preferences concerning such matters as pronoun agreement and sexist language. On our Myths page, we’ve presented arguments against a few preferences of others, preferences that most present-day authorities join us in rejecting.

Reasons for usage preferences

At least three kinds of rational consideration lie behind the strictures of judicious authorities: there are considerations of lucidity, simplicity, and directness; of character; and of aesthetics. The section of The Guide devoted to the first consideration provides a number of illustrations; examples from each category also appear below. You will find additional suggestions about usage under Making Improvements. These suggestions are no substitute for a comprehensive manual of English usage, which we believe every student would do well to purchase. Fowler’s Modern English Usage, for many years the standard, is lively if a bit cranky; extensively revamped to improve readability and incorporate modern linguistic perspectives, it has been recently republished asThe New Fowler’s Modern English Usage (ed. R. W. Burchfield). As a less expensive (but less expansive) alternative, you could buy a dictionary that includes guidance on usage.

Grammar and usage may be complicated matters, but your task, as a student writer, remains simple: know those usages that generate the most controversy, choose conservatively, and, most important, make it your business to learn the policies of your individual professors. You can discover some of these policies by going to the Disciplines page of The Guide.

Reasons for usage preferences: lucidity, simplicity, directness

Every language has resources for clear, precise, economical expression. If a change in usage threatens some existing resource, other means must be÷and always can be÷found. Yet it’s reasonable to fight for the preservation of what has worked in the past.

  • Words. Usage authorities often hold out for subtle distinctions among words. Some insist, for example, that viable, which originally meant “capable of maintaining life,” should not be confused with practicable or workable. Practicable is often used interchangeably with practical, whereas sticklers employ the former to mean “possible in practice” and the latter to mean either “concerned with practice as opposed to theory” or “(of persons) inclined to action.” (The preferred negatives are, respectively, impracticable and unpractical.) Some authorities deprecate the use of disinterested to mean “uninterested, bored” as opposed to “impartial.” A few still deprecate the depreciation of deprecate through its confusion with depreciate.

  • Verb forms. Consider the following two sentences:

(1) Smith may have been a great ballplayer.
(2) Smith might have been a great ballplayer.

There is a fine distinction of meaning here that English is in danger of losing. Sentence (1) says, in effect, that it’s possible Smith really was a great ballplayer, whereas (2) says that Smith was not a great ballplayer but could have been one if circumstances had differed. What threatens the distinction is the increasing prevalence of sentences like the following: “If he could only hit the ball, Smith may have been a great ballplayer.” Now consider another two sentences:

(3) It’s very important that Jones has that degree.
(4) It’s very important that Jones have that degree.

The word have in (4) is an example of the subjunctive, a verb form that has been dying a slow death in American English. Readers and writers attuned to the subjunctive will read (3) to mean that Jones actually has the degree in question, and that her having it is a fact of some importance, while they’ll take (4) to mean that Jones doesn’t yet have the degree but must get it. The more speakers and writers use (3) when they mean (4), however, the less reliable the distinction becomes.

Reasons for usage preferences: character

“You mean to say that after all you are really going to be the kind of woman who the baker won’t let near the bread?” asks a character in a Jamaica Kincaid story. In our written and spoken language, as in much of our other behavior, we express our identity. In our accents, some of our vocabulary, and even some of our sentence structures we may unconsciously reveal such characteristics as birthplace, education, social class, and ethnic heritage.

Our language may also reveal personal qualities over which we have more control. How I speak and write says something about the kind of person I am÷how I think and act, what I believe, what I value. In “The Decline of Grammar,” linguist Geoffrey Nunberg quotes Joseph Epstein to explain his own aversion to the word life-style:

The objection to the word “lifestyle” is that it is at too many removes from reality; in its contemporary usage are implied a number of assumptions about life that are belied by experience. Chief among these is an assumption about the absolute plasticity of character—change your lifestyle, change your life—that is simply not true; and the popularity of the word “lifestyle” is testimony to how much people want to believe it.

Do you really want to be the kind of person who embraces the life-style view of human identity? Do you want to be the kind of person who mixes metaphors? What kind of person knows the intricacies of the who/whom distinction and can manage them in print? What kind of person insists on obeying the distinction rigidly in speech?

Fowler’s often quoted admonition against the expression if and when is another example of a usage argument based on the manifestation of character in language:

Any writer who uses this formula lays himself open to entirely reasonable suspicions on the part of his readers. There is the suspicion that he is a mere parrot, who cannot say part of what he has often heard without saying the rest also. There is the suspicion that he likes verbiage for its own sake. There is the suspicion that he is a timid swordsman who thinks he will be safer with a second sword in his left hand. There is the suspicion that he has merely been too lazy to make up his mind between if and when. Only when the reader is sure enough of his author to know that in his writing none of these probabilities can be true does he turn to the extreme improbability that here at last is a sentence in which if and when is really better than if or when by itself.

Reasons for usage preferences: aesthetics

This concern overlaps with the two discussed above. A preference for simple, direct prose is to some extent an aesthetic one, and writers who strive to combine words in striking and attractive ways (see Care and Imagination say something about their character.

In The Practical Stylist, Sheridan Baker attacks what he calls “of-and-which disease” largely on aesthetic grounds. The disease, which principally afflicts passive sentences, is “something like sleeping sickness. With’s, in’s, to’s and by’s also inflamed.” It’s plain that Baker finds the victim disfigured and deformed.

To illustrate “of and which disease,” Baker quotes the following sentence:

Many biological journals, especially those which regularly publish new scientific names, now state in each issue the exact date of publication of the preceding issue. In dealing with journals which do not follow this practice, or with volumes which are issued individually, the biologist often needs to resort to indexes . . . in order to determine the actual date of publication of a particular name.

“The passage,” he comments, “is a sleeping beauty. The longer you look at it, the more useless little attendants you see.” He goes on to advise that “you can cut most which’s, one way or another, with no loss of blood.” He recommends, for example, changing “a car which was going south” to “a car going south,” and “a person who is attractive to “an attractive person.” The writer should beware “the whole crowd: who are, that was, which are” (Sheridan Baker, The Practical Stylist with Readings, 6th Edition. New York: Harper & Row, 1986, 287-8).