The Grammarphobia Blog

Suitable attachments

Q: It was pointed out to me the other day that I had used “attendence” when it should have been “attendance.” So then I wrote “correspondance” when it should have been “correspondence.” Any rules here or is this one of those things you have to remember?

A: The suffixes “-ance” and “-ence” are used to make nouns out of verbs or adjectives. But there’s no rule for sorting them out. The dictionary is your only hope.

Generally, these suffixes form two kinds of abstract nouns. Some have to do with an action or process (“performance,” “convergence”), and some refer to a state or quality (“elegance,” “absence”).

Why the different spellings?

The short (and rather misleading) answer here is that “-ance” and “-ence” differ because their respective Latin counterparts did, the classical suffixes -antia and -entia.

But the story isn’t that simple.

What complicates things is that a great many Latin nouns ending in -antia and -entia came into English from Old French before 1500. And the Old French spellings all were “levelled,” as the OED says, to –ance, thus ignoring the difference in Latin.

After 1500, however, new nouns coming into both French and English followed the Latin pattern, some ending in “-ance” and some in “-ence.” And some of the old “-ance” endings from the Middle Ages were even changed back to “-ence” to conform to Latin.

Meanwhile, “-ance” took on a life of its own as an English suffix. People began adding it to native English verbs to form nouns (as in “furtherance,” “forbearance,” “riddance,” and “hindrance”).

The result is that there’s no meaningful difference between “-ance” and “-ence” spellings today. Some words reflect a Latin spelling—like “impudence” (from impudentia) and “vigilance” (from vigilantia)—but many do not.

Check out our books about the English language

The Grammarphobia Blog

How notorious is notoriety?

Q: I was reading a press release from a specialty-food company when I noticed this sentence: “Montebello Kitchens soon gained notoriety as the source for nutritious, delicious, healthful, and flavorful specialty foods.” Is the language shifting or is this use of “notoriety” simply wrong?

A: The noun “notoriety” has meant either fame or infamy since it entered English in the 16th century, but it’s especially used, in the words of the Oxford English Dictionary, “for some reprehensible action, quality, etc.”

Although the published references in the OED suggest that “notoriety” has been used more often than not in the negative sense, the word’s roots are free of infamy.

The dictionary defines the word as “the state or condition of being notorious,” so let’s begin with the adjective “notorious,” which entered English in the late 15th century with “neutral or favourable connotations.”

The adjective is derived from the post-classical Latin notorius (simply meaning well-known) and the classical Latin notus (known).

However, one classical Latin relative did hint at things to come: a notoria was a written statement notifying the authorities of a crime.

It wasn’t until the mid-16th century that “notorious” took a turn toward the dark side—or, as the OED puts it, took on “depreciative or unfavourable connotations.”

The dictionary’s earliest citation for this usage is from the original Book of Common Prayer (1549): Suche persones as were notorious synners.”

Getting back to “notoriety,” can the word be used today to mean fame as well as infamy?

The American Heritage Dictionary of the English Language (4th ed.) defines it as the “quality or condition of being notorious; ill fame.” The adjective “notorious,” meanwhile,  is defined as “known widely and usually unfavorably; infamous.”

(Speaking of “infamous,” we’ve written a blog entry about its use to mean merely “famous.”)

Although one could make an etymological case for using “notoriety” in a positive way, the word carries a lot of negative baggage. That’s why you were puzzled by the Montebello Kitchens press release.

Would we use it positively? Perhaps, but only in rare situations.

For example, we might use it to make a hyperbolic point: “Groucho achieved notoriety as a punster.”

Or we might use it to make a play on words: “In Hitchcock’s Notorious, a close-up of the key in Ingrid Bergman’s hand gained a certain notoriety.”

Check out our books about the English language

The Grammarphobia Blog

Fixation and the alchemist’s art

Q: Is “fixate” really a verb? If you’re strongly fixed on something, then you have a fixation. But are you fixated? It seems that people have turned the noun into a verb.

A: Yes, “fixate” is a real verb. It was first recorded in the 1880s in the physical sense (to fix in place or stabilize). It was first used in the 1920s in a psychological sense (to form an abnormal emotional attachment).

This latter meaning is the one familiar to most of us, with the word usually appearing in an adjectival form (“fixated”) or in the passive (“to be fixated”).

The Oxford English Dictionary says this sense of “fixate” was originally a term in Freudian theory.

This is how the OED defines the psychoanalytic term: “to cause (a component of the libido) to be arrested at an immature stage leading a person to abnormal attachments to people or things, etc.”

Oxford’s first citation for this usage comes from William McDougall’s Outline of Abnormal Psychology (1926). In commenting on the Oedipus complex, McDougall wrote that “every infant normally becomes fixated upon the parent of the opposite sex.”

However, “fixated” has a looser, less technical meaning too. It can simply mean “obsessed with,” the OED says.

The dictionary’s first example of this looser usage is from George Orwell’s Critical Essays (1945): “It is clear that for many years he remained ‘fixated’ on his old school.” (He’s referring here to P. G. Wodehouse.)

Some commentators have suggested that this use of “fixate” may be a back-formation from “fixation,” though we haven’t found any standard dictionaries that say so.

A back-formation is a word formed by dropping part of an older one, and “fixation” is certainly older! It has its origins not in psychology but in alchemy.

“Fixation,” which comes from the medieval Latin fixationem (of action), referred in the alchemist’s art to the “process of reducing a volatile spirit or essence to a permanent bodily form,” the OED says.

The dictionary’s earliest citation for the usage is from a reference to alchemy in  Confessio Amantis (1393), a Middle English poem by John Gower: “Do that there be fixation / With tempred hetes of the fire.”

The noun “fixation” still has various meanings in the modern physical sciences. But, like “fixate,” it also has a technical meaning in psychology, first recorded in 1910 in a translation of Freud.

And, like “fixate,” the noun also has a looser meaning, one the OED compares to “an obsession, an idée fixe.”

The parent of these words is, of course, the verb “fix,” meaning to fasten—that is, make fast—or stabilize.

The earliest recorded use, says the OED, is “to fix (one’s eyes) upon an object,” although its “use in alchemy is nearly as old in English; it is found in the Romanic languages and in the medieval Latin writers on alchemy.”

For centuries, “fix” has been used to mean to fasten or concentrate one’s mind, attention, or affections on someone or something.

This brings a question to mind: Why the “-ate” in “fixate”? It seems that “fix” would do as well and that the suffix is superfluous.

We can think of two explanations.

It may be that other uses of “fix”—to adjust, arrange, prepare, put to rights, make ready, repair, and so on—just became too numerous. When a new usage came along, the suffix “-ate” was used to differentiate it from the rest.

Or perhaps the explanation is simpler, and “fixate”—at least in the Freudian sense—really is, as you suggest, a back-formation from “fixation.”

Check out our books about the English language

The Grammarphobia Blog

Why not look a gift horse in the mouth?

Q: As part owner of a racehorse, I’ve often wondered about the expression “Don’t look a gift horse in the mouth.” Have you ever written or seen anything written about its origin?

A: There’s a brief explanation buried in a blog entry we wrote five years ago about a related phrase (“straight from the horse’s mouth”).

As we noted in that posting, you can learn a lot about a horse’s age and general health by examining its teeth. So a wise trader looks in a horse’s mouth before buying it.

But when someone gives you a horse as a present, you don’t inspect it before you accept it. “Don’t look a gift horse in the mouth” means “Don’t quibble about a gift.”

Here’s a little more information. Within its entry for the noun “gift,” the Oxford English Dictionary says “gift-horse” means “a horse given as a present.”

The OED includes this citation from Samuel Butler’s mock-heroic poem Hudibras (1663): “He ne’er consider’d it, as loath / To look a gift-horse in the mouth.”

But this sentiment didn’t begin with Butler.

The old saying appeared in another form—with “given horse” instead of “gift horse”—in John Heywood’s A Dialogue of the Effectual Prouerbs in the English Tongue Concerning Marriage (1546): “No man ought to looke a geuen hors in the mouth.”

A 1906 edition of Heywood’s proverbs notes that an earlier, toothier version—“A gyven hors may not be loked in the tethe”—appeared around 1510 in the Vulgaria Stambrigi, a book of proverbs collected by John Stanbridge.

But even in Stanbridge’s time, the proverb may have been over a thousand years old. The Anglican bishop and philologist John Chenevix Trench, in his book Proverbs and Their Lessons (1852), says:

“I will not pretend to say how old it is; it is certainly older than St. Jerome, a Latin father of the fourth century; who, when some found fault with certain writings of his, replied with a tartness which he could occasionally exhibit, that they were voluntary on his part, free-will offerings, and with this quoted the proverb, that it did not behove to look a gift horse in the mouth.”

St. Jerome’s words were “Noli … ut vulgare proverbium est, equi dentes inspicere donati” (“Don’t … as the popular proverb goes, inspect the teeth of a gift horse”).

Many languages have proverbs expressing a similar sentiment.

A letter published in 1873 in the journal Notes and Queries said the old horse-and-teeth proverb found its way into French in the 13th century: “Cheval donné ne doit-on en dens regarder” (“Don’t look at the teeth of a given horse”).

The ancient Greeks said more or less the same thing, minus the horse and the dentistry: “Praise the gift that anyone bestows.” Of course the Trojans would have disagreed.

Check out our books about the English language

The Grammarphobia Blog

Are we defeatists?

Q: You say “bemused” is rarely used in the traditional way and anyone doing it is almost certain to be misunderstood. I think your attitude is defeatist and downright wrong. I just did a Google News search for “bemused” and all the articles used it in the traditional way. Why do language mavens blithely accept the skunking of words, depriving us of useful nuances and handing unwanted victory to the ignorant?

A: We too are not amused by the use of “bemuse” to mean amuse, and you won’t see it in our writing. We prefer the traditional sense: to confuse, bewilder, or cause to be engrossed in thought.

But whether we like it or not, the newer usage has been adopted as standard English by Merriam-Webster’s Collegiate Dictionary (11th ed.).

As we said in the posting you consider defeatist, the M-W entry for the verb “bemuse” includes among its meanings “to cause to have feelings of wry or tolerant amusement.”

The American Heritage Dictionary of the English Language (4th ed.) still adheres to the traditional definition, but we’ll be interested to see what the fifth edition says when it’s released soon.

[A 2012 update: The fifth edition of American Heritage sticks to its guns. It includes a usage note that reads, “The word bemused is sometimes used to mean ‘amused, especially when finding something wryly funny,’ as in The stream of jokes from the comedian left the audience bemused, with some breaking out into guffaws. Most of the Usage Panel does not like this usage, with 78 percent rejecting this sentence in our 2005 survey. By contrast, 84 percent accepted a sentence in which bemused means ‘confused.’ ”]

You make a good point, though. We also did a Google News search and found that “bemused” seemed to be overwhelmingly used the traditional way by news organizations.

So, yes, copy editors have generally managed to stem the tide of language change at many news organizations.

But it’s another story in the real world—the one without copy editors to keep us in line. As Bryan A. Garner writes in Garner’s Modern American Usage (3rd ed.), “many writers mistakenly use bemuse as a synonym for amuse.”

One of the examples of the misuse he gives is from a review of a musical in the Chicago Tribune: “The show has a quirky humor that will bemuse jaded adults and even manages to touch some deeper chords without descending to the saccharine.”

But let’s hope you’re right and we’re too pessimistic about the fate of “bemuse.” We’d love to be proved wrong. And Garner doesn’t include it among his list of skunked terms or candidates for skunkdom.

Nevertheless, it’s the job of language commentators to observe shifts in common usage, and to report when those shifts are recognized by lexicographers as evidence that the language has changed. We wrote a posting earlier this year on our changing language.

Check out our books about the English language

The Grammarphobia Blog

Period piece

Q: When did we start emphasizing a point by saying or spelling out the punctuation at the end of it? Example: “I won’t pay that outrageous fine—period!” If one wants to emphasize a point, why not use a more emphatic punctuation mark? Example: “I won’t pay that outrageous fine—exclamation point!”

A: We’ve been saying or writing the word “period” to emphasize the end of a statement since at least the early 20th century, but we’ve had somewhat similar uses of the term since the 16th century.

The Oxford English Dictionary, which describes the word “period” here as an adverb, says the usage is chiefly North American. Here’s how the dictionary defines it:

“Indicating that the preceding statement is final, absolute, or without qualification: and that is all there is to say about it, that is the sum of it, there is no more to be said.”

The OED’s earliest citation for the usage is from Husbands on Approval, a 1914 comedy by William M. Blatt: Have you finished what you were saying, Hamilton? Your heart has found its mate, period. That’s all you wanted us to know, isn’t it?”

A more recent example is this 2001 citation from the New York Times Magazine: Like it or not, you are going to learn something today. Period.

But as we’ve said, this usage has a history. It didn’t show up out of the blue.

The noun “period” has referred to the end of something since the mid-16th century. In Henry IV, Part 1, believed to have been  written in the late 16th century, Shakespeare uses the word in that way: “The period of thy Tyranny approacheth.”

And here’s a recent example from The Liar, a 1991 novel by Stephen Fry: “Peter forbore once more to put a period to the rottenest life in the rottenest den in the rottenest borough in the rottenest city in all the rotten world.”

And since the late 16th century, according to the OED, the noun “period” has meant the “single point used to mark the end of a sentence,” though this usage is now North American. In Britain, a period is called a “full stop.”

If you’d like to read more about what to call this thingy at the end of a sentence, we wrote a blog item earlier this year about periods and full stops.

Why, you ask, don’t people say or write out the phrase “exclamation point” at the end of an emphatic statement? We dunno!

But here’s an interesting use of the phrase from Vladimir Nabokov’s 1963 novel The Gift: “She was slowly mixing a white exclamation mark of sour cream into her borshch, but then, shrugging her shoulders, she pushed her plate away.”

By the way, we do often hear one other punctuation mark spoken: the quotation mark—or, rather, the clipped versions “quote” and “unquote.” In fact, we wrote a posting on the subject some time back.

Check out our books about the English language

The Grammarphobia Blog

Infinitive possibilities

Q: I teach business communications and I’ve always taught my students that infinitives are not verbs. But what about a sentence like “I am going to bake a cake”? Isn’t that the same as saying “I will bake a cake”? I am sitting with three English teachers pondering this question.

A: An infinitive is indeed a verb—a verb in its simplest form.

Infinitives often appear as parts of verb phrases, like the ones you mention: “will bake” and “going to bake.” In both phrases, “bake” is an infinitive.

You might be interested in a blog entry we wrote earlier this year about the use of “going to” in the sense of “will.”

As we said then, the most common way of expressing a future action is by using “will” plus an infinitive (“I will study tomorrow”).

But an alternative method has been around since the 1400s—the use of “be going to” plus an infinitive (“I am going to study tomorrow”).

The most obvious difference between the two is that “will” is followed by a bare infinitive (“study”) while “be going” is followed by the preposition “to” plus the infinitive.

Here’s the Oxford English Dictionary’s first citation for the use of “going to” in this sense. It comes from The Revelation to the Monk of Evesham (1482):

“Thys onhappy sowle … was goyng to be broughte into helle for the synne and onleful lustys of her body.”

Originally, the OED says, this use of “going to” meant “on the way to, preparing or tending to.”

But the dictionary says the construction is “now used as a more colloquial synonym of about to, in the auxiliaries of idiomatic complex verb phrases expressing immediate or near futurity.”

(Note that phrase, “immediate or near futurity.” When you say, “I will bake a cake,” you could be speaking of some remote event. But when you say, “I am going to bake a cake,” or “I’m about to bake a cake,” you’re speaking of something more immediate.)

Thus, “am going to bake” is what the OED would call an idiomatic complex verb phrase. Again, the infinitive here is “bake,” and “going to” functions as an auxiliary.

Infinitives work in many different kinds of verb phrases. For instance, it often happens that one verb is followed by a second in the infinitive. We’ve written about this subject before on the blog, but it bears repeating because infinitives are so widely misunderstood.

Many people don’t realize that in a sentence like “I saw her fall,” the verb “fall” is in the infinitive. In English, this is a very common pattern, especially when the first verb is one involving sensory perception (“see,” “feel,” “hear”).

Here are a few examples of the kinds of verbs that are often paired with infinitives (the infinitives are underlined):

“I helped her move” … “They saw us fight” … “We felt it shake” … “He heard her sing” … “You need not worry” … “Dare we ask?” … “I would rather die” … “We will let it rest” … “Let there be light.”

In addition, the auxiliary “do” is often used with an infinitive to form a question: “Do you smoke?” … “Did they drive?”

And the modal auxiliary verbs (“can,” “may,” “must,” etc.) take infinitives as their complements: “She may smoke” [or “May she smoke?”] … “We must leave” [or “Must we leave?”].

(We’ve had several postings on the blog about modal auxiliaries, including one earlier this month.)

In all of these cases, the second verb is in the infinitive because it needs no inflection. (An inflected verb changes in form to indicate number, tense, and so on.)

Some people don’t recognize these verb forms as infinitives because they expect infinitives to be preceded by “to.” As you can see, that’s often not the case.

Even when the “to” is present, it’s not actually part of the infinitive. It’s a prepositional marker indicating that the infinitive is coming up.

So you can’t “split” an infinitive, no matter what anyone tells you. We’ve written before on the blog about the “split infinitive” myth.

Check out our books about the English language

The Grammarphobia Blog

O death, where is thy sting?

Q: Everyone, including Charles Krauthammer, has done a 180 with the word  “decimate.” It is everywhere both overused and misused.

A: You’re right that almost nobody uses “decimate” in its traditional sense (to kill every tenth man) or in its original English sense (to tax by a tenth).

Like it or not, “decimate” has acquired wider meanings over the years. The old sense of to kill every tenth man is still one of the word’s meanings—but only one.

“Decimate” is generally used these days to mean destroy in part. This is a standard usage, according to The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.).

As far as we can tell, that’s the way the columnist Charles Krauthammer uses the term.

In a May 5, 2011, column in the Washington Post, for example, he says the last Bush administration’s war on terror “scattered and decimated al-Qaeda and made bin Laden a fugitive.”

Although it’s now OK to use “decimate” in this looser sense, the word still carries some of its old etymological baggage.

That’s why we think it’s best to avoid  using “decimate” in a way that clearly conflicts with its old sense of to destroy by a tenth.

For example, we wouldn’t use it with a specific figure, as in “the storm decimated two-thirds of the city.”

And we wouldn’t use it to describe total destruction, as in “The crop was completely decimated.” Beware of those adverbs!

In Origins of the Specious, our book about language myths and misconceptions, we discuss the etymology of “decimate.”

As we write, the word comes to us from the Latin decimus, meaning a tenth, and decimare, to take a tenth. To the Romans, the verb meant to take a tax of one tenth. But it had a darker side too.

“Roman military commanders would sometimes ‘decimate’ a mutinous or cowardly unit by taking every tenth man and executing him,” we say in Origins. “This was called a decimatio, or ‘decimation.’ Occasionally only one in twenty were punished (a process called vicesimatio), or one in a hundred (centesimatio).”

Livy, Plutarch, Tacitus, and other historians of the time all describe incidents leading to military decimations.

“From the bits and pieces of information available, it seems that a rebellious unit was divided into groups of ten, with each group forced to choose lots,” we write in Origins. “The unlucky tenth man in each group was flogged or stoned to death, and as a final indignity the corpse might be decapitated. The survivors were later forced to sleep outside their encampment and to eat barley instead of their usual wheat.”

There’s no evidence, however, that this kind of punishment was common. And historians have suggested that commanders often rigged the lottery so only the ringleaders were killed.

When “decimation” first showed up in English, in 1549, it was used to mean a tithe or a tax of one tenth, according to the Oxford English Dictionary. It wasn’t used to refer to the military punishment until 1580, but that was in a translation of Plutarch.

Over the next three centuries, “decimation” and “decimate” were used in both senses— taxing and executing—though the taxation sense was more common.

Most of the punishment usages were references to classical times, though the British did occasionally revive the ancient practice. The second earl of Essex, for example, used it in Ireland in 1599, apparently inspired by reading a translation of Tacitus.

The OED quotes a 17th-century commentator as saying Essex “decimated certain troops that ran away, renewing a peece of the Roman Discipline.” Essex himself was later beheaded for treason.

In the mid-19th century, the word “decimate” lost the sense of taxing or tithing, but it embraced a meaning that had been seen once in a while, though rarely, in earlier times: to destroy in part or cause great damage.

Charlotte Brontë, for instance, wrote in a letter in 1848, “Typhus fever decimated the school periodically.” We’ve used the word that way ever since.

This sense of destroying in part has been firmly established in English for 150 years. War correspondents from the Civil War to the Crimean War to World War II and beyond have used “decimate” to refer to great destruction or loss of life.

Today this meaning is not only standard English, but also the most common meaning of the word.

Check out our books about the English language

The Grammarphobia Blog

Filling a few holes in the origin of spackle

Q: I’m reading your posting on “splatter” and “spatula.” Are those words also related to “spackle,” which I splatter and spread with a type of spatula?

A: Well, there may be a relationship here, or maybe not. We’ll try to fill a few holes, but the etymology is far from certain.

“Spackle” is a registered trademark, though we didn’t realize it until we began looking into your question.

The product, according to the Oxford English Dictionary, is “a compound used to fill cracks in plaster and produce a smooth surface before decoration.”

Though it’s still a proprietary name, The American Heritage Dictionary of the English Language (4th ed.) says, “this trademark often occurs in lowercase and as a verb in print.”

American Heritage gives this example from the New York Times: “Two young men quietly spackled and whitewashed the walls … for an exhibition.”

Merriam-Webster’s Collegiate Dictionary (11th ed.) has separate entries for the noun and the verb: (1) “Spackle,” capitalized, as the trademarked name of the product, and (2) “spackle,” lowercased, for the verb meaning “to apply Spackle paste to.”

Spackle was created and brought to market in 1927 by the Muralo Company, which patented it in 1928 and still has the trademark. Now a paste, Spackle was originally a dry powder that the user mixed with water.

Should the word be capitalized? We’d lowercase it unless we were referring to the Muralo product. The term “spackle” is now used for so many similar hole-and-crack-filling products that the trademark has lost its distinctiveness.

Where did the word come from? The Muralo website doesn’t say. The OED tells us that the word’s origin is “uncertain,” but it suggests a couple of possibilities.

First, there could be a connection between “spackle” and the verb “sparkle,” which was used in the late 18th and early 19th centuries as a technical term meaning “to overlay or daub with cement or the like.” (Who knew?)

Second, there’s a similarity with the German word spachtel (meaning a putty knife, mastic, or filler). The German term, the OED says, was originally a 16th-century variant of Spatel (spatula), from the Italian word for the same thing, spatola.

As we said in our earlier posting, “spatula” has had many variant forms since it entered English in the 15th century, including “spattle,” “spartle,” “spatter,” and “splatter.”

Perhaps there’s a spatula hidden in the etymology of “spackle” as well.

Check out our books about the English language

The Grammarphobia Blog

Is using “lay” for “lie” a hanging offense?

Q: I see that back in 2007 you tackled the misuse of “lay” for “lie.” Today, the demise of “lie” seems complete in spoken English. Shall I just try to ignore this ignorant blunder as I do with gum-cracking?

A: Perhaps it’s time to revisit “lie” and “lay,” but we’ll start by saying we think our old post still holds up, four years later.

This is what we said then:

The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.) still continue to give the same old principal parts of ‘lie’ and ‘lay’ that our grandparents learned.”

And nothing’s changed in the interval.

Here’s a brief review of the subject, from Pat’s grammar and usage book Woe Is I:

LIE (to recline): She lies quietly. Last night, she lay quietly. For years, she has lain quietly.

LAY (to place): She lays it there. Yesterday she laid it there. Many times she has laid it there. (When lay means ‘to place,’ it’s always followed by an object, the thing being placed.)”

Them’s the facts, ma’am. But here are a few more.

First, even people who bother to use these words correctly in writing may slip up in speech.

And second, mixing up “lie” and “lay” wasn’t always considered a mistake, though it’s now regarded as one of the classic boo-boos of English grammar.

“These verbs are one of the most popular subjects in the canons of usage,” says Merriam-Webster’s Dictionary of English Usage.

Both words have long histories. They were first recorded in writing in the 800s, according to the Oxford English Dictionary.

Each has a prehistoric Germanic base in its ancestry: leg- for “lie” and lag- for “lay.” The relationship between the two is causative—to lay something is to cause it to lie.

“Lie” has always meant, more or less, to be in a recumbent position. It’s generally an intransitive verb, meaning that it needs no direct object (as in “Don’t lie there”).

“Lay” originally meant to place, set, or cast down. It’s generally a transitive verb, meaning that it needs a direct object (as in “Don’t lay your shoes there”).

When “lay” is used as an intransitive verb—as in “Don’t lay on the couch”—the OED says it’s “only dialectal or an illiterate substitute for lie.” But this apparently wasn’t considered a mistake in the 17th and 18th centuries, the dictionary adds.

What largely accounts for the confusion, in the opinion of many commentators, is that the words overlap in different tenses: the past tense of “lie” is “lay.”

So someone who says “I lay on the couch” could be correctly using the past tense of “lie” or incorrectly using the present tense of “lay.”

All standard dictionaries, the M-W usage guide says, “mark the intransitive lay for lie as nonstandard in one way or another.” Yet the mistake persists, particularly in speech.

“The conflict between oral use and school instruction,” Merriam-Webster’s says, “has resulted in the distinction between lay and lie becoming a social shibboleth—a marker of class and education.”

Despite the stigma, says M-W, even people who know the rules and follow them in formal situations may not bother in “informal, friendly circumstances.”

But as we’ve said before, speech is one thing and written English is another.

As Merriam-Webster’s acknowledges, “by far the largest part of our printed evidence follows the schoolbook rules.”

In summary, the old schoolbook distinction between “lie” and “lay” isn’t going away—at least not in writing.

So it’s worth knowing the difference and keeping them straight, especially when you write. But if you forget yourself when you talk, that’s not a hanging offense.

Check out our books about the English language

The Grammarphobia Blog

Trend games

Q: Would you please discuss the trend of using “trending” in news reports? It seems to have cropped up in the past few months and is now ubiquitous. Nary a news report goes by without saying, “Here’s what’s trending.” It rubs me the wrong way.

A: We find this usage a bit trendy, but it doesn’t bug us all that much. If it gets used too much, it’ll lose its trendiness and you’ll be hearing less of it.

The present participle “trending” is being used here to indicate news that’s hot or breaking or engaging or noteworthy or otherwise interesting (at least to the editors).

The CNN website, for example, often begins headlines with this participle: “TRENDING: McCain cautions Christie” … “TRENDING: Palin threatens lawsuit over book” … “TRENDING: Senators place blame for budget stalemate.”

Although standard dictionaries don’t mention this precise sense in their entries for the verb “trend,” it does call to mind one common meaning of the noun “trend”: a new and popular style or fashion.

When the verb entered English more than a thousand years ago, it meant to revolve or rotate or roll, according to the Oxford English Dictionary, but that sense is now considered obsolete.

In the late 16th century, the verb took on one of its contemporary meanings: to extend in a general direction or follow a general course (as in, “The path trends to the north”).

It wasn’t until the mid-19th century that the verb took on the figurative sense that’s common today: to show a tendency or a shift (“Opinions now trend in the conservative direction”).

The OED’s earliest citation for this figurative use is from the British novelist George Alfred Lawrence’s 1863 account of his failed attempt to fight for the Confederacy:

“In which direction do the sympathies and interests of the Border States actually trend?”

Be patient! Trendy usages tend to be overworked and become stale. Eventually, they rub enough people the wrong way … and then they aren’t trendy anymore.

Check out our books about the English language

The Grammarphobia Blog

Hey, Mr. Taliban

Q: I used to HATE hearing American newscasters use “Taliban” as a plural. I considered it a British affectation. I finally asked my longtime copyeditor, one of the best in the Bay Area, and she politely informed me that “Talib” is singular and “Taliban” is plural. So it’s OK for an American to use “Taliban” as a plural, because it is one!

A: “Taliban” is widely regarded as a plural noun in both the US and the UK.

Most news organizations, both here and abroad, treat “Taliban” as plural. Here are snippets from a few news and opinion articles published in recent weeks:

“The Taliban have not claimed responsibility” (AP) …. “the Taliban have ordered their forces “ (CNN) … “the Taliban have embarked “ (The New Yorker) … “the Taliban are now inveigling children” (Huffington Post) … “even if the Taliban are in the ascendancy” (Financial Times) … “the Taliban are rightly accused” (The Guardian) … “The Taliban are seeking” (Toronto Star) … “The Taliban are extremely unlikely” (New York Times) … “the Taliban are being strangled” (Fox News).

However, it’s not unusual to find “Taliban” used as a singular collective noun when referring to the Islamic group as a single entity. Here are some examples:

“The Taliban is a concern, but it’s not public enemy number one” (Washington Post) … “the Pakistani Taliban has stepped up attacks” (Voice of America) … “the Taliban has no interest in reconciliation”  (CNN) … “the Taliban has returned to those arid hills” (Boston Globe) … “The Taliban is using the idiom of justice as its calling card and recruiting card” (Brig. Gen. Mark Martins, quoted in the Wall Street Journal).

For the period we examined in a Google News search, the uses of “Taliban” with plural verbs (“are,” “were” “have”) outnumbered those with singular verbs (“is,” “was,” “has”) by about two to one.

Who’s right—is “Taliban” plural, or is it a singular collective noun?

Only a few standard dictionaries have weighed in on this, since “Taliban” came into use in English less than 20 years ago.

The American and English versions of the Cambridge Dictionaries Online say “Taliban” can be used with either a singular or plural verb, but no examples are given.

Merriam-Webster’s Online as well as the online Longman’s Dictionary of Contemporary English say “Taliban” is a plural noun.

Etymology is on the side of the plural usage, and that’s the position taken by most news organizations.

In the Pashto and Persian languages, ṭaliban is a plural form of ṭalib, a word from Arabic that means a student or seeker. (In Arabic, the plural would be tullab or talaba.)

The Oxford English Dictionary has no separate entry for “Taliban,” but under the noun “Talibanization” it notes: “The name Taliban arises from the fact that the movement began amongst Afghan Islamic students exiled in Pakistan.”

The dictionary’s first citation for the name in English is from a January 1995 issue of Asiaweek:

“A powerful new armed faction, known as the Taliban or ‘religious students,’ mysteriously emerged in October and has already transformed the balance of power in southern Afghanistan.”

Etymology aside, there’s an argument to be made for using “Taliban” as a singular in the US when referring to the Islamic organization itself, rather than the members of it.

In American usage, collective nouns like “company” and “government” are routinely treated as singular, though a British speaker would treat them as plural.

As for how to refer to an individual member of the group, we hear “a Taliban” more often than “a Talib,” etymology be damned. But the usual practice is to use “Taliban” as an adjective in phrases like “a Taliban fighter” or “a Taliban suicide bomber.”

In his book Organizations at War in Afghanistan and Beyond (2008), Abdulkader H. Sinno defends the singular usage of “Taliban” this way:

“Taliban joins the Arabic noun talib (seeker, as in seeker or truth or knowledge) with the plural Dari and Pashto suffix ‘an.’ I refer to the Taliban in both the singular and plural to reflect current practice. It is more accurate to use the singular tense, however, because the Taliban is an organization with a structure and not an amorphous group of students like the name would indicate and the organization’s mythology would imply.”

Our position is that the plural use of “Taliban” for members of the movement is firmly established in common usage and is etymologically sound. But when speaking of the organization rather than its members (as in “The Taliban is growing”), it’s reasonable to use the word as a singular collective noun.

Check out our books about the English language

The Grammarphobia Blog

A foolish consistency?

Q: Is there a difference between “consist in” and “consist of”? If so, when do you use “in” and when do you use “of”?

A: Yes, there is a difference, but many writers, especially Americans, use “consist of” consistently, and we wouldn’t be surprised if “consist in” is eventually lost.

Here’s how The American Heritage Dictionary of the English Language (4th ed.) describes the two verbal phrases:

To “consist of” means to “be made up or composed: New York City consists of five boroughs.

To “consist in” means to “have a basis; reside or lie: The beauty of the artist’s style consists in its simplicity.

If you find that a bit fuzzy, Bryan A. Garner, in Garner’s Modern American Usage (3rd ed.), offers a clearer explanation.

Garner explains that “consist of” is used in reference to “the physical elements that compose a tangible thing.”

“The well-worn example,” he writes, “is that concrete consists of sand, gravel, cement, and water.”

Garner says “consist in” means “to have as its essence,” and refers to “abstract elements or qualities, or intangible things.”

“Thus,” he writes, “a good moral character consists in integrity, decency, fairness, and compassion.”

As an example of “consist of” used incorrectly for “consist in,” Garner offers this  sentence written by Henry Kissinger in 1994: “The beginning of wisdom consists of recognizing that a balance needs to be struck.”

We’ll end with a brief history of the verb “consist,” which comes from the Latin consistere (to place oneself, stand still, stop, remain firm, exist).

When the verb entered English in the 15th century, according to the Oxford English Dictionary, it meant to “have a settled existence, subsist, hold together, exist, be.” But these senses are now considered obsolete or archaic.

The two usages you ask about (“consist in” and “consist of”) showed up in the 16th century, but it didn’t take long for people to start using them inconsistently.

For example, the 18th-century lexicographer Samuel Johnson, in translating a travel book from Portuguese into English, wrote, “The whole Revenue of the Emperor consists in Lands and Goods.”

But perhaps Johnson, like Emerson, believed, “A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines.”

Check out our books about the English language

The Grammarphobia Blog

Lexical buttinskies

Q: I’ve always thought of tmesis as a newish, nifty linguistic device for being emphatic, folksy, funny, or just plain crude. However, I recently read “A Hymn to Christ,” where John Donne uses it twice in the first stanza. So does tmesis have a respectable literary past? And is “whatsoever” (the word Donne splits) an example of it?

A: Let’s begin by explaining this linguistic critter for readers of the blog who aren’t familiar with the term “tmesis” (pronounced TMEE-sis).

Webster’s Third New International Dictionary, Unabridged, defines “tmesis” as a “separation of parts of a compound word by the intervention of one or more words (as what place soever for whatsoever place).”

When we see tmesis these days, it’s usually used for emphasis or humor, often by inserting a crude term between the parts of the compound word.

Here are a few examples of these buttinskies in compound words: “abso-damn-lutely” …  “a whole nother” … “un-fucking-believable” … “any-bloody-body” … “god-freaking-awful.”

The earliest example of tmesis in the Oxford English Dictionary is from a 1592 definition of the term that includes this example: “What might be soeuer vnto a man pleasing” (“What might be soever unto a man pleasing”).

As you suspect, tmesis has a respectable literary past. In fact, it’s been used as a poetic device since classical times, though it didn’t show up in English until the 16th century. The term ultimately comes from the Greek verb temnein (to cut).

Interestingly, “whatsoever,” the compound used in the OED and Webster’s examples, is the same term that’s split twice in the 1619 poem you cite. Here are the opening lines, which describe Donne’s concerns as he sets off on a trip across the Channel:

“In what torn ship soever I embark,
That ship shall be my emblem of Thy ark;
What sea soever swallow me, that flood
Shall be to me an emblem of Thy blood.”

Here are some literary examples of tmesis in Shakespeare, from the late 16th and early 17th centuries:

“This is not Romeo, he’s some other where,” Romeo and Juliet

“That man, how dearly ever parted,” Troilus and Cressida

“If on the first, how heinous e’er it be,” Richard II

You’ve asked if “whatsoever” itself is an example of tmesis.

At first glance, it looks as if it is indeed a textbook example, with “so” inserted between the two parts of “whatever” for emphasis.

But “whatsoever” has been an English word since the early 14th century, a combination of the archaic “whatso” and “ever,” according to the OED. In fact, “soever” was also a word, though it showed up a couple of hundred years after “whatsoever.”

As for other “-soever” words, most have origins similar to that of “whatsoever.”

The pronoun “whosoever” was originally formed from “whoso” and “ever,” the adverb “whensoever” from “whenso” and “ever,”  the adverb “wheresoever” from “whereso” and “ever,” and so on.

The “-soever” word that comes closest to being an example of tmesis is “howsoever,” but the OED says it originated in the 16th century by combining “how” and “so” and “ever,” not by sticking “so” in the middle of “however.”

By the way, we wrote a posting earlier this year about compound words, and included a list of triple compounds from a reader.

We’ll end this with a modern literary example of tmesis, from Kingsley Amis’s 1960 comic novel Take a Girl Like You: “It’s a sort of long cocktail—he got the formula off a barman in Marrakesh or some-bloody-where.”

Check out our books about the English language

The Grammarphobia Blog

Who put the “feck” in “feckless”?

Q: I recently read Consequences by Penelope Lively and came across the words “diffident” and “feckless.” I memorized them for the College Board exams (in the ’60s), but I go to the dictionary now when I come across them, usually in more “literary” writing. I’ve never heard anyone actually say either word. Have you?

A: “Diffident” and “feckless” don’t often come up in conversation—at least not in ours! These are words we encounter mostly in literary works like the novel you just read.

We do see them once in a while in less literary writing. Diplomats, for example, seem to like using “diffident” and “feckless” to say undiplomatic things.

A US cable revealed recently by WikiLeaks described Philippine President Benigno Aquino as “a diffident, unassertive man.” And another cable referred to Italian Prime Minister Silvio Berlusconi “feckless, vain, and ineffective.”

Today “diffident” means timid, shy, and lacking self-confidence. But this wasn’t always so.

When it first entered English, in the late 1500s, the adjective “diffident” meant distrustful. It came from the Latin diffīdentem, which is traceable to the verb diffīdere (to mistrust) and ultimately to fidere (to trust).

The adjective was preceded by the noun “diffidence,” which was known since before 1400, according to the Chambers Dictionary of Etymology.

In the 17th century, both “diffident” and “diffidence” shifted gears, according to entries in the Oxford English Dictionary. They began to be associated with a lack of confidence in oneself, rather than a lack of confidence in others.

As your ear will probably tell you, the Latin words fides (trust, belief) and fidere have given us a whole family of English words: “confide” and “confidence,” “defy,” “faith,” “fealty,” “fidelity,” “fiduciary,” “perfidy,” and “infidel.” And believe it or not, “federal” is part of the family!

“Feckless,” on the other hand, comes not from Latin (at least not directly) but from dialects spoken in Scotland and northern England. It was first recorded in the late 1500s and means—listen for the echo—ineffective.

We know what you’re thinking. Is there a word “feck,” to which “less” was added? The answer is yes!

“Feck” is in fact a Scottish shortening of “effect,” Chambers says. And the ancestor of “effect” is the Latin verb efficere, meaning to work out or bring about. The Latin word is a compound of ex (out) and facere (to make or do).

The noun “feck” was first recorded in the late 1400s and means, in the words of the OED, “operative value, efficacy, efficiency” and hence also “vigour, energy.” It’s still used in parts of Britain today.

Originally, the OED says, “feckless” was used to describe things (not people) that were considered “valueless, futile, feeble.”

Later, it was used chiefly to describe people believed to be “lacking vigour, energy, or capacity; weak, helpless; (now more usually) irresponsible, shiftless.”

So “feckless,” too, has shifted gears somewhat. These says, it often describes not just incapacity or inability but moral weakness.

They’re both fine old words—“diffident” and “feckless.” It’s nice to come across them in literary writing as well as in leaked diplomatic cables.

Check out our books about the English language

The Grammarphobia Blog

Drive friendlily?

Q: After attending a business conference in San Antonio, I rented a car and did a little sightseeing. What’s with all those “Drive Friendly—The Texas Way” signs? Shouldn’t it be “friendlily”? Don’t they teach grammar in the Lone Star State?

A: We’re not familiar with the state of grammar education in Texas, but the wordsmiths at the Texas Department of Transportation got this right.

“Friendly” has been both an adverb and an adjective since the Middle Ages. In fact, “friendlily” is the klutzy latecomer— it didn’t arrive on the scene until the 17th century. For most of us, it never really did arrive.

Although “friendly” has the telltale mark of most adverbs— an -ly ending—it’s widely considered just an adjective.

In Origins of the Specious, our book about language myths, we discuss this and other misconceptions about adverbs. We’ve also had several items on the blog about adverbs without -ly tails, including a posting back in 2006.

We often hear from people who get bent out of shape when they see a “GO SLOW” sign on a suburban street.

“What’s happening to adverbs?” they complain. “Why is everybody using adjectives  instead? Is  the -ly disappearing from English?”

The handwringers apparently believe that an adverb, a word that modifies a verb, has to end with -ly. As far as they’re concerned, “slow” is an adjective, “slowly” is an adverb, and never the twain shall meet.

The truth is that adverbs can come with or without tails. The ones without -ly (they’re called simple or flat adverbs) were seen more often in the past, though they may be making a revival now, if our mail is any indication.

Many adverbs, like “slow” and “slowly,” exist in both forms. In such cases, usage experts generally recommend the -ly version for formal writing, but there are lots of exceptions.

No one would insist, for instance, on “lately” in a sentence like “The plane arrived late and we missed our connection.” (“Lately,” as you know, means recently, not tardily.)

The most respected writers use phrases like “sit tight,” “go straight,” “turn right,” “work hard,” “rest easy,” “aim high,” “dive deep,” “play fair,” and “think fast.”

Yes, “straight,” “right,” “hard,” and the rest are bona fide adverbs, and they’ve been adverbs since the Middle Ages.

So why do so many people believe that an adverb must end in -ly? Here’s some history, from Origins of the Specious:

“We’ve had adverbs with and without the -ly (or archaic versions of it) for more than a thousand years. In Old English, adverbs were often formed by adding -e or -lice to the end of adjectives. Over the years, the adverbs with a final e lost their endings and the -lice adverbs evolved into the modern -ly ones. Take the word ‘deep.’ The Old English adjective diop had two different adverbs: diope and dioplice, which eventually became the modern adverbs ‘deep’ and ‘deeply.’

“Sounds simple, right? So how did things get confusing? You guessed it— the Latinists strike again. In the 17th and 18th centuries, they insisted that adjectives and adverbs should have different endings in English, just as they do in Latin. So these busybodies began tacking -ly onto perfectly legitimate flat adverbs, and preferring -ly versions where both kinds existed.

“The lesson? Next time you start to pounce on someone for using an adverb without -ly, go slow. And go to the dictionary.”

Check out our books about the English language

The Grammarphobia Blog

When a helping word makes you cry help!

Q: I transcribe sonogram reports for a doctor who routinely uses this passage: “should any nodule become larger or develop/develops suspicious characteristics.” I’m confused—“develop” or “develops”? I know “nodule” is singular and needs a singular verb, but does “any” change this?

A: The right choice is “develop,” but “any” has nothing to do with it. And it doesn’t matter whether the subject is singular or plural. Here’s the story.

The passage you cite includes two verbal phrases “should … become” and “should … develop.”

You may be confused because there’s only one “should” (the second is understood) and because a couple of words slipped in between “should” and the first verb.

Here’s the passage with several omitted but understood words in brackets: “should any nodule become larger or [should any nodule] develop suspicious characteristics.”

The “should” at the beginning of the original passage is a helping word, or auxiliary, that indicates the two verbal phrases are conditional. (A conditional verbal phrase describes an action that depends on another situation.)

Technically, the word “should” here is a modal auxiliary, a verb that’s used, among other things, to indicate the conditional mood.

So why doesn’t it matter whether the subject is singular or plural?

Because a modal auxiliary like “should” is accompanied by an infinitive (“become” and “develop” in this case). And the infinitive remains the same for singular and plural subjects.

Examples: “He should become” … “They should become” … “I should develop” … “We should develop.”

Although the doctor’s passage is standard English, it’s often seen in a version with “if” at the beginning: “If any nodule should become larger or develop suspicious characteristics.”

As The Cambridge Grammar of the English Language explains, a version like the doctor’s is created by omitting “if” and reversing the positions of the subject and the auxiliary.

Many people don’t think of “become” and “develop” in the examples above as infinitives because they aren’t preceded by “to.” As you can see, that’s often not the case.

Even when the “to” is present, it’s not actually part of the infinitive. It’s a prepositional marker indicating that the infinitive is coming up.

In other words, you can’t “split” an infinitive, no matter what anyone tells you. We’ve written before on the blog about the “split infinitive” myth.

Check out our books about the English language

The Grammarphobia Blog

Does “duck and cover” have fowl origins?

Q: When that six-ton NASA satellite fell to earth a few weeks ago, I got to wondering about the expression “duck and cover.” Does it have anything to do with ducks?

A: The phrase “duck and cover” doesn’t have anything to do with ducks—at least not directly.

The verb “duck,” meaning to dip, plunge, or dive, is what gave the waterfowl its name. The bird is called a “duck” because it “ducks” or dives below the water’s surface.

The verb is probably derived from the Old English ducan (to dive), which has prehistoric West Germanic roots, according to John Ayto’s Dictionary of Word Origins.

The Old English verb was the source of the bird’s name in Anglo-Saxon times: duce. The spelling “duck” developed later, for both the noun and the verb.

The Oxford English Dictionary says another (and drier), sense of the verb “duck” came along in the 16th century: “to bend or stoop quickly so as to lower the body or head; to bob; to make a jerking bow.”

And it’s this sense of the verb that gave us the phrase “duck and cover,” which describes a defensive posture in which one lowers the head and covers it with the arms or hands.

Any American who grew up in the 1950s will recall the “duck and cover” drills in which students were instructed to duck under their desks and protect their heads in case of a nuclear attack.

A 1951 civil defense film called Duck and Cover  warned school students about what to do in case an atomic bomb were dropped. The recurring message: “Duck and cover fast!”

An animated opening sequence featured a cartoon character, Bert the Turtle, who demonstrated the “duck and cover” technique by withdrawing into his shell.

A turtle, yes, but not a duck in sight!

Check out our books about the English language

The Grammarphobia Blog

Bells are ringing

Q: Why does saying someone has bats in the belfry mean he (or she) is crazy? What do bats and belfries have to do with it? And, for that matter, why is a belfry called a “belfry”? Is it because of the bells?

A: Let’s begin with the word “belfry.” What comes to mind? Bats and bells, no doubt. That’s where bats hang out, that that’s where bells ring.

It’s reasonable to conclude that the origin of “belfry” had something to do with bells. But we shouldn’t jump to conclusions.

Linguists believe the word ultimately comes from the prehistoric Germanic word bergfrid, meaning a place of shelter.

But the word entered English in the 12th century via the Old French berfrei, a siege tower. No bells here.

For the first few hundred years, the word was spelled all sorts of ways in English (“berefrei,” “berfrey,” “barfray,” etc.), and it meant a siege tower, a movable structure used to protect attackers besieging a fortification.

The word wasn’t used for a bell tower until 1440, about the same time “bel” entered the picture (“belfray,” “belfroy,” “belfrie,” and finally “belfry”).

The lexicographers at the Oxford English Dictionary say the new “bel” spelling was undoubtedly popularized by the use of the term for a bell tower.

But let’s not leave the bats hanging.

For all we know, bats have taken up residence in  belfries for hundreds of years. The expression “bats in the belfry,” though, comes from another meaning of “belfry.”

In the early 1900s, “belfry” was a slang expression for someone’s head. To have “bats in the belfry” was to be nuts—in other words, batty.

Check out our books about the English language

The Grammarphobia Blog

How do you say “double entendre”?

Q: How should an English speaker pronounce “double entendre”? Like French? Or like English? Or whatever?

A: Let’s begin with a little history.

English adopted “double entendre” in the 17th century from a now-obsolete French phrase that meant double understanding or ambiguity. But English speakers gave the expression a new, suggestive twist.

The Oxford English Dictionary defines the phrase this way: “A double meaning; a word or phrase having a double sense, esp. as used to convey an indelicate meaning.”

The earliest citation in the OED is from John Dryden’s 1673 comedy Marriage a-la-Mode: “Chagrin, Grimace, Embarrasse, Double entendre, Equivoque.”

And here’s a 1694 example from Dryden’s play Love Triumphant: “No double Entendrès, which you Sparks allow; / To make the Ladies look they know not how.”

Interestingly, there’s no exact equivalent in modern French to our expression “double entendre.” Two near misses, double entente and double sens, don’t have the suggestiveness of the English version.

So how should an English speaker pronounce our illegitimate offspring? Illegitimately, of course.

Dictionaries are all over the place on this, but we treat “double” as an English word (DUB-ul) and “entendre” as if it’s French (ahn-TAN-dr).

Check out our books about the English language

The Grammarphobia Blog

A reprehensible posting

Q: If something is reprehensible, can we reprehend it? Or do we “reprimand” it?  If so, is it reprimandable?

A: Yes, we can (and do!) “reprehend.” And if we “reprehend” something, that means we find it “reprehensible.”

We don’t use the verb “reprehend” much anymore, which is too bad. It’s an expressive word, meaning to reprimand, reprove, find fault with, censure, condemn, or disapprove.

“Reprehend” entered English in the 1300s. It ultimately comes from a classical Latin verb, reprehendere, which the Oxford English Dictionary defines as meaning “to hold back, to retrieve, to censure, to find fault with, to rebuke, to refute.”

That Latin verb is also the source of the Latin adjective reprehensibilis (open to censure, blameworthy), from which “reprehensible” is derived.

The English adjective, says the OED, means “deserving of reprehension, censure, or rebuke; reprovable; morally detestable.”

It was first recorded, according to the dictionary’s citations, in the Wycliffe Bible of 1384: “for he was reprehensyble, or worthi for to be reprouyd [reproved].”

Here’s a more up-to-date citation, from a 2001 issue of the New York Review of Books: “Terrorism is reprehensible and unacceptable.”

A closer look at the Latin reprehendere shows that it consists of the prefix re- plus prehendere (to grasp, seize, or catch), which is the source of the now rare English verbs “prehend” (to seize, arrest, or grasp) and “prend” (to take, understand, or comprehend).

The English words “reprehend,” “comprehend,” and “apprehend” all have similar Latin origins, and have to do with seizing, grasping, or laying hold of something—whether physically or mentally.

By this time you’ve probably noticed a recurring theme here—a “hend” keeps cropping up.

As the OED explains, the Latin prehendere consists of the prefix pre- plus a second element. And this second element comes from same Indo-European base as our Germanic verb “get” (reconstructed as hed), with a nasal “n” thrown in.

You can hear an Indo-European echo in the Old English verb gehende, which means near or convenient—literally, “at hand.”

You also asked about “reprimand.” It first showed up, both as a noun and as a verb, in the 17th century. Its lineage goes back to the Latin reprimere (to hold in check), source of the 14th-century English word “repress.”

“Reprimandible” isn’t in any dictionary we’ve checked, perhaps because so many other words might fit the bill: “blameworthy,” “censurable,” “contemptible,” “discreditable,” “disreputable,” “reproachful,” etc.

Check out our books about the English language

The Grammarphobia Blog

Is “injust” one of those things?

Q: My daughter came home with a list of words to study for her third-grade class. One was “injustice,” which she had to use in a sentence. She then had to use derivatives and she wrote a sentence using “injust.” I told her “injust” wasn’t a word and she ought to use “unjust,” but she insisted her teacher said it was correct. Can you help clarify?

A: Well, you won’t find “injust” in standard dictionaries, but it is indeed a word—an antiquated adjective that may be having a revival.

The Oxford English Dictionary, which describes the word as “obsolete,” says “injust” means the same as “unjust”: that is, not just.

The earliest citation for “injust” in the OED is from a collection of poems, published sometime before 1430, by John Lydgate. The latest is from a 1711 diary entry by the English antiquarian Thomas Hearne.

A series of Google searches suggests that “injust” began showing signs of a rebirth in the 1970s. Since then, there have been more than 700,000 sightings of the usage on Google.

Despite all its recent fans, we wouldn’t describe “injust” as standard English—at least not yet.

For now, if we meant unjust, we’d use “unjust,” the older adjective and by far the more popular. It entered English in the late 1300s, according to OED citations, and gets more than 25 million hits on Google.

The noun “injustice,” which also entered English in the late 1300s, means the opposite of justice or an action that’s unjust.

All three words are ultimately derived from two Latin terms concerning justice: the noun justitia (justice) and the adjective justus (just).

Interestingly, “injustice” and “injust” have negative Latin prefixes, while “unjust” combines an Old English prefix with a Latin root.

A traditionalist, especially a Latinist, might argue that “injust” is the more “legitimate” adjective because it reflects its Latin roots better than “unjust.”

But the use of negative prefixes in English with words of Latin origin is so capricious that it’s meaningless to use a word like “legitimate” here.

From the 14th century on, the OED notes, the negative prefixes “in-” and “un-” have been added with “considerable variation” to words of Latin origin.

In fact, some of these words had versions using both prefixes. For example, “inability,” “incorrigible,” “incurable,” and “indiscreet” once existed alongside “un-” versions.

Since the 17th century, the OED says, there’s been a tendency “to discard one or other of the doublets, the forms with in-, etc., being very commonly preferred when the whole word has a distinctively Latin character, as inadequate, inadvertence, inarticulate, etc.”

But there’s “no absolute rule” about whether to keep one or both prefixes, the OED adds, “and doublets are still numerous, as in- or un-advisable, in- or un-alienable, etc.”

Getting back to your question: yes, “injust” is a word, but we suspect that your daughter’s third-grade teacher doesn’t know much about etymology and simply mistook it for the far more common adjective “unjust.”

Check out our books about the English language

The Grammarphobia Blog

Cite lines

Q: I was surprised to see “cite” used as a noun in your recent posting about the word “anachronism.” Am I behind the curve? I hope not. I wouldn’t want to see this become standard. I expected to see “citation,” but would have preferred “example.”

A: We do indeed use “cite” every once in a while on our blog when we tire of “citation,” “example,” “reference,” “instance,” “quotation,” etc. As a reader of the blog, you’re aware that we do a lot of citing!

Yes, the word “cite” is a bit jargony, but it’s common among linguists, etymologists, and other language types.

The two standard dictionaries we use the most—The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.)—don’t have entries for “cite” as a noun.

But the Oxford English Dictionary has an entry for the noun and considers it standard English. In fact, the OED lists more than half a century of published references for “cite” used as a shortened form of “citation.”

The usage has been particularly handy in legal and academic writing, where authors frequently have to cite cases and references. (Yikes! This is probably the first time we’ve ever defended a usage by citing legal and academic writing.)

The first example given in the OED is from a 1957 issue of the Atlantic Reporter, a regional case-law publication: “The Legislature in 1951 passed the Police Tenure Act, (cite. omitted).”

Note that the term first appeared with a period, a clear indication that the editors considered it an abbreviation. But subsequent examples in the OED dispense with the period.

Here’s a 1975 cite from Bookletter, a New York literary periodical that was once published by Harper’s: “He has personally collected a file of over 250,000 cites.”

Here another usage, from the Yale Law Journal (1998): “First a cite to Morrall, then a cite to the source citing Morrall, and so on until the connection to Morrall is forgotten.”

This abbreviation was probably inevitable, given that the noun “quote”—short for both “quotation” and “quotation mark”—has been around since the 19th century. We touched on this issue in a posting a couple of years ago.

In short, the use of “cite” as a noun, short for “citation,” has been around for more than half a century and it’s certain to last. So you might as well make your peace with it.

Check out our books about the English language

The Grammarphobia Blog

Tent and tambourine

Q: Are you familiar with the expression “so happy she wanted to rent a tent and a tambourine”? An ESL teacher in Hungary says it’s on an English test used in the high school where she works. I can’t find anyone who’s ever heard of it.

A: We’d never heard that “tent and tambourine” expression until you brought it to our attention.

We can’t tell you where it originated or when, but such expressions refer to old-fashioned revival meetings, jubilant gatherings held under tents to the accompaniment of tambourines.

When someone says he wants to spread a piece of good news with “a tent and a tambourine,” he’s likening himself to an old-time gospel preacher.

Here are some of the examples we’ve found.

A letter to the editor in Salon in 2009 commented on the conservative pundits Ann Coulter and Rush Limbaugh:

“What they both are doing is comparable to the characters (Elmer Gantry and Sister Sharon) in Sinclair Lewis’ novel Elmer Gantry—minus the tent and tambourine.”

In 2003, an article in Reader’s Digest quoted a doctor eager to spread the word about her research findings: “I was about ready to rent a tent and a tambourine. … Because it’s rare that you see something that clearly in medicine.”

A biography of the evangelist and faith healer Aimee Semple McPherson described a revival meeting held under a big white tent in Philadelphia in 1918. Here’s the passage, from Daniel Mark Epstein’s Sister Aimee (1994):

“With chins and hands raised and eyes closed, the thousands abandoned themselves to the Spirit. The beat of Aimee’s tambourine as it flashed in arcs was irresistible.”

And this is from a memoir first published in 1941, Alaska Challenge, by Ruth Sutton Albee and William Albee with Lyman Anson:

“We heard that once during a streak of bad luck he had borrowed a tent and tambourine and conducted revival meetings, cleaning up handsomely thereby.”

Some, though not all, of the published references we found in various databases used the expression in a negative or condescending way.

Check out our books about the English language

The Grammarphobia Blog

A faux French quirk

Q: I recently read a review of A Death in Summer, the latest Benjamin Black mystery. The writer referred to “Benjamin Black” as the nom de plume of the Booker Prize-winning author John Banville. I think it’s silly to use a French term here when we have a perfectly fine English one, “pen name.” Your thoughts please?

A: We prefer “pen name” too, but nom de plume isn’t French. It’s as English as Big Ben, the Tower of London, and fish and chips.

In Origins of the Specious, our book about language myths, we discuss the unusual birth of this faux French expression.

The real French term for an assumed name is nom de guerre, which the British adopted in the late 17th century. But in the 19th century, British writers apparently thought the original French might be confusing.

One can see why nom de guerre, literally “war name,” could puzzle readers.

The French initially used it for the fictitious name that a soldier often assumed on enlisting, but by the time the British started using the expression, it could mean any assumed name— in English as well as French.

The fake-French nom de plume was introduced in English in the 19th century. An obscure Victorian novelist, Emerson Bennett, is responsible for the first citation in the Oxford English Dictionary.

In his 1850 novel Oliver Goldfinch, he writes that the title character is “better known to our readers as a gifted poet, under the nom de plume of ‘Orion.’ ”

Bennett could have used the word “pseudonym,” which we had borrowed from the French around the same time nom de plume was invented. But perhaps he felt “pseudonym” lacked a certain je ne sais quoi.

Whatever his reasons, nom de plume was a hit with the literary crowd— such a hit that it inspired an English translation, “pen name,” which made its debut in 1864 in Webster’s American Dictionary of the English Language.

The old French expression nom de guerre is still with us, though.

It’s defined in English dictionaries as “pseudonym” or “fictional name,” but these days it seems to be used most often for the sobriquets of terrorists (or freedom fighters, depending on your point of view).

In Origins of the Specious, we mention a story about the poet Coleridge, who not only used noms de plume (“Cuddy” and “Gnome,” among others) but once had a nom de guerre as well.

When a young lady refused his hand, the rejected suitor dropped out of Cambridge and enlisted in the Fifteenth Light Dragoons under the assumed name “Silas Tomkyn Comberbache.” (He’d seen the name Comberbache over a door in Lincoln’s Inn Fields in London.)

Check out our books about the English language

The Grammarphobia Blog

A whole lotta shakin’ goin’ on

Q: Since when did “shaking” one’s head come to mean affirmative agreement? I thought the term “shaking” was negative and “nodding” was affirmative. But I don’t hear anyone say “nodding” anymore. Is it just me?

A: In much of the world, though not everywhere, moving one’s head down and up indicates agreement and moving one’s head from side to side indicates disagreement.

In English, the vertical movement is referred to as “nodding one’s head” and the horizontal movement as “shaking one’s head,” according to standard dictionaries.

The American Heritage Dictionary of the English Language (4th ed.), for example, defines the verb “nod” as “to lower and raise the head quickly, as in agreement or acknowledgement.”

And the Collins English Dictionary says “to shake one’s head” means “to indicate disagreement or disapproval by moving the head from side to side.”

That’s the way we’ve always understood this nodding and shaking business. A bit of googling, however, suggests that an awful lot of people use the terms “nodding” and “shaking” interchangeably these days.

For example, we got 322,000 hits for “nodding his head yes” and 840,000 for “nodding his head no.” Although we got more than 6.5 million hits for “shaking his head no,” we also got 768,000 for “shaking his head yes.”

It’s pretty obvious, as you (and Jerry Lee Lewis) have observed, that there’s a whole lotta shakin’ goin’ on.

We’ll stick with using “nodding” for vertical agreement and “shaking” for horizontal disagreement, but we wouldn’t be surprised if “shaking” in either direction becomes the default term, especially in American English.

British dictionaries tend to make more of a distinction between “nodding” and “shaking” of the head. In fact, most of the US dictionaries we checked don’t even bother to define “shaking” in this sense.

That’s a shame. The verbal phrase “to shake one’s head” has been part of English since before 1300, according to the Oxford English Dictionary.

The OED defines it as “to turn the head slightly to one side and the other in sorrow or scorn, or to express disapproval, dissent or doubt.”

In The Expression of the Emotions in Man and Animals (1872), Charles Darwin cites examples from Asia, Africa, and Europe where the vertical movement means yes and the horizontal movement means no.

But he adds that “these signs are not so universally employed as I should have expected; yet they seem too general to be ranked as altogether conventional or artificial.”

One exception that’s frequently cited now is Bulgaria, where the vertical head movement means no and the horizontal movement means yes.

Check out our books about the English language

The Grammarphobia Blog

Is “also” an also-ran?

Q: What is the correct usage of “not only… but also”? Is “also” necessary here or is it an also-ran?

A: We wouldn’t call “also” an also-ran here, though it’s not always necessary.

There are three ways of dealing with the conjunctive phrases “not only” and “but also,” and all three are perfectly good English.

(1) Use both phrases in their entirety: “He’s not only a doctor but also a lawyer.”

(2) Drop “also”: “He’s not only a doctor but a lawyer.”

(3) Use “as well” instead of “also”: “He’s not only a doctor but a lawyer as well.”

Many people believe that it’s incorrect to drop “also” from a “not only … but also” construction. This isn’t true. Examples 1 and 3 are more formal than example 2, but all are correct.

Which is better? The decision is up to you, as Bryan A. Garner writes in Garner’s Modern American Usage (3rd ed.).

“It’s merely a matter of euphony and formality: let your ear and your sense of natural idiom help you decide in a given sentence,” he writes.

But no matter which style you choose—formal or casual—the elements that you’re joining should be parallel constructions.

In the best writing, what follows “not only” is similar in construction to what follows “but also” or “but.”

For example, if there’s no verb after the first conjunctive phrase (“not only”), there shouldn’t be a verb after the second (“but also”).

Or if there’s a preposition after the first conjunctive phrase, there should be another after the second. Keep the parts parallel. Here’s what we mean:

Parallel: “He attended not only medical school but law school.” Not parallel: “He attended not only medical school but went to law school.”

Parallel: “He went not only to medical school but also to law school.” …Not parallel: “He went not only to medical school but also law school.”

One other point. “Not only” and “but also” can be used with the parts of a compound subject, as in “Not only his son but also his daughter went to boarding school.”

Sometimes, however, the choice of a verb with such a compound subject isn’t that obvious. What if one part of the compound is singular and one is plural?

As Pat writes in her book Woe Is I (3rd ed.), “If the part nearer the verb is singular, the verb is singular,” as in “Not only the chairs but also the table was sold.”

And “If the part nearer the verb is plural, the verb is plural,” as in “Not only the table but also the chairs were sold.”

Check out our books about the English language

The Grammarphobia Blog

Do the bridges you burn light the way?

Q: I was wondering if you might know the origin of the saying “may the bridges I burn light the way.” I did a little googling and found something that says it’s from Beverly Hills 90210, but I have a hard time believing it was original with the show’s writers.

A: We did a little googling ourselves as well as searching in several Newsbank databases, including the Archive of America and America’s News.

As far as we can tell, the writers on the TV show did indeed come up with that remark. Here’s the relevant exchange from a 1994 episode:

“Brandon Walsh: Dylan, at this point in time, I’m just about the only friend you’ve got. You sure you want to do this? Push me away like you’ve done to everyone else?

“Dylan McKay: Yeah! May the bridges I burn light the way!”

The remark is also the title of a 2009 CD by the one-man band Bass Clef, a k a Ralph Cumbers.

Of course people have been burning bridges both literally and figuratively for quite a while. The figurative expression “to burn one’s bridges behind one” showed up in the late 19th century, according to citations in the Oxford English Dictionary.

The OED’s earliest example is from Mark Twain’s 1892 novel The American Claimant: “It might be pardonable to burn his bridges behind him.”

The dictionary defines the expression as “to burn one’s boats,” which is defined elsewhere in the OED as “to cut oneself off from all chance of retreat.”

Winston A. Reynolds, in a 1959 article in the journal American Speech, notes that Americans prefer the bridges version of the expression while Britons prefer the boats version.

Reynolds adds that the boats expression, which can be found in Spanish, French, Chinese, Dutch, German, and Latin, is the older version.

He suggests that the origin of the expression lies in historical and legendary accounts of burning one’s boats to encourage military victories in antiquity.

The earliest example of this, he writes, is in a seventh-century BC work attributed to Tso Kiu-Ming, a contemporary of Confucius:

“Miu-Kung, the Earl of Tsin, invaded the marquisate of Tsin: after crossing the river he burnt his boats, took the castle of Wang-Kwan, and even approached its capital.”

Reynolds cites Tu Yii, a third-century AD Chinese scholar, as explaining that Miu-Kung burned his boats to show “his determination never to return without a victory.”

Check out our books about the English language

The Grammarphobia Blog

Splatter proof

Q: I’m still getting over learning that I mispronounced “chimera” for over 60 years. I’d been saying SHIM-era. Who knew? Anyway, I was wondering about the relationship between “spatter” and “splatter”?

A: This will give us a chance to discuss one of our favorite words, “spatula.” (We know you’re eager, but you’ll just have to wait a bit.)

The word “splatter” means splash or spatter. It’s described by the Oxford English Dictionary as chiefly dialectal, and used mostly in the US.

The verb “splatter” dates from the late 18th century and the noun from the 19th. As for its source, the OED says it’s “imitative” in origin, meaning that its sound is an echo of what the word symbolizes.

The Chambers Dictionary of Etymology has another suggestion—that “splatter” is “perhaps a blend of spatter and splash,” which seems logical

Now, on to “spatter, which is much older than “splatter” and has Germanic origins. In Dutch and Low German, for example, spatten means to burst or spout, the OED says.

When the verb “spatter” was first recorded in English, in the late 1500s, it meant “to scatter or disperse in fragments,” says Oxford.

Early in the following century, it acquired the meanings familiar today—to splash or fall on something in scattered drops or particles.

The noun “spatter,” meaning a small splash or sprinkle, came along in the late 1700s.

You ask whether there’s a relationship between “spatter” and “splatter.” It’s possible. As we mentioned, Chambers speculates that “splatter” might be a blend of “spatter” and “splash,” but there’s a more solidly documented link.

In the late 1600s, men wore cloth or leather leggings to protect their trousers from spatters, especially while riding horseback. These were called, appropriately, “spatterdashes.” (Yes, this is the granddaddy of the later abbreviation “spats.”)

The old “spatterdashes” had several variants, including “splatterdashes” (18th century) and “spatter-plashes” (17th century).

What’s a “plash”? The noun “plash,” meaning something like a shallow pool or puddle, dates back to Old English and was altered in the 17th century to become “splash.”

OK, we’re now ready to discuss “spatula,” which we like simply for its combination of sounds.

It comes from Latin, in which spatula (or spathula) means a broad piece, but its ultimate source is the Greek spathe (a broad blade).

If you go back far enough, however, the words “spatula,” “spade,” and “spoon” share a prehistoric ancestor, according to John Ayto’s Dictionary of Word Origins.

In English, “spatula” has always meant a long, flat implement for mixing or spreading.

It entered the language in the 15th century but it has had some variant forms over the centuries. These include “spattle,” “spartle,” and (as you’ve probably guessed) “spatter” and “splatter.”

Books on etymology make very entertaining reading!

Check out our books about the English language

The Grammarphobia Blog

A song and dance

Q: In his book A Fine Romance, David Lehman writes about the Gershwin songs that Fred and Adele Astaire “sang and danced to.” This got me to thinking. Why can we simply sing a song but we have to dance TO it? It doesn’t make sense to me.

A: It makes sense to us. A singer is HEARD, while a dancer is SEEN.

A song, or any other piece of music, consists of sounds. In order to be heard, the notes must be sounded—that is, they have to be sung.

But the notes are not danced, because a dancer’s movements don’t make notes that are heard. The notes can only be danced TO, because the dancer isn’t sounding notes. (Yes, a tap dancer makes percussive sounds, but they’re not notes.)

Your question concerning David Lehman’s book about Jewish songwriters in America gives us a chance to discuss the expression “song and dance.”

When the phrase entered English in the early 17th century, according to the Oxford English Dictionary, it meant a “form of entertainment (spec. a vaudeville act) consisting of singing and dancing.”

The earliest published reference in the OED is from a 1628 account of Sir Francis Drake’s circumnavigation of the world. During a landing in California, he witnessed a “song and dance” by Native Americans.

It wasn’t until the 1870s, though, that the expression was used in its vaudeville sense. Here’s an example from an 1872 issue of the Chicago Tribune: “First week of the distinguished song and dance artists.”

By the end of the 19th century, according to the OED, the expression was being used figuratively to mean “an elaborately contrived story or entreaty” as well as “a fuss or outcry.”

The earliest citation for the usage in the OED is from an 1895 collection of short stories by Edward Waterman Townsend: “Den, ’is whiskers gives me a song an’ dance.”

We’ll conclude with an example from A Diversity of Creatures (1917), a collection of Rudyard Kipling’s short stories: “I don’t see how this song and dance helps us any.”

Check out our books about the English language

The Grammarphobia Blog

Trash talk

Q: I recently used the word “dumpster” in a letter published in a local newspaper. The editor changed it to “Dumpster,” saying it was a trademark, like “Kleenex,” and had to be capitalized. I’d like your opinion: Big D or little d?

A: Let’s begin with the New York Times stylebook, which capitalizes “Dumpster” and describes it as “a trademark for a trash hauling bin.”

Elsewhere in the stylebook, the Times says trademarks “are uppercased as a caution to readers who might adopt a name owned by someone else.”

The Associated Press stylebook agrees that “Dumpster” should be capitalized, but it recommends using a generic term like “trash bin” or “trash container” instead.

The two standard dictionaries we consult the most—The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.)—also capitalize “Dumpster.”

But American Heritage’s example of the word’s usage (from the Chicago Tribune) describes a street “lined with low-cost apartment buildings and strewn with blue dumpsters.” (As the dictionary notes, “This trademark often occurs in print in lowercase.”)

The Oxford English Dictionary’s entry for “dumpster” lowercases the word, though most of the quotations cited by the OED capitalize it.

A trademark, as you know, is a distinctive name used by a business to identify its products or services. But when the public begins using this name for all similar products and services, the trademark loses its distinctiveness.

In the case of trash hauling bins, the word “Dumpster” comes from the Dempster-Dumpster system for mechanically loading trash containers onto garbage trucks.

Dempster Brothers, which patented the system in 1937, has several trademarks for “Dumpster,” but the word is often used these days as a generic term for a trash container used in any similar system.

We won’t get into the legal situation that arises when a trademark becomes generic. We’ll leave that to trademark lawyers.

The main concern for a writer is to communicate, not to help a business protect a trademark, especially not a trademark that’s widely used as a generic term.

So what would we do? We’d lowercase “dumpster” if we were referring generically to a container used in a mechanically loading trash system.

The alternatives recommended by AP (“trash bin” and “trash container”) are too vague. And the Times definition (“trash hauling bin”) is too clunky.

Anyway, the term “dumpster” is so widely used now that any effort to preserve its distinctiveness would probably be a lost cause, like trying to revive such old trademarks as “aspirin,” “butterscotch,” “thermos,” and “zipper.”

We can’t end this posting without mentioning the noun phrase “dumpster diving,” which the OED defines as the “practice of searching through a rubbish container (esp. a dumpster or skip) for food, items of value, etc.”

The first citation in the OED is from a 1983 caption in Life Magazine: “Rat and Mike call rummaging for food in trash bins behind restaurants dumpster diving.”

Check out our books about the English language