The Grammarphobia Blog

Nothing but the truth

Q: I’m editing this sentence for the publishing house where I work: “There were nothing but steep cliffs on all sides.” The verb should be “was,” no? “There” is a dummy subject, rendering the true subject “nothing,” which is singular. Can you tell me if my logic is unassailable?

A: You’re right that the verb should be singular, though we can’t say your logic is unassailable. There are exceptional cases, as we’ll explain later.

In that sentence, “there” is a dummy subject—one that’s required by syntax and merely occupies the obligatory subject position.

The true subject is “nothing.” And when used  as a subject, “nothing”—even when followed by “but”—traditionally takes a singular verb, regardless of the noun (singular or plural) that follows.

The American Heritage Book of English Usage has this to say: “According to the traditional rule, nothing is invariably treated as a singular, even when followed by an exception phrase containing a plural noun.”

The book gives these examples: “Nothing except your fears stands (not stand) in your way. Nothing but roses meets (not meet) the eye.”

When the American Heritage editors use the word “traditional,” they’re not exaggerating. We found this example in a 1772 edition of Joseph Priestley’s The Rudiments of English Grammar:

“Nothing but the marvellous and supernatural hath any charms for them.” (Note the archaic singular “hath” for “has.”)

Constructions like “nothing but,” “nothing save,” and so on are venerable features of the language.

Since Old English, according to the Oxford English Dictionary, “nothing” has been used with a “limiting particle”—like “but,” “besides,” “except,” “save”—to mean “merely” or “only.”

So you’re right about that sentence, and you can feel justified in editing it to read, “There was nothing but steep cliffs on all sides.”

But here’s a qualification to keep in mind for future use, from the editors of The American Heritage Dictionary of the English Language (5th ed.).

In a usage note with its entry for “nothing,” the dictionary repeats the usual rule about using a singular verb with “nothing but,” then adds this:

“But there are certain contexts in which nothing but sounds quite natural with a plural verb and should not be considered inappropriate. In these sentences, constructions like nothing but function much like an adverb meaning ‘only,’ in a pattern similar to one seen in none but.

The usage note follows with this example: “Sometimes, for a couple of hours together, there were almost no houses; there were nothing but woods and rivers and lakes and horizons adorned with bright-looking mountains (Henry James).”

In our opinion, the Henry James example is worth remembering because it cries out for symmetry between those two clauses: “there were … there were….”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Is it loo-TEN-ant or lef-TEN-ant?

Q: My daughter wonders why “lieutenant” is pronounced lef-TEN-ant in the UK and loo-TEN-ant in the US. Do you have any clues?

A: The word “lieutenant” came into Middle English in the 1300s from French—lieu for “place” and tenant for “holding.”

(Originally a “lieutenant” was a placeholder, a civil or military officer acting in place of a superior. Think of the phrase “in lieu of” for “in place of.” )

But since the beginning, the British have commonly pronounced the first syllable of “lieutenant” as if it had an “f” or a “v.”

In the early days, this tendency was sometimes reflected in spellings:  “leeftenaunt” (1387), “luf-tenend “ (late 1300s), “leyf tenaunt” (early 1400s),” “lyeftenaunt” (circa 1425), “luff tenande” (late 1400s), “leivetenant” (late 1500s), and so on.

But long after the spelling stabilized and “lieutenant” became the dominant form in writing, the “f” sound has survived in British speech, where the usual pronunciation today is lef-TEN-ant. Nobody knows why.

The Oxford English Dictionary says the origin of the “f” and “v” sounds “is difficult to explain,” and the Chambers Dictionary of Etymology says it “remains uncertain.” In other words, we can only guess.

The OED says one theory is that English readers misinterpreted the letter “u” as a “v,” since in Middle English the two letters were not distinct.

But Oxford says this can’t account for the “f” and “v” pronunciations since it “does not accord with the facts.”

The dictionary is apparently referring to the fact that in Middle English spelling, the letter “v” was generally used at the beginning of a word and “u” elsewhere, regardless of the sound, which accounts for old spellings like “vpon” (upon) and “loue” (love). However, the “u” is in the middle of “lieutenant,” not the beginning.

The OED suggests two possibilities to explain the appearance of the “f” and “v” sounds in “lieutenant.”

One is that that some of the “f” and “v” pronunciations “may be due to association” with the noun “leave” or the adjective “lief.”

A likelier theory is “that the labial glide at the end of Old French lieu as the first element of a compound was sometimes apprehended by English-speakers as a v or f.” (A labial glide is a transitional sound in which air is forced through the lips.)

Oxford also notes the existence of “the rare Old French form leuf for lieu,” which may have influenced the English pronunciation. (The language researcher Michael Quinion cites a medieval form of the word, leuftenant, in the records of what is now a Swiss canton.)

However it came about, the usual pronunciation in Britain today begins with “lef,” and seems unlikely to change.

As Oxford notes, John Walker in his Critical Pronouncing Dictionary (1793) gives the “actual pronunciation” of the first syllable as “lef” or “liv,” though he “expresses the hope that ‘the regular sound, lewtenant’ will in time become current.” Despite Walker’s advice, that pronunciation “is almost unknown” in Britain, the OED adds.

Noah Webster, in his American Dictionary of the English Language (1828), recommends only one pronunciation for the word, which he renders as “lutenant.”

American dictionaries have followed Webster’s lead and give loo-TEN-ant as the pronunciation, though they usually note the lef-TEN-ant pronunciation in Britain.

Finally, an aside. Another of our correspondents once suggested that the British pronunciation arose though squeamishness: “The Brits didn’t want to refer to their officers with the term ‘loo’!”

Intriguing, but untrue. The word “loo” wasn’t recorded in the bathroom sense until the 20th century. Another theory down the drain.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Noseblind spot

Q: A Febreze commercial uses the apparently new word “noseblind” to describe someone who can’t smell. As far as I know, there are only two common adjectives for sensory deficiencies: “blind” and “deaf.” Aside from obscure medical terms, are there common words for the loss of the three other traditional senses?

A: Although the Febreze commercials have helped popularize “noseblind,” the term had been around for a dozen or so years before Procter & Gamble began using it last summer to promote the air freshener.

The earliest example of the usage we could find is from an Oct. 8, 2002, comment on a Mazda discussion group: “I use 89 octane from esso all the time, but haven’t noticed any smell at all. Maybe I am nose blind, but it hasn’t been a problem for me.”

And here’s an example from “The Revisionist,” a short story by Helen Schulman in a 2004 collection from the Paris Review:

“The resultant odor was strong enough to etherize an elephant, but Hershleder the rebel was nose-blind to it.”

Julia LaFeldt, a P&G spokeswoman, told us that “Febreze first started using the term ‘noseblind’ in July 2014 when we launched our current campaign.”

In a July 9, 2014, press release, P&G announced that the actor-comedian Jane Lynch would be promoting Febreze to counter “noseblindness,” a condition that “occurs naturally over time when a person becomes accustomed to surrounding smells.”

In videos featuring Ms. Lynch and others, the adjective “noseblind” is repeatedly used to describe people who are so used to their own odors that they can’t smell what their guests do.

On a web page that features videos promoting Febreze, P&G offers a mock dictionary entry that defines “noseblind” as a noun but treats it as an adjective in the accompanying example:

“noseblind [nohz-blihnd], noun; The gradual acclimation to the smells of one’s home, car, or belongings, in which the affected does not notice them (even though their guests do).

“Example: I can’t attend Book Club this week. Nancy is completely noseblind to the fact that her house smells like a feral cat shelter.

As for your question, we don’t know of any common words for the loss of the three other traditional senses: taste, smell, and touch (though “numbness” might describe an inability to feel a touch). The usual medical terms are “ageusia” (taste), “anosmia” (smell), and “analgesia” (touch—actually, the inability to feel pain).

A more general term, “sensory processing disorder,” describes a condition in which the nervous system doesn’t properly organize sensory signals into the appropriate responses. It can affect one or more of the senses, according to the SPD Foundation.

The linguist Arnold Zwicky, who discussed “noseblind” recently on his blog, describes it as “a fairly clever coinage for this sensory saturation effect, treating it as similar to being temporarily blinded by bright lights or deafened by loud noises.”

“But it’s not truly similar to being blind or deaf, which are enduring and more global inabilities,” Zwicky adds.

If you’d like to read more, we’ve answered several questions on the blog about “nose” and “blind,” including a post in 2009 as well as posts in October and November of 2012.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

English via Facebook

Q: Does Facebook use “via” incorrectly when your friend A forwards a link to you from his friend B? Facebook describes this as “From A via B,” but surely it should be the other way around, “From B via A.”

A: You’re right—“via” has meant “by way of” since it came into English in the 1700s.  A newer sense of the word, “by means of” or “with the aid of,” came into use in the 1930s and is also accepted as standard English in modern dictionaries.

So Facebook has things turned around. In fact, the message is coming from friend B (the original source) by way of (or “via”) friend A, the intermediary who forwards it.

These two examples from Merriam-Webster’s Collegiate Dictionary (11th ed.) illustrate the standard uses of “via”:

  • by way of: “She flew to Los Angeles via Chicago.”
  • by means of: “I’ll let her know via one of our friends.”

The English preposition “via” was taken directly from the Latin noun via, meaning “way” or “road.”

Despite its classical origins, the English word is relatively new as these things go, since it dates back only to the 18th century. This is why it’s sometimes printed in italics in older writing.

The Oxford English Dictionary’s earliest citation is from a letter written by James Lovell, a delegate to the Continental Congress, to John Adams on June 13, 1779:

“This night is the fourteenth since we first had the news of his victory, via New Providence.” (The reference is to Gen. Benjamin Lincoln in a battle against the British.)

Here’s another example that clearly displays the original usage: “Lord Weybridge … is on his way to London viâ Paris.” (From Theodore Edward Hook’s novel The Parson’s Daughter, 1833.)

The newer sense of the word (“by means of”) is nicely illustrated by this 1977 citation from the OED:

“It would in theory be possible to provide five more services with national coverage via satellite.” (From a British government report on the future of broadcasting.)

But no matter which usage you subscribe to—and current dictionaries accept both—“via” always refers to whoever or whatever is in between, not to the origin.

Think of the word “viaduct,” a long high bridge that’s an elevated go-between.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

When words change their spots

Q: I see that the online Merriam-Webster has caved to the misuse of “peruse,” which is now apparently an antonym to itself. It means, or so the dictionary says, to examine or read “in a very careful way” (the traditional usage) as well as “in an informal or relaxed way.” Are linguists creating a new type of word?

A: Often a simple question calls for a complicated answer, and this is one of them.

Linguists and lexicographers don’t create new meanings for words. They merely catalog what they perceive as shifts in common usage—shifts that naturally occur as a language develops.

As for the verb “peruse,” it’s been used to mean “both careful and cursory reading” since the 16th century, according to the Oxford English Dictionary.

Take a look at a post about “peruse” that we wrote in 2006 and later updated to reflect recent dictionary definitions. As you can see, the usage you object to is well established.

It’s unfortunate that a language commentator in the early 1900s took a dislike to the word’s “cursory” sense, and that other usage guides unthinkingly followed.

But in the end, the general public took no notice and continued to use “peruse” in the old familiar way.

The truth is that common usage determines what’s “correct.” This is why alterations in meaning, spelling, and pronunciation are normal as a language develops.

Even Classical Latin, when it was a living, spoken language, underwent regular shifts and changes. It only became frozen when it died.

And once Latin words were absorbed into English and the Romance languages, those words continued to shape themselves to their new surroundings and came to reflect common usage in those societies.

For example, we’ve written on our blog about the assimilation of Latin words into English and the consequent shifts in pluralization.

Many words derived from Latin plurals have become accepted over the years as singular nouns in English: “ephemera,” “erotica,” “stamina,” “agenda,” “trivia,” “insignia,” “candelabra,” and more recently “data.”

What’s more, the word “media” now has both singular and plural usages in English, as we wrote in a post four years ago.

This naturalization process is normal and expected. Similarly, we should expect words to change their meanings. As this happens, they can even take on meanings that are opposed.

Sometimes these words retain both opposing senses, as with “cleave” and “sanction.” Such words are often called “contronyms,” and the reader has to judge the writer’s intent by the context.

(Merriam-Webster’s Unabridged, for example, feels that “literally” has joined this group and can be taken to mean “in effect.” However, we aren’t yet recommending that our readers adopt this looser usage.)

We’ve written about words with opposing meanings many times on our blog, including posts in 2008, 2010, and 2012.

At times a word’s earlier meaning is discarded and becomes obsolete. This process can move an originally affirmative word (like “pedant”) in a derogatory direction.

But just as often the reverse happens, and a derogatory word (like “terrific”) takes on a positive meaning.

Words change not only in meaning but in grammatical function. This kind of change, as when a noun becomes a verb, often upsets people, but it’s a natural way in which new words are formed.

As we’ve said before, this process is called conversion, and it’s given us a considerable portion of our modern English words.

Thanks for your question, and we hope we’ve shed a little light here.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

The times they are a changin’

Q: I increasingly hear sentences with two nouns competing to be the subject. Some recent examples, all from local newscasts: “Our producer, she is going to New Hampshire” … “My aunt and uncle, they died of diabetes” … “That guy, he can play on Sunday.” I was told years ago by an English professor that this was incorrect. Have the rules changed?

A: In all of those examples, the pronoun duplicates the subject: “our producer, she” … “my aunt and uncle, they” … “that guy, he.”

The pronoun in all these cases isn’t technically necessary. It’s sometimes called a “pleonastic subject pronoun” (pleonastic means redundant or superfluous).

Although such a pleonasm is sometimes used as a literary device in poems and songs, the Oxford English Dictionary says this usage is “now chiefly regional and nonstandard.”

We’d add that speakers of standard English use pleonastic subject pronouns for emphasis in casual conversation, though rarely in prose writing, especially in formal prose.

For centuries, poets and balladeers have used this device to force a pause in the meter of a line and give it a songlike air.

Consider, for example, this line from the 13th-century poem Amis and Amiloun, the Middle English version of an old French legend: “Mine hert, it breketh.” How much duller it would be without that superfluous pronoun!

Modern poets, too, have employed this usage. Here’s a line from A Shropshire Lad (1896), by A. E. Housman: “I tell the tale that I heard told. / Mithridates, he died old.”

The OED has many examples, dating back to Old English. Here’s a sampling: 

“My sister, shee the jewell is” (from an anonymous Elizabethan play, Common Conditions, 1576).

“ ‘Fair and softly,’ John he cried, / But John he cried in vain” (William Cowper, 1782).

“The worms they crept in, and the worms they crept out” (the novelist Matthew Gregory Lewis, 1795).

“The skipper he stood beside the helm” (Henry Wadsworth Longfellow, 1839).

“My wife she cries on the barrack-gate, my kid in the barrack-yard” (Rudyard Kipling, 1892).

“The times they are a changin’ ” (Bob Dylan, 1964).

We’ve found many nonstandard uses of pleonastic subject pronouns in speech or dialogue from the 19th and 20th centuries.

Here’s an example from Mark Twain’s Adventures of Huckleberry Finn (1884): “The king he spread his arms, and Mary Jane she jumped for them.”

In recent decades, as you’ve noticed, speakers of standard English have been using the device for emphasis in conversation.

Here’s a quote from Bruce Springsteen in the Feb. 5, 1981, issue of Rolling Stone: “My mother and father, they’ve got a very deep love because they know and understand each other in a very realistic way.”

Is the usage legit? Well, the OED doesn’t consider it standard English. But we see nothing wrong with its emphatic use in casual speech.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Is “close proximity” redundant?

Q: I would love to hear your perspective on “close proximity.” If “in proximity to” means “close to,” what does “in close proximity to” mean? Including “close” seems redundant to me, but it feels odd to leave it out.

A: Well, the phrase “in close proximity” isn’t very graceful (we’d prefer “near” or “close to”), but we don’t consider it redundant, as we’ll explain.

The Oxford English Dictionary defines “proximity” as “nearness” or “the fact, condition, or position of being near or close by in space.”

So theoretically the noun “proximity” should need no help from an adjective like “close.”

But theory is one thing and fact is another. In reality, there are degrees of nearness, so it’s reasonable to indicate how near with the use of an adjective like “close,” “closer,” or “closest.”

Used by itself, “proximity” sometimes seems inadequate, which may be why the naked word feels odd to you.  A statement like “There’s no restaurant in proximity to my apartment” could mean within a city block or a ten-minute drive.

As Merriam-Webster’s Dictionary of English Usage says, “Of course there are degrees of proximity, and close proximity simply emphasizes the closeness.” The usage guide gives many examples, including these:

“Swallow means porch-bird, and for centuries and centuries their nests have been placed in the closest proximity to man” (from Richard Jefferies’s book The Open Air, 1885).

“Mr. Beard and Miss Compton disagreed on the distance of meat from heat, probably because Mr. Beard had in mind a smaller fire-bed to which the steak could  be in closer proximity” (from the New York Times, 1954).

“The herb [tansy] works only on plants in very close proximity” (from the New York Times Magazine, 1980).

This OED has dozens of examples of the usage, dating back to the early 1800s. Here’s one from an 1872 travel guide to the English Lake District: “Owing to the close proximity to the sea.”

Elsewhere in the same guide, we found this example in a description of the city of Carlisle: “It dates back to the time of the Romans, and was in close proximity to the wall of Hadrian.”

The word “proximity” came into English from the French proximité (near relationship), the OED says. It was derived from the Latin noun proximitas (nearness or kinship), which came from the adjective proximus (nearest, next).    

When first recorded in English, in 1480, “proximity” referred to blood relationship or kinship (as in the phrase “proximity of blood,” first recorded in the 16th century and still occasionally used).

The noun was soon being used to refer to other kinds of nearness—time, space, distance, and so on. Today, “proximity” in relation to distance is the dominant usage.

John Ayto’s Dictionary of Word Origins says the Latin proximus (nearest) was the superlative form of an “unrecorded” Latin word that’s been reconstructed as proque (near).

This reconstructed word, Ayto adds, was “a variant of prope, from which English gets approach and propinquity.”

Another English relative, Ayto says, is “approximate,” which ultimately comes from the Latin verb proximare (“come near”).

We’ll close with another example from the M-W usage guide. It’s from Iolanthe (1882), by W.S. Gilbert and Arthur Sullivan.

But then the prospect of a lot
Of dull M.P.’s in close proximity,
All thinking for themselves, is what
No man can face with equanimity.

 Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Paralinguistically speaking

Q: My wife and I were alone in our car and having a general discussion when she lowered her voice and said, “Everyone knows her husband is having an affair.” Has anyone studied this strange behavior in mentioning a sensitive topic?

A: Yes, language scholars have indeed looked into this behavior. The study of pitch, loudness, speed, hesitation, and similar qualities of speech is referred to as “paralinguistics,” and this aspect of communication is called “paralanguage.”

In his 1975 paper “Paralinguistics,” the linguist David Crystal says the “para-” prefix (meaning “beyond” or “outside of”) “was originally chosen to reflect a view that such features as speed and loudness of speaking were marginal to the linguistic system.”

However, Crystal writes, studies in social psychology, psychiatry, sociolinguistics, and other areas “suggest that the vocal effects called paralinguistic may be rather more central to the study of communication than was previously thought.”

“Certainly, observations of people’s everyday reactions to language suggest that paralinguistic phenomena, far from being marginal, are frequently the primary determinants of behaviour in an interaction,” he says.

Although “the most widely recognized function” of paralanguage is for emotional expression, he adds, a “far more important and pervasive” function “is the use of paralinguistic features as markers of an utterance’s grammatical structure.”

In other words, the use of paralanguage in speech replaces the punctuation and spacing that’s so important in making written language intelligible.

Crystal discusses several kinds of paralanguage. An extended low pitch, for example, may be used “as a marker of parenthesis (e.g. ‘My cousin—you know, the one who lives in Liverpool—he’s just got a new job’).”

A rise in loudness may be used “as a marker of increased emphasis (‘I want the red one, not the green one’).”

An increase is speed may indicate “that the speaker wishes to forestall an interruption, or to suggest that what he is saying need not be given careful attention.”

A sentence spoken with a noticeable metrical beat may “suggest irritation, e.g. ‘I really think that John and Mary should have asked.’ ”

The kind of voice-lowering you’re asking about could be considered a marker of parenthesis. Crystal doesn’t cite an example like yours, but other language researchers say a whispered or lowered voice may accompany confidential or embarrassing comments.

In Principles of Phonetics (1994), for example, the linguist John Laver writes that the paralinguistic use of whisper may “signal secrecy and confidentiality.”

And in Simultaneous Structure in Phonology (2014), the linguist D. Robert Ladd writes, “A speaker’s voice may be raised in anger or lowered to convey something confidential.”

The linguist Carlos Gussenhoven, writing in The Phonology of Tone and Intonation (2004), says people may raise their pitch “to express surprised indignation” and “lower it to suggest confidentiality.”

And in a study entitled “The Roles of Breathy/Whispery Voice Qualities in Dialogue Speech” (2008), Carlos Toshinori Ishi, Hiroshi Ishiguro, and Norihiro Hagita say that a “more whispered and low-powered voice quality” may reflect embarrassment.

(The three authors, who specialize in robotics, attempt to apply paralinguistics to synthesized speech.)

We can’t end this without mentioning a book that we came across while researching your question.

In Playing With My Dog Katie: An Ethnomethodological Study of Dog-Human Interaction (2007), the sociologist David Goode discusses his embarrassing “over-reliance on paralinguistic features of vocalization” in relating to his pet.

As the owners of two young golden retrievers, we know what’s he’s ethnomethodologically talking about.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Why not “one headquarter”?

Q: To my ear, “one headquarter” sounds better than “one headquarters.” Why is the plural “headquarters” used for both the singular and the plural?

A: When the term first showed up in English in the early 1600s, it was “headquarter” (or, rather, “head quarter”), but the “s”-less singular is rarely seen now except in South Asian English.

We’ll have more to say later about the history of “headquarter”/“headquarters,” but first let’s look at how the word is generally used in contemporary English.

Today, “headquarters” is a noun that’s plural in form but can be used with either a singular or a plural verb.

As Pat writes in her grammar and usage book Woe Is I (3rd. ed.), “headquarters” is one of those words, like “series” and “species,” that ends in “s” but can mean either one thing or more—a base or bases.

She gives this example: “Gizmo’s headquarters was designed by Rube Goldberg. The two rival companies’ headquarters were on opposite sides of town.”

The Oxford English Dictionary has these modern examples of “headquarters” used in each way:

singular: “Hundreds of Home Office staff should be moved out of London because a new headquarters is too small to accommodate them.” (From the Daily Telegraph, 2004.)

plural: “He set up two headquarters, one to control Japan … and the other to command U.S. forces in the Far east.” (From Richard B. Finn’s book Winners in Peace, 1992.)

Now let’s look at the history of “headquarter” and “headquarters.”

Both of these forms showed up in written English in the first half of the 17th century, with “headquarter” used in the singular sense and “headquarters” used initially in the plural sense.

But within a few years, “headquarters” was being used in both singular and plural contexts —or, as the OED puts it, the plural form “headquarters” was being used “with pl. or sing. concord.”

The dictionary’s earliest example of “headquarter” used in the singular sense is from a 1622 issue of the Continental Newes that refers to military forces “about to draw away their Ordnance into their head quarter.”

The OED’s earliest example of “headquarters” used as a plural is in a 1639 issue of a newsletter, Curranto This Weeke From Holland: “This Campe is divided into 2 Head-quarters, on one side commandeth Monsieur de Lambert, and on th’other side Colonell Gassion.”

And the first example of the plural form used in a singular sense is from a 1644 issue of the Weekly Account: “The Hoptonian Forces are as yet at their head quarters at Winchester.”

In the 1500s and 1600s, the dictionary points out, other Germanic languages had singular forms for the singular sense: German Hauptquartier (1588 or earlier), Swedish huvudkvarter (1658), Dutch hoofdkwartier (1688 or earlier).

Why did the plural form “headquarters” come to be used in English for both singular and plural senses?

Perhaps because the plural “quarters” was being used around the same time for a singular place of residence, as in this example from Every Man in His Humor, a 1616 play by Ben Jonson: “Turnebull, White-chappell, Shore-ditch, which were then my quarters.”

As we’ve said earlier, the singular “headquarter” is rarely seen now except in South Asian English. Here are some recent examples from military and corporate writing:

“My headquarter was at Chandhi Mandir which is easily the best laid out military cantonment in the country.” (From S. K. Sinha’s book A Soldier Recalls, 1992.)

“Another network was established by National Informatics Centre … whose headquarter is at Delhi.” (From Computer Technology for Higher Education, 1993, by Sarla Achuthan and others.)

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

When the subject is a dummy

Q: I’ve read your recent post on deconstructing “it” and I have one additional question. What does “it” refer to in sentences like “It is raining” and “It is snowing”? I’ve heard various explanations of this usage, but I’d appreciate your take on it.

A: English speakers have been using the pronoun “it” to talk about the weather since Anglo-Saxon days. The “it” that we use to denote weather conditions (“it was drizzling” … “it’s hot”) is often called a “dummy” or “empty” or “artificial” subject.

The Cambridge Grammar of the English Language says the “it” here has no semantic meaning and serves “the purely syntactic function of filling the obligatory subject position.”

The Oxford English Dictionary describes this “it” as “a semantically empty or non-referential subject” that dates back to Old English, where it was frequently used in statements about the weather.

The OED’s earliest recorded usage in reference to weather is from an Old English translation, possibly written around the 10th century, of Bede’s Ecclesiastical History of the English Nation, which he probably completed in Latin in 731.

In the relevant passage, “hit rine & sniwe & styrme ute” (“it rain & snow & storm out”), the verbs are in the subjunctive.

We’ll expand the OED citation and give a modern English translation: “as if you at feasting should sit with your lords and subjects in winter-time, and a fire be lit and your hall warmed, and it should rain and snow and storm outside.”

This Middle English example from around 1300 needs no translating: “Hor-frost cometh whan hit is cold.”

The  “it” we use in statements about the weather, according to the OED, is part of a broader category of usages in which the pronoun is “the subject of an impersonal verb or impersonal statement, expressing action or a condition of things simply, without reference to any agent.”

These usages would include statements about the time or the season (“it was about noon” … “it was winter”); about space, distance, or time (“it was long ago” … “it’s too far”); and about other kinds of conditions (“how is it going?” … “it was awkward” … “if it weren’t for the inconvenience”).

The Cambridge Grammar wouldn’t use the term “dummy subject” to describe most of these non-weather usages. In its view, a dummy subject “cannot be replaced by any other NP [noun phrase].”

So Cambridge regards the “it” in a sentence like “It is five o’clock” or “It is July 1” as a predicative complement rather than a dummy subject, because “it” could be replaced by “the time” or “the date.”

Some linguists, however, might argue that none of the “it” usages we’ve discussed are true dummy subjects, but we’ll stop here.

To quote Shakespeare (Macbeth, around 1606), “If it were done, when ’tis done, then ’twer well, / It were done quickly.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

How “colonel” became KER-nel

Q: How did a “colonel” in the military come to be pronounced like a “kernel” on an ear of corn?

A: The word for the military officer once had competing spellings as well as competing pronunciations. When the dust settled, it ended up being spelled in one way and pronounced in the other.

The word was actually “coronel” when it entered English in the mid-16th century, according to the Oxford English Dictionary.

Here’s the messy story of how a word once spelled “coronel” in English came to be spelled “colonel” and pronounced KER-nel.

English acquired the original “coronel” from the Middle French coronnel, which came from colonello, the Italian word for the commander of a regiment, the OED says.

Colonello is derived from colonna, Italian for a column, which in turn comes from columna, Latin for a pillar.

Oxford cites the English philologist Walter William Skeat as saying the colonello got his name because he led “the little column or company at the head of the regiment.”

The first company of the regiment—the colonel’s company—was called la compagnia colonnella in Italian and la compagnie colonelle in French, according to the OED.

The confusion began when the Italian colonello entered Middle French in the 16th century. The two “l” sounds apparently didn’t sit well with French speakers, so the first “l” changed to “r” and the word briefly became coronel.

The process by which two neighboring “l” sounds were “dissimulated” (or rendered dissimilar) was common in the Romance languages, the OED says.

However, the French coronel “was supplanted in literary use, late in 16th cent., by the more etymological colonnel,” according to the dictionary. (The word is now colonel in modern French.)

But meanwhile both English and Spanish had borrowed coronel, the dissimilated version of the word, from Middle French in the mid-1500s.

When it entered English, in 1548, it was spelled “coronel,” with a three-syllable pronunciation (kor-uh-NEL) similar to that of the Middle French word.

Although it’s still spelled coronel in Spanish, English speakers soon followed the French and returned to the more etymologically correct spelling.

As the OED explains, “under this influence [the French spelling change] and that of translations of Italian military treatises colonel also appeared in English c1580.”

By the mid-1600s, the OED says, “colonel” was the accepted English spelling and “coronel” had fallen by the wayside.

But the word’s pronunciation took much longer to get settled.

The two competing pronunciations (kor-uh-NEL, kol-uh-NEL) existed until the early 19th century, according to the Chambers Dictionary of Etymology, along with a popular variation, KER-uh-nel.

In the early 1800s, Chambers says, the KER-uh-nel pronunciation was shortened to KER-nel. (The awkward KOL-nel, a shortened version of kol-uh-NEL, was recorded in Samuel Johnson’s dictionary of 1755, but it eventually fell out of use.)

Although the KER-nel pronunciation became universally accepted, Chambers says, “the familiar literary form colonel remained firmly established in printing.”

So you might say that the word’s spelling today reflects its Italian heritage while the pronunciation reflects its French side—that is, its brief period of dissimilation in French.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

A message from Pat and Stewart

Dear readers,

We could use a little help to keep the Grammarphobia Blog going, and we’re not too embarrassed to ask for it (well, Stewart is a bit).

If you read the blog regularly, you may have noticed that something is missing—advertising. This is because we find ads just as annoying as you do.

Something else we don’t have is a paywall. You don’t have to buy a subscription to read our blog and to search the thousands of posts we’ve written over the past eight years. We’re free!

The downside, of course, is that the blog makes no money. So besides spending several hours a day researching and writing for the blog, the two of us pay all the expenses.

What expenses? To begin with, there’s the technical stuff: Web hosting, domain registration, maintenance, and so on. Then there are the costs of keeping our standard dictionaries and other reference books up to date, and of paying for annual subscriptions to the Oxford English Dictionary, the Dictionary of American Regional English, and other online resources.

We had hoped to cover our costs—and earn a little something for our work—with donations from readers. But unfortunately this hasn’t happened … yet.  With your help, it can happen in 2015.

Our heartfelt thanks to those of you who have contributed, especially those who donate regularly (one reader has even arranged a small monthly contribution through PayPal). Bless you all.

To the rest of you, if you like what we do and would like to pitch in, please help support the Grammarphobia Blog with a donation. No contribution is too small!

Thank you.

Pat and Stewart

The Grammarphobia Blog

Was the storm a shoo-shoo?

Q: I woke up in my Hell’s Kitchen apartment the other day, looked out the window expecting to see a storm-wracked New York, and thought, “Well, that was a shoo-shoo.” Growing up in New Orleans, we learned that an unexploded firecracker was a shoo-shoo. I wondered if this went beyond my hometown and I found an article saying the reduplicative usage was brought home to Louisiana by doughboys returning from World War I.

A: Yes, the recent “storm of the century” was indeed a shoo-shoo in New York City as well as in our part of southern New England. And “shoo-shoo” is a fine example of reduplication—a subject we recently discussed on the blog.

However, we doubt that doughboys from Louisiana brought the usage home with them from the battlefields of World War I. Or that the usage was inspired, as the article says, by problems with the Chauchat light machine gun.

The Dictionary of American Regional English has an example of the usage in Louisiana dating from 1917, when the doughboys were still heading for Europe, not returning home.

The first members of the American Expeditionary Force arrived in Europe in June of 1917, and the force wasn’t involved in significant combat until 1918, the last year of the war.

DARE defines “shoo-shoo” as “a failed firecracker that is later broken open and lit.” The dictionary suggests that the name is probably “echoic”—an imitation of the hissing sound made when the powder from a split firecracker is ignited.

The dictionary’s earliest citation is from a list of Louisiana terms submitted by James Edward Routh of Tulane University to Dialect Notes, a publication of the American Dialect Society:

“A fire-cracker that has failed to go off. The ‘shoo-shoo’ is broken and lighted for the flare of the loose powder.”

DARE says the usage is “chiefly” seen in Louisiana. Nearly all of its most recent reports of “shoo-shoo” (1967-68) are from Louisiana, though the dictionary does have a couple from Hawaii for “shoo-shoo baby.”

We suspect that the Hawaiian reports were inspired by “Shoo Shoo Baby,” an Andrews Sisters hit, or by a B-17 Flying Fortress named after the song. The World War II plane is now on display at the National Museum of the US Air Force in Ohio.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Can’t win for losing

Q: Is the expression “You can’t win for losing” as simple as it sounds? Or is there a deeper meaning and significance?

A: We don’t see anything particularly deep about the expression. It’s just another way of saying “You can’t win if you’re losing all the time.”

The Dictionary of American Slang (4th ed.) says the usage refers to someone  “entirely unable to make any sort of success” or “persistently and distressingly bested.”

The authors, Barbara Ann Kipfer and Robert L. Chapman, give this example: “We busted our humps, but we just couldn’t win for losing.”

Kipfer and Chapman date the expression from the 1970s, but we’ve found earlier examples in Internet searches.

The earliest is from a 1955 issue of the Postal Supervisor, a journal of the National Association of Postal Supervisors:

“You can’t win for losing, it seems. Who are our friends, and who is the snake in the grass in Congress. There must always be a villain in the plot. Will it be the outer-space missile this time?”

A search with Google’s Ngram viewer indicates that the use of the expression increased sharply in the 1960s, reaching a peak in the late ’80s and early ’90s.

We’ll end with a more recent example of the usage from Any Woman’s Blues: A Novel of Obsession (2006), by Erica Jong:

“I want to be the best man for you, but you’re never satisfied. Whatever I do, it’s not enough—I can’t win for losing!”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Hear Pat on Iowa Public Radio

She’s on Talk of Iowa today from 10 to 11 AM Central time (11 to 12 Eastern) to discuss the English language and take questions from callers.

The Grammarphobia Blog

No problem at all

Q: I’m struck by the strangeness of the phraselet “at all.” It seems to pop up everywhere, with a clear connotation but not much denotation at all. Is it shorthand for “at all events”? Seems to me it’s used in cases where the full phrase wouldn’t work at all.

A: “At all” is one of those phraselets (we like your term) that defy literal interpretation.

It functions as an adverb, but taken individually the words “at” and “all” don’t seem to add up to what the idiom means. And what exactly does “at all” mean?

The Oxford English Dictionary says “at all” has been used three ways since it showed up in the mid-1300s: in negative or conditional statements, in interrogative usages, and in affirmative statements (though this sense has generally died out).

When used in negative or conditional “if” statements, according to the OED, “at all” means “in any way,” “to any degree,” “in the least,” or “whatsoever.”

Examples date back to the 15th century and include “stryve not at al” (1476); “no peace at all” (1535); “If thy father at all misse me” (1611); “not at all visible” (1664); “If he refuses to govern us at all” (1849), and “no problem at all” (1975).

When used in questions, the OED says, “at all” has somewhat similar meanings—“in the least,” “in any way,” “for any reason,” “to any extent,” and “under any circumstances.”

Interrogative usages date back to the 16th century, and among the OED’s citations are “what power can it haue on you at all?” (1566); “shall I not vse Tabacco at all?” (1600); “why should he at all regard it?” (1683), and “Why should people care about football at all?” (2008).

But before these negative, conditional, and interrogative usages came into being, “at all” was used in affirmative statements to mean “in every way,” “altogether,” “wholly,” and “solely,” the OED says.

The dictionary’s earliest example, from about 1350, is  “I þe coniure & comande att alle” (I thee conjure and command at all).

The affirmative use of the phrase has died out in common usage, however, and now survives only in some regional dialects of American and Irish English.

A 1945 article in the journal American Speech says this affirmative use “lives on in Irish dialect and in colloquial speech in certain parts of America, especially after a superlative.”

The article, which gives “We had the best time at all” for an example, says the usage was reported in Georgia, Kentucky, North Carolina, Oklahoma, elsewhere in the South, and the Midwest.

The Dictionary of American Regional English has 20th-century examples of the affirmative usage from Virginia, Louisiana, West Virginia, Indiana, and Wisconsin.

In affirmative constructions in US regional English, “at all” means “of all” or “only,” according to the DARE.

The regional dictionary cites such examples as “He is the greatest man at all” (1916), “We had the best time at all” (1936), “She’s the finest girl at all” (1942), and “Use one statement at all” (1976).

As for the preposition in “at all,” the OED has this to say: “At is used to denote relations of so many kinds, and some of these so remote from its primary local sense, that a classification of its uses is very difficult.”

Well, we hope this sheds a little light on an idiomatic phrase (or phraselet) that today eludes a word-for-word interpretation.

Finally, a few words about “all,” an extremely useful word.

It functions as many parts of speech: adjective (“all day” … “we all went”); pronoun (“all you need” … “all is well”); noun (“he gave his all” … “the one versus the all”); and adverb (“all dirty” … “it’s all a dream”).

For many centuries, since the days of Old English, the adverb has been used with prepositions in interesting ways to emphasize, affirm, or otherwise modify a verb.

This is where “at all” comes in. But there are many other such phrases, too many to mention in all (there’s one now!).

For example, we use “all” with prepositions to mean “the entire way” or “fully.” The OED’s citations, dating back to early Old English, include quotations from Lord Nelson (“all round the compass,” 1795); Thomas Macaulay (“all down the Rhine,” 1849); and Bob Dylan (“all along the watchtower,” 1968).

We use both “all of” and “of all,” but for different purposes. Similarly we use “in all” and “all in” (as in “I’m all in”). And we often use “all” with “for” and “to”—as in “all to [or for] nought,” “all to hell,” “a free for all,” “all for it,” “all for one and one for all,” and many others.

“All” is also used with words that look like prepositions but are in fact adverbs: “I knew all along” … “they’re all alone” … “go all out” … “look all over” … “fall all round” … “lie all around,” hemmed all about,” and more.

“All” is such an ancient part of the language that its fossilized traces were evident in words from as far back as early Old English, when it appeared as ael- in compounds.

Remnants are seen today in words like “also,” “always,” “although,” “altogether,” “almighty,” and others.

We mentioned above that “all” can be an adjective, a pronoun, a noun, or an adverb. But once upon a time it was a conjunction as well.

The use of “all” as a conjunction is almost unknown today, but a trace of the old conjunction lives on in the word “albeit,” which is derived from the old phrase “all be it so.”

With that, we’re all done.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

In our humble opinion

Q: The new CEO of a local organization recently emailed this: “It is with humbleness and excitement that I take on this leadership role.” Why back-form a clumsy-sounding noun from an adjective when we already have a perfectly good noun—“humility”?

A: One of the blessings of English is its flexibility. We have umpteen different ways of saying something with umpteen different shadings.

That CEO could have taken on his new job with humility, humbleness, modesty, diffidence, meekness, selflessness, and so on.

Clumsiness is in the eye (and ear) of the beholder. It’s not even clear whether “humility” or “humbleness” is conciser, let alone nicer. “Humility” has two fewer letters, but “humbleness” has one less syllable.

More important, both nouns showed up in English around the same time (back in the 1300s!) and writers have been choosing one or the other ever since, depending on tone, cadence, intonation, and so on.

Shakespeare, for instance, used “humbleness” in the late 1500s in The Merchant of Venice (“With bated breath, and whispring humblenes”) and he used “humility” in the early 1600s in Coriolanus (“Enter Coriolanus in a gowne of Humility, with Menenius”).

He had a way with words, didn’t he? We especially like the idea of whispering humbleness.

Both “humility” and “humbleness” have Gallic roots, though “humbleness” has more of an Anglo-Saxon flavor because of its Old English suffix.

English got “humility” from the Middle French humilité, but the ultimate source is humilis, Latin for low or humble, according to the online Merriam-Webster Unabridged.

“Humbleness” comes from the Middle English adjective humble and the Old English suffix -ness. The adjective, in turn, is derived from the Old French umble or humble, which ultimately comes from humilis, the same Latin source as “humility.”

The first of these nouns to show up in English was “humility,” according to written examples in the Oxford English Dictionary.

The earliest citation in the OED is from “The Five Joys of the Virgin Mary” (circa 1315), a poem by William of Shoreham: Thorȝ clennesse and humylyte (“Her pureness and humility”).

The dictionary’s earliest example for “humbleness” is from the Wycliffe Bible of 1388: “He knowynge her pride, and schewinge his owene humblenesse.”

We’ll end with these not-so-humble remarks by Uriah Heep to David Copperfield:

“Ah! But you know we’re so very umble. And having such a knowledge of our own umbleness, we must really take care that we’re not pushed to the wall by them as isn’t umble. All stratagems are fair in love, sir.”

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

The Grammarphobia Blog

On brooch, broach, and broccoli

Q: How come the ornament pinned over my wife’s clavicle, a “brooch,” is pronounced like “roach” and not like “smooch”?

A: Yes, “brooch” is usually pronounced in the US and the UK to rhyme with “roach,” but some American dictionaries recognize a variant pronunciation that rhymes with “smooch.”

And some US dictionaries also recognize the variant spelling “broach” when the word for the ornamental pin is pronounced like “roach.”

In fact, the noun was spelled neither “brooch” nor “broach” when it first showed up in Middle English in the 1300s, according to the Oxford English Dictionary. (The OED has a questionable citation from the 1100s.)

The word was originally spelled “broche” when Middle English adopted it from broche, Old French for a pointed weapon or instrument.

In Middle English, “broche” was pronounced with a long “o” (as in “hope”), which accounts for the pronunciation you’re asking about, according to the Chambers Dictionary of Etymology.

For a few hundred years, the word “broche” referred to both the ornamental pin and a pointed implement (lance, spear, skewer, awl, and so on). However, “brooch” was occasionally used for the pin, as in the OED‘s earliest example of the ornamental usage.

In the 1500s, English speakers began routinely using the “brooch” spelling for the ornament and the “broach” spelling for the sharp implement, but the spellings weren’t consistent and were often reversed, according to Oxford.

The contemporary acceptance of “brooch” for the pin and “broach” for the tapered tool is relatively recent. As Oxford explains, “the differentiation of spelling being only recent, and hardly yet established.”

In the OED’s earliest definite example for “broche” (from Legends of the Rood, circa 1305, a collection of tales based on the Bible), the word refers to a lance or spear: “A Broche þorw-out his brest born” (“A lance borne through his breast”).

The dictionary’s earliest definite example for the ornamental usage is from The Legend of Good Women, a poem by Chaucer from around 1385: “Send hire letters, tokens, brooches, and rynges.”

The usage ultimately comes from the classical Latin broccus (pointed or projecting). In late Latin, brocca referred to a pointed tool.

The Latin and French sources have given English several other words, according to John Ayto’s Dictionary of Word Origins.

The verb “broach,” for example, meant “to pierce” when it entered English in the 1300s, then came to mean to tap a keg in the 1400s. And English speakers began using “broach” metaphorically in the 1500s to mean “introduce a subject.”

The French verb brocher (to stitch), Ayto adds, has given both French and English the noun “brochure” (literally “a stitched work”).

Finally, the late Latin brocca has given English (via Italian) “broccoli.” (In Italian, brocco is a shoot or stalk, and broccolo is a cabbage sprout.)

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

The Grammarphobia Blog

Hear Pat live today on WNYC

She’ll be on the Leonard Lopate Show around 1:20 PM Eastern time to discuss the English language and take questions from callers. Andy Borowitz will be filling in for Leonard as guest host. If you miss the program, you can listen to it on Pat’s WNYC page.
The Grammarphobia Blog

Why wine drinks well

Q: Why does a wine critic say a Bordeaux “drinks well”? A food critic wouldn’t say the carpaccio “eats well.” Does this usage have a history or is it just recent jargon?

A: Yes, the usage has a history—a long history!

The verb “drink” has been used intransitively (that is, without an object) since the early 1600s to mean “have a specified flavour when drunk,” according to the Oxford English Dictionary.

All six examples of the usage in the OED refer to wine, though one of the wines is made from fermented plantains, not grapes.

The earliest Oxford citation is from A Woman Kilde With Kindnesse, a play by Thomas Heywood that was first performed in 1603 and published in 1607:

“Another sipped to give the wine his due / And saide unto the rest it drunke too flat.” (We’ve gone to the original text to expand on the dictionary’s citation.)

And here’s an 18th-century example from John Armstrong’s Sketches or Essays on Various Subjects (1758): “The Burgundy drinks as flat as Port.”

The dictionary’s most recent citation is from the May 23, 1969, issue of the Guardian: “Every one of these wines will drink well now: most of them will improve by keeping.”

This use of “drink” is often referred to as “mediopassive,” a middle voice somewhere in between active and passive. In a post last year,  we discussed mediopassives like “My new silk blouse washes beautifully” … “Your house will sell in a week” … “The car drives smoothly.” A friend recently sent us her favorite example: “That dress buttons up the back.”

Why, you ask, is this usage common among wine critics, but not other food critics?

The OED suggests that it may have been influenced by the passive use of se boire (the reflexive form of the French verb “drink”).

Other than that, we don’t know. Some questions can’t be answered. That’s one reason why etymology is so fascinating. Let’s drink to that.

The verb “drink” itself is of Germanic origin (drincan in Old English, drinkan in Old Saxon, trinkan or trinchan in Old High German, drekka, in Old Norse, and so on).

When the verb showed up in Old English around the year 1000, it was transitive (a transitive verb needs an object to make sense). It meant “to swallow down, imbibe, quaff” a liquid, according to the OED.

Oxford’s first example of the usage is from the Book of Luke in the West Saxon Gospels: He ne drinco win ne beor. (He drinks neither wine nor beer.)

We’ll end by returning to the usage you asked about. Here’s a poetic example from The Compleat Imbiber: An Entertainment (1967), by the wine and food writer Cyril Ray: “I sipped the wine, which drank like velvet.”

[Update, Feb. 13, 2015. A retired English teacher writes: “I was reminded of an old rural Alabama saying, circa 1940-’50, used by cooks who may have overcooked, over salted, or otherwise prepared food not up to their usual standards, but needed to serve said food anyway. ‘This will eat,’ or ‘It’ll eat,’ was used in those cases as a slight apology for the less than perfect dish.” We can think of another example, the advertising slogan for Campbell’s Chunky Soup: “The soup that eats like a meal.”]

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Goody goody

Q: I’m fascinated by reduplicatives, especially those whose segments have no particular meaning on their own: “bow-wow,” “choo-choo,” “flim-flam,” “helter-skelter,” etc. I’ve often wondered why we refer to them as “reduplicatives” rather than “duplicatives.”

A: We once wrote a post on the reduplicative copula (“the thing is … is”), a usage that bugs a lot of people. But we haven’t written about the kind of reduplicatives you’re talking about.

In their Dictionary of Linguistics (1954), Mario Pei and Frank Gaynor define “reduplication” as “the complete or partial repetition of an element or elements.” And “reduplicative words” are “words of recurring sound and meaning (e.g., chit-chat).”  

In the 60 years since then, other linguists have defined “reduplication” in other ways. Some, for example, have drawn a distinction between repeated sounds and repeated meanings. But we won’t get into that.

Suffice it to say that “reduplication” is a technical term in linguistics, and that the Oxford English Dictionary’s definitions for “reduplication” and “reduplicative” used in the linguistic sense are similar to those of Pei and Gaynor. 

But as the OED says, “reduplication” in a more general, nonlinguistic sense simply means a doubling, repetition, or duplication.

So you ask a very good question—why do linguists use the “re-” prefix? If “duplication” means copying something once, then “reduplication” would imply copying something more than once, wouldn’t it?

Well, not necessarily, because “re-” doesn’t always mean “again” or “once more.” Sometimes it implies “back” or “backward,” as in words like “respect” (whose Latin roots mean to look back at), “revoke” (call back), “repay” (pay back), “remit” (send back), “remove” (move back), and others. 

It could be that the “re-” of “reduplication” in the linguistic sense originally had this same meaning, implying “back” instead of “again.” Unfortunately, we can’t say for sure that this is the case; we can only suggest it.

The word ultimately comes from the classical Latin verb reduplicare, meaning to double. (The Latin verb duplicare also meant to double.)

In the Latin of the third century and later, reduplication- or reduplicatio came to be used as a rhetorical term for the repetition of a word, according to the OED.

But it’s difficult to tell how the Romans—classical or later—viewed the “re-” in these words, and whether it originally meant “back” or “again.”

As the OED explains, “The original sense of re- in Latin is ‘back’ or ‘backwards,’ but in the large number of words in which it occurs it shows various shades of meaning.”

“Even in Latin,” the dictionary continues, “the precise sense of re- is not always clear, and in many words the development of secondary meanings tends greatly to obscure its original force. This loss of distinct meaning is naturally increased in English, where a word has often been adopted in a sense more or less remote from its original sense.”

In English, “reduplication” has had several meanings since it first entered the language, perhaps as long ago as the early 1400s. Early on, it was used in anatomy and zoology, for instance, to mean a doubling over or folding.

The “reduplication” we’re talking about, the name we now use for words like “mishmash” and “namby-pamby,” came into English in the 16th century. It’s defined in the OED as the “exact or partial repetition of a word, phrase, etc.”

But in some early uses of “reduplication” in this linguistic sense, it meant something similar to “epanalepsis,” a rhetorical device in which a an earlier word or phrase is repeated at some later point. This might be interpreted as a looking backward. Here are two OED citations:  

“Marke heere againe, how the Prophet resumeth his first admiration, by a Poeticall Epanalepsis or reduplication.” (From the undated Atheomastix, a posthumously published religious treatise by Martin Fotherby, 1560-1620.)

Reduplication … is a figure in Rhetoric, when the same word that ends one part of a verse or sentence, is repeated in that which follows.” (From Thomas Blount’s dictionary Glossographia, 1656.)

So it’s reasonable to suggest that “reduplication” in poetry or rhetoric originally meant something like “backward duplication” instead of “repeated duplication.”

The modern linguistics terms “reduplication” and “reduplicative” are derived from those earlier literary and rhetorical uses. But though the words have been handed down intact, today the “re-” seems far removed from the “back” sense and apparently means simple repetition.

Here, for example, is a contemporary citation, from L. J. Brinton’s Structure of Modern English (2000): “In English, reduplication is often used in children’s language (e.g., boo-boo, putt-putt) … or for humorous or ironic effect (e.g., goody-goody, rah-rah).”   

And this OED citation, for the adjective “reduplicative,” is from B. F. Skinner’s Verbal Behaviour (1959): “A fragmentary self-echoic behavior … may be shown in reduplicative forms like helter-skelter, razzle-dazzle, and willy-nilly.”

In the end, what we’re suggesting is that the “re-” in “reduplication” and “reduplicative” may have originally implied “back” or “backward.” And the modern terms in linguistics have preserved the “re-” prefix even though the meaning of it has changed. That would explain why today the prefix looks redundant.

In our readings about reduplication, we came across an interesting use of the term in art criticism to refer to a visual doubling.

In an essay on photography, the art critic Craig Owens uses the word “reduplication” to characterize a mid-19th-century “double portrait” of a woman who is seen alongside her reflection in a mirror:

“If we speak of this image, and of others like it, as reduplicative, it is because reduplication signifies ‘to reproduce in reflection,’ ” he says in his book Beyond Recognition (1994).

Owens seems to be using the term in the going-back sense to refer to an image seen in reflection. In fact, he draws a parallel to rhetorical reduplication.

“In classical rhetoric,” he writes, “reduplication was a species of repetition, distinguished by the reiteration of a word or phrase within the same part of a sentence or clause.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

REZ-oo-may or RAY-zoo-may?

Q: You say in your post about the American term for a curriculum vitae that it can be spelled “resume,” “resumé,” or “résumé.” But how is it pronounced? If one uses two accents, for example, is it pronounced REZ-oo-may or RAY-zoo-may?

A: British dictionaries (which define the term as a summary, not a list of accomplishments) use two accents.  But American dictionaries (which accept both definitions) are all over the place.

Merriam-Webster’s Collegiate Dictionary (11th ed.), as we noted in our earlier post, lists the spellings in this order: “résumé” or “resume,” also “resumé.” (The wording indicates that the first two are equal in popularity, and the third is somewhat less common.)

However, The American Heritage Dictionary of the English Language (5th ed.) lists the spellings this way: “resumé” or “resume” or “résumé.” (The wording indicates that the three are equally popular.)

In spite of differences in spelling, all the dictionaries we’ve consulted (three British and three American) list REZ as either the only or the primary pronunciation of the first syllable.

When English borrowed the word from French in the early 19th century, it meant only a summary of something.

The earliest example of the usage in the Oxford English Dictionary is from a Feb. 21, 1782, letter from Samuel Andrews to Benjamin Franklin: “I have taken the Liberty to send your Exellency two of my Résumé memoirs.”

The next example, from an 1804 issue of the Edinburgh Review, is clearer: “After a short resumé of his observations on coffee-houses and prisons, Mr. Holcroft leaves Paris.”

The word wasn’t used for a career summary until the 20th century, when this sense began appearing in the US and Canada.

The OED’s first citation is from an advertisement in the Jan. 10, 1926, issue of the Lincoln (Neb.) Sunday Star: “Send resume of previous business connections in letter of application.”

However, the dictionary encloses the entire citation in brackets, which “indicates a quotation is relevant to the development of a sense but not directly illustrative of it.”

The first unequivocal example is from an April 3, 1938, ad in the Hartford Courant: “Recent insurance company experience. $1800-$2000. Send full resume with snapshot.”

In Britain and France, a “résumé” is a summary while a list of accomplishments is a “curriculum vitae.”

Although some Americans also use the term “curriculum vitae” for a list of accomplishments, most refer to it as a “resume,” “resumé,” or “résumé.”

We prefer “resume.” Since the word is usually pronounced REZ-oo-may in English, it seems silly to keep the first accent and even sillier to leave only the second.

Yes, the noun and the verb would then be spelled the same, but it seems unlikely that anyone would confuse them in an actual sentence.

When English borrows words from other languages, they typically become anglicized over time, losing their accents and taking on new pronunciations. We think the time has come for “résumé” to be naturalized as “resume.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

When “inept” is inapt or unapt

Q: I recently wrote a criticism of a certain individual, calling him “incompetent,” then escalating to “inept.” Or so I thought. Are those two terms in fact synonyms, as some on the Internet claim? I thought ineptitude was a step further than incompetence.

A: When “inept” and “incompetent” took on their usual modern meanings in the 1600s, “inept” was the more negative term, but the two words have grown closer over the years, and a few standard dictionaries now define “inept” as “incompetent.”

Merriam-Webster’s Collegiate Dictionary (11th ed.), for example, says one of the senses of “inept” is “generally incompetent” while the online Collins English Dictionary gives one meaning as “awkward, clumsy, or incompetent.”

In the 17th century, both “inept” and “incompetent” meant incapable of doing something, but “inept” had the additional sense of silly or foolish.

Some standard dictionaries still include the silly or foolish sense in their definitions of “inept,” so you’re right to think that “inept” is “a step further,” as you put it, than “incompetent.”

A complication is that “inept” is sometimes confused with “inapt” (not suitable or appropriate) and “unapt” (not likely or inclined)—three words that overlap somewhat.

An “inept” job seeker, for example, may be “inapt” for a certain position or “unapt” to be hired for it.

Some usage guides say “inept” is the more negative of the three terms, and consider its use rude or insulting.

Fowler’s Modern English Usage (rev. 3rd ed.) describes “inept” as an “impolite use” while “inapt” and “unapt” are “reasonably polite.”

Garner’s Modern American Usage (3rd ed.) says “inept” is “usually intended as an insult” and “it’s an inapt choice in other contexts.”

When “inept” first showed up in English in 1603, according to the Oxford English Dictionary, it meant “not adapted or adaptable; not suited [for or to] a purpose; without aptitude; unsuitable, unfit.”

The OED’s first citation is from John Florio’s 1603  translation of a Montaigne essay: “A maner peculiar vnto my selfe, inept to all publike Negotiations.”

A year later, “inept” was recorded in the sense of “absurd; wanting in reason or judgement; silly, foolish,” according to the dictionary.

The first example is from A Counterblaste to Tobacco, a 1604 treatise by King James I of England (James VI of Scotland), expressing his distaste for tobacco:

“As to the Proposition, That because the braines are colde and moist, therefore things that are hote and drie are best for them, it is an inept consequence.”

Later in the 1600s, “inept” came to mean “not suited to the occasion; not adapted to circumstances; out of place, inappropriate.”

The OED’s earliest example is from a 1675 religious treatise by Richard Baxter: “If they mean Negative Propositions, it’s true, but inept.”

We’ll skip the legal senses of “incompetent” (not qualified as testimony or lacking the mental ability to stand trial), which showed up in English in the late 1500s.

In the mid-1600s, “incompetent” took on its modern sense of “inadequate ability or fitness; not having the requisite capacity or qualification; incapable.”

The first Oxford citation is from Fragmenta Regalia, Robert Naunton’s 1641 account of the reign of Queen Elizabeth I: “Sir Francis Knowles was somewhat neare in the Queenes affinitie, and had likewise noe incompetent issue.”

We’ve written before on our blog about the Latin origins of “inept” (also “ept,” “adept,” and “apt”): As we wrote, there was no English word “ept” until it was deliberately created as a humorous antonym to “inept” in the 1930s.

“Inept,” as we said, can be traced to the Latin ineptus, which the OED defines as “unsuited, absurd, foolish.” The Latin word is composed of the negative prefix in- plus the noun aptus, meaning a general tendency.

“Competent,” the opposite of “incompetent,” has been part of the language since the 1400s. It comes from the Latin adjective competentem (suitable, fitting, proper, lawful), which is derived from the verb competere (from which we get “compete”).

So etymologically, the notion of  being “competent” is related to the idea of being “competitive,” and an “incompetent” person can’t “compete.”

The classical Latin competere, by the way, was formed from com- (together) and petere (aim at, fall upon, strive, reach for). To the Romans, competere originally meant to fall together, coincide, or be suitable.

But in medieval Latin competere came to mean “strive together,” as John Ayto explains in his Dictionary of Word Origins. That’s the sense that gave us our word “compete.”

This is a reminder that Latin, while it was still a living language, grew and changed with the times—just as English does today.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Is “basis” loaded?

Q: I had hoped your “ongoing” article would opine about “on an ongoing basis” and similar constructions. I see phrases like “on a going-forward basis” and “on an expedited basis” more and more (perhaps because I read a lot of documents written by government lawyers). They set my teeth on edge and seem at best wordy.

A: Yes, many of these “basis” expressions could be replaced by simpler modifiers.

Instead of “on a going-forward basis” (which to our surprise produced 90,000 hits on Google), how about “in the future” or “from now on”? And instead of “on an expedited basis” (235,000 hits), why not “quickly”?

We’re not surprised that you find “on an ongoing basis” (a whopping 6.3 million hits!) annoying. It often serves no purpose (other than to give one’s writing an air of stuffiness) and could be deleted.

Here are some recent examples from news stories. Just imagine them without the underlined phrase:

“It is the most common anti-clotting drug, and most people with heart disease are advised to take it daily in low dose on an ongoing basis” (from the New York Times).

“The Red Cross hopes to have 15 to 20 Canadians cycling through West Africa on an ongoing basis for the next six to 12 months” (from Canada’s CTV News).

“Mobile platforms have changed not only how people shop, but have also enabled them to look for deals and bargains on an ongoing basis and make the most of them on the spot” (from Retail Times online).

To be fair, we did find some examples in which “on an ongoing basis” served a legitimate purpose. But even then, it could have been replaced with something simpler. For example:

“It’s no longer about selling them a game once every year. It’s about being able to offer value on an ongoing basis” (quoted in the Washington Post). That one could be replaced with “every day.”

These “basis” constructions also serve a useful purpose when a writer or speaker wants to emphasize an underlying condition or state of affairs: “She was hired on a trial basis” or “They’re on a first-name basis.”

And the constructions are handy when used to emphasize a fixed pattern or system for doing something: “Our employees are paid on a monthly basis.” Though in normal usage, “I’m paid monthly” seems more felicitous to us.

The Oxford English Dictionary has no entry for the phrase “ongoing basis,” but in searches of various databases we found examples dating from the late 1950s.

The earliest was from the September 1959 issue of the journal Biometrics: “providing meaningful data to the clinician on an ongoing basis as opposed to providing him with results based on mere endpoint observations.”

The expression cropped up occasionally during the 1960s, then with increased frequency throughout the 1970s and beyond. It has proved especially popular among scientific, corporate, and governmental writers.

For instance, it appeared no less than six times in a 119-page instructor’s guide, “Professional Career Systems in Housing Management,” published in 1979 for a workshop sponsored by the Department of Housing and Urban Development.

In each of the passages (you’ll have to trust us on this, since we’re in no mood to quote them), the phrase could have been replaced by the adverb “regularly.”

The OED has no discussion on the use of “basis” with temporal adjectives. (The last new citation in the “basis” entry is from 1958.)

But throughout the dictionary, in citations for other words, there are scores of examples in which “basis” is modified by “daily,” “hourly,” “weekly,” “monthly,” “yearly,” “annual,” “regular,” “irregular,” and “continuing.” So “ongoing basis” was probably inevitable.

A final word about “basis,” which English adopted directly from the Latin noun basis (foundation). The earlier Greek noun basis (something to step or stand on) is derived from the verb bainein (to step or go), according to the Chambers Dictionary of Etymology.

When “basis” entered English in the 1500s it meant the same as “base,” a word that had come into the language through Old French in the 1200s. And for a time, “base” and “basis” had the same meaning—the foundation, pedestal, support, or foot of some material thing.

But around 1600, according to OED citations, “basis” acquired several figurative or transferred meanings in respect to immaterial things: a principal ingredient or constituent, an underlying foundation, a principle, or a fact.

The result is that today “basis” retains only its newer meanings, and it’s no longer used in its original, material sense.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

A hit and a lick

Q: I’m trying to find the origin of “a hit and a lick,” a saying I learned while living in East Texas. I found an article about “a lick and a promise” on your site. I suspect the meaning is similar, but I’d like to have your input.

A: We haven’t been able to find “a hit and a lick” in any of our slang or idiom references, and it doesn’t seem to be used much.

In searches of news and book databases, we’ve found only a couple of dozen examples of the usage, with the earliest dating back to the late 19th century.

We suspect that the usage may be a conflation of two similar expressions: the adjectival phrase  “hit-or-miss” (meaning sometimes successful and sometimes not) and the noun phrase “a lick and a promise” (a superficial effort).

Or it may be a variation on the verbal expression “hit a lick,” which Green’s Dictionary of Slang says can mean, among other things, “to make an effort; usu. in negative combs. implying laziness on behalf of the subj. of the phr. e.g. He hasn’t hit a lick all week.”

Whatever its origin, the expression “a hit and a lick” seems to be used, as you suspect, in the same sense as “a lick and a promise.”

The earliest example of the usage we could find is from the Feb. 27, 1891, issue of the biking journal Wheel and Cycling Trade Review:

“X Bones will plead that, like other members of the club, he has not seen enough of the gentleman recently to be able to tell as much about him as he would like, and so, in the homely old phrase, he will give this sketch ‘a hit and a lick’ and let it go.”

And here’s an example from a 1920 issue of the Institution Quarterly, the journal of the Illinois Department of Public Welfare:

“A hit and a lick here and there have been all it has ever received. Much improvement could be made in its typographical appearance and in the character and preparation of its contents.”

We’ll end with a comment on the Ticketmaster website about a B. B. King concert at the Horseshoe Southern Indiana Hotel on Nov. 16, 2013:

“His band started out the first 15 minutes with instrumentals. He was on stage himself 75 minutes, and only sang parts of two (2) songs with a hit and a lick on both. Most of the time he spent conversing with people on the first row and asking band members if they remembered things and asking his number one back up rhythm guitar player to take over and play some instrumentals.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

When “George” was “Geo.”

Q: Why are certain men’s names abbreviated in old books and records? Examples: “Geo.” for George, “Thos” for Thomas, “Jos.” for Joseph, “Wm” for William, and “Chas” for Charles?

A: Men’s names aren’t the only ones. Women’s names are shortened in old writing too: “Abig.” for Abigail, “Const.” for Constance, “Lyd.” for Lydia, “My” for Mary, “Urs.” for Ursula, and so on.

Names and other words were abbreviated in old documents to save time and writing material. A census taker, tax collector, or scribe could speed up his work and cut down on paper, parchment, vellum, or papyrus.

Writing material was expensive until the introduction of steam-driven machines to mass-produce paper out of wood pulp in the 19th century.

However, the abbreviating of names and other words didn’t die out with scribes and parchment. Writers now abbreviate in email, texts, tweets, and instant messages.

And some analog types still abbreviate the old-fashioned way. We have a friend in Iowa City who writes only letters for personal correspondence, using every last inch of her stationery and abbreviating like a scribe of yore.

Paleographers, philologists, and linguists have studied  the practice of shortening names and other words over the years.

In a 2013 paper, “Manuscript Abbreviations in Latin and English,” the language researcher  Alpo Honkapohja discusses the practice in classical and medieval times.

“The two main reasons to use abbreviations are the economy of time and the economy of space,” Honkapohja writes.

He says economy of time “was the more important one in Ancient Rome, where abbreviations were needed for making quick transcriptions of spoken language.”

“In late Antiquity and the early Middle Ages,” he adds, “saving parchment became the driving principle.”

In The Handwriting of English Documents (1958), L. C. Hector writes that medieval abbreviations “saved time and space by allowing the scribe to drop letters from his writing of individual words.”

“A word of which the beginning is written and the end omitted is said to be suspended: the most extreme form of suspension is, of course, the representation of a whole word by its initial letter alone,” he says.

When a writer “omits a letter or letters from the middle of a word, so that its beginning and end remain, the word is said to be contracted,” Hector says.

When an abbreviated name is contracted, the last letter can appear in either normal type or superscript. So a contraction of the name William is seen as “Wm” or “Wm” while a contraction of Jonathan (or John) is “Jno.” or “Jno.” in old writing.

When letters are eliminated from the end of an abbreviated name, the shortened form is often followed by a colon or a dot, but the punctuation is often dropped with a contracted name. We’re using dots in this post for all abbreviated names except contractions.

Several genealogical websites include lists of names that are often shortened in old documents.  GeneologyInTime, for example, has a page of abbreviated first names, minus the dots.

And the Treasure Maps Genealogy site has a page that shows how some common abbreviated given names look in handwritten manuscripts.

We’ve had many items on our blog about abbreviations, including a posting in 2013 about the singer-songwriter Prince’s use of letters and numbers in his lyrics as shorthand for sound-alike words.

On a related subject, we wrote a post in 2012 about palimpsest and crossed (or cross) writing, two techniques used to conserve writing material in bygone days.

In crossed writing, a poor or frugal letter writer would fill a page of paper with writing, then turn it sideways and fill the page again with text running perpendicular to the original.

In palimpsest, old writing is scraped or rubbed away from parchment or vellum, so the material can be recycled. Documents made of more fragile papyrus were sometimes washed and used again.

On still another related subject, we’ve discussed nicknames several times on the blog, including posts in 2008, 2011, and 2013.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

A hand in the game

Q: Do you have any idea as to the origins of the expression “a hand in the game” and how old it might be?

A: We’ve found examples of “a hand in the game” in British and American writing—fiction as well as collections of letters and so on—dating back to the early 1800s.

The word “hand” had been used for centuries before that to mean involvement, or “a part or share in doing something,” according to the Oxford English Dictionary.

That sense of the word is chiefly used, the OED says, in the expression “to have a hand in,” which was first recorded in the 1580s.

This 18th-century quotation from the OED is a good example: “I solemnly protest I had no hand in it,” from Oliver Goldsmith’s novel The Vicar of Wakefield (1766).

And here’s a contemporary citation from the OED by Queen Elizabeth, quoted in the Coventry Evening Telegraph (2012):

“Prince Philip and I want to take this opportunity to offer our special thanks and appreciation to all those who have had a hand in organising these Jubilee celebrations.”

So the notion of having “a hand in” may have led to the longer expression “a hand in the game,” with “game” used literally to mean a card game or figuratively to mean some activity or project.

Since the mid-1500s, the OED says, “hand” has been used to mean the set of cards held by a player. And this sense of “hand” has been used figuratively since 1600 to mean one’s lot or fate.

So by extension, to have “a hand in the game” may refer to being a player—if not in an actual card game, then in some other enterprise, good or bad. 

This interpretation seems to make sense, considering the contexts in which 19th-century writers used the expression. Some of them also threw in metaphorical references to cards or gambling.

For instance, this passage describing the character of a miser comes from an essay written in 1815 by Conrad Speece, a newspaper columnist in Staunton, Va.:

“He is capitally skilled in the making of bargains, and makes a great many. On this subject his maxim is, ‘all the world is a cheat, and he is a fool that has no hand in the game.’ ”

And this quotation is from a letter written in 1824 by the Scottish painter Sir David Wilkie. Here he writes to a fellow artist, mentioning that the two of them had promised to do a joint drawing of a nobleman’s stately home:

“I successfully showed his Lordship that the delay did not rest with me, that you were the first hand in the game, and that it was not my turn till you had played your card.” (Notice that Wilkie also uses the image of “playing a card” to mean “making a move.”)

The expression cropped up a few years later, in 1827, when Thomas Carlyle translated a passage from the German writer Jean Paul Friedrich Richter:

“However, I could not speak to her, nor as little to the Devil, who might well be supposed to have a hand in the game.”

And later in the century, we find this example in Robert Louis Stevenson’s travelogue An Inland Voyage (1878): “In a place where you have taken some root you are provoked out of your indifference; you have a hand in the game.” 

Over the centuries, card games have given us many phrases that have acquired meanings beyond the poker or bridge table:

to “show [or declare] one’s hand”; to “lay the cards on the table”; to “be dealt a bad [or good] hand”; to “trump” someone; to play to one’s “strong [or weak] suit”: to “have a difficult hand to play,” and others.

Our guess is that “a hand in the game” is one more.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Ruminations on chewing the cud

Q: Some sources list “cud” as an uncountable noun while others say it’s countable. What’s your opinion?

A: A countable or count noun, as you know, is one that can be modified by an indefinite article (“a” or “an”) or a number: “a book,” “three dogs,” “seven dollars,” etc.

A mass or uncountable noun represents something that can’t be counted—a substance, a quality, an abstract idea, and so on. In ordinary usage, it’s singular and not modified by indefinite articles or numbers, like “water,” “fragility,” and “happiness.”

“Cud” is a substance (partly digested food that’s chewed again), so it’s a singular mass noun in the ordinary sense: “The cow seems contented to chew its cud.”

But the plural “cuds” is sometimes used, especially by agricultural writers, when referring to more than one cow (“the Jerseys were chewing their cuds”) or to several instances of cud chewing by a single cow (“the ailing Holstein spit out three of her cuds”).

We’ve checked seven standard dictionaries, and all the examples given use “cud” as a mass noun and refer to a singular cow chewing its singular cud.

However, the Oxford English Dictionary and farming journals have examples of multiple cows chewing multiple cuds. And the journals also have examples of a single cow chewing multiple cuds.

The OED entry for “cud” has this definition: “The food which a ruminating animal brings back into its mouth from its first stomach, and chews at leisure.” The word usually appears, the OED adds, in the verbal phrase “to chew the cud.”

The word “cud” is derived from old Germanic sources meaning glue or a glutinous substance. It’s been part of our language since Old English, and was later adapted to mean anything held in the mouth and chewed repeatedly, such as chewing tobacco.

As John Ayto notes in his Dictionary of Word Origins, the word “quid,” which means a plug of chewing tobacco, is a variant of “cud.”

The OED’s examples for written uses of “cud” date back to about the year 1000. In most of them, the noun is in the singular, “cud,” but there are plural examples too, like these:

“The whiles his flock their chawed cuds do eate.” (From a poem by Edmund Spenser, Virgil’s Gnat, 1591.)

“They … began grazing and chewing their cuds.” (From Nathaniel Hawthorne’s novel The Blithedale Romance, 1852.)

We notice that all the uses of “cuds” in the OED refer to plural animals. But in checking journals devoted to cattle raising and dairy farming, we find both “cud” and “cuds,” with the plural used to refer to multiple instances of cud chewing.

In one farm journal, a sick cow “frequently spit out her cuds”; in another, a cow “was chewing her cuds all right.”

As you know, there’s more than one way to “chew the cud.” As far back as the 14th century, the phrase has been used in a figurative sense to mean to meditate, ponder, or reflect.

We like this 18th-century example from Tobias Smollett’s novel The Expedition of Humphry Clinker (1771): “I shall for some time continue to chew the cud of reflection.”

And of course to meditate or turn something over in one’s mind is to ruminate.

As you probably suspect, the verb “ruminate” literally means to chew the cud. Its etymological ancestor is the Latin rumen (gullet), according to the Chambers Dictionary of Etymology.

A “ruminant” is an animal, like a sheep or cow or goat, that gets nutrients from plant roughage by chewing it, then swallowing it so fermentation can take place, then regurgitating and chewing it again.

Chambers says we owe both “ruminate” and “ruminant,” as well as the Latin verb ruminare (to chew the cud or to think over), to a prehistoric Indo-European root, reconstructed as reu-, that had a humble meaning—to belch.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

The original tiger mother?

Q: I’m curious to know if Amy Chua originated the phrase “tiger mother” or if it’s something that was around before her book. I (a possible Tiger Mom) can’t remember if I ever used it before Ms. Chua’s book and the subsequent media blitz.

A: No, Amy Chua didn’t coin the phrase in Battle Hymn of the Tiger Mother, a 2011 memoir about her strict parenting techniques, but she did help popularize the term.

Saul Bellow, for example, used the phrase several times before Ms. Chua’s book appeared in print. Here’s an example from his 1975 novel Humboldt’s Gift:

“When she was in her busy mood, domineering and protecting me, I used to think what a dolls’ generalissimo she must have been in childhood. ‘And where you’re concerned,’ she would say, ‘I’m a tiger-mother and a regular Fury.’ ”

And here’s one of two examples in his 1989 novella The Bellerosa Connection: “They were married and, thanks to him, she obtained her closure, she became the tiger wife, the tiger mother, grew into a biological monument and a victorious personality.”

In fact, the phrase “tiger mother” has been around since the 19th century, although many early examples use it in the sense of a protective mother rather than one who is strict or domineering, a meaning reinforced by Ms. Chua’s memoir.

The earliest example we’ve found is from an 1878 English translation of Bilder aus Oberägypten, an 1876 book about upper Egypt by the German zoologist and physician Karl Benjamin Klunzinger.

In the English translation, Klunzinger says the fear of mothers-in-law among the Bedouin of upper Egypt “perhaps naturally arises from the relationship itself, being expressed also in our proverb ‘Mother-in-law—tiger mother’ or ‘Devil’s darling.’ ”

In the original German, Klunzinger refers to the expressions as “Schwiegermutter—Tigermutter” and “Schwiegermutter—Teufelsunterfutter.”

Comrades Two, a 1907 novel by Elizabeth Fremantle (the pseudonym of Elizabeth Rockfort Covey), has an early example of the phrase used in its protective sense.

In the novel, which is set in Saskatchewan, the mother of a son suffering  from typhoid fever says “the instinct of the tiger-mother is tearing my heart to pieces.”

This more recent example of the protective usage appears in an article (“Be My Baby,” by Jane Hutchinson) published on May 8, 2005, in the Sunday Telegraph Magazine: “My sister calls us ‘tiger mothers,’ because we’re so protective.”

Interestingly, the phrase is used with the words reversed in Mother Tiger, Mother Tiger (1974), the title of Rolf Forsberg’s short film about an angry mother who struggles to accept the fact that her child is severely handicapped.

Although “tiger mother” didn’t show up in English until the 19th century, the word “tiger” itself has been used figuratively since the 1500s in reference to someone who is fierce, cruel, active, strong, or courageous.

The Oxford English Dictionary, for example, cites a 1585 prayer that thanks God for foiling a plot against Queen Elizabeth and saving “her from the jaws of the cruel Tigers that then sought to suck her blood.”

The OED also has citations from around the same time of the word “tiger” used adjectivally and adverbially in a similar sense.

Here’s one from The Theatre of Gods Iudgements, a 1597 book by the English clergyman Thomas Beard about divine retribution: “The poore old man thus cruelly handled … departed comfortlesse from his Tygre-minded sonne.”

And the OED also has examples from the 1500s of “tigerlike” used both adjectivally and adverbially.

In The Historie of England (1587), Raphael Holinshed writes of men who avenged the wrongs of the past with “more than tigerlike crueltie.”

And in “The Complaynt of Phylomene,” a 1576 poem about Philomena’s murder of her son in Greek mythology, George Gascoigne writes that she took the boy “Tygrelike” and stabbed him in the heart.

We’ll end with an example of “tiger mother” from a 2014 review of Daniel E. Sutherland’s biography of James Abbott McNeill Whistler. Here’s how the New York Times reviewer describes the artist’s mom:

“So there she sits, old Mrs. Whistler, in her black dress and lacy bonnet. Call her the original tiger mother. If she looks back to dour Puritans, she looks forward to an American culture of self-display, where you are only as good as your most recent publicity.”

Note: Amy Chua tells us that she used the term “tiger mother” in her memoir because she was born in Year of the Tiger. She also reminds us that Jacqueline Kennedy once used the term to refer to her father-in-law, Joseph P. Kennedy. In an interview with Arthur M. Schlesinger Jr. for a 1964 oral history, she said, “I always thought he was the tiger mother.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Is there a disconnect here?

Q: Is the word “disconnect” properly used as a noun?

A: Yes, “disconnect” has been a noun for more than a century, though the contemporary sense of a difference or an incompatibility is relatively new.

Webster’s New World College Dictionary (4th ed.) considers the newer sense informal, but the other six standard dictionaries we’ve checked list it without comment, indicating that it’s used in formal as well as informal English.

Although the noun “disconnect” is a relative newcomer (it dates from the early 1900s), “disconnection” has been a noun since the mid-1600s, meaning lack of connection, separation, or detachment.

The earliest example for “disconnection” in the Oxford English Dictionary is from Jasper Mayne’s 1663 translation of Lucian’s Dialogues:

“He still raises the derision of the auditory by his disconnections, and tautologies, and Nonplusses.”

The shorter word “disconnect” first showed up in English in the mid-1700s as a verb meaning to destroy the logical connection between things or to cause things to become disjointed.

The verb ultimately comes from Latin: the prefix -dis (apart) and the verb conectere (to join together).

The earliest example in the OED is from Moravian Heresy, a 1751 treatise by John Roche denouncing the Moravian Church:

“And if the Text does not chance to have Words enough sufficient to make a full Answer to the Question put, then the Sense is defective; if too many Words, then do they disconnect the Tenor, and confound the Sense.” (We’ve expanded the citation.)

Over the years, according to OED citations, the verb took on many related senses, including to break a physical connection (1758), to detach an electrical device from its power supply (1826), to end a telephone call (1877), to withdraw from society or reality (1961), and to terminate a computer connection (1977).

When the noun showed up in the early 20th century, the dictionary says, it referred to an “act or instance of disconnecting something; esp. a break of an electrical or telephone connection.”

The OED’s earliest example of the noun is from Telephony, a 1905 book by A. V. Abbott about the design, construction, and operation of telephone exchanges: “These signals must appear as a disconnect as soon as the receivers are replaced.”

The noun took on its contemporary sense of “a lack of consistency, understanding or agreement; a discrepancy” in the early 1980s, according to Oxford.

The dictionary’s earliest example is from a 1982 issue of Parameters, a journal of the US Army War College: “The result was the same: a disconnect between the security policy and the military strategy needed to achieve the political objective.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Who put the “X” in “Xmas”?

(We’re repeating this post for Christmas Day. It originally ran on Dec. 26, 2006.)

Q: I haven’t seen the word “Xmas” much for the last few years, probably because of all the attacks on it as part of a secularist plot against Christmas. In any case, what is the origin of “Xmas” and how did an “X” come to replace “Christ”?

A: Anybody who thinks “Xmas” is a modern creation that represents the secularization and/or commercialization of Christmas should think again. The term “Xmas” has been around for hundreds of years and “X” stood in for “Christ” for many hundreds of years before that.

The first recorded use of the letter “X” for “Christ” was back in 1021, according to the Oxford English Dictionary. But don’t blame secularists. Blame the monks in Great Britain who used “X” for Christ while transcribing manuscripts in Old English.

It turns out that the Greek word for Christ begins with the letter “chi,” or “X.” It’s spelled in Greek letters this way: ΧΡΙΣΤΟΣ. In early times the Greek letters “chi” and “rho” together (“XP”) and in more recent centuries just “chi” (“X”) were used in writing as an abbreviation for “Christ.” Sometimes a cross was placed before the “X” and sometimes it wasn’t.

Thus for nearly ten centuries, books and diaries and manuscripts and letters routinely used “X” or “XP” for “Christ” in words like “christen,” “christened,” “Christian,” “Christianity,” and of course “Christmas.” The OED’s first recorded use of “X” in Christmas dates back to 1551.

One other point. Although the St. Andrew’s Cross is shaped like an “X,” there’s no basis for the belief that the “X” used in place of “Christ” is supposed to represent the cross on Calvary.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Bogus origins

Q: I keep seeing “bogus” used in ways that seem too colloquial. Somehow saying Colin Powell made bogus claims about WMDs just doesn’t possess the right connotation. So is my claim of excessive informality correct or bogus?

A: We’ve checked seven standard dictionaries and none of them suggest that “bogus” is anything but standard English when used to mean counterfeit, fake, or spurious.

But one of the sources, The American Heritage Dictionary of the English Language (5th ed.), considers “bogus” slang when used in two less common senses:

(1) “Not conforming with what one would hope to be the case; disappointing or unfair” and (2) when used as an interjection “to indicate disagreement or displeasure with another’s actions or a circumstance.”

American Heritage gives this example of the first slang sense: “It’s bogus that you got to go to the party, and I had to stay home.” It doesn’t have any example for the second.

Although “bogus” is considered standard English today when used in its false sense, the word did originate in the late 1700s as US slang.

The Random House Historical Dictionary of American Slang says the term was originally underworld argot for “counterfeit coins; counterfeit money,” and in the early 1800s it came to mean “a machine for coining counterfeit money.”

The earliest Random House citation for “bogus” is from Band of Brothers (circa 1798): “Coney means Counterfeit paper money … Bogus means spurious coins, &c.”

The slang dictionary’s first example of “bogus” used for a machine to make phony money is from the July 6, 1827, issue of the Painesville (Ohio) Telegraph: “He never procured the casting of a Bogus at one of our furnaces.”

The earliest Random House cite for “bogus” used as an adjective to mean fraudulent or phony is from The Banditti of the Prairies, an 1855 book by Edward Bonney about his work as a private detective to expose criminal gangs in Illinois:

“I have a little bogus gold but have been dealing mostly in horses.”

The Oxford English Dictionary has several earlier citations for the adjective, including this one from A New Home—Who’ll Follow? (1839), by Caroline Matilda Kirkland, writing under the pen name Mrs. Mary Clavers:

“And in the course of the Tinkerville investigation the commissioners had ascertained by the aid of hammer and chisel, that the boxes of the ‘real stuff’ which had been so loudly vaunted, contained a heavy charge of broken glass and tenpenny nails, covered above and below with half-dollars, principally ‘bogus.’ ” (We’ve expanded on the citation.)

The OED says “many guesses have been made, and ‘bogus’ derivations circumstantially given” about the origin of the word.

The dictionary notes, for example, that Eber D. Howe, editor of the Painesville, Ohio, newspaper cited above, wrote in his 1878 autobiography that “bogus” might “have been short for tantrabogus, a word familiar to him from his childhood, and which in his father’s time was commonly applied in Vermont to any ill-looking object.”

We suspect, however, that Howe’s suggestion as well as several others we’ve seen (a forger named Borghese, the French word bagasse, etc.) are bogus.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Hark! the Herald Angels Sing

Q: Is the subject grammatically correct in the title “Hark! the Herald Angels Sing”? That’s how it appears in our hymnal. Astonishingly, this is a practical issue, since we display the words during church services via video projection.

A: Yes, the subject is grammatically correct.

The plural subject “Angels” (part of the noun phrase “the Herald Angels”) agrees with the plural verb (“Sing”). The word “Hark!” in the title is a stand-alone imperative verb meaning “Listen!”

Although the grammar is correct, the punctuation and capitalization might seem odd to modern readers. If the title were written today, it would probably be either “Hark, the Herald Angels Sing” or “Hark! The Herald Angels Sing.”

However, we see no reason to modernize the title. In fact, we prefer the old-fashioned punctuation and capitalization. It gives the 18th-century hymn a patina of age.

Interestingly, the original hymn, written by Charles Wesley, was entitled “Hymn for Christmas-Day” and had nothing in it about “Herald Angels.”

Here are the opening lines from the earliest version of the hymn, as published in Hymns and Sacred Poems (1739), a collection of verse compiled by Charles and John Wesley, leaders of the Methodist movement:

Hark how all the welkin rings
“Glory to the King of kings,
Peace on earth, and mercy mild,
God and sinners reconcil’d!”

George Whitefield, a preacher and friend of the Wesley brothers, rewrote the first two lines in A Collection of Hymns for Social Worship (1753):

Hark! The herald angels sing
“Glory to the new-born King!”

In 1855, the English musician William H. Cummings made several other changes, including adding the refrain, when he set the hymn to music by Felix Mendelssohn.

The hymn has had other titles over the years (“On the Nativity,” “Christmas Hymn,” “An Ode,” “The Song of the Angels,” and so on), but it was often referred to simply by either its first line or a number in a hymnal.

The earliest examples we’ve found for “Hark! the Herald Angels Sing” used as the title are in a list of sheet music for Christmas hymns in the Nov. 1, 1864, issue of the Musical Times.

In five different arrangements of the hymn for four voices, the title is written in all capital letters: “HARK! THE HERALD ANGELS SING.”

In an article that discusses the editing of Charles Wesley’s hymn, C. Michael Hawn, a sacred-music scholar, notes that changes in the texts of hymns are quite common.

“The average singer on Sunday morning would be amazed (or perhaps chagrined) to realize how few hymns before the twentieth century in our hymnals appear exactly in their original form,” Hawn writes.

He considers the replacement of the term “welkin” in the first line as “perhaps the most notable change” in the Wesley hymn.

And what, you’re probably wondering, is a welkin? As Hawn explains, it refers to “the sky or the firmament of the heavens, even the highest celestial sphere of the angels.”

Hawn cites a light-hearted comment by the Wesley scholar Ted Campbell that suggests the term may not have been a household word even in the 18th century:

“I have wondered if anybody but Charles knew what a welkin was supposed to be.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Bread and dripping

Q: The next time Pat appears on the Leonard Lopate Show, she should tell Leonard that here in England we don’t all eat “drippings” (“dripping” in British English) for breakfast! The last time I tasted dripping was after the Second World War when food was still rationed. I’ve certainly never heard of it for breakfast. Fried bread is still popular, though my GP wouldn’t be too pleased if I indulged.

A: We do recall that Leonard once mentioned “drippings” in a discussion of British breakfast habits. Like us, he probably enjoys vintage British fiction, stories in which kids slip away from Nanny and sneak into the kitchen, where Cook gives them a treat of “bread and dripping.”

We’re big fans of Angela Thirkell, and we recall such scenes in her Barsetshire novels, which begin in the early 1930s and end in the late ’50s. In either kitchen or nursery, children are indulged with lavish helpings of “dripping,” spread on fresh warm bread.

We always assumed “dripping” meant bacon grease, but we should have consulted the Oxford English Dictionary! We would have found it defined this way: “the melted fat that drips from roasting meat, which when cold is used like butter. Formerly often in pl.”

The Cambridge Dictionaries Online says the noun is singular in the UK and plural in the US, though all the American dictionaries we’ve checked list “dripping” as the principal noun, with “drippings” as a common variant.

Gravy, as every cook knows, is made from the drippings (we prefer the variant) that come from roast meats—hot fat plus crispy morsels and bits of meat than have fallen off.

In many parts of the US, “biscuits and gravy” is a staple, and you can order it for breakfast in diners, alongside your eggs. (Tell THAT to your GP!)

So from now on, we’ll think of “dripping” as a sort of pre-gravy, before the flour and extra liquid are stirred in.

We’ve occasionally skipped the flour and used this pre-gravy with bread or mashed potatoes, but we’ve never used the cold congealed stuff like butter, as the OED suggests.

The British have used the noun “dripping” since as far back as the 15th century. The word is implied in a reference to “drepyngpannes” (dripping-pans) that was published in an Act of Parliament in 1463, according to the OED.

References to “dripping” itself began appearing in 1530 (“drepyng of rost meate”) and continued until well into modern times.

The OED’s citations conclude with this one, from Rosa Nouchette Carey’s novel Uncle Max (1887): “A piece of bread and dripping.”

However, the tradition has apparently lived on. We’ve found plenty of subsequent references to “bread and dripping,” eaten at breakfast or tea or even for supper, in the works of George Orwell, Doris Lessing, Somerset Maugham, P.D. James, Margaret Atwood, and too many others to mention.

And the online Oxford Dictionaries offers this example of the usage: “I still carry around a hankering for bread and dripping, steamed pudding, and sweet macaroni, but I know they will do me no good, so I avoid them.”

Contributors to British cooking websites often wax nostalgic about “bread and dripping.” Some recall it as a humble working-class dish, or as a byproduct of food rationing. But others still eat it with relish (that is, with enjoyment) just because they like it.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

­

The Grammarphobia Blog

Deconstructing “it”

Q: I’m flummoxed by the word “it” in a sentence such as “I like it when you sing.” What in the world is “it” doing there?

A: The sentence that puzzles you, “I like it when you sing,” is a familiar construction, especially in spoken English. We find nothing grammatically wrong here, as we’ll explain later.

But you’re right—on close examination, this familiar old pattern seems curiouser and curiouser.

In sentences like this a verb, often one expressing a state of mind (“like,” “love,” “hate,” “appreciate,” etc.), has as its object the pronoun “it,” followed by a clause beginning with “when.” (A clause, as you know, is a group of words that has a verb and its subject.)

Here are some similar examples: “She loves it when he smiles” … “I hate it when people swear” … “Mom and dad appreciate it when you do the dishes” … “He always regrets it when he’s rude.”

All of these examples seem quite innocent on the surface. But what’s happening underneath?

As you can see, there are two clauses here. Using your original sentence as our model, the clauses are “I like it” and “when you sing.”

In the main clause, “it” is the direct object of the verb “like.” And to identify what “it” is, the speaker follows with a subordinate clause that begins with “when” and names an event or circumstance.

So the “when”-clause is an object too, in a sense. It explains what the object “it” refers to: an occasion on which someone sings. So in that sense the “when”-clause resembles a noun clause.

But it also seems to have an adverbial use, since it says when something happens. It describes the condition required for the main clause to be true. So instead of referring to a time, this “when” refers to a situation.

Often sentences like these can be reversed: “I like it when you sing” neatly corresponds to “When you sing, I like it.” In the second version, “it” refers back, instead of forward, to the explanatory “when”-clause.

But you wouldn’t want to move a “when”-clause to the front unless it’s fairly short and simple. Here’s a sentence that would sound clunky if flipped:

“I hate it when a birthday invitation says ‘No gifts, please’ and then everyone but you brings one anyway.” There’s no felicitous way to move “I hate it” to the end.

Linguists have interpreted this kind of construction in many different ways over the years. For example, they’ve used a variety of terms in discussing the role played by “it.”

In A Grammar of the English Language (1931), George O. Curme interprets this “it” as “an anticipatory object” that points forward to a fuller object clause.

In his book When-Clauses and Temporal Structure (1997), Renaat Declerck calls this a “cataphoric” or “anticipatory” pronoun, one that depends on the “when”-clause for its meaning. (A “cataphoric” pronoun is one that refers to a following word or phrase.)

Other commentators have described this “it” as an “expletive” or “pleonastic” pronoun—one with no meaning of its own, but merely a sort of placeholder required by the word order.

But we’ve also found arguments that the pronoun is not “pleonastic”: it’s not without meaning, since it refers to an event.

Linguists have also disagreed in their views of the “when”-clause in sentences like these—is it a relative clause, a noun clause, an adverbial clause, or perhaps some combination of those?

Declerck regards these clauses as adverbial. And when preceded by “it” acting as an object, he writes, they are “extraposed when-clauses.” (Essentially, an element is “extraposed” when the pronoun takes its place and shoves it aside.)

Without the “it” (as in “I don’t like when people argue”), the “when”-clause itself “fills the object position,” Declerck writes. So in that case the clause is not “extraposed.”

As we mentioned above, we find nothing grammatically wrong with sentences such as “I like it when you sing.” They seem natural and idiomatic, and they work well. But they do seem more at home in informal or spoken English.

No usage authorities, to our knowledge, have condemned the use of a “when”-clause to describe an event. And the use of “it” as an object that’s then echoed by the “real” object is also a common feature of English, as in “I like it, this movie,” and “He loathes it, that old eyesore.”

So we have no quarrel with these “when”-clauses in spoken or informal English, but if you prefer to avoid them you certainly can. Many constructions are similar, though in some cases they may be subtly different.

Declerck says, for example, that “I hate it when you talk like that” will generally be interpreted as similar to “I hate your talking like that.”

But the two don’t mean precisely the same thing. One refers to an occasion, the other to what could be habitual behavior. If the person you’re addressing always talks like that, then either construction would be appropriate.

Another kind of substitution comes to mind. You can often replace “when” by “that” and still make grammatical sense.

But again, your meaning may be changed. “I like it when you sing” isn’t the same as “I like it that you sing.” In the first sentence, the object of the liking—“it”—is not the fact that the person sings, but occasions when the person sings.

A few years ago we wrote a post on a similar subject, the use of “when”-clauses in definitions after forms of the verb “be.” (Example: “Despair is when you see no way out.”)

As we wrote then, this construction is common and has a long history, but it’s been considered colloquial since the mid-19th century. It’s common in speech and casual writing, but it’s generally avoided in formal English.

If you see “when”-clauses after the verb “be” in formal writing, it’s usually in reference to time, as in “Yesterday was when I heard the news” or “This is when you should change the oil.”

And now is when we should sign off.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

­