The Grammarphobia Blog

The secret life of sources

Q: On the news, I often hear a source quoted “on condition of anonymity” and several variants thereof. This usage sounds like journalese or legalese. Can you clarify the original and the subtleties of its forms?

A: The expression “on condition of anonymity” is associated with news reporting, but we see it in legal contexts as well. So you’d be justified in calling it both “journalese” and “legalese.”

On the legal front, someone might wish to give information without having it linked to his name. This might happen, for instance, in organized-crime or securities-fraud cases, as well as other kinds of law-enforcement investigations.

But anonymous sources are probably most common in journalism, and have been for quite some time. As a British weekly, the Publishers’ Circular, noted in an editorial in 1893, “Anonymity in the press is not a new subject of discussion.”

Someone who talks to a reporter “on condition of anonymity” is willing to give information—but only if he’s not named. He wants certain information to be made public, but he’s not willing to take responsibility for it.

In both the law and in journalism, such information carries a taint of suspicion, even when it’s perfectly legitimate. The informant could have an ulterior motive, since  anonymity allows him to smear another person’s name while remaining nameless himself.

But sometimes journalists and investigators can’t get certain information in any other way, so they promise to protect their source. And it could be that the informant’s reason for anonymity is simply to protect himself—he might face retaliation if identified.

As we said, the anonymous source—even the anonymous reporter—isn’t new. Periodicals of the 17th, 18th, and 19th centuries commonly featured articles written anonymously or under pen names.

In the 17th century, a satirical pamphlet entitled Whimsies caricatured the anonymous journalist as a weekly “newes-monger” whose “owne genius is his intelligencer” (in other words, his source is himself).

“No matter though more experienced judgements disprove him; hee is anonymos, and that wil secure him,” the pamphlet said. (Here we’ve expanded on a 1631 quotation that appears in the Oxford English Dictionary.)

In the early 18th century, the Spectator, founded by Joseph Addison and Richard Steele, was notorious for its anonymous columns and for its “letters” (often fictional) from nameless or pseudonymous writers.

The OED quotes one letter-writer in 1712 as begging to be heard “amongst the crowd of other anonymous correspondents.”

And in 1820 Blackwood’s Magazine spoke of “the merit due to us, for being the first to carry on a periodical work, without that vile anonymous disguise, under which such unwarrantable liberties are frequently taken with You, my public.”

Oxford cites this 1882 quotation from the Times of London: “Academical dignitaries, writing … under a disguise of transparent anonymity.” Notice how the writer of that article recognized the evasive nature of the writings he was quoting.

The OED has no citations for the exact expression “on condition of anonymity,” and we can’t say for sure when it first appeared in print.

In 1925 E. M. Forster wrote in the Atlantic that “all literature tends toward a condition of anonymity.” But he used the expression in a different sense than the one we’re discussing here. Forster meant “condition” merely as a state, not as a requirement.

We find the “requirement” sense of the word in this 1925 quotation from the British Medical Journal:

“The Society had since received from the same generous and anonymous source a further munificent gift of something over £28,000, to be applied on the same terms and under the same condition of anonymity.”

And a 1949 article in the Proceedings of the American Association for Public Opinion Research referred to “the condition of anonymity” as “a condition which had for many years been resorted to on the assumption that servicemen would otherwise not give frank reports on the state of their morale.”

The expression as used by journalists quoting unnamed sources wasn’t common, as far as we can tell, until the latter half of the 20th century.

The first such use in the New York Times appears in a 1964 article: “But, though Kennedy himself kept his silence, some of his intimates, on condition of anonymity, did not.”

We’ve found 19th-century examples of the journalistic usage that come close, without using that exact wording.

This one, for example, is from a letter written by Jean Joseph Louis Blanc in 1863 and published in Letters on England (1876):

“Whence comes it that in such a country as England journalism is anonymous? Whence comes it that, generally speaking, anonymity is considered an indispensable condition of journalism? I confess that I am at a loss to explain it.”

And here’s an 1895 example from the Harvard Graduates’ Magazine:

“The first consideration should be, therefore, to create the condition most favorable for the critic to produce an unbiased opinion, and one of the elements of the condition is often anonymity, because it allows him to work impersonally.”

By now, the phrase “on condition of anonymity” is almost a journalistic cliché. 

Walter Shapiro wrote a humorous column on the subject in the Atlantic in 2005, entitled (naturally) “On Condition of Anonymity.”

And Matt Carlson wrote a book on the uses and abuses of anonymity, On the Condition of Anonymity: Unnamed Sources and the Battle for Journalism (2011).

We can’t leave without giving you a little etymology.

The adjective “anonymous,” according to the OED, was first recorded in 1601 (an earlier form, “anonymal,” died out). It literally means “without  name.”

English owes the word “anonymous” to Latin (anonymus) and ultimately Greek (anonymos). But it’s not classical at heart.

Its root is the ancient Indo-European word nomen, the source of the word for “name” in the Germanic languages as well as Latin and Greek.

“Anonymous” is the source of the short-lived noun “anonymousness” (1802) and the more durable “anonymity” (1820).

Check out our books about the English language

The Grammarphobia Blog

From minutia to minutiae

Q: I never hear people say “minutia” and mean “minutia” (i.e., a minor detail). They always use it to mean “minutiae” (minor details). And they pronounce it mi-NOO-shee-uh or mi-NOO-shuh, which is understandable considering how ungainly mi-NOO-shee-ee is. Are we witnessing the conflation of these singular and plural forms?

A: Standard dictionaries define “minutia” as a small or trivial detail, and “minutiae” as small or trivial details. Yet “minutia” is often used to mean “minutiae,” and “minutiae” is often pronounced like “minutia.”

This is nothing new, however. Both words have been used as singulars and plurals since they first showed up in English in the 18th century. And their pronunciations have been all over the place.

Confused? Well, don’t look to Latin for help.

In classical Latin, “minutia” didn’t even mean a small or minor detail, nor did “minutiae” mean small or minor details. Here are the details.

The source of these two words was minutus, which meant small in classical Latin. Minutia and minutiae were singular and plural nouns for smallness—the quality or state of being small.

In the late Latin of the 4th century, minutiae came to mean small or trivial details, but minutia continued to mean smallness.

It wasn’t until “minutia” showed up in English in the 18th century that it took on its small or trivial sense—in both singular and plural versions!

The first of these Latin words to enter English was “minutiae,” according to the Oxford English Dictionary, which defines it initially as a plural meaning “precise details; small or trivial matters or points.”

The earliest example in the OED is from Samuel Richardson’s 1748 epistolary novel Clarissa: “I have always told you the consequence of attending to the minutiæ.”

However, the dictionary also has examples from the late 18th century to the year 2000 of “minutiae” used in the singular to mean “a precise detail; a small or trivial matter or point”—that is, “minutia.”

The first OED citation for “minutiae” used in the singular is from The Beggar Girl and Her Benefactors, a 1797 novel by Anna Maria Bennett (she wrote as “Mrs. Bennett”): “Strict attention to every minutiæ of her domestic arrangement.”

The dictionary’s earliest citation for the singular “minutia” is from Elizabeth Blower’s 1782 novel George Bateman: “On the observance of some little minutias, no small share of the beauty … depended.”

The first Oxford example that refers to just one “minutia” is from Washington Irving’s 1841 biography of the poet Margaret Miller Davidson:

“That holy patriotism which could toil and bleed, ere it would yield one single minutia of that independence bequeathed to them by the valour of their immortal sires.”

The earliest written example of “minutia” used in the plural is from Charles Burney’s Memoirs of the Life and Writings of the Abate Metastasio (1796): “Descending to the minutia of all the events and occasions which may be imagined.”

The OED offers the singular and plural versions of both “minutia” and “minutiae” with no warning labels—in other words, no indication that these usages are anything but standard English.

Only a handful of standard dictionaries in the US and the UK have entries for the singular “minutia,” perhaps because the word is used so rarely to mean a small or trivial detail. When we see “minutia” online, it’s almost always used as a plural.

In fact, The American Heritage Dictionary of the English Language dropped its entry for the singular “minutia” from the latest edition, the fifth.

Pronunciation guides in standard American dictionaries indicate that “minutia” is pronounced mi-NOO-shee-uh and “minutiae” is pronounced mi-NOO-shee-ee,  mi-NOO-shee-eye, mi-NOO-shee-uh, or mi-NOO-shuh. The NOO in both words can also be NYOO.

However, our experience is that many, if not most, Americans pronounce “minutiae” as mi-NOO-shuh. And few Americans use “minutia” to mean a small detail.

As for British pronunciations, the OED says “minutia” can be either my-NYOO-shee-uh or mi-NYOO-shee-uh while “minutiae” can my-NYOO-shee-eye, mi-NYOO-shee-eye, my-NYOO-shee-ee, or mi-NYOO-shee-ee.

In other words, you can probably defend just about any likely use of “minutia” and “minutiae.”

As for us, we pronounce “minutiae” as mi-NOO-shuh. And we don’t use “minutia.” If we did want to refer to a small or trivial detail, we suppose we’d call it something like a “trifle,” a “triviality,” or perhaps even a “trivial detail.”

Check out our books about the English language

The Grammarphobia Blog

Hear Pat on Iowa Public Radio

She’s on Talk of Iowa today from 10 to 11 AM Central time (11 to 12 Eastern) to discuss the English language and take questions from callers.

 Check out our books about the English language

The Grammarphobia Blog

Three little Wendys

Q: Please tell me the correct way to “pluralize” (is this a word?) a person’s name. For example, “The Marxes (a couple named Marx) will be presenting their ideas to the board.” Is there a rule about when to add an “es” or only an “s”?

A: We wrote a post a few years ago about how names are pluralized (yes, it’s a word), but the subject comes up so often that the commonly accepted rules bear repeating.

Any name or other word ending in a sibilant (a hissing, shushing, or buzzing sound, as in “s” “ch,” “sh,” “x,” or “z”) is pluralized with the addition of “es.”

Examples: “Joneses” … “Foxes” … “Birches” … “Lopezes” … “Schultzes” … “Gishes” … “Fitches.”

In general, names are pluralized the same way as ordinary nouns—except for irregular nouns like “sheep,” “fish,” “children,” and others.

A family named “Child,” for example, would be pluralized as “the Childs.” And a family named “Childs” would be pluralized as “the Childses.”

However, names ending in “y” are treated differently from ordinary nouns ending in “y.”

When a name ends in “y,” it’s pluralized with “s” instead of “ies.” So a family named “Brady” would be pluralized as “the Bradys.” And three little girls named “Wendy” would be “three little Wendys.”

You’ll find these rules in all major usage guides, including Fowler’s Modern English Usage (rev. 3rd ed.), Garner’s Modern American Usage (3rd ed.), Merriam-Webster’s Dictionary of English Usage, and Pat’s grammar and usage book Woe Is I (3rd ed.).

Check out our books about the English language

The Grammarphobia Blog

“Very” similitude

Q: My mother, an old-school English teacher/grammarian, insisted that it was incorrect to say “very amused.”  She said that it should be “very much amused.”

A: Both of these sentences—“He was very amused” and “He was very much amused”—are perfectly good English. (“He was much amused” is fine, too, for that matter.)

Your mother was thinking of an old prohibition against “very” that first reared its head in the late 19th century and was criticized even then.

Although you can still find a few sticklers who haven’t heard the word, that old taboo has long been discredited by language authorities.

Why did it rear its head in the first place? The reasoning goes something like this:

Since “very” doesn’t modify verbs (only adjectives and adverbs), it shouldn’t be used right before a verb’s past participle, like “amused.”

When “very” appears with a past participle, the critics said, an adverb like “much” should be inserted so that “very” modifies “much,” not the participle.

That’s the argument, and because of it some language commentators of the past have objected even to usages like “We were very annoyed” and “I am very concerned.”

But fortunately modern usage authorities have brought common sense to the rescue.

Words like “amused,” “annoyed,” and “concerned” are indeed the past participles of verbs (“amuse,” “annoy,” “concern”).

But in sentences like the ones above they’re also adjectives—the kind of adjectives that are formed from past participles. So there’s nothing wrong with “very amused” and the rest.

We’re more likely to use “very” with a participial adjective having to do with a condition, feeling, mental state, and the like—“bored,” “amused,” “relaxed,” “surprised,” “mistaken,” etc.

And the participle is often used passively—that is, with a form of “be,” as in “was very mistaken,” “am very surprised,” “were very bored,” and so on.

Past participles like those have firmly established themselves in the wide family of English adjectives. It’s become natural to see, hear, and use them with “very.”

Some other past participles, though, haven’t yet reached this point, as R. W. Burchfield points out in Fowler’s Modern English Usage (rev. 3rd ed.).

“There are many passive participles which by their very nature are incapable of being qualified by either much or very,” Burchfield writes, citing “defeated,” “finished.” “forced,” “located,” “undetected,” and “unsolved.” 

Merriam-Webster’s Dictionary of English Usage suggests, however, that English speakers could even be “very”-ing words like those one day.

“The movement of past participles into adjective function—based on premodification by very—began in the 17th century,” Merriam-Webster’s says.

This is “a continuing process,” M-W’s editors write, and “participles that sound awkward with very today may sound fine in another generation.”

Check out our books about the English language

The Grammarphobia Blog

Running dogs of rhetoric

Q: In reading some of the old Cold War literature, I frequently come across the term “running dog” in reference to the Chinese disdain for America. What are its origins and implications?

A: The English term “running dog” has been around a lot longer than you might suspect—for hundreds of years before Mao Zedong was a gleam in his father’s eye.

When it first showed up in the early 1600s, according to the Oxford English Dictionary, the phrase referred to “an animal, esp. a dog: that is raised or kept for pursuing animals in the course of a hunt.”

The earliest citation in the OED is from A Helpe to Discourse (1619), a collection of miscellaneous writings of questionable authorship: Miso, because I hunted in his grounds / Let lose his running dogges, and baukt my hounds.”

The dictionary notes that a similar term, “running hound,” showed up a couple of centuries earlier.

In The Master of Game (1425), a book on hunting, Edward, Duke of York, writes about “rennyng houndis hunten in diuers maners” on the moors.

In fact, the OED says the use of the term “running dog” by the Communists is derived from zougou, a Chinese term for a hunting dog (from zou, to run, and gou, dog).

The dictionary says the Chinese used the term figuratively in the “18th cent. or earlier in political contexts” to refer to a “servile follower, lackey.”

In the 20th century, according to Oxford, the Chinese Communists used the term for “a person who is subservient  to a foreign power, esp. to one that threatens revolutionary interests.” Later, the dictionary notes, the Communists used the term in a “generalized” sense.

Interestingly, the OED’s earliest written example of the term used in the political sense is from an American newspaper in which the Chinese are referred to as “running dogs” of the Russians.

Here’s the citation, from the June 8, 1925, issue of the Los Angeles Times: “The Communists cry ‘overthrow imperialism,’ but they themselves are the running dogs of red Russian imperialists.”

The next Oxford example is from China: A Nation in Evolution (1928), by Paul Monroe: “The intelligent Chinese … may believe that missionaries in general are but the ‘running dogs’ … of the imperialistic business and political interests.”

The earliest example citing a Chinese source is from Selected Works of Mao Tse-tung, Volume IV (1961):

“Without a revolutionary party … it is impossible to lead the working class and the broad masses of the people in defeating imperialism and its running dogs.”

We’ll end with an example from The Honourable Schoolboy, a 1977 novel by John le Carré:

“Czarist imperialist running dogs drank tasteless coffee with divisive, deviationist, chauvinist Stalinists.”

Check out our books about the English language

The Grammarphobia Blog

The power of appositive thinking

Q: When a quote comes right after a verb like “said” or “asked,” we use a comma (e.g., God said, “Thou shalt not kill”). But do we still need a comma if we don’t use a verb (e.g., God’s statement “Thou shalt not kill” is one of the Ten Commandments)?

A: No, you wouldn’t use a comma to introduce the quotation in your second example.  

The Chicago Manual of Style (16th ed.) notes (as you point out) that a quotation “in the form of dialogue or from text is traditionally introduced with a comma.”

Elsewhere, the manual says, “a comma is used after said, replied, asked, and similar verbs.” 

But not every quotation requires an introductory comma.

For instance, the Chicago Manual says no comma is needed before a quote introduced by “that,” “if,” “whether,” or a similar conjunction.

We’ll invent some examples: “He wondered if ‘Thou shalt not kill’ was the fifth or sixth commandment” … “She asked whether ‘Thou shalt not kill’ or ‘You shall not murder’ was the proper wording” … “How can a murderer believe that ‘Thou shalt not kill’ is God’s law?” 

And your sentence—“God’s statement ‘Thou shalt not kill’ is one of the Ten Commandments”—illustrates another kind of quotation that doesn’t need a comma. 

In this case the quotation (“Thou shalt not kill”) is the explanatory equivalent of the subject (“God’s statement”).

An English teacher would call the quotation an appositive—something placed in apposition to a noun or noun phrase. Grammatically, “apposite” means equivalent (not to be confused with “opposite”).

Sometimes these explanatory equivalents are surrounded by commas and sometimes they’re not.

The Chicago Manual explains the situation in a nutshell here (brace yourself for more grammatical terminology):

“A word, abbreviation, phrase, or clause that is in apposition to a noun (i.e., provides an explanatory equivalent) is normally set off by commas if it is nonrestrictive—that is, if it can be omitted without obscuring the identity of the noun to which it refers. …

“If, however, the word or phrase is restrictive—that is, provides essential information about the noun (or nouns) to which it refers—no commas should appear.”

In plain English, put commas around an explanatory equivalent that’s dispensable—one that could be dropped without losing the point of the sentence.  But don’t put commas around one that’s essential to the point.

In your sentence (“God’s statement ‘Thou shalt not kill’ is one of the Ten Commandments”), the appositive (“Thou shalt not kill”) is essential, so you don’t need commas. If you dropped the appositive, the point of the sentence would be lost.

In a section on “maxims, proverbs, mottoes and other familiar expressions,” the Chicago Manual gives examples of two appositive sayings, one with commas and one without:

Commas used: “Tom’s favorite proverb, ‘A rolling stone gathers no moss,’ proved wrong.”

Commas omitted: “The motto ‘All for one and one for all’ appears over the door.” 

We’ve written before, including posts in 2011 and 2009, about why some explanatory equivalents are surrounded by commas and some aren’t. Here are examples of both:

“My husband, John, will be joining us for dinner.” (Commas used; you have only one husband, so the name isn’t essential information.)

“My friend Susan will be joining us for dinner.” (Commas omitted; she’s not your only friend, so her name is essential to identify which friend.)

Check out our books about the English language

The Grammarphobia Blog

Drown, drowned, and drownded

Q: You’ve written an article about the clipped infinitive and past tenses for the verb “text.” What about the converse—the elongated past tense and past participle “drownded.”

A: We’ll get to “drownded” in a moment, but here’s an aside that illustrates another example of unnecessary elongation.

Pat recently opened a new package of socks that the manufacturer claimed were “pre-shrunked.” Ouch! Now, on to your question.

The familiar “drownded,” which many of us perpetrated in childhood, is another example of the tendency to add an extra “-ed” ending to a verb that doesn’t require one.

The past tense and past participle of the verb “drown” (which has been part of English since around 1300) is simply “drowned.”

So we say, “The victim drowned” and “The victim has drowned.” 

But little children, as well as adults who don’t know any better, sometimes use “drownded” as the past tense, past participle, and participial adjective. (A Google search produced 129,000 hits.)

This is a nonstandard usage, though the word “drownded” has shown up at times in the past, according to the Oxford English Dictionary.

The OED lists it as a past form of “drownd,” a variant of “drown” that came along in the 1500s and was “widely prevalent in dialectal and vulgar use.”

This variant verb “drownd” is described by the OED as “parallel in development to astound, bound, compound, sound, etc.”

The OED has several citations for this nonstandard verb. Here are two that use it in the present:

“Thy curate (that otherwise wold mumble in the mouth & drounde his wordes).” From Robert Crowley’s The Way to Wealth (1550).  

“He had a beautiful voice. He could drownd out the whole choir.” From Harper’s Magazine (1884). 

And here are several examples using the past tense “drownded”:

“God … drownded Pharaoh and his host in the read sea.” From William Prynne’s Vindic. Psalme  (1644).

“In my own Thames may I be drownded.” From a dialogue of Jonathan Swift (1727).

“They dy’d … in Seas of sorrow Drownded.” From The Roxburghe Ballads (circa 1679).

“ ‘Just fill that mug up with lukewarm water, William, will you?’… ‘Why, the milk will be drownded.’ ” From Charles Dickens’s Nicholas Nickleby (1839).  

Still, as we’ve said, these examples of “drownded” are past tenses of a nonstandard verb “drownd.” In other words, they’ve been grounded.

In modern usage, the standard  verb is “drown” and the past tense or past participle is “drowned”—no extra “-ed” is needed.

Similarly, the participial adjective is now “drowned” and it’s been spelled that way since around 1500.

Check out our books about the English language

The Grammarphobia Blog

Harebrained or hairbrained?

Q: Here’s a term I’ve seen used but I’m unsure of the origin or its precise intention. Is it “harebrained,” like a hare? Or “hairbrained,” like a brain stuffed with hair? If the former, how is a hare involved?

A: The short answer is “harebrained,” but the short answer doesn’t do justice to your question. Here’s the story.

Both the “hare” and “hair” versions showed up in the 1500s, though both of those usages referred to the animal of the genus Lepus rather than the stuff that grows from follicles.

It turns out that “hair” and “haire” were variant spellings of “hare” in the 1500s, especially in Scottish English.

The earliest example of the usage in the Oxford English Dictionary is from The Union of the Two Noble and Illustre Families of Lancastre and Yorke (1548), a book by the English historian Edward Hall:

“My desire is that none of you be so unadvised or harebrained as to be the occasion that I in my defence shall colour and make red your tawny ground with the deaths of yourselves and the effusion of Christian blood.” (We’ve expanded the OED citation.)

The next example of the usage is from the English writer George Pettie’s 1581 translation of La civil conversazione, a work written in Italian by Stefano Guazzo: “If his sonne be haughtie, or haire brained, he termeth him courageous.”

The OED defines the term “hare-brained” (which it hyphenates) as “having or showing no more ‘brains’ or sense than a hare; heedless, reckless; rash, wild, mad. Of persons, their actions, etc.”

The “hair” version of the usage later inspired two alternative definitions: “having hair-sized brains” and “having brains stuffed with hair.” However, those are considered products of folk etymologies.

The American Heritage Dictionary of the English Language (5th ed.) notes that the “hair” spelling of “hare” was preserved in Scotland into the 1700s, making it impossible to tell exactly when “hairbrained” came to be associated with hair rather than hares.

Standard dictionaries now define “harebrained” as foolish, silly, or impractical. A few list “hairbrained” as a variant spelling, but “harebrained” is far more popular, with roughly twice as many hits on Google.

American Heritage, one of the dictionaries that include the “hair” version as a variant, says in a usage note: “While hairbrained continues to be used, the standard spelling of the word is harebrained.

Fowler’s Modern English Usage (rev. 3rd ed.), edited by R. W. Burchfield, describes the “hair” version as “an erroneous variant,” but Merriam-Webster’s Dictionary of English Usage accepts it as an “established” though secondary usage.

What do we think? We’ll stick with “harebrained.” Our brains are a bit woolly at times, but not quite hirsute.

Check out our books about the English language

The Grammarphobia Blog

Fain vs. feign

Q: Sandra Boynton has a cartoon mug collection. One of my favorites depicts a snail declaiming its love: “Oh, inch by inches / Doth my love grow; / I feign would call thee / ‘My Escargot.’ ” My question: ought this not be “fain” instead of “feign”?

A: Yes, it should read “fain,” not “feign.”

The word “fain” here is an archaic adverb that means gladly or happily. The word “feign,” on the other hand, is a verb meaning to present falsely, to fabricate, or to pretend.

We like the poem, though, and would fain see Sandra Boynton fix the wording.

How, you may be wondering, did two words that look so different come to sound the same?

Interestingly, the word “fain” had a “g” sound in Old English, where it was spelled fægen or fægn, according to the Oxford English Dictionary. But the “g” was dropped during the Middle English period (from the late 12th to late 15th centuries).

In this OED example from the epic poem Beowulf, believed to date from the early 700s, Beowulf and his comrades happily carried Grendel’s head after killing the ogre:

Ferdon forth thonon fethelastum ferhthum fægne. Modern English: “They headed away along the footpaths happy at heart.” (We changed the Old English letters eth and thorn to “th.”)

As for “feign,” the verb didn’t have a “g” when it entered English in the late 1200s. Here’s a “g”-less example from a 1297 history of England by Robert Gloucester: Somme feynede a delay.”

So how did the “g” get there? The Chambers Dictionary of Etymology explains: “The introduction of g into the spelling of Middle English feinen was an imitation of the original French.”

The English verb was derived from feign-, an Old French stem of a verb meaning to pretend or shirk. The French verb was derived in turn from the Latin fingere (to make or shape.)

John Ayto’s Dictionary of Word Origins points out that the “semantic progression from ‘make, shape’ to ‘reform or change fraudulently,’ and hence ‘pretend,’ had already begun in classical Latin times.”

The Latin verb fingere, Ayto says, has given English many other “pretend” words, including “effigy,” “fiction,” “figment,” and “feint.”

The Latin word also gave us the adjective, noun, and verb “faint.” When “faint” entered English sometime before 1300, it carried over the French meanings of pretended, simulated, lazy, shirking, and cowardly.

So in the 1300s  someone said to faint was lazy or cowardly and pretending to pass out in order to shirk responsibility.

This sense is now considered obsolete, Ayto notes, except in expressions like “faint hearted” and  “faint of heart.”

Check out our books about the English language

The Grammarphobia Blog

Counterintuitive: true or false?

Q: It bugs me to hear someone say something “seems” counterintuitive. “Counterintuitive” is one of those words suddenly everywhere (which might account for my annoyance). If inherent in the word’s meaning is the notion that something seems improbable, is it not then redundant to qualify it with words like “seems,” “sounds,” and “appears”? Would you like to comment? Or should I just ask Noam?

A: Is it redundant, you ask, to qualify an adjective that describes a qualified condition? Perhaps, though as we’ve written many times on our blog, not all apparent redundancies are redundant. In fact, our post yesterday says it’s not necessarily redundant to call a corpse a “dead body.”

In the case of “counterintuitive,” however, we’re not sure the adjective in question describes a qualified state.

Most of the standard dictionaries we’ve checked—two American and two British—define “counterintuitive” as contrary to what intuition or common sense would lead one to expect. Only one source qualifies it as seemingly contrary to common sense.

A more interesting question might be this: Why is “counterintuitive” sometimes qualified—that is, weakened with words like “seems,” “sounds,” and “appears”—and sometimes not?

A bit of googling suggests that the adjective is usually qualified when it describes a situation that seems contrary to fact but really isn’t. It generally isn’t qualified when it describes something that actually is—or probably is—contrary to fact.

The American Heritage Dictionary of the English Language (5th ed.), quoting from Natalie Angier, a New York Times science writer, provides this qualified example of the word used to describe a factual situation:

Scientists made clear what may at first seem counterintuitive, that the capacity to be pleasant toward a fellow creature is … hard work.”

And Merriam-Webster’s Collegiate Dictionary (11th ed.) provides another qualified example of the word used to describe something factual: “It may seem counterintuitive, but we do burn calories when we are sleeping.”

The online Collins English Dictionary, however, includes this unqualified example from the Globe and Mail in Toronto of “counterintuitive” used to describe a factually doubtful situation:

“Brother Jeff’s theories are counterintuitive at best, and have regularly baffled lawyers and judges.”

The Oxford English Dictionary, which includes both unqualified and qualified definitions of “counterintuitive,” has examples of both senses.

Here’s a 1974 example from the American Philosophical Quarterly in which “counterintuitive” describes a dubious situation: “The formulas offered by Day lead to results so counter-intuitive that they had best be called simply false.”

And here’s an example from the April 1979 issue of Scientific American that describes a factual situation: “At first the effect of the dimples seems counterintuitive because the dimpling surely also increases the skin-friction drag.”

The OED’s earliest example of the usage is from The Logical Structure of Linguistic Theory (1955) by Noam Chomsky:

“If we construct linguistic theory in such a way that the grammar can present a phrase structure for every sentence directly … then this counter-intuitive analysis of (25) as analogous to (26) will follow.”

We’ll leave it to you and the other readers of our blog to decide which way Chomsky is using “counterintuitive” here.

Check out our books about the English language

The Grammarphobia Blog

Body English

Q: Why do people say “dead body” instead of just “body”? In a news story about a murder, one would assume that the body found in the woods or in the water was dead.

A: Yes, one would assume that a body found floating in the water was dead. And yes, in many cases it’s unnecessary or redundant to add the adjective “dead” to the noun “body.”

But we might want to add “dead” as an intensifier to emphasize the deadness of the body. And of course “body” doesn’t always refer to a corpse. In fact, the word was around for hundreds of years before it came to mean a dead body.

When the word first showed up in early Old English (spelled bodæi or bodeg), it referred to the “complete physical form of a person or animal,” according to the Oxford English Dictionary.

The OED’s earliest citation is from a translation of Historia Ecclesiastica Gentis Anglorum, a church history written in the 8th century by the English monk Bede.

That original sense of “body” is still one of the major meanings of the word. It wasn’t until the 13th century, according to Oxford, that “body” took on the meaning of a corpse.

In fact, the OED says the “corpse” sense of the word “perhaps originally” was “a euphemistic shortening of ‘dead body.’ ” And the dictionary has 115 written examples of the phrase “dead body” used over the last five centuries.

John Ayto’s Dictionary of Word Origins says the early etymological history of the word “body” is surprisingly sketchy.

“For a word so central to people’s perception of themselves, body is remarkably isolated linguistically,” Atyo writes.

With the exception of a connection with an Old High German term, “it is without relatives in any other Indo-European language.”

“Attempts have been made, not altogether convincingly, to link it with words for ‘container’ or ‘barrel,’ ” he adds.

All this talk about bodies reminds us of these lines from Robert Burns’s poem “Comin Thro’ the Rye”:

Gin a body meet a body
Comin thro’ the rye,
Gin a body kiss a body,
Need a body cry?

(“Gin” means “if” in Scots and dialectal English.)

And of course there’s J. D. Salinger’s version in Catcher in the Rye, where Phoebe corrects Holden for thinking it’s “catch a body.”

“It’s ‘If a body meet a body coming through the rye’!” old Phoebe said.

Check out our books about the English language

The Grammarphobia Blog

What’s a female cuckold?

Q: Pat said on WNYC the other day that she didn’t know of a name for a female cuckold. Ooh, I know one! A Salon article refers to the “unsurprisingly underused Canterbury Tales-tastic term cuckquean.”

A. Ooh, you’re right! There is a term for a female cuckold, “cuckquean,” though it showed up a couple of hundred years after Chaucer wrote the Canterbury Tales.

The Oxford English Dictionary describes it as an obsolete noun, which explains why we could find “cuckquean” in only one standard dictionary, Webster’s Third Unabridged.

The OED says “cuckquean” is derived from the stem of ”cuckold” and the noun “quean,” which meant simply a woman in Old English but later was a term of disparagement, like “hussy” or “prostitute.”

(Though “quean” and “queen” sound alike and have similar prehistoric roots, they’re separate words in English.)

Oxford’s earliest example of the usage (spelled “cookqueane”) is from a 1562 collection of proverbs and epigrams by the English writer John Heywood:

“And where reason and customs (they say) asoords, / Alwaie to let the loosers haue their woords, / Ye make hir a cookqueane, and consume hir good.”

The dictionary’s only modern example of the usage is from James Joyce’s Ulysses (1922):

“A wandering crone, lowly form of an immortal serving her conqueror and her gay betrayer, their common cuckquean, a messenger from the secret morning.”

(We’ve expanded on both of those OED citations to provide more context.)

If you’d like to read more, we discussed “cuckold” a few years ago in a post about whether the term “horny” is related to “horns of the cuckold.”

Check out our books about the English language

The Grammarphobia Blog

Einstein? It’s all relative

Q: I used to work in management training, where this saying was cited in arguing for innovation — “Insanity: doing the same thing over and over again and expecting different results. BrainyQuote attributes it to Einstein, but gives no evidence. Is this one of those “quotes” that float around until someone decides to give a brainy person credit for it?

A: The words are correct—more or less—but the attribution is wrong.

The Yale Book of Quotations says the American novelist Rita Mae Brown, not Albert Einstein, is the source of the earliest known appearance of the quotation in print.

However, a similar quotation appeared around the same time in a book published by Narcotics Anonymous and two years earlier in an unpublished draft of the NA book.

There are also tantalizing suggestions that the quote may have been floating around in the addiction-recovery movement even earlier than that.

The quotation can be found in chapter four of Brown’s novel Sudden Death (1983). We’ll quote a couple of relevant paragraphs to provide some context:

“The trouble with Susan was that she made the same mistakes repeatedly. She’d fall in love with a woman and consume her. Susan thought that her mere presence was enough. What more was there to give? When she tired, usually after a year or so, she’d find another woman.

“Unfortunately, Susan didn’t remember what Jane Fulton once said. ‘Insanity is doing the same thing over and over again, but expecting different results.’ ” 

(The “Jane Fulton” referred to is another character in the novel.)

A similar quote—“Insanity is repeating the same mistakes and expecting different results”—appeared on page 11 of an unpublished 1981 draft of a book on recovery prepared by Narcotics Anonymous.

But that was a working draft. The approved version wasn’t published until 1983, when it appeared in the book Narcotics Anonymous. That was the same year Brown’s novel appeared.

Another version of the quotation appeared in a pamphlet, Step 2: Coming to Believe (Rev. ed.), published in 1992 by the Hazelden Foundation, an addiction-treatment organization.

In the pamphlet, a recovering addict is quoted as saying, “When I came into the program, I heard that insanity is doing the same thing over and over and expecting different results.”

We’ve seen suggestions that a 1980 version of the Step 2 pamphlet might have contained that quotation. But we’ve read the 1980 pamphlet, which is very different, and the quote isn’t there.

We wouldn’t be surprised, though, if an earlier source shows up, perhaps in the addiction-treatment movement, as more published works become digitized.

However, it’s not likely to be Einstein, whose writings are well known. Nor Mark Twain or Benjamin Franklin, as some Internet sites have claimed.

We’ve written before on our blog about “quote magnets,” famous people who get credited for every catchy quote that comes down the pike.

Perhaps the most popular quote magnets of all time are Twain and Winston Churchill. Runners-up include Franklin, George Bernard Shaw, Oscar Wilde, Abraham Lincoln, and Dorothy Parker.

They all said and wrote many quotable things—but not all the quotable things they’re credited with.

 Check out our books about the English language

The Grammarphobia Blog

If not, why not?

Q: My question is about a sentence like “Jones is smart, if not brilliant.” Does this mean “Jones is smart, but he isn’t brilliant”? Or does it mean “Jones is smart and maybe even brilliant”? It seems to me I’ve heard this “if not” construction used both ways.

A: We’re not surprised that you’re confused by this use of “if not.” It can be downright confusing, especially in writing when you aren’t able to use intonation and emphasis to get your meaning across.

The usage authority Bryan A. Garner says “if not” can mean either “but not” or “maybe even.”

Writing in Garner’s Modern American Usage (3rd ed.), he says it’s “often an ambiguous phrase to be avoided.”

Garner gives several examples of the expression used ambiguously in each sense. In all the examples, he says, it’s possible for a reader to arrive at the unintended meaning.

Here’s a “maybe even” example from the Dec. 8, 1996, issue of the Dallas Morning News: “The greater Phoenix area is one of the fastest—if not the fastest—growth areas for call centers nationwide.”

And here’s a “but not” example from the April 12, 1996, issue of the Los Angeles Times: “She gave proficient, if not profound, readings.”

Theodore M. Bernstein, another usage authority, says “if not” is “usually perfectly clear in spoken language,” though it becomes “a tantalizing ambiguity” in writing.

In The Careful Writer, Bernstein gives this example of an ambiguous “if not” sentence: “The proposed taxes would be levied primarily, if not exclusively, on New York and Pennsylvania residents.”

He says a speaker would use his voice to emphasize or deemphasize the word “exclusively,” leaving no doubt about his meaning. But a writer can’t “indicate a rise or fall in tonal register.”

His recommendation: “The solution to the present problem should have become evident in its very discussion: if you mean perhaps, say so; if you mean but not, say so.”

We think that makes sense. As we’ve said many times—if not many, many times—the whole point of writing is communicating. And nothing should interfere with that.

One last point. Garner thinks the “perhaps” sense of “if not” is more common than the “but not” sense. Perhaps, but we’re not sure about that.

In fact, there’s only one citation in the Oxford English Dictionary for “if not” in the senses we’ve been talking about, and it’s a “but not” example.

The English author-priest Mark Pattison used the phrase in an essay published posthumously in 1845 in the Anglican periodical Christian Remembrancer: “The style of Bede, if not elegant Latin, is yet correct, sufficiently classical.”

Standard dictionaries generally don’t have entries for “if not.” Merriam-Webster’s Collegiate Dictionary (11th ed.) doesn’t define “if not,” but it gives this “maybe even” example of the usage: “difficult if not impossible.”  

Check out our books about the English language

The Grammarphobia Blog

Memento memorious

Q: Is “memorious” a word? I’ve heard it used a couple times on podcasts by educated speakers, but I can’t find it in my dictionary. Please give me any thoughts you have on it.

A: Yes, “memorious” is a word, but you won’t find it in a standard dictionary—or at least not in any of the 10 standard dictionaries we checked.

We did find an entry for it in the online collaborative reference Wiktionary, where it’s defined as “having an unusually good memory” or “easy to remember.”

More important, the Oxford English Dictionary has written examples for the use of the adjective “memorious” dating back to the early 1500s.

The OED defines it as “having a good memory” or “memorable; evocative of or rich in memories.” The dictionary describes a third meaning, “mindful of,” as obsolete.

Oxford’s earliest citation for the good-memory sense of the word is from The Lyfe of Saynt Radegunde, written by the English poet and Benedictine monk Henry Bradshaw sometime before his death in 1513:

“Of speciall frendes / honest and vertuous / Whiche lately requyred me full memorious / With synguler request / and humble instaunce / This lyfe to discrybe / with due circumstaunce.”

(We’ve gone to the original to expand on the OED citation.)

The memorable sense of the word first showed up in the late 19th century. The dictionary’s earliest citation is from The Human Inheritance, The New Hope, Motherhood and Other Poems (1882), by the Scottish writer William Sharp:

“So may have blown some wind of thought Memorious from a past forgot.”

Although we couldn’t find “memorious” in any standard dictionary, it’s alive and well online. A Google search got 48,000 hits, including references to Funes el Memorioso, a 1942  short story by the Argentine writer Jorge Luis Borges.

In the story, which is usually translated into English as “Funes the Memorious,” a teenage boy named Ireneo Funes develops an astonishing memory after a fall from a horse.

Check out our books about the English language

The Grammarphobia Blog

The right percent

Q: I’m a journalism student at Mizzou and recently disagreed with an editor about the word “percentage.” I thought it was interchangeable with “percent,” but she wasn’t so sure. We checked the AP stylebook, but it didn’t illuminate anything. What’s the verdict?

A: These words aren’t necessarily interchangeable. A “percent” is a hundredth part of something, but a “percentage” can mean any part of a whole. 

This is why “percent” is generally used with a number: “50 percent of the flour was ruined.”

And this is why “percentage” is not used with a number, just an ordinary adjective: “a large percentage of the flour was ruined.”

Still, “percent” is sometimes used in place of “percentage,” as in “What percent of the flour was ruined?”

This usage has been discouraged by some language authorities, but it’s recognized in most standard dictionaries and seems idiomatic to us.

The Columbia Guide to Standard American English, by Kenneth G. Wilson, has this to say about the subject:

Percentage is the more widely accepted noun, especially in Edited English, but Informal use of percent (What percent of your time do you spend watching TV?) seems thoroughly established.”

So if that’s what you and the editor disagreed about, you can both relax. If you’re writing formal English, however, you might want to stick with “percentage.”

Now comes the sticky part.

“Percentage” is a noun. (The noun can also be used attributively as a modifier, as in “percentage point.”)

And “percent” is a noun when it means “percentage.” But there’s some disagreement about how to classify “percent” in other cases.

The Oxford English Dictionary, for example, classifies “per cent” (it’s two words in British English) as an adverb in almost all the other cases.

The dictionary describes “percent” as an adverb when it appears with a number to form a noun phrase that expresses a proportion in hundredths (for example, “10 percent of the students”).

That definition covers a lot of territory. Too much, in our opinion and in the opinion of some standard dictionaries.

Those dictionaries include The American Heritage Dictionary of the English Language (5th ed.), Webster’s Third New International Dictionary, Unabridged, and the online Macmillan Dictionary in both its US and UK editions. 

All standard dictionaries, including those three, would agree with the OED that “percent” is an adverb when it modifies a verb or an adjective.

In this adverbial Oxford citation, for example, it modifies a verb: “The Funds rose 1 per cent. on the news” (1804).

However,  American Heritage, Webster’s Third, and Macmillan would disagree with the OED that “percent” is an adverb in these Oxford citations:

“The Blank Tickets bear seven per Cent. Interest” (1710); “At the rate of ten per cent. therefore …” (1776); “Ninety per cent of the cooks do their full share” (1904); “cut my social life by about 35 per cent” (1973).

The three standard dictionaries would consider “percent” an adjective or a noun in those citations. We’ll quote some of their own examples of “percent” used as an adjective, a noun, and an adverb.

Adjective : “a 0.75 percent increase in interest rates” … “harvested 50 percent more wheat” … “another 100 percent result” … “a 3½ percent government bond” … “a 2.25 percent checking account.”

Noun: “provided 40 percent of Europe’s requirements” … “42 percent of the alumni contributed” … “owns 20 percent of the business” … “represent 50 percent of the workforce.”

Adverb: “agreed with her suggestions a hundred percent” … “sales increased 30 percent” … “if he is even one percent responsible for the accident.”  

Why does the OED call “percent” an adverb in cases where some standard dictionaries do not? This probably has a lot to do with the fact that “percent” started out as an adverbial phrase. 

The OED says “per cent” (it uses the British form) was modeled on the Italian phrase per cento, which can be translated as “for (every) hundred.”

The dictionary says the phrase appeared in Italian in 1263 or earlier. (In the following century, incidentally, the Italians invented the % sign.)

“Per cent” was first recorded in English in 1568, but a slightly earlier form showed up in 1565—“per centum,” abbreviated as “per cent.” with a period.

As the OED explains, “per centum” was “the usual form in Acts of Parliament and most legal documents.”

This coinage too was modeled after the Italian per cento, though it was fashioned out of Latin elements (per plus centum). In fact, per centum did not exist in Latin.

The facts remain that in Britain the word is still mostly written as a phrase—“per cent”—and is still regarded as adverbial in some standard dictionaries.

The Cambridge Dictionaries Online, for instance, says it’s an adverb in the examples “You got 20 percent of the answers right” and “Only 40 percent of people bothered to vote.”

American dictionaries would generally regard “percent” as a noun in those examples, though perceptions about the linguistic function of “percent” aren’t unanimous even in the United States.

The Chicago Manual of Style (16th ed.), for example, sort of agrees with Cambridge and sort of doesn’t.

“Despite changing usage,” the manual says, “Chicago continues to regard percent as an adverb (‘per, or out of, each hundred,’ as in 10 percent of the class)—or, less commonly, an adjective (a 10 percent raise).”

And by the way, the manual, which is widely used in the publishing industry (that means in formal, edited English), also recommends “percentage as the noun form (a significant percentage of her income).”

While we’re on the subject, many people use “percent” and “percentage point” incorrectly—the terms are not interchangeable.

For instance, if a mortgage rate falls to 6 percent from 8 percent, that’s a decline of 2 percentage points, or 25 percent. 

So beware. There’s no percentage in getting things wrong.

By the way, an old friend of ours from the New York Times graduated from the Missouri School of Journalism. Good look with your career!

Check out our books about the English language

The Grammarphobia Blog

An opinion on opinionated

Q: Are we seeing a shift in the meaning of “opinionated”? Merriam-Webster’s defines it as “unduly adhering to one’s own opinion or to preconceived notions,” but lately the meaning seems to have expanded to include the less negative denotation of having many firm opinions. I’m curious about what you think.

A: The adjective “opinionated” has indeed gained a new, less negative sense—at least in American English—though this meaning isn’t recognized by most standard dictionaries.

It first showed up in the US in the 1960s, according to citations in the Oxford English Dictionary, and it represents somewhat of a return to the original neutral sense of the word.

We’ve found only one standard dictionary that includes both senses of the word, Merriam-Webster.com, the online version of the dictionary you mentioned.

Merriam-Webster online includes the definition you cited (“unduly adhering to one’s own opinion or to preconceived notions”) and a relatively neutral one (“expressing strong beliefs or judgments about something: having or showing strong opinions”).

The first definition is from M-W’s Collegiate Dictionary (11th ed.) and the second is from M-W’s Learner’s Dictionary, which is intended for students of English as a second language.

Peter Sokolowski, editor at large at the Merriam-Webster company, explained in an email to us that the Learner’s reference is M-W’s most recent dictionary and “its definitions sometimes do reflect new subtleties and changes.”

When the adjective “opinionated” entered English in the late 1500s, it meant simply “having a (specified) opinion” or “of the opinion (that),” but the OED describes this sense of the word as obsolete.

The dictionary’s first citation is from Robert Dallington’s 1592 translation of Hypnerotomachia Poliphili, a romance by Francesco Colonna written in Latinate Italian:

“Being perswaded and firmly opinionated, that this sight was a traunce in loue.”

The OED says the negative sense of the word (“thinking too highly of, or holding obstinately to, one’s own opinion; conceited; dogmatic”) showed up in the early 1600s.

The first citation is from Joseph Taylor’s Life and Death of the Virgin Mary (1630):

“Though hee be but a Botcher, or a Button-maker, and at the most a lumpe of opinionated ignorance, yet he will seeme to wring the Scriptures to his opinions.”

In the mid-20th century, according to Oxford, a more or less neutral sense of the word showed up in the US: “Holding firm views or opinions.”

However, the dictionary notes that it’s often difficult to tell whether writers are using the term to describe people with firm opinions or with obstinate ones.

The OED’s first example of this new usage is from Woman of Valor, Irving Fineman’s 1961 biography of the American Zionist Henrietta Szold:

“How to handle a young man as high-spirited and opinionated as herself.”

The dictionary’s most recent example of the usage is from the Nov. 8, 2002, issue of the Journal and Courier in Lafayette, IN:

“I mean, being opinionated and pushing the envelope is all fine and good, as long as you show a little self-respect.”

A bit of googling suggests to us that many people who appear to be using the new sense are actually using the old one in a humorously hyperbolic way, especially to refer to themselves.

The word pops up a lot in the titles of blogs or posts. Here’s a sample: “Opinionated About Dining” … “Opinionated Democrat” … “Opinionated Catholic” … “Opinionated Geek” … “Opinionated Palate” … “Ms. Opinionated” … “Little Miss Opinionated.”

We think the word “opinionated” in those titles, which refer to the authors’ own blogs or posts, is being used in the traditional, negative sense, though with tongue in cheek.

Perhaps the prevalence of this ironic or facetious usage accounts for the OED’s difficulty in telling whether the old or new sense of the word is being used.

Does the new sense of “opinionated” have legs? We’re not sure, and lexicographers apparently aren’t either.

We checked nine standard dictionaries in the US and the UK, and only Merriam-Webster.com (as we’ve said) has both the old and new senses.

Check out our books about the English language

The Grammarphobia Blog

“Good-paying” or “well-paying”?

Q: Wish you would address “good-paying job” versus “well-paying job.”

A: We’re taught that “good” is an adjective, not an adverb, so it shouldn’t be used to modify a verb or another adjective.

That, in a nutshell, is why many people regard “good-paying job” as inferior to “well-paying job.”

They think a verb form like the participle “paying” should be modified by “well,” an adverb, not “good,” an adjective.

However, language authorities say “good” has been used as an adverb or a quasi-adverb since the Middle Ages, and this adverbial usage wasn’t criticized until the latter half of the 19th century.

In fact, the phrase “good-paying job” doesn’t strike us as bad English—informal, perhaps, but not incorrect.

It seems more natural and idiomatic than the stiffer, consciously correct “well-paying job,” especially in speech and casual writing.

Google searches of the phrases “good-paying job” and “well-paying job” turn up almost identical numbers—5.3 million hits apiece. So “good-paying job” has become an acceptable idiom to a good number of people.

Is one phrase really technically better than the other?

Well, 5.3 million hits for “well-paying job” would indicate that many people think so.

But that other 5.3 million for “good-paying job” would indicate a real split in usage, one that shouldn’t be brushed off on “technical” grounds.

We believe that in “good-paying job,” the word “good” is being used idiomatically as an adverb to modify the present participle “paying.”

Merriam-Webster’s Dictionary of English Usage agrees, noting that “good” has been used as an adverb since the 13th century, and that this adverbial use wasn’t criticized until the 19th.

The Oxford English Dictionary doesn’t go that far, but it describes a category of usages in which the adjective “good” is used “in quasi-adverbial combination” with present participles.

Such a combination, the OED says, is “used adjectivally”—that is, the combined phrase modifies a noun (like “job”).

While the OED says that “in none of these instances is good adverbial in origin,” it nevertheless sometimes functions as an adverb.

There are other common usages in which “good” plays an adverbial role, though in most of them the OED stops short of classifying the word as an adverb:

● The adverbial phrase “as good as.” This modifies a verb or an adjective and means the same as an adverb like “practically.” Examples: “He as good as confessed” … “The victim was as good as dead.”

● Phrases like  “a good long time,” “a good sharp knife,” “a good many people,” and so on. Here, “good” acts as an adverb like “very” or “properly.” It modifies the adjective and serves as an intensifier.

● The phrase “good and.” This colloquial expression is an adverbial phrase that intensifies, as in “his hair was good and red” … “until we’re good and ready.”

● The phrases “for good” (as in “he left for good”) and “but good” (“you socked him but good”). These adverbial idioms can be compared to “finally” and “well.”

Merriam-Webster’s, which calls “good” an adverb when it functions as one, says English speakers of all educational levels use “good” adverbially. 

“Our evidence shows that adverbial good is common in the speech of the less educated, but is also known and used by the better educated,” M-W’s editors write.

Ironically, they add, “The schoolmasterly insistence on well for the adverb may have contributed to the thriving condition of adverbial good.”

We can imagine a couple of reasons why “good-paying job,” in particular, seems natural to many.

(1) “Good” has other forms—“better” and “best.” There’s nothing at all wrong with “better-paying job” or “best-paying job.” So why not “good-paying job”?

A critic might argue that “better” and “best” are adverbs—forms of “well”—when used with participles like “paying.” (In fact, the OED does categorize them as adverbs when so used.)

But isn’t this a circular argument?

If one calls “good” a misused adjective in “good-paying job,” then why aren’t “better” and “best” adjectives when used in the same expression?

We could just as easily call all three—“good,” “better,” “best”—adjectives being used informally in an adverbial way to modify the participle “paying.”

(2) We use the adjective “good” in phrases like “good-looking boy” and “good-tasting pie,” so why not in  “good-paying job”?

A critic might answer that “look” and “taste” are grammatically different from most verbs. They’re known as linking verbs, which are modified by adjectives like “good” instead of adverbs like “well.” (We’ve written about linking verbs before, including posts in 2012 and 2010.)

Here we would respond that when “good” modifies the participle of a linking verb (as in “good-looking,” “good-tasting”), it’s clearly an adjective—not an adverb.

But when it modifies the participle of a non-linking verb (“good-paying”), then it’s being used informally as an adverb.

We admit that “good-paying” is informal English. So do the editors at Merriam-Webster’s, who suggest that “adverbial good is still primarily a speech form.”

“Our evidence,” they write, “is mostly from reported or fictional speech, letters, and similar breezy and familiar contexts. It is not likely to be needed in a book review or a doctoral dissertation.”

We’ll end with an example of the usage from a 1936 letter by Archibald MacLeish, the poet and Librarian of Congress: “It pays good and keeps the boys in school.”

Check out our books about the English language

The Grammarphobia Blog

“I” strain

Q: I hear more and more people substitute “I’s” for “my” or “mine.” For example, “My friends had a wonderful time at Jason and I’s party.” Ouch! That hurts my ears! Is this something that will fade away, or will it eventually become accepted?

A: You’re right. A lot of people are using “I’s” as a possessive of “I,” though mostly in place of “my,” not “mine.”

We got millions of hits when we googled “John and I’s,” “Bob and I’s,” and so on. Here are a few examples:

● “You really captured the spirit of John and I’s relationship and I absolutely cannot wait to see more of the shots!”

● “Thank you for supporting Jason and I’s celebration.”

 ● “Bob and I’s story is short, pathetic and kind of sweet!”

 ● “Tomorrow is Sam and I’s 2nd wedding anniversary.”

This usage seems to be relatively new. The earliest examples we could find were from 2004. Here’s an early one from the film producer Harvey Weinstein about the Weinstein brothers’ rocky relationship with Disney.

“They get to see my books all the time, so there’s no hiding what we do. On the flip side, Bob and I’s pay is determined by accounting from Disney. I don’t get to see everything unless I ask for an audit.”

Getting back to your question, what should be used in place of “Jason and I’s party”?

Well, “Jason’s and my party” would be correct, but that’s not the most felicitous phrasing. We’d prefer something like “our party” or, if you need to mention Jason, “our party, Jason’s and mine.”

Is this “I’s” business something that will fade away? Probably, but only time will tell.

In looking into your question, we came across a paper by Karen Milligan, a linguist at Wayne State University, about the grammar of joint possession.

In “Expressing Joint Possession: Or, Why me and Mary’s paper wasn’t accepted (but Bob and I’s was),” she suggests that the natural way of expressing joint possession (she calls it “the default construction”), is “me and Sean’s.”

She says this is “syntactically the most economical choice and the one utilized by children first. It is also the one adults revert to subconsciously—when under stress or in unguarded speech.”

Ms. Milligan writes that the early usage “declines with age, leading to eventual abandonment” as the “over-extension or misappropriation of prescriptive rules results in constructions that are semantically altered and/or unpredicted by the grammar of English.”

In other words, she seems to be arguing, we got it right as toddlers and we can blame English teachers for our difficulties in expressing joint possession as adults.

Well, that’s enlightening. But since you’re a grownup who expresses joint possession in an adult (albeit syntactically uneconomical) way, you might be interested in a brief post we wrote a few years ago on the subject.

[Update, Oct. 5, 2013. A reader had this interesting comment: “I think that you may find a commonality among most uses of ‘X and I’s’—it is predominantly used when the speaker thinks of the pair of people as a single entity. So couples (or a pair of siblings) will use it when they are talking about themselves as a unit. The possessive gets applied to the noun phrase ‘John and I.’

“This structure would be much less likely to occur with temporary or less significant pairs. If I were to team up with a colleague at work to solve a problem, I would never refer to ‘Trevor and I’s plan.’ However, it would not be uncomfortable to make reference to ‘Heather and I’s wedding anniversary.’ ”]

Check out our books about the English language

The Grammarphobia Blog

In-laws and other impediments

Q: I wonder why English has only one term, “brother-in-law,” for three different kinds of relatives: your spouse’s brother, your sibling’s husband, and your spouse’s sibling’s husband.

A: You might also ask why something similar can be said of “sister-in-law,” which refers to three different relatives too.

This is probably because “brother-in-law” and “sister-in-law” originally referred not only to the various relatives involved but also to a prohibited relationship shared by them.

When  “brother-in-law” entered English around 1300 and “sister-law” about 1440, the phrase “in-law” meant “in canon law,” as opposed to “in blood” or “by nature,” according to the Oxford English Dictionary.

The OED says the phrase was appended to names of relationship to indicate “the degrees of affinity within which marriage is prohibited; a brother-in-law or sister-in-law being, as regards intermarriage, treated ‘in law’ as a brother or sister.”

The word “affinity” here, Oxford says, refers to “relationship by marriage (as distinguished from relationship by blood).”

We won’t discuss the ins and outs of affinity in canon law, the ecclesiastical rules of the Catholic, Orthodox, and Anglican churches. Let’s just say that different churches have different rules for which relationships are impediments to marriage.

In an earlier post, we noted that the term “in-law” was once used in English to describe relationships that are now referred to with the term “step.” So, the expression “sister-in-law” once also meant stepsister.

Another old term, “inlaw,” used to mean the opposite of “outlaw” or, as the OED puts it, “one who is within the domain and protection of the law.”

Finally, in case you’re wondering, the use of the term “in-law” as a colloquial noun for any relative showed up first in the late 1800s. The earliest example in the OED is from the Jan. 24, 1894, issue of Blackwood’s Edinburgh Magazine:

“The position of the ‘in-laws’ (a happy phrase which is attributed with we know not what reason to her Majesty, than whom no one can be better acquainted with the article) is often not very apt to promote happiness.”

Check out our books about the English language

The Grammarphobia Blog

A jitney lunch

Q: My wife went to a country schoolhouse that was being swallowed up as Omaha grew. Unlike the modern schools in town, hers had no facilities for a hot lunch. But once a month the school system would deliver a hot lunch (usually hot dogs) called a “jitney lunch.” What does “jitney” mean and where does it come from?

A: You’d be surprised at how much time and effort language scholars have spent trying to find out where the word “jitney” comes from.

In Studies in Etymology and Etiology (2009), for example, David L. Gold explores possible French, Russian, Spanish, British, Yiddish, and other sources of the word. His conclusion: origin unknown.

All the other references we’ve checked, including the Oxford English Dictionary and the Random House Historical Dictionary of American Slang, agree: origin unknown.

Of all the suggested origins of “jitney,” the most likely—or rather the least unlikely—is that it’s derived (via the French or Creole spoken in Louisiana) from jeton, French for a token.

However, language sleuths haven’t found any evidence linking the word to French or Creole, and the earliest known sighting of the word was in Kentucky, hundreds of miles from New Orleans.

We may not know where “jitney” comes from, but we do know a lot about its life after the word first showed up in late 19th-century American English.

The earliest known example of “jitney,” according to Green’s Dictionary of Slang, appeared in the Dec. 16, 1899, issue of the Morning Herald in Lexington, KY:

“ ‘Can’t spare de change. Me granmaw died in Sout’ Afriky an’ I need dis to float me over ter de fun’ral.’  ‘Quit yer kiddin’ an’ let me have a jitney.’ ”

The word “jitney” here (it’s sometimes spelled “gitney”) meant either five cents or a nickel, the fare to ride minibuses at the time, according to slang dictionaries.

But by the early 20th century, the term was being used adjectivally to refer to the minibuses themselves. The OED’s earliest example is in a Nov. 28, 1914, letter from Los Angeles published in the Jan. 14, 1915, issue of the Nation:

“This autumn automobiles, mostly of the Ford variety, have begun in competition with the street cars in this city. The newspapers call them ‘Jitney buses.’ ”

Soon the word was being used by itself as a noun for the minibuses. Here’s an OED example from the April 16, 1915, issue of the New York Evening Post: “The jitney wears out the streets and should contribute to their repair.”

You’ll be especially interested in the next step in the evolution of “jitney”—as a noun used attributively (that is, adjectivally) to mean cheap or shoddy or inferior. Here’s how Oxford explains the new usage:

“So, on account of the low fare or the poor quality of these buses, used attrib. to denote anything cheap, improvised, or ramshackle.”  

The earliest published reference in the OED for this new usage is from Somewhere in Red Gap (1916), Harry Leon Wilson’s sequel to his better-known novel Ruggles of Red Gap (1915):

“It would be an ideal position for him. Instead of which he runs this here music store, sells these jitney pianos and phonographs and truck like that.”

As for those hot dogs served at your wife’s country school once a month, we imagine the meal was referred to as a “jitney lunch” either because it was cheap or uninspiring or because it was delivered by a jitney.

Check out our books about the English language

The Grammarphobia Blog

Leaning out

Q: Which of these is correct: (1) “I leaned out the window” or (2) “I leaned out of the window”? And which of these: (3) “I looked out the window” or (4) “I looked through the window”? I often see #1 and #3, but I prefer #2 and #4.

A: All of those are correct, though some usage commentators have objected to the “of” in #2 as redundant. However, Merriam-Webster’s Dictionary of English Usage pooh-poohs the objections.

“A few commentators observe that the of is superfluous most of the time, or sometimes—depending on whose opinion you are reading—when out is used with verbs of motion,” Merriam-Webster’s says.

However, the usage guide adds that this observation “is not especially useful, for out and out of are interchangeable only in a few very restricted contexts; out simply cannot be substituted for out of in most cases.”

“Out” is generally an adverb (“Get out!”), but it’s sometimes used prepositionally as a substitute for “out of.”

When used as a preposition, according to M-W, ”out” seems “most often to go with door or window” (“She looked out the window, then ran out the door”).

Merriam-Webster’s notes that “out” and “out of” are “about equally common” in constructions with the word “window.”

The dictionary gives this “out” example from the February 1969 issue of Harper’s: “He stares out the window.”

And it gives this “out of” example from “A Summer’s Reading,” a 1956 short story by Bernard Malamud: “as he read his Times, upstairs his fat wife leaned out of the window, seeming to read the paper along with him.”

The usage guide says “out of” is more common with “nouns that designate places or things that can be thought of as containing or surrounding.”

Here’s an example from And More by Andy Rooney, a 1982 essay collection by the 60 Minutes commentator: “A bathtub is, at best, a makeshift place to take a shower. It’s hard to get into and out of gracefully.”

M-W says “out” is sometimes used this way too, “but it sounds not quite part of the mainstream.”

This is an example from “The Heart of the Park,” a 1949 short story by Flannery O’Connor: “The woman came out the bath house and went straight to the diving board.”

If you’d like to read more, we had a post a couple of years ago about the use of “out” by itself in such constructions, a usage heard mostly in the South and the Gulf region.

Check out our books about the English language