The Grammarphobia Blog

Theater (or theatre) piece

Q: I review “theatre.” Or should I say “theater”? Which do you prefer, and why? Actually, why is there a choice at all?

A: There’s been a lot of nonsense written about “theater” and “theatre”—that one is for movies and the other is for plays; or that one refers to a building and the other to an art form; or that one spelling is lowbrow while the other is refined.

But these are merely variant spellings of the same noun.

“Theatre” is the only spelling now recognized in Britain. “Theater” is the traditional American spelling, but “theatre” is now equally acceptable in the US, according to standard dictionaries.

Personally, we prefer “theater,” but you’re free to make your own choice. No matter how you spell it, the meaning is the same.

We suspect that some Americans lean toward “theatre” because of its British associations (just as the spelling “colour” appeals to Anglophile cosmetics manufacturers). In other words, it has snob appeal.

The truth is that the spelling of this word has fluctuated over the centuries, and “theatre” hasn’t always been the preference in the British Isles.

The Oxford English Dictionary says the “earliest recorded English forms, c1380, are theatre and teatre.” But, the OED adds, “from c1550 to 1700, or later, the prevalent spelling was theater.”

So Chaucer, writing in Middle English in the late 1300s, used “theatre.” Two hundred years later, Shakespeare and Spenser used “theater.”

Why the change?

It helps to know that the word is ultimately derived from the Latin theatrum, and that its spellings in other languages are roughly divided along linguistic lines—Romance versus Germanic.

In Romance languages, the final syllable is spelled with -tr rather than -ter. For example, teatro in Italian, Spanish, and Portuguese; teatru in Romanian; and théâtre in French.

The word was teatre in Old French, and theatre in 12th- to 13th-century French (a spelling that, in light of the Norman Conquest, may have influenced the Middle English).

In Germanic languages, on the other hand, the word ends in -ter. For example, theater in German and Dutch, and teater in Norwegian, Danish, and Swedish. So it’s not surprising that English, a Germanic language, would have adopted the “ter” spelling at some point.

So far so good. But then why did the British switch back to “theatre” in the 1700s?

At the time, all things French were fashionable among the English upper classes. Besides, French became established as the language of diplomacy early in that century. Et voilà—French spellings crept into British usage.

As the OED says, “between 1720 and 1750, theater was dropped in Britain, but has been retained or (?) revived in U.S.” The question mark seems to indicate that it’s more likely the Colonists kept the old spelling.

We included a section about French-influenced British spellings in Origins of the Specious, our book about language myths and misconceptions.

As we wrote, British and American preferences today reflect those of the language’s two great lexicographers—the Englishman Samuel Johnson in the 18th century and the American Noah Webster in the early 19th:

“Many of the words that are now spelled one way here and another there had multiple spellings once upon a time. When the two lexicographers wrote their influential dictionaries, Webster chose one and Johnson another. But the story isn’t as simple as that. Johnson adopted many Frenchified spellings that had been introduced in Britain in the eighteenth century. But Webster often stuck with older spellings, the ones the Colonists had brought from England in the seventeenth century.

“Webster wanted, among other things, to purge English of words ‘clothed with the French livery’ and rid spelling of the ‘egregious corruptions’ imposed by Francophiles. He considered the eleventh-century conquest of Britain by French-speaking Norman princes the ‘dark ages of English.’ Johnson, on the other hand, wanted to preserve the spelling of his day, even if ‘it is in itself inaccurate, and tolerated rather than chosen.’ He was well aware of the Gallic corruptions but chose not to fiddle with them ‘without a reason sufficient to balance the inconvenience of change.’ ”

So we can largely blame two cranky old men for the fact that we have both “theatre” and “theater” today.

Something similar happened with other “er” words (“center/centre,” “fiber/fibre,” “luster/lustre,” and others). The Colonists took the “-er” endings with them to the New World, but British writers shifted their allegiance to French spellings.

Check out our books about the English language

The Grammarphobia Blog

Positively negative

Q: Though it sounds quite stilted, the double negative is often used in medicine to be more precise. I hate the sound of “non-inferiority,” but it’s useful to describe a statistical result that’s not necessarily superior. It’s often seen in the oncology literature to describe results of clinical trials—inelegant but necessary.

A: We agree with you about the usefulness of double negatives. But if we were writing about a clinical trial on our blog, we’d skip the jargon and use a longer, simpler, and equally precise phrasing.

In describing a non-inferiority trial, for example, we might say it shows that a new robotic treatment for prostate cancer is equivalent to, but no better than, the standard robotic procedure.

Getting back to double negatives, they can be quite expressive and somehow “just right” in all sorts of writing.

For example, a woman’s style of dressing might be described as “eccentric, but not inelegant.” Calling something “elegant” is very different from calling it “not inelegant.”

To use another example, an odd sensation or an unusual-tasting spice might be described as “a bit startling, but not unpleasant.” Again, “pleasant” and “not unpleasant” are worlds apart.

Blanket prohibitions against the use of the double negative are misguided, to put it kindly. We’ve written before on our blog about this subject, including posts in 2007 and 2008.

In Pat’s grammar and usage book Woe Is I, she says a double negative can be “handy when you want to avoid coming right out and saying something: Your blind date is not unattractive. I wouldn’t say I don’t like your new haircut.

We go a little deeper into the subject in our book Origins of the Specious: Myths and Misconceptions of the English Language:

“There’s nothing wrong with using two negatives together to say something positive (‘I can’t not buy these Ferragamos’) or to straddle the fence (‘He’s not unintelligent’). So anybody who says all double negatives are bad is badly informed. The only double negative that’s a no-no is one that uses two negatives to say something negative (‘I didn’t see nothing!’). Modern grammarians regard this usage as substandard, insisting on only one negative element in a simple negative statement (‘I didn’t see anything’ or ‘I saw nothing’).

“But why outlaw any kind of double negative? What’s wrong with ‘I didn’t see nothing’?”

As we go on to explain, such a statement “would be correct in French, Italian, Spanish, Polish, Russian, and other languages. And it used to be commonplace in English, too, as a way to accentuate the negative.”

Chaucer, for instance, uses double, triple, and even quadruple negatives in The Canterbury Tales. Here’s how he describes the Friar: “Ther nas no man no-wher so vertuous,” or as one would say today, “There wasn’t no man nowhere so virtuous.”

We say in Origins of the Specious that it “wasn’t until the eighteenth century that a sentence like ‘I didn’t see nothing’ was pronounced a crime against English.”

“If ever a prohibition had staying power, this one did,” we write. “By the time Dickens came along, only a poorly educated person, like Peggotty in David Copperfield, would say, ‘Nobody never went and hinted no such a thing.’ Many linguists argue that there’s nothing wrong with speaking like Peggotty today.”

But, as we add, we don’t hear no linguists saying nothing like that. Why? Because no PhD wants to sound like a high school dropout.

Our advice? “Don’t use two negatives to say something negative (‘You never take me nowhere’), but go ahead when you want to be emphatic (‘We can’t not go home for Thanksgiving’) or wishy-washy (‘Mom’s mince pie is not unappetizing’).”

Check out our books about the English language

The Grammarphobia Blog

When the past isn’t perfect

Q: I’m a little embarrassed to admit this but my guilty pleasure is watching Judge Judy on TV. I’ve noticed that many of the “litigants” on the show use the past perfect merely to explain the events that brought them to “court.” Example: “I had bought a car.” It seems to me the simple past tense would be more appropriate. I’m not losing any sleep over this, but it’s something I don’t get.

A: As you say, this isn’t something to lose sleep over, but we find that the past perfect tense is probably used too little these days, not too much.

The past perfect comes into play when people speak of two separate times, both of them in the past. For example, someone might begin by using the simple past tense to set the scene, then shift to the past perfect to refer to an even earlier time.

Here’s how it’s supposed to work. Let’s assume a well-spoken plaintiff appears before Judith Sheindlin, the outspoken ex-judge who presides over the daytime court reality show.

The plaintiff might introduce his case by saying, “Last July, the brakes on my car failed [simple past]. I had bought [past perfect] the car with the understanding that there was a warranty.”

But if the defendant’s English isn’t quite up to snuff, he might use the past perfect when there’s only one time frame: “I had sold him the car in January with a standard three-month warranty. He had refused to pay an extra $100 for a one-year warranty.” (Neither “had” is necessary.)

If this is what Judge Judy is hearing show after show, it may explain why she loses her temper so often.

Why do her litigants do it? Well, a courtroom (even one in a TV studio on Sunset Boulevard) can seem like a pretty formal place, and perhaps people overuse the past perfect tense because they think it’s more formal.

As we’ve said above, it’s been our experience that people don’t use the past perfect enough. For example, we wrote a blog entry last December on the tendency of many people to begin sentences with “If I would have known …” instead of “If I had known.”

But the past perfect isn’t always necessary when speaking of different times in the past. If the time frames are obvious, the simple past will do: “I got out of the slammer in December, a week after Judge Judy sentenced me to hard time for subject-verb disagreement.”

Check out our books about the English language

The Grammarphobia Blog

Watchwords

Q: I recently watched Man on Wire, a documentary about the man who walked on a tightrope between the Twin Towers in the ’70s. In the film, a police officer says “everybody was spellbound in the watching of it.” I was really struck by his eloquence and wondered what you thought of this type of construction.

A: Philippe Petit’s tightrope walk between the Twin Towers on Aug. 7, 1974, left spectators gaping.

One of them was Sgt. Charles Daniels of the Port Authority Police Department, who had been dispatched to arrest Petit. In the 2008 documentary Man on Wire, Daniels recalled the experience:

“I observed the tightrope ‘dancer’—because you couldn’t call him a ‘walker’—approximately halfway between the two towers. And upon seeing us he started to smile and laugh and he started going into a dancing routine on the high wire. And when he got to the building we asked him to get off the high wire but instead he turned around and ran back out into the middle. … He was bouncing up and down. … His feet were actually leaving the wire and then he would resettle back on the wire again. … Unbelievable really… everybody was spellbound in the watching of it.”

We’re quoting here from a PBS American Experience webpage.

Daniels’s phrase “spellbound in the watching of it” is indeed eloquent. This kind of construction isn’t heard that often, and its uncommonness makes it all the more poetic.

Here the word “watching” is a gerund, a verbal form used as a noun (as in “the watching was tiresome”).

Daniels of course meant that in watching the performance, everyone was spellbound. But to say “everybody was spellbound in the watching of it” was much more elegant.

We’ll use another example to illustrate how a gerund acts as a noun: “Lord Carnarvon searched long for Tut’s tomb and was overjoyed in the finding of it.” Note that “finding” could easily be replaced with the noun “discovery.”

We’ve written often on our blog before about gerunds, including postings in January and March of 2011.

Here are a few more examples of the same kind of construction: “The art of the cake is in the baking of it” … “The iron’s strength is in the forging of it” … “The pie was quick to make but the boys were quicker in the eating of it.”

And here’s one we didn’t invent: “The proof of the pudding, is in the eating of it,” from Tobias Smollett’s 1755 translation Cervantes’s Don Quixote.

Check out our books about the English language

The Grammarphobia Blog

Bathroom language

Q: Just catching up on posts, including the one about “gazebo” and its possible relationship with the Latin lavabo. My British-born mother always called the bathroom sink a “lavabo”—but NOT the kitchen sink. Until I read your post, I hadn’t put it together with the future tense of “I wash” in Latin—and I’ve had TWO YEARS of Latin! Is this use of “lavabo” British? I used to picture it in my mind as “lavabeau” (a beautiful place to wash up). I knew the Latin “lav” root meant wash, and I seized on the Gallic “beau” to explain the other part.

A: It’s interesting that your mother should use “lavabo” as a noun for the bathroom sink. As it happens, this use of “lavabo” is standard English for a bathroom sink in both British and American dictionaries! (Who knew?)

As we said in our “gazebo” post, one theory about the origin of the word is that it’s a quasi-Latin coinage.

“Gazebo,” according to this theory, would be translated as “I shall gaze,” mimicking Latin verb forms ending in -bo, like lavabo (“I shall wash”).

Never underestimate the capacity of English to absorb new words. In the mid-19th century, it adopted the Latin lavabo—the first-person singular future tense of lavare (to wash)—as a noun.

The noun was first used, according to the Oxford English Dictionary, in connection with Christian rites, where it had several meanings.

For example, the “lavabo” meant the ritual washing of the celebrant’s hands at the offertory, performed before touching the offerings.

In the Roman Catholic rite, the hand washing was accompanied by a recitation from Psalm 26, beginning Lavabo inter innocentes manus meas (“I will wash my hands in innocence”).

As the OED explains, ”lavabo” was also used to mean “the small towel used to wipe the priest’s hands” as well as “the basin used for the washing.”

It was also used by at least one historian in the late 19th century to refer to “a washing trough used in some mediæval monasteries.”

More secular uses of the word began to show up in the early 20th century, according to citations in the OED. This is when “lavabo” came to mean a household wash-stand or a lavatory (in the sense of a small room for washing the hands and face).

In 1909, Webster’s New International Dictionary of the English Language defined a “lavabo” as something like a sink: “a wash basin with its necessary fittings, esp. one set in place and supplied with running water and a waste pipe.”

And Dorothy L. Sayers used the word in the sense of “lavatory,” according to the OED, in her novel Strong Poison (1930): “The little lavabo in the passage.”

This calls for a brief look into “lavatory,” a 14th-century word that’s also derived from the Latin verb lavare.

When first recorded in writing, sometime before 1375, it meant a bath or a vessel for washing. But in the 16th century it was also used to mean the Christian purification rite that was later called the “lavabo.”

The modern sense of “lavatory” can be traced to the 17th century, when it first came to mean a small room equipped with a wash basin.

Here’s how the OED defines this use of “lavatory”: “An apartment furnished with apparatus for washing the hands and face, subsequently also including water-closets, etc. In the 20th c. one of the more usual words for a W.C. (and in turn giving way to more recent euphemisms: lav., loo, toilet, etc.).”

In some of its citations, the OED delicately adds, “lavatory” is used elliptically “for the appliance itself”—that is, the toilet bowl. We’ll quote a couple of those examples:

“Albert closed the door and sat down on the lavatory,” from Jack Trevor Story’s novel Something for Nothing (1963).

“Flush Conscience down the lavatory,” from the now-defunct BBC publication The Listener (1965).

Check out our books about the English language

The Grammarphobia Blog

Healthy choices

Q: A friend has a soup container with a label that reads “fresh healthy delicious.” Is “healthy” correct? I realize it’s supposed to mean you’ll be healthy if you eat the soup, but doesn’t it actually mean the soup is healthy?

A: We have to disagree here. Not many people reading that label would think the soup itself was enjoying robust health.

We’ve written before on our blog about “healthful” and “healthy.” But that was more than five years ago, so we’ll revisit the subject.

In traditional usage during much of the 20th century, “healthy” people led “healthful” lives—that is, they ate “healthful” foods and did “healthful” things.

So a person was “healthy” if the vegetables he ate and the exercises he sweated over were “healthful.” That’s how a lot of early- to mid-20th-century usage guides explained the difference.

But language authorities no longer insist on this distinction. As we said back in 2006, “It’s become almost universal for people to refer to ‘healthy food,’ even though a literal-minded person might imagine a stalk of broccoli lifting weights!”

Today, dictionaries regard this use of “healthy” as correct, standard English. So it’s not a mistake to refer to a healthful thing as “healthy.”

As it turns out, history is on the side of this broader interpretation. For hundreds of years, “healthy” was freely used to mean good for you, and nobody minded until a distinction was drawn in the late 19th century.

The American Heritage Dictionary of the English Language (5th ed.) has an interesting usage note on the subject (we’ll add paragraph breaks):

“Some people insist on maintaining a distinction between the words healthy and healthful. In this view, healthful means ‘conducive to good health’ and is applied to things that promote health, while healthy means ‘possessing good health,’ and is applied solely to people and other organisms. Accordingly, healthy people have healthful habits.

“However, healthy has been used to mean ‘healthful’ since the 1500s, as in this example from John Locke’s Some Thoughts Concerning Education: ‘Gardening … and working in wood, are fit and healthy recreations for a man of study or business.’

“In fact, the word healthy is far more common than healthful when modifying words like diet, exercise, and foods, and healthy may strike many readers as more natural in many contexts. Certainly, both healthy and healthful must be considered standard in describing that which promotes health.”

Merriam-Webster’s Collegiate Dictionary (11th ed.) agrees, defining “healthy” as both “enjoying health” and “conducive to health.” M-W quotes General George S. Patton: “walk three miles every day … a beastly bore, but healthy.”

Finally, the Oxford English Dictionary has citations going back to the 16th century in which “healthy” is used to mean “possessing or enjoying good health” as well as “conducive to or promoting health.”

Check out our books about the English language

The Grammarphobia Blog

A belated valentine

Q: I wished a colleague happy Valentine’s Day earlier in the month and was told there is no apostrophe plus “s” in the name of the holiday. Could you shed some light?

A: Yes, there is an apostrophe + “s” in “Valentine’s Day.” The longer form of the name for the holiday is “St. Valentine’s Day.”

And, in case you’re wondering, the word “Valentine’s” in the name of the holiday is a possessive proper noun while the word “valentines” (for the cards we get on Feb. 14) is a plural common noun.

“Valentine’s Day” has the possessive apostrophe because it’s a saint’s day. In Latin, Valentinus was the name of two early Italian saints, both of whom are commemorated on Feb. 14.

Published references in the Oxford English Dictionary indicate that the phrase “Valentine’s Day” was first recorded in about 1381 in Geoffrey Chaucer’s Middle English poem The Parlement of Foules:

“For this was on seynt Volantynys day / Whan euery bryd comyth there to chese his make.” (In Middle English, possessive apostrophes were not used.)

Chaucer’s lines would be translated this way in modern English: “For this was on Saint Valentine’s Day / When every bird comes here to choose his mate.” (The title means a parliament or assembly of fowls—that is, birds).

As a common noun, “valentine” was first used to mean a lover, sweetheart, or special friend. This sense of the word was first recorded in writing in 1477, according to OED citations.

In February of that year, a young woman named Margery Brews wrote two love letters to her husband-to-be, John Paston, calling him “Voluntyn” (Valentine).

As rendered into modern English, one of the letters begins “Right reverend and well-beloved Valentine” and ends “By your Valentine.” (We’re quoting from The Paston Letters, edited by Norman Davis, 1963.)

In the mid-1500s, the OED says, the noun “valentine” was first used to mean “a folded paper inscribed with the name of a person to be drawn as a valentine.”

It wasn’t until the 19th century, adds Oxford, that “valentine” came to have its modern meaning: “a written or printed letter or missive, a card of dainty design with verses or other words, esp. of an amorous or sentimental nature, sent on St. Valentine’s day.”

Here’s the OED’s first citation, from Mary Russell Mitford’s book Our Village (1824), a collection of sketches: “A fine sheet of flourishing writing, something between a valentine and a sampler.”

This later example is from Albert R. Smith’s The Adventures of Mr. Ledbury and his Friend Jack Johnson (1844): “He had that morning received … a valentine, in a lady’s hand-writing, and perfectly anonymous.”

What could be more intriguing than that?

Check out our books about the English language

The Grammarphobia Blog

That old college cheer

Q: When my wife and I attended City College after World War II, we’d cheer on our basketball team with this nonsense: “Allagaroo, garoo, gara, Ee-yah, ee-yah, Sis boom bah, Yay, team!” Is there a hidden meaning in those lines?

A: “Allagaroo, garoo, gara!” was the City College of New York’s battle cry during the postwar years and students used it to cheer on the basketball team when it won both the NIT and NCAA basketball championships in 1950.

Where did “allagaroo” come from? The origin isn’t really known, but the word sleuth Barry Popik has come up with a couple of speculative theories.

Popik tracked down a 1950 article in the Sporting Times with this explanation: “According to school legend, an allagaroo either was a cross between an alligator and a kangaroo or a corruption of the French phrase ‘allez guerre’ (on to the war).”

In an article on his Big Apple website, Popik also points out that City College wasn’t the only school with an “allagaroo” cheer.

Hutchinson High School in Kansas has had one since 1901, but eliminated a stanza in 2003 after complaints about racial overtones. Here’s the first stanza of the revised cheer, from a Hutchinson alumni website:

Allagaroo, garoo, garoo; Wah, hoo, bazoo; Hicer, picer, dominicer; Sis! Boom! Bah! Hutchinson High School Rah! Rah! Rah!

As for the old City College cheer, we’ve seen various versions of it on the Web, including this one from an article in the April 3, 2000, issue of Sports Illustrated:

Allagaroo garoo gara, Allagaroo garoo gara, Ee-yah ee-yah, Sis boom bah, Team! Team! Team!

The Sports Illustrated article notes that in early 1951, “with CCNY’s grand season still fresh in the city’s memory,” seven members of the basketball team were arrested and charged with conspiring to fix games.

But let’s get back to the City College cheer you asked about.

The “ee-yah” part of the cheer has a history of its own. The yell was popularized by Hughie Jennings, a big-league baseball player and manager from 1891 to 1925, according to the Dickson Baseball Dictionary (3rd ed.).

The dictionary says Jennings made the ear-splitting yell famous while he was the manager and third-base coach of the Detroit Tigers before World War I.

Dickson discounts stories that Jennings picked up the yell while working as a mule driver in his youth or that it originated from his mangling a Hawaiian phrase, weeki-weeki (watch out), used by a pitcher from Hawaii.

The Jennings yell became so well know, according to Dickson, that American infantrymen shouted it during trench warfare in World War I.

Now, let’s look at the interjection “sis boom bah.”

The Oxford English Dictionary describes it as an “echoic” expression that represents “the sound of a skyrocket: a hissing flight (sis), an explosion (boom), and an exclamation of delight from the spectators (bah, ah).”

The OED says the expression originated in 19th-century America, and is “a shout expressive of support or encouragement to a college team.”

The dictionary adds that it’s also used as a noun meaning “enthusiastic or partisan support of spectator sports, esp. football.” (Here’s a possible noun use: “Give us a sis-boom-bah.”)

Finally, we’ve written before on the blog about the word “yay,” so we won’t repeat ourselves.

Check out our books about the English language

The Grammarphobia Blog

The human equation

Q: As a mathematician (PhD), I’d like to take issue with the WNYC caller who said Leonard Lopate misused the word “equation” on the air. It’s true that in mathematics, an equation is an equality between two expressions involving at least one object that is unknown and must be found. The ideal situation occurs when only one object can actually be determined—the solution of the equation. However, an equation can have no solution, or many, even infinitely many solutions. On the other hand, “equation” has a quite different meaning in standard English, and I can attest that Leonard’s metaphorical use of the word was quite correct.

A: Thank you for setting the record straight. For readers of the blog who didn’t hear Pat’s appearance last month on the Leonard Lopate Show, here’s the story.

A caller to the show said Leonard used the word “equation” incorrectly. The caller insisted that it should be used in a non-mathematical sense only when referring to situations involving two equal things.

But as Leonard and Pat noted on the show, the term is commonly used in a broad, metaphorical sense as well as the more literal one.

The American Heritage Dictionary of the English Language (5th ed.) has as one of its definitions “a complex of variable elements or factors.” Merriam-Webster’s Collegiate Dictionary (11th ed.) has “a complex of variable factors.”

Some dictionaries allow even broader meanings. But first, a little history.

The noun “equation” came into English in the late 1300s from the Latin æquationem. The Latin noun was derived from the verb æquare (to make equal), which in turn came from the adjective æquus (equal).

As it happens, “equation” was used by astrologists long before mathematicians adopted the word.

In Middle English, according to the Oxford English Dictionary, its original meaning was “equal partition,” a reference to the astrological division of the heavens.

For example, “equations of houses” meant “the method of dividing the sphere equally into ‘houses’ for astrological purposes.”

Nearly 200 years later, in 1570, the mathematical sense of “equation”—that is, a “statement of equality”—was introduced.

And one of the senses of this definition, the OED says, is “a formula affirming the equivalence of two quantitative expressions, which are for this purpose connected by the sign =.”

A century later, more general uses of the word came along.

In astronomy, for example, “equation” meant “the action of adding to or subtracting from any result of observation or calculation such a quantity as will compensate for a known cause of irregularity or error.”

This is where the terms “personal equation” and “human equation” came from.

“Personal equation” was a phrase introduced by 19th-century astronomers, the OED says, and originally meant the correction required to account for inaccuracy on the part of the observer.

A variation on this theme, “human equation,” came along in the mid-20th century. Here are a couple of OED citations:

“The Oakland Bridge suffers from such a simple, unpredictable human equation as the preference of truck drivers to loaf on a ferry” (from a 1938 issue of Reader’s Digest).

“We must throw out the human equation as much as we can in our search to find an explanation for seeming aberrancies” (from Fredson T. Bowers’s book Bibliography and Textual Criticism, 1964).

Current standard dictionaries, as we said, have endorsed even wider metaphorical uses of “equation.”

The Collins English Dictionary includes these definitions: “a situation, esp one regarded as having a number of conflicting elements (‘what you want doesn’t come into the equation’)”; and “a situation or problem in which a number of factors need to be considered.”

The Macmillan Dictionary, in both the British and American editions, says “the equation” can mean “all the different aspects that you have to consider in a situation (‘In a choice between the use of rail and car, the question of cost will come into the equation’).”

Check out our books about the English language

The Grammarphobia Blog

Ethics vs. morals

Q: Many people use “morals” and “ethics” interchangeably, but I think the former refers to values imposed by the community while the latter refers to a personal sense of right and wrong. What do you think?

A: You believe a person’s morals come from outside—they’re determined by the surrounding community—while ethics come from within and are determined by one’s character.

We have roughly the same impression about these nouns and their adjectives, “moral” and “ethical.”

And so, more or less, do the editors of The American Heritage Dictionary of the English Language (5th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.).

In an explanatory note, American Heritage says “moral” applies to “personal character and behavior: ‘Our moral sense dictates a clearcut preference for these societies which share with us an abiding respect for individual human rights’ (Jimmy Carter).”

The word “ethical,” the explanation continues, “stresses idealistic standards of right and wrong: ‘Ours is a world of nuclear giants and ethical infants’ (Omar Bradley).”

Merriam-Webster’s explains that “moral” implies “conformity to established sanctioned codes or accepted notions of right and wrong (‘the basic moral values of a community’).”

The dictionary says “ethical” may suggest “the involvement of more difficult or subtle questions of rightness, fairness, or equity (‘committed to the highest ethical principles’).”

This difference, however, isn’t so apparent in the etymologies of the two words. In fact, their linguistic ancestors were nearly identical.

As the Oxford English Dictionary explains, the classical Latin word moralis (moral) was formed by Cicero as a rendering of the ancient Greek word ethikos (ethical).

Cicero apparently took as his model the Latin mores (habits, morals), which was already in use as the Latin equivalent of the Greek ethe (customs, manners, habits).

Similarly, when “moral” and “ethical” first came into English, they meant much the same thing.

The adjective “moral” (which predates the noun “morals”) was first recorded in Chaucer’s The Canterbury Tales (circa 1387-95).

It originally meant, in the words of the OED, “of or relating to human character or behaviour considered as good or bad; of or relating to the distinction between right and wrong, or good and evil, in relation to the actions, desires, or character of responsible human beings; ethical.”

We still use “moral” in that way, but we also use it in a less abstract sense when applied to an action or a person, a meaning that emerged in the late 16th century.

When applied to an action, the OED says, it means “having the property of being right or wrong, or good or evil; voluntary or deliberate and therefore open to ethical appraisal.”

And when applied to a person, it means “capable of moral action; able to choose between right and wrong, or good and evil.”

Also, we sometimes use “moral” to mean “virtuous with regard to sexual conduct,” a meaning the OED says was first recorded in 1803.

The noun “ethics” (originally used in the singular, “ethic”), entered English at about the same time as “moral,” in the late 1300s. At first, it meant a scheme of moral science or the study of moral science.

The adjective “ethical” came along in the early 1600s and meant “of or pertaining to morality or the science of ethics.”

A later meaning emerged in the 19th century: “in accordance with the principles of ethics; morally right; honourable; virtuous; decent; spec. conforming to the ethics of a profession, etc.”

So by the 19th century, the distinction between “moral” and “ethical” was established, although there was still a lot of overlapping.

Of the two, “moral” has taken on more senses over the years. We had a posting some time ago about one of them—the use of “moral” in the expression “moral support.”

It occurs to us that the difference between “moral” and “ethical” is more pronounced in their negative forms: “immoral” and “unethical.”

As applied to a person, “immoral” conveys possible meanings (like impure, dissolute, licentious) that aren’t found in “unethical.”

Check out our books about the English language

The Grammarphobia Blog

Politics and prose

Q: ’Tis the season, I suppose, but twice in the past week, I’ve heard the word “politics” used as though it were plural. Here, for example, is Mayor Bloomberg on the recent Susan G. Komen flap: “Politics have no place in health care.” I’m an editorial cartoonist and I always refer to “politics” in the singular: Can “politics” really be like “deer”?

A: “Politics” can be used with either a singular or a plural verb, depending on your meaning. In general, it’s singular. It’s plural only when it means a particular set of political beliefs.

Here are examples of the word used both ways:

Singular: “Politics is my favorite subject” … “Politics has muddied the waters.”

Plural: “I like him personally but his politics are repellant” … “Dad’s politics have changed.”

We agree with you, and think the word should have been used with a singular verb in that statement by the New York City mayor: “Politics has no place in health care.”

Garner’s Modern American Usage (3rd ed.) explains the difference this way: “Politics may be either singular or plural. Today it is more commonly singular than plural (politics is a dirty business), although formerly the opposite was true. As with similar -ics words denoting disciplines of academic and human endeavor, politics is treated as singular when it refers to the field itself (all politics is local) and as plural when it refers to a collective set of political stands (her politics were too mainstream for the party’s activists).”

And here’s how Pat discusses words like “politics” in her grammar and usage book Woe Is I:

“Figuring out the mathematics of a noun can be tricky. Take the word mathematics. Is it singular or plural? And what about all those other words ending in icseconomics, ethics, optics, politics, and so on? Fortunately, it doesn’t take a PhD in mathematics to solve this puzzle.

“If you’re using an ics word in a general way (as a branch of study, say), it’s singular. If you’re using an ics word in a particular way (as someone’s set of beliefs, for example), it’s plural.”

Here are the examples given in the book: “Politics stinks,” said Mulder. … “Mulder’s politics stink,” said Scully. … Statistics isn’t a popular course at the academy. … Alien-abduction statistics are scarce.

Check out our books about the English language

The Grammarphobia Blog

Let’s look sharp

Q: It’s odd that Pat includes “look sharp” in Woe Is I among her examples of adverbs without “ly” endings. It strikes me that “look” here is an adjective, not an adverb.

A: The paragraph you refer to in Woe Is I is about the use of these “ly”-less adverbs, and it says:

“Adverbs can come with or without ly, and many, like slow and slowly, exist in both forms. Those without the tails are called ‘flat adverbs,’ and we use them all the time in phrases where they follow a verb: ‘sit tight,’ ‘go straight,’ ‘turn right,’ ‘work hard,’ ‘arrive late,’ ‘rest easy,’ ‘look sharp,’ aim high,’ ‘play fair,’ ‘come close,’ and ‘think fast.’ Yes, straight, right, hard, and the rest are bona-fide adverbs and have been for many centuries.”

But “look sharp” may or may not belong with those other phrases, depending on how it’s used.

If it’s being used in the sense of “watch out”—that is, “look sharply”—then “sharp” is indeed a flat adverb.

If it’s being used in the sense of “be quick” or “look alive,” however, “look” is a linking verb, and it’s modified by an adjective, not a flat adverb.

“Look” in this sense is a linking verb because it means to seem or appear to be (not to use one’s eyes). And linking verbs—like “seem,” “be,” “appear,” “feel,” and so on—are always modified by adjectives.

In case you’d like to read more about linking verbs, we’ve written about them on the blog, including posts in 2010 and 2009.

As for the expression “look sharp,” it’s been around since the early 18th century. The Oxford English Dictionary’s earliest citation is from a story written by Richard Steele for The Spectator in 1711:

“The Captain … ordered his Man to look sharp, that none but one of the Ladies should have the Place he had taken fronting the Coachbox.”

When the phrase was first used, the OED says, “sharp” was an adverb and the phrase had a more literal meaning—“ ‘to look sharply after something,’ ‘to keep strict watch.’ ”

But in later usage, according to the dictionary, “the sense is commonly ‘to bestir oneself briskly,’ ‘to lose no time.’ ”

So “look” here means “to have a certain appearance,” or “to have the appearance of being,” a meaning that Oxford compares with “the similar use in passive sense of other verbs of perception, like smell, taste, feel.”

And “sharp” in this sense, the dictionary adds, is not an adverb but an adjective complement.

For a few more examples, we need go only to Chapter 39 of Charles Dickens’s The Old Curiosity Shop (1841), where Kit walks into an oyster shop, “as bold as if he lived there.”

Kit tells the waiter “to bring three dozen of his largest-sized oysters, and to look sharp about it! Yes, Kit told this gentleman to look sharp, and he not only said he would look sharp, but he actually did, and presently came running back with the newest loaves, and the freshest butter, and the largest oysters, ever seen.”

Thank you for calling our attention to this dual usage of “look sharp.” When there’s a fourth edition of Woe Is I, Pat will remove “look sharp” from that paragraph to avoid confusion.

(In addition, the hyphen will be removed from “bona fide” in that same paragraph, another problem pointed out recently by an eagle-eyed reader of the blog.)

Check out our books about the English language

The Grammarphobia Blog

Mythematics: a decimal point

Q: Just thought I’d let you know why the young lady in your Sept. 28, 2008, post lost the multiplication bee. When you’re speaking or writing numbers, the word “and” is actually the decimal point. So one hundred thirty-two is 132, but one hundred and thirty-two is 100.32. It may seems a small matter, but it makes a big difference in the number.

A: This is a common misconception, but in spoken or written numbers the conjunction “and” does not mean decimal point. So someone who says, “Twelve times eleven is one hundred and thirty-two” means the result is 132, not 100.32.

The number 132 can correctly be spoken or written as “one hundred thirty-two” or “one hundred and thirty-two.” British speakers and most Americans use “and,” but the conjunction is sometimes omitted in North American usage.

The “and” merely means “plus” and indicates that more of the number is coming. What follows “and” can be a whole number or a fraction, whether decimal or not (as in “three and five-eighths”).

Normally, someone speaking a decimal number like 100.32 would say “one hundred point thirty-two” or “one hundred point three two” or “one hundred and thirty-two hundredths.” When you use “and” in speaking a decimal number, the size of the decimal fraction (“tenths,” “hundredths,” and so on) must be included.

Our posting mentioned a child who lost a multiplication bee because she answered that 12 times 11 was “one hundred and thirty-two.” The correct answer, according to the contest sponsors, was “one hundred thirty-two,” so she was penalized for using “and.”

Apparently there was a special rule in the math contest against using “and” in answers. Contest rules are a kingdom unto themselves, and it may be that the usual practice in multiplication bees is to forbid the use of “and.”

But we can assure you that “and” does not mean “decimal point.” That is not among the definitions of “and” in any dictionary that we can find, and that includes standard dictionaries, the Oxford English Dictionary, and mathematical dictionaries.

As we said in our earlier posting, the OED’s entry for “and” lists this as one of its definitions: “to connect (units or) tens to hundreds (or thousands), as two hundred and one, three thousand and twenty-one, six thousand two hundred and fifty-six.”

In English, the earliest written references for this use of “and” are from the Lindisfarne Gospels, which were written in Old English in the late 7th or early 8th century. And the usage has been common practice ever since. We’re talking about more than 1,200 years of history.

The OED further explains that the “and” in numbers “is frequently omitted colloquially in North American usage.” This implies that including “and” is the more standard usage, while omitting it is an informal or conversational usage.

The use of “and” in numbers is a very ancient practice. You’re probably familiar with the now old-fashioned use, for example, of “one and twenty” to mean twenty-one, a convention that English inherited from the older Germanic languages.

“With numerals of the type one and twenty,” the OED says, “compare this type of composition in other Germanic languages, e.g. Old High German fiarzug inti sehso forty-six, Middle High German einz und drizic thirty-one, German einundzwanzig twenty-one, Old Icelandic einn ok tuttugu (also tuttugu ok einn ) twenty-one, etc.”

We don’t know where the misconception that “and” means “decimal point” came from. But unfortunately, it’s all over the Internet and apparently elementary-school teachers are even passing it on to their students. Never mind. It’s nonsense.

Check out our books about the English language

The Grammarphobia Blog

“At” tricks

Q: I realize that the taboo against ending a sentence with a preposition is a myth, but I’ve been reading with increasing frequency such sentences as “I don’t know where he’s at.” Is this use of a superfluous “at” incorrect as well as awkward sounding?

A: We’ve written before about these “where … at” constructions, but after four years it’s time for an update.

Many people criticize sentences like “Do you know where Dad’s at?” and “Tell me where they’re at” and “Did she say where she’s at?” But they often do so for the wrong reason.

The problem with such sentences isn’t that they place the preposition at the end. As you know, and as we’ve written many times, there’s nothing wrong with ending an English sentence with a preposition.

So what’s the issue here? The problem—if there is one—is simply that the “at” is redundant. “Where she’s at” is just a redundant way of saying “where she is.”

Needed or not, people persist in using “at” with “where” in their speech—very seldom in writing.

When we asked ourselves why, it occurred to us that this “at” very frequently follows a contraction: “where he’s at,” “where it’s at,” “where they’re at,” and so on.

Aha! A light began to dawn.

People naturally use contractions when they talk, but not at the end of a sentence.

This is because when you end a sentence with a contraction, like “she’s,” the verb (“is”) gets swallowed up. And a swallowed-up verb at the end of a sentence—as in “Did she say where she’s?”—is not idiomatic English.

So anyone who uses a contraction is going to want to put something after it—like “at.”

We were pleased to see our suspicions verified in Merriam-Webster’s Dictionary of English Usage.

“In current speech,” Merriam-Webster’s says, “the at serves to provide a word at the end of the sentence that can be given stress. It tends to follow a noun or pronoun to which the verb has been elided, as in the utterance by an editor here at the dictionary factory: Have any idea where Kathy’s at?

As M-W explains, “You will note that at cannot simply be omitted: the ’s must be expanded to is to produce an idiomatic sentence if the at is to be avoided.”

The usage guide says the “where … at” combination has been a part of American speech since at least 1859, when it was recorded in Bartlett’s Dictionary of Americanisms.

M-W adds that the Dictionary of American Regional English says it’s used mostly in the US South and Middle America.

So we know why this usage turns up so often in American speech. But is it a crime?

If it is, the folks at Merriam-Webster’s seem to think it’s a pretty small one. “A more harmless idiom would be hard to imagine,” they write.

We agree that a redundant “at” is not a hanging offense. But you’ll probably be taken to task for using it.

Unless your conversation is very casual indeed, the unnecessary “at” may give your speech an uneducated flavor.

And of course it should be avoided when you want your writing to be at its best—that is, unless you’re quoting someone else.

But, as we say in our previous posting, there’s a related expression that’s become an accepted idiom. This is the colloquial expression “where it’s at,” as in “Dylan really knows where it’s at!”

The Oxford English Dictionary defines the idiomatic “where it’s at” this way: “the true or essential nature of a situation (or person); the true state of affairs; a place of central activity.”

The OED has published references for this expression going back to a 1903 article in the New York Sun, but it really took off in the 1960s.

Here’s an OED citation from 1967, in the now-defunct BBC magazine The Listener: “As Dylan says, ‘I’ll let you be in my dream, if I can be in yours.’ I think I know where he’s at.”

And here’s one from Robert M. Pirsig’s Zen and the Art of Motorcycle Maintenance (1974): “That, today, is where it is at, and will continue to be at for a long time to come.”

We’re pretty sure that this use of “where it’s at” will be part of the language for a long time to come.

Check out our books about the English language

The Grammarphobia Blog

A participle in gerund’s clothing

Q: I have a question about these two sentences: (1) “The girl hurt her foot playing soccer.” (2) “While playing soccer, the girl hurt her foot.” I believe that “playing soccer” is a gerund phrase in both sentences. Is my assessment correct?

A: Sorry, but “playing” isn’t a gerund and “playing soccer” isn’t a gerund phrase in either of those sentences. Not every word made up of a verb plus “-ing” is a gerund.

In both #1 and #2, “playing” is a participle and “playing soccer” is a participial phrase. In this case, with or without “while,” the phrase is used adverbially because it tells when or how the girl hurt her foot.

Although participles and gerunds are both forms of verbs, they act differently.

A gerund is a verb form ending in “-ing” that functions as a noun. Here’s an example of “playing soccer” as a gerund phrase: “Her favorite pastime is playing soccer.” (Or, conversely, “Playing soccer is her favorite pastime.”)

Participles come in two varieties. Past participles generally end in “-ed” (like “played”), and present participles end in “-ing” (like “playing”).

Participles can function as adverbs (“She hurt herself playing”), adjectives (“She hurt herself on the playing field”) or parts of verbs (“She was playing”).

We should mention here that over the years, some grammarians have drawn a distinction between different kinds of “-ing” adjectives. They regard some as participles and some as gerunds, depending on their function.

For example, George O. Curme, in A Grammar of the English Language (Vol. I), says “sleeping” is used adjectivally as a participle in the phrase “sleeping children” but as a gerund in the phrase “sleeping quarters.”

Why? Because in the first phrase, “sleeping” tells us what the children are DOING; in the second, it tells us what the quarters are FOR. So Curme would call “playing” a gerund in the phrase “playing field,” while we choose to call it a participle.

Some other grammarians draw no distinction one way or the other. The Cambridge Grammar of the English Language would refer to both as “gerund-participles.” It maintains that there’s “no viable distinction” to be made.

By the way, English got the word “participle” from Old French, according to the Chambers Dictionary of Etymology, but the ultimate source is the Latin participium, which literally means a sharing or a partaking.

The Oxford English Dictionary explains that the Latin term was used to refer to “a non-finite part of a verb” that shares “some characteristics of a verb and some of an adjective.”

Chambers says the word “gerund” comes from the Late Latin gerundium, which is a bit of a mishmash. It combines the first part of the Classical Latin gerundum, the gerund form of gerere (to bear or carry), with the ium ending of participium.

We’ve written before about gerunds, including a posting a year ago that explores the differences between gerunds and participles.

Check out our books about the English language

The Grammarphobia Blog

Hear Pat live today on WNYC

She’ll be on the Leonard Lopate Show around 1:15 PM Eastern time to discuss the English language and take questions from callers. If you miss the program, you can listen to it on Pat’s WNYC page.

Check out our books about the English language

The Grammarphobia Blog

Neck of the woods

Q: I’ve heard the phrase “neck of the woods” many times, and just accepted it. But when I heard it again the other day, I started wondering. Why do people refer to their neighborhood as their neck of the woods?

A: Several hundred years ago, early American settlers used the word “neck” to describe a narrow stretch of wood, pasture, meadows, and so on.

Our expression “neck of the woods,” according to the Oxford English Dictionary, is a surviving remnant of that old usage.

The use of “neck” to describe a narrow piece of land was of course an extension of the anatomical term “neck”—that narrow stretch located between the head and the shoulders.

The original word dates back to the 800s (it was first recorded in Old English as hneccan), and comes from old Germanic sources.

Since the 14th century, people have used “neck” to describe a variety of things that were narrow or constricted, like the top of a bottle, a mountain pass, an inlet of water, the fingerboard of a stringed instrument, and so on.

So the early colonists were merely carrying on a tradition when they used “neck” to describe a narrow piece of land. (You might say they weren’t sticking their necks out.)

The usage was first recorded in colonial property deeds.

The Oxford English Dictionary’s earliest citation is from a document written in Dedham. Mass., in 1637: “Graunted to Samuell Morse yt necke of medowe lying next unto ye medowes graunted unto Edward Alleyn.”

Here’s another example, from Providence, R.I., in 1699: “A percell of Meadow which … is scituate in a neck of Meaddow on the north side of Pautuxett River.”

In modern usage, the OED says, “neck of the woods” can mean “a settlement in wooded country, or a small or remotely situated community.” But more generally, it means “a district, neighbourhood, or region.”

And when people speak of “this neck of the woods,” they mean “around here” or “in this vicinity.”

Check out our books about the English language

The Grammarphobia Blog

The earwig in fact and fiction

Q: How did earwigs get their name and is there any truth to the belief that they like to crawl into people’s ears to lay their eggs?

A: Before we get to the etymology, let’s clear up the entomology.

It’s a myth that earwigs lay their eggs in human ears. And it’s an even yuckier myth that they bore into human brains to lay their eggs, driving the poor hosts crazy.

Yes, earwigs like to hang out in moist, dark places, and ear canals fit that description.

But the entomologist May Berenbaum says she knows of only “one single reference in about ten centuries of literature to an earwig actually being found in an ear.”

In The Earwig’s Tail, her 2009 book about mythological bug stories, she suggests that the belief in the earwig’s attraction to human ears may have its roots in Roman times.

“Like so much entomological misinformation,” she writes, “the notion that earwigs infect ears may have originated with Pliny the Elder, first-century polymath who, among other things, believed that caterpillars originate from dew on radish leaves.”

Berenbaum cites Pliny’s advice in Historia Naturalis that if “an earwig … be gotten into the eare … spit into the same and it will come forth anon.” (We’re using the same 1601 translation that Berenbaum quotes.)

Now, on to the etymology. The modern word “earwig” comes from an Old English term, earwicga, a compound of words for ear and an insect of some sort. The two earliest written examples in the Oxford English Dictionary date from around 1000.

One of those citations, from the Old English Leechdoms, an Anglo-Saxon medical work, includes a cure in which a blade of grass or straw is used to drive an earwig out of the ear.

So, the myth about earwigs and ears was alive and well in Anglo-Saxon times, and we suspect that it played a role in the naming of the insect itself.

An etymology note in The American Heritage Dictionary of the English Language (5th ed.) says wicga, the second part of the Old English word, is a member of the same family of words that has given us “wiggle” and “wag.”

“This group of terms,” American Heritage adds, “denotes quick movements of various sorts and the prehistoric ancestor of the Old English word wicga probably meant something like ‘wiggler.’ ”

Although there’s no truth to the belief that earwigs inhabit ears, many other languages have similar terms for the insect, according to Berenbaum, head of the entomology department at the University of Illinois.

The French call the earwig perce-oreille (ear piercer), the Germans Ohrwurm (ear worm), the Russians ukhovertka (ear turner), and so on.

Since earwigs don’t inhabit human ears or brains, where do they hang out?

In a garden, you’ll find them in clumps of mulch and bark. In a house, you’ll find them in cracks and crevices. That is, if you really want to look!

A final note: Berenbaum says an earwig’s hind wings look a lot like human ears when unfolded. But she doesn’t buy that as an explanation for the insect’s name. And neither do we.

Check out our books about the English language

The Grammarphobia Blog

Need a PhD to pluralize “master’s degree”?

Q: I’ve been asked to edit some policy papers for the graduate dean at the university where I teach. My inclination is to change “Masters degree” to “Master’s degree.” Do you have an opinion? Also, after reading through some rather convoluted arguments on the Web, I’m at sea over how to punctuate “I have two Master’s degrees.” Do you have a clear explanation of the issue?

A: On our blog, we follow The Chicago Manual of Style (16th ed.), and use the apostrophe: “master’s degree.”

We also lowercase “master’s” when used generically (as in “My son is working on a master’s degree”).

Note that when you pluralize the phrase as a whole, only “degree” gets the plural “s.” The possessive adjective “master’s” doesn’t itself become plural.

So that sentence you’re editing should be written this way: “I have two master’s degrees.”

This practice is also followed in publications of the Modern Language Association.

For example, here’s a sentence from a paper published by the MLA last June (“Rethinking the Master’s Degree in English for a New Century”): “Across all fields of study, the number of master’s degrees granted between 1980–81 and 2007–08 increased 111%, from 295,731 to 625,023.”

We wouldn’t be surprised if the master’s and bachelor’s degrees lost their apostrophes someday, just as the abbreviations—MA and BA—have lost their periods.

Punctuation does tend to fall away over time, as with USA, MD (for medical doctor), USA, NFL, and so on.

But for now, the apostrophe is still used with those degrees in the Chicago Manual as well as in standard dictionaries, including The American Heritage Dictionary of the English Language (5th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.).

Check out our books about the English language

The Grammarphobia Blog

Gone fishing

Q: In a recent posting, you note that we still use “be” as an auxiliary with some verbs of motion, like “go” and “grown.” Then you add: “So today we can say either ‘he is gone’ or ‘he has gone,’ ‘they are grown’ or ‘they have grown.’ To me, there’s a subtle distinction in emphasis, if not meaning, between “he is gone” and “he has gone.” Am I off base?

A: No, you’re right on base. There’s a difference between “he is gone” and “he has gone,” and between “they are grown” and “they have grown.” That’s why both forms—with “be” and “have”—are still in the language. They’re both useful.

To explain, we have to back up a bit. As we said in our earlier posting, many verbs originally had some form of “be” (like “is,” “am,” or “are”) as their auxiliary.

This was true of verbs of motion including “come,” “go,” “rise,” “fall,” “grow,” “depart,” “return,” and others. These verbs once had “be” as their auxiliary, not “have.”

This accounts for old usages like “he is come” (for “he has come”), “Troy is fallen” (for “Troy has fallen”), and “we are lately returned” (for “we have lately returned”).

With most of those verbs, the old “be” forms have long since been dropped and the modern auxiliary is “have.” But the “be” forms have been retained in some poetic and religious usages (“He is risen,” “the Lord is come,” “miracles are not ceased”).

In the case of “go” and “grow,” they too have adopted “have” as their auxiliary verb. But they’ve kept the old “be” too—with a difference.

In modern usage, “is gone” and “are grown” are no longer construed as perfect tenses. Instead, “gone” and “grown” are interpreted as adjectives.

So the old forms are still here, but with new meanings.

In a previous blog item about the difference between “he is gone” and “he has gone,” we quoted the grammarian Otto Jespersen:

“While he has gone calls up the idea of movement, he is gone emphasizes the idea of a state (condition) and is the equivalent of ‘he is absent.’ ”

In summary, “gone” and “grown” are past participles of “go” and “grow.” But in modern English, when they’re used with “be” they’re adjectives. If you want to get technical about it, they’re past participles, used predicatively as adjectives.

We hope this sheds some light. And as for “gone fishing” (the title of this blog item), we discuss the expression briefly in a posting about the lyrics of the Pink Floyd song “The Trial.”

Check out our books about the English language

The Grammarphobia Blog

A bona fide boner

Q: Does “bona fide” require a hyphen? In Woe Is I, I read these two phrases: “a bona fide pebble” (on page 160) and “bona-fide adverbs” (on page 221). Is there a difference?

A: You found a style mistake in the new third edition of Pat’s grammar and usage book!

The adjectival phrase “bona fide,” according to standard dictionaries, should not have a hyphen.

When Pat gets a chance to do a fourth edition of the book, this error on page 221 will be fixed.

For readers of the blog who don’t have the latest edition of Woe Is I, “bona fide” first shows up in a section about the pronunciation of English words and phrases that come from foreign languages:

“BONA FIDE. This means ‘genuine’ or ‘sincere’ (it’s Latin for ‘good faith’). There are several ways to say it, but the most common is also the most obvious: BONE-uh-fied. Veronica owns a bona fide pebble from Graceland.’ ”

The second appearance is in “The Living Dead,” a chapter about bogus or dead rules. In the interest of laying them to rest, a tombstone is dedicated to each. Here’s the item with the surplus hyphen:

TOMBSTONE: Don’t say ‘Go slow’ instead of ‘Go slowly.

R.I.P. Both slow and slowly are legitimate adverbs. In fact, slow has been a perfectly acceptable adverb since the days of Shakespeare and Milton.

“Adverbs can come with or without ly, and many, like slow and slowly, exist in both forms. Those without the tails are called ‘flat adverbs,’ and we use them all the time in phrases where they follow a verb: ‘sit tight,’ ‘go straight,’ ‘turn right,’ ‘work hard,’ ‘arrive late,’ ‘rest easy,’ ‘aim high,’ ‘play fair,’ ‘come close,’ and ‘think fast.’ Yes, straight, right, hard, and the rest are bona-fide adverbs and have been for many centuries.”

If you’d like to read more about flat adverbs, we had a posting about them on the blog last year.

And in case you’re curious about “bona fide,” it entered English in the 16th century as an adverbial phrase meaning “in good faith, with sincerity; genuinely,” according to published references in the Oxford English Dictionary.

The dictionary’s earliest citation, dated 1542-43, is from a parliamentary act during the reign of Henry VIII: “The same to procede bona fide, without fraude.”

The phrase, which comes from the adverbial Latin for “in good faith,” was first used adjectivally in a 1788 essay by John Joseph Powell: “Act not to extend to bona fide purchasers for a valuable consideration.”

Thanks for catching that error.

Check out our books about the English language

The Grammarphobia Blog

A likely story: “like” vs. “such as”

Q: I’ve heard that one should use “like” for comparisons and “such as” for examples, but everyone I know uses “like” for examples as well. What’s the story?

A: Respected writers have been using the preposition “like” in the sense of “such as” since at least the early 1800s. And as far as we can tell, no language authority objected to this usage until the second half of the 20th century.

Since then, a handful of commentators have criticized the usage for one reason or another. But other usage authorities have either ignored the issue or pooh-poohed the objections.

Count us among the pooh-poohers.

American and British lexicographers, the people who keep track of how English is actually used, agree with us that one standard meaning of the preposition “like” is “such as.”

We checked a dozen standard dictionaries published on both sides of the Atlantic and they were unanimous on this point.

The American Heritage Dictionary of the English Language (5th ed.), for instance, gives this example of the usage: “saved things like old newspapers and pieces of string.”

And the Cambridge Dictionaries Online gives this one: “She looks best in bright, vibrant colours, like red and pink.”

When the ancestors of “like” and “such as” entered English in Anglo-Saxon times, the meanings of the two terms were pretty much alike, according to citations in the Oxford English Dictionary.

The Old English source of “like” (gelic) meant “like one another, similar, of identical form or character,” while the Old English ancestor of “such as” (swelce swa) meant “of the kind or degree that; the kind of (person or thing) that.”

If anything, the earliest ancestor of “like” was more specific and suggested an example while the earliest ancestor of “same as” was less specific and suggested a comparison.

It wasn’t until the late 17th century, according to OED citations, that “such as” took on the sense of “for example.”

Here’s an early usage from A History of the Earth and Animated Nature, a 1795 work by Oliver Goldsmith: “All of the cat kind, such as the lion, the tiger, the leopard, and the ounce.”

Not long after Goldsmith wrote that, other writers began using “like” in the same way, according to published references collected by the language researcher Mark Israel with the help of the Merriam-Webster editorial department.

Here are a couple of examples from Jane Austen’s novels:

“Good sense, like hers, will always act when really called upon,” Mansfield Park (1814).

“A straightforward, open-hearted man, like Weston, and a rational unaffected woman, like Miss Taylor, may be safely left to manage their own concerns,” Emma (1816).

And here’s an example from Charles Darwin: “to argue that because a well-stocked island, like Great Britain, has not, as far as is known” (On the Origin of Species, 1859).

The OED—from its earliest “like” entry, published in 1903, to its latest online entry—has consistently said “like” often has the sense of “such as.”

(The earliest entry was published in a fascicle, or book part, before the first edition was completed or even called the OED.)

The dictionary’s first citation for the usage is from an 1886 letter by Robert Louis Stevenson: “A critic like you is one who fights the good fight, contending with stupidity.”

OK, lexicographers like the usage, but what about usage authorities?

Well, Henry Fowler, the language maven’s language maven, certainly didn’t see anything wrong with using “like” this way.

In the 1911 first edition of the Concise Oxford Dictionary of Current English, which Fowler edited with his brother, Francis, one meaning of “like” is listed as “resembling, such as.”

As an example of the usage, the Fowlers give “a critic like you,” and say “like” is being used to mean “of the class that you exemplify.” Yup, as an example!

Interestingly, neither the original 1926 edition of Henry Fowler’s A Dictionary of Modern English Usage nor the 1965 second edition, edited by Sir Ernest Gowers, cites any problem with using “like” in the sense of “such as.”

It’s not until the third edition, edited by Robert Burchfield in 1996 and 1998,  that an eyebrow is raised about the usage. Burchfield says the use of “like” for “such as” is sometimes questioned because of possible ambiguity.

As an example, he says the title of Kingsley Amis’s 1960 novel A Girl Like You could be read as referring to the girl herself or a girl resembling her.

We think that he’s nitpicking and that it would be silly to use a clunky title like A Girl Such as You to help the one reader in a million who might misread the original. And remember, Henry Fowler himself used “a critic like you” as an example of proper usage.

So where did Burchfield, a pretty tolerant language guy, get the idea that the use of “like” for “such as” may be confusing?

We can’t ask him, since he died in 2004, but we assume he was influenced by the few objections raised in the second half of the 20th century to a usage that had passed without notice since the early 1800s and perhaps earlier.

Merriam-Webster’s Dictionary of English Usage has a half-page entry on how some language mavens blew this issue of ambiguity out of proportion in the 1960s, ’70s, and ’80s.

Wilson Follett appears to be the first language authority to write about the “shade of difference” between “such as” and “like” used in this sense.

In Modern American Usage (1966), Follett says the two terms “may often be interchanged,” but “such as leads the mind to imagine an indefinite group of objects” while “like” suggests “a closer resemblance among the things compared.”

Because of “this extremely slight distinction,” he says, some critics may object to the phase “a writer like Shakespeare” on the ground that no writer is like Shakespeare.

He adds, however, that “context usually makes clear what the comparison proposes to our attention. Such as Shakespeare may sound less impertinent, but if Shakespeare were totally incomparable such as would be open to the same objection as like.”

A few years after Follett’s book came out, another language authority, Theodore M. Bernstein, made light of the issue and used Beethoven instead of Shakespeare as an example.

In Miss Thistlebottom’s Hobgoblins, Bernstein’s 1971 book about language myths and misconceptions, he writes that “only some nit-pickers object to saying, ‘German composers like Beethoven.’ ”

Interestingly, both Follett and Bernstein seem to feel that “like” may be somewhat more specific than “such as”—that is, “like” may suggest an example and “such as” a similarity.

Leslie Sellers, in Keeping Up the Style (1975), appears to be the first language writer to suggest that “such as” should refer to examples and “like” to similarities.

H. Ramsey Fowler and Quentin L. Gehle then picked up the idea in The Little Brown Handbook (1980), followed by James Kilpatrick, in Reflections on the Writing Art (1993), and a few other commentators.

After reviewing these “rather diverse opinions,” Merriam-Webster’s concludes that there’s no agreement on standard usage here and  that “the issue of ambiguity, which evidently underlies the opinion of those who urge the distinction, is probably much overblown.”

The usage guide goes on to list eight 20th-century examples of “like” used for “such as,” including a 1956 letter in which Flannery O’Connor refers to “reading someone like Hemingway,” and two books in which language mavens use “like” this way:

Words on Paper (1960), by Roy H. Copperud: “Phrases like three military personnel are irreproachable and convenient.”

American English Today (1985), by Hans P. Guth: “Avoid clipped forms like bike, prof, doc.”

In none of the examples, the M-W editors add, “can you detect any ambiguity of meaning, either as they are written with like or as they would read if you substituted such as.”

In summary, most English speakers don’t recognize a distinction between these two terms, and the few usage writers who believe in a distinction can’t agree on what it is.

We use both “like” and “such as” to mean “for example,” though Pat considers “such as” a bit stuffy and uses it less than Stewart.

What do we do when we want to emphasize that we’re referring to an example? We simply use “for example” or “including” or a similar term:

“The writers we reread the most—for example, Jane Austen, Anthony Trollope, P. G. Wodehouse, and Angela Thirkell—all have a sense of humor.”

Finally, if you’re up for reading more about “like,” Pat wrote an article for the New York Times Magazine in 2007 about its use in “She’s like, ‘No way,’ ” and a blog post that same year about its use in “Winston tastes good like a cigarette should.”

Check out our books about the English language

The Grammarphobia Blog

When “for ever” isn’t forever

Q: I seem to recall reading somewhere that “forever” means continually and “for ever” means eternally. I checked my dictionary and it only has the one-word version. Is there really a difference or is the one-word version enough for both senses?

A: In American English, the one-word version is the only version for the adverb meaning continually, incessantly, or eternally.

The American Heritage Dictionary of the English Language (5th ed.), Merriam-Webster’s Collegiate Dictionary (11th ed.), and all the other standard US dictionaries we checked agree on this.

In British English, the situation isn’t quite so simple.

The Oxford English Dictionary says the two-word version can mean either eternally, continually, or incessantly, but it has a half-dozen citations, beginning as far back as the 17th century, for the one-word version used in both senses.

The original 1926 edition of Fowler’s Modern English Usage doesn’t mention the issue, but the 1965 second edition insists on the two-word version.

But, wait, the latest Fowler’s (the revised third edition) says the one-word version means continually or persistently and the two-worder means eternally—except in the US, where one word can do for all those senses.

The lexicographers at standard British dictionaries, however, don’t generally buy that arbitrary approach.

The Cambridge Dictionaries Online, the Longman Dictionary of Contemporary English, and the other British dictionaries we checked list “forever” and “for ever” for all senses in their British editions. And the one-word version is listed first.

In other words, the British seem to be coming around to the American usage here.

In fact, Garner’s Modern American Usage (3rd ed.) says “the solidified version has become standard in both AmE and BrE, and the two-word version is best described as archaic.”

The two-word version, according to OED citations, is by far the oldest, first showing up around 1300 in Cursor Mundi, a Middle English poem: “This folk … that suld vs serue for euer and ai” (“This folk … that should us serve for ever and always”).

The one-word version first appeared in a 1670 satire by John Eachard that expressed “honest and hearty wishes that the best of our Clergy might forever continue as they are.”

By the 1800s, however, sticklers were complaining about the one-word version. We’ll end with an excerpt from “Forever,” a poem by Charles Stuart Calverley, one of the 19th-century complainers:

Forever; ’tis a single word!
Our rude forefathers deem’d it two:
Can you imagine so absurd
A view?

Forever! What abysms of woe
The word reveals, what frenzy, what
Despair! For ever (printed so)
Did not.

Check out our books about the English language

The Grammarphobia Blog

A room with a view

Q: Let’s talk about pronunciation: ga-ZEE-bo or GAZE-bo, what do you think? Anxious to hear your thoughts on this.

A: The verb “gaze” may have something to do with the origin of “gazebo,” but not with its pronunciation. The word “gazebo” has three syllables, not two.

It can be pronounced as ga-ZEE-bo or ga-ZAY-bo, according to The American Heritage Dictionary of the English Language (5th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.).

The Oxford English Dictionary, however, gives only one pronunciation: ga-ZEE-bo.

The OED’s earliest citation for the word is from a 1752 publication about the design of Chinese bridges, temples, arches, and so on. One example was described as “The Elevation of a Chinese Tower or Gazebo.”

But an enterprising word sleuth, Stephen Goranson, recently discovered an earlier citation, from 1741.

Writing on the American Dialect Society’s Linguist List, Goranson said he found the word in a poem by Wetenhall Wilkes. We’ll give an excerpt:

“Unto the painful summit of this height / A gay Gazebo does our Steps invite. / From this, when favour’d with a Cloudless Day, / We fourteen Counties all around survey. / Th’ increasing prospect tires the wandring Eyes: / Hills peep o’er Hills, and mix with distant Skies.”

In modern usage, a gazebo is a freestanding, roofed structure that’s generally open on the sides, similar to a summerhouse or belvedere.

But in the 18th and 19th centuries, a gazebo could also be a part of a house, like a projecting window or balcony, or a roof turret affording distant views.

Where did the word come from? One common theory is that “gazebo” is a quasi-Latin coinage. As the OED says, it’s “commonly explained as a humorous formation” on the verb “gaze.”

According to this theory, “gazebo” would be translated as “I shall gaze,” mimicking first-person future-tense Latin verbs ending in -bo, like videbo (“I shall see”), lavabo (“I shall wash”), placebo (“I shall please”), and so on.

But there’s another theory about the origin of “gazebo,” and that “Chinese Tower” mentioned above is a clue. Some of the early quotations, according to the OED, “suggest that it may possibly be a corruption of some oriental word.”

Ultimately, however, the true origin remains a mystery.

Check out our books about the English language

The Grammarphobia Blog

What’s your weekend look like?

Q: The other day I got a brochure in the mail with a cover that read, “What’s your weekend look like?” Yikes! So embarrassing!

A: The writer of that brochure used “what’s” as a casual or informal contraction of “what does.” But in standard written English, “what’s” is normally a contraction of either “what is” or “what has” (as in “What’s you name?” or “What’s he done now?”).

In written English, the verb “do” is normally contracted only with “not”—in “don’t,” “didn’t,” and “doesn’t.” It’s not usually contracted with a pronoun (like “what”).

When people use “what’s” to mean “what does”—shrinking “does” to an apostrophe and “s”—they’re more likely to be talking than writing.

In discussing contracted forms of the verb “do,” Sidney Greenbaum writes in the Oxford English Grammar: “The contracted form ’s is only occasionally found in writing: Who’s she take after?, What’s he say? It is more common in informal speech.”

In another oral contraction we sometimes hear, “did” is reduced to an apostrophe and “d,” as in “What’d they say?” This too is often heard in speech but very seldom used in writing (except written dialogue).

So we don’t think this use of “what’s” belongs in good writing, unless you’re deliberately trying to to sound colloquial, and it probably shouldn’t have appeared in a brochure seeking business. But we don’t consider it an egregious error, and it’s fairly common in casual conversation.

Check out our books about the English language

The Grammarphobia Blog

Assume, presume, and exhume

Q: It occurred to me this morning that “assume” and “presume” are very close in meaning, especially when taking something to be true, and they present, at least to me, a bit of a semantic problem. Am I the only one who finds these verbs confusing?

A: No, you’re not the only one who finds them confusing. In fact, Pat has included them in a section on confusing pairs of words in her grammar and usage book Woe Is I.

Here’s the way she described them:

“ASSUME/PRESUME. They’re not identical. Assume is closer to ‘suppose,’ or ‘take for granted’; the much stronger presume is closer to ‘believe,’ ‘dare,’ or ‘take too much for granted.’ I can only assume you are joking when you presume to call yourself a plumber!

“NOTE: Presume in the sense of ‘believe’ gives us the adjective presumptive. And presume in the sense of ‘take too much for granted’ gives us the adjective presumptuous. As her favorite nephew, Bertie was Aunt Agatha’s presumptive heir. Still, it was presumptuous of him to measure her windows for new curtains.”

Now let’s exhume a few ancestors. You’re right in thinking that these words are closely connected.

Both have at their roots the Latin verb sumere (to take), so both have to do with taking something—a fact, a thing, or whatever—to oneself.

As the Oxford English Dictionary explains, “assume,” which showed up in English in the 15th century, comes from the Latin verb adsumere “to take to oneself, adopt, usurp.” Here, the Latin prefix ad- means “to.”

The older and more forceful “presume,” first recorded in the 14th century, comes from the Latin verb praesumere, in which the prefix prae- means “before.”

In classical Latin, the OED says, praesumere meant “to consume beforehand, to take upon oneself beforehand, to anticipate, to take for granted, presuppose, assume.”

Later, in post-classical Latin, the word also meant to be arrogant, to rely on, to expect, to take the liberty, to dare, and to claim.

In your question, you mention a specific use of these words—taking something to be true. Depending on the strength of your conviction, you might either “assume” or “presume” that something is true.

In this sense of the word, the OED says, “assume” means “to take for granted as the basis of argument or action,” or “to suppose.”

Here “presume” has a very similar, though stronger meaning: “to assume; to take for granted; to presuppose; to anticipate, count upon, or expect (in early use with a suggestion of overconfidence).”

And in law, according to the OED, “presume” has a specific meaning: “to take as proved in the absence of evidence to the contrary.”

Check out our books about the English language

The Grammarphobia Blog

Palimpsestuous

Q: I suspect you’ve already heard this from other WNYC listeners: Pat might want to recheck the definition of “palimpsest” that she gave during her January appearance on the Leonard Lopate Show.

A: Yes, the definition of “palimpsest” that Pat mentioned on the air isn’t the one found in standard dictionaries, though some writers have used the term figuratively since the 19th century in the way she did.

Pat was referring to crossed (or cross) writing, an old practice in which a letter writer who was poor or frugal would fill a page of paper with writing, then turn it sideways and fill the page again with text running perpendicularly to the original.

The term “palimpsest,” as you point out, refers to a very different way of conserving writing material.

In bygone days, when documents were written on sturdy stuff like parchment or vellum, the writing could be at least partially scraped or rubbed away so the material could be reused. Documents made of more fragile papyrus were sometimes washed and used again too.

Such a recycled document is called a “palimpsest,” and sometimes the ghost of the old writing can be seen beneath the new.

Although writing material has been recycled since classical times, the term “palimpsest” didn’t show up in English until the early 19th century, according to citations in the Oxford English Dictionary.

A bit later in the 19th century, the term was extended to various figurative uses, including multilayered records. The OED doesn’t specifically mention cross-written letters in its citations for “palimpsest,” but the usage can be found in 19th-century texts.

For example, Sir Philip Grey Egerton, an English paleontologist, uses the term in this sense in an 1869 family history that refers to “letters themselves converted into palimpsests by cross writing.”

This practice was extremely common among writers who wanted to save on postage, paper, or both. The poet Keats and the novelist Jane Austen, to mention two examples, were known to have written letters like this.

Such letters were of course a real challenge to read! Charles Dodgson, otherwise known as Lewis Carroll, invented an appropriate proverb in a booklet he wrote on letter-writing in 1888: “Cross-writing makes cross reading.”

True, the intention with the ancient palimpsests was to obscure the old writing, while 19th-century letter writers intended that both layers of writing would be legible.

Here are the OED’s three definitions of “palimpsest” (from the Greek for “scraped again”):

(1) “Paper, parchment, or other writing material designed to be reusable after any writing on it has been erased.” This meaning is now obsolete.

(2) “A parchment or other writing surface on which the original text has been effaced or partially erased, and then overwritten by another; a manuscript in which later writing has been superimposed on earlier (effaced) writing.”

(3) “In extended use: a thing likened to such a writing surface, esp. in having been reused or altered while still retaining traces of its earlier form; a multilayered record.”

We’ll end with the OED’s earliest extended use of “palimpsest,” from an essay by Thomas de Quincy in the June, 1845, issue of Blackwood’s Edinburgh Magazine: “What else than a natural and mighty palimpsest is the human brain?”

Check out our books about the English language

The Grammarphobia Blog

Eating with the fishes

Q: Just read your blog about “fishes.” Are you aware of an Italian Christmas Eve tradition known as the Feast of the Seven Fishes? My husband and I attended one at a huge restaurant in South Jersey. It was served banquet style, minimum of seven fish courses. I lost track of the number of species—something like eleven. I remember fried smelt, fish stew, clams casino, shrimp, baked cod with tomatoes, stuffed flounder, on and on. It lasted for hours, and ended with limoncello and biscotti.

A: We might have known there would be a South Jersey angle (not to mention an Italian one) in that blog item. As we noted, both “fish” and “fishes” are proper plurals, with “fishes” usually referring to more than one species of fish.

No, we hadn’t heard about the Feast of the Seven Fishes. But we found out more about it in a 1987 article about the feast by the New York Times writer Craig Claiborne.

The seven dishes, according to Claiborne, are for the seven Roman Catholic sacraments. (A little googling offers a few other explanations.) Each dish uses a different main ingredient or is cooked in a different way: broiled, fried, baked, and so on.

In his Times article, Claiborne offers recipes for a Feast of Seven Fishes served by a friend of his whose parents came to the US from a small fishing village in Italy.

The tradition you enjoyed in South Jersey apparently originated in southern Italy and isn’t known in some other Italian regions.

In fact, Claiborne’s friend said his parents only began serving the seven dishes after moving to the US. They picked it up from neighbors in Waterbury, CT, who were immigrants from southern Italy.

By the way, you might be interested in our posting on “sleeping with the fishes.” And something tells us that we may find ourselves eating with the fishes in South Jersey next Christmas Eve!

Check out our books about the English language

The Grammarphobia Blog

Religious orientation

Q: Your recent “orientation” post got me to wondering. Jews of Europe (and America) pray to the east, the direction of Jerusalem (roughly), as do Christians of Europe, where the standard church orientation is with the chancel facing east and the faithful facing east toward the altar. Muslims originally prayed toward Jerusalem, before changing to Mecca. Does the need to orient oneself come from these religious practices?

A: There is indeed a religious dimension to the noun and verb “orient,” but let’s begin with some geography.

In classical Latin, oriens meant “the eastern part of the world, the part of the sky in which the sun rises, the east, the rising sun, daybreak, dawn,” in the words of the Oxford English Dictionary.

And when the noun “orient” entered English in the 14th century, it was “originally used with reference to countries lying immediately to the east of the Mediterranean or Southern Europe (i.e. east of the Roman Empire); now usually understood to mean East Asia, or occas. Europe or the Eastern hemisphere, as opposed to North America.”

Now for the religious significance. When the verb “orient” came along in the 18th century, it had as one of its meanings “to place or arrange (a thing or a person) so as to face the east; spec. (a) to build (a church) with the longer axis running due east and west, and the chancel or chief altar at the eastern end; (b) to bury (a person) with the feet towards the east.”

Here are the OED’s citations for this sense of the word:

1728, in Ephraim Chambers’s Cyclopædia: “In most Religions, particular Care has been taken to have their Temples oriented. St. Gregory Thaumaturgus is said to have made a Mountain give way, because it prevented the orienting of a Church he was building.”

1884, in an issue of the journal Science: “The coffins were of plank or stone, and were not oriented.”

1896, in The Classical Review: “The primitive Aryan in taking his bearings literally oriented himself and turned to the east.”

1993, in Joan E. Taylor’s book Christians and the Holy Places: The Myth of Jewish-Christian Origins: “The basilica is, like other Byzantine churches, oriented to the east.”

These practices, however, predated the use of the word “orient” in reference to them. Here’s an explanation we found in a 1907 edition of The Catholic Encyclopedia (we’re adding paragraph breaks for readability):

“The custom of praying with faces turned towards the East is probably as old as Christianity. The earliest allusion to it in Christian literature is in the second book of the Apostolic Constitutions (200-250, probably) which prescribes that a church should be oblong ‘with its head to the East.’ Tertullian also speaks of churches as erected in ‘high and open places, and facing the light.’ ”

Why did the custom develop? The encyclopedia goes on to explain: “The reason for this practice, which did not originate with Christianity, as given by St. Gregory of Nyssa … is that the Orient is the first home of the human race, the seat of the earthly paradise. In the Middle Ages additional reasons for orientation were given, namely, that Our Lord from the Cross looked towards the West, and from the East He shall come for the Last Judgment.”

The writer goes on to say that “the existence of the custom among the pagans is referred to by Clement of Alexandria, who states that their ‘most ancient temples looked towards the West, that people might be taught to turn to the East when facing the images.’”

In discussing church construction, the encyclopedia adds: “The form of orientation which in the Middle Ages was generally adopted consisted in placing the apse and altar in the Eastern end of the basilica. A system of orientation exactly the opposite of this was adopted in the basilicas of the age of Constantine. … Thus, in these cases the bishop from his throne in the apse looked towards the East.”

In practice, however, ecclesiastical architects found that prior construction, street arrangements, and terrain often interfered with strict adherence to the rules of orientation. Not everybody could move a mountain!

As for Jews, rabbinical opinion has generally held that those living outside the Land of Israel should pray toward the Holy Land, and those living in Israel should pray toward Jerusalem.

Most Jews of the Diaspora live to the west of Israel and thus pray to the east. From what we’ve read, ancient synagogues usually conformed to this tradition and may have influenced the orientation of Christian churches.

Check out our books about the English language

The Grammarphobia Blog

Well, clutch the pearls!

Q: This is from a posting on Care2 about funding for the Senate campaigns of Elizabeth Warren and Scott Brown: “Plenty of politicians pearl-clutch over the impact of Citizens United but complain that they are helpless to do anything about it.” Can you explain the phrase “pearl-clutch”?

A: The verbal phrase “pearl-clutch” and several similar expressions, including “pearl clutching,” “clutch my pearls,” and “clutch the pearls,” are often accompanied by a gesture suggesting the clutching of an imaginary string of pearls.

These expressions, with or without the gesture, usually refer to surprise or shock of one sort or another—real shock, mock shock, amused shock, awed shock, or shock over something that’s not considered really shocking.

But this is an evolving usage and in the posting you cited from the social network Care2, the lawyer-writer Jessica Pieklo uses “pearl-clutch,” minus the shock, to mean carry on or grumble about something.

Pieklo praises Warren and Brown for trying to reduce the impact of the Citizens United case on their Senate race, but then says (in the sentence you cite) that a lot of other politicians just kvetch about the Supreme Court ruling without doing anything about it.

You won’t find any of these relatively new pearl-clutching expressions in the Oxford English Dictionary, standard dictionaries, or general slang reference books—at least not yet.

But you can find them in some works on gay slang, including Speaking in Queer Terms (2003), edited by William L. Leap and Tom Boellstorff, a collection of essays about the globalization of gay men’s English.

So did these pearl-clutching expressions originate among gay men?

Well, Speaking in Queer Terms gives several examples of gay men using them in the sense of shock, surprise, and awed admiration, but it doesn’t include any dates.

However, a gay character on the Fox TV show In Living Color is responsible for the earliest example of the usage mentioned in discussions over the last six months on the American Dialect Society’s Linguist List.

In an April 15, 1990, sketch, the flamboyant cultural critic Blaine Edwards (played by Damon Wayans) gushes over how daring the producers were to cast a male actor as the female lead in Dangerous Liaisons.

When told that Glenn Close is actually a woman, Edwards squeals in mock shock and says, “Well, clutch the pearls! What a sneaky thing to do.”

Of course people have been literally clutching their pearls in shock or otherwise for a long time. Here, for example, is a citation from a 1910 issue of the Chambers Journal, a weekly magazine that published fiction and nonfiction:

“Without being aware that I had stirred, I found myself close to the table. I drew a gasping breath, and my hand went out without any conscious volition and clutched the pearls.”

Although the usual image is of a shocked woman doing the pearl-clutching, we found a couple of references in Google Books to male clutching, including a strange one from Arabian Antic, a 1938 book by Ladislas Farago about Jews in the Arab world.

In a chapter on Yemen, Farago describes a rabbi whose “skinny fingers clutched the pearls of a rosary” while “his withered lips sucked the tube of a gigantic” hookah.

Check out our books about the English language