The Grammarphobia Blog

In the beginning were the words

Q: Several years ago, I read a story about research being done on the “original language.” I have tried, to no avail, to find any reference to the subject. Of course I have no idea where I read it, or anything else about it, just that it was very interesting. Do you have any suggestions?

A: The article may have been referring to the prehistoric Proto-Indo-European language, which was spoken “in an as yet unidentified area between eastern Europe and the Aral sea around the fifth millennium B.C.,” according to The American Heritage Dictionary of Indo-European Roots, edited by Calvert Watkins.

The Indo-European family of languages – including English and the other Germanic languages, the Celtic languages, Latin and its descendants, Greek, the Balto-Slavic languages, Sanskrit, Iranian, and many others – all are descended from prehistoric Proto-Indo-European.

Of course Proto-Indo-European does not include many, many more languages and language groups, like Arabic, Hebrew, ancient Egyptian, Finnish, Hungarian, Turkish, Mongolian, Dravidian (the languages of southern India), Korean, Japanese, Chinese languages, Polynesian, Amer-Indian, Eskimo-Aleut, and on and on.

While some linguists like to speculate that everything ultimately came from one pot – some “Proto-World” language that may have existed 100,000 years ago – that at the moment is just a leap of the imagination.

Buy Pat’s books at a local store or
Amazon.com.

The Grammarphobia Blog

Fun and games

(An updated post about “fun” and “so fun” appeared on the blog on July 25, 2014.)

Q: My husband and I have always said things like “That was such fun” or “This is so much fun.” Nowadays, I hear “That was so fun.” Is this new usage acceptable?

A: No, the usage isn’t acceptable, but it’s now so common that someday it just might be.

“Fun” is traditionally a noun (a thing, as in “We had fun”), not an adjective. So you usually wouldn’t use it as a modifier (“We had a fun day”). An exception would be when “fun” is a predicate nominative – a noun that follows a verb and modifies the subject (“This is fun”).

Therefore, it would be OK to use “fun” in a sentence like “Skiing is fun,” but not in one like “We had a fun day on the slopes.”

For the same reason, the ubiquitous and annoying “so fun” is incorrect, but “so much fun” is not. If you mentally substitute a noun like “entertainment” you can see why. You wouldn’t say “so entertainment,” but you could say “so much entertainment.” Similarly you could say “This is entertainment” (predicate nominative).

I can’t say exactly when “so fun” came into being. (I don’t see any published references for the expression in the Oxford English Dictionary.) But I think it was probably inevitable.

People are used to putting “so” before predicate adjectives – that is, adjectives that describe the subject of a sentence (“This is so pink” or “This is so hard”). It’s natural, if not proper, that some people would want to put “so” before a noun that describes a subject (“This is so fun”). And if enough people do it, “fun” may become a legitimate adjective one day.

One more thing: Although “fun” is primarily a noun, it has been used since the 17th century as a verb meaning to make fun or to cheat (this usage is now obsolete), according to the OED.

Now, isn’t English fun!

Buy Pat’s books at a local store or
Amazon.com.

The Grammarphobia Blog

It is what it is, or is it?

Q: I cringe every time I hear someone say, “It is what it is.” This phrase is ridiculous and it fails to add any substance to a conversation. It was bad enough when I heard it only from friends and students, but now I hear it on the news, even on NPR. What is its origin? And when will it disappear?

A: This annoying and insipid expression (and it’s older than you think) has gotten a lot of attention in the last few years, often from people who are just as irritated by it as you are.

Gary Milhoces, in a December 2004 article in USA Today, picked “It is what it is” as the No. 1 sports quote of 2004, but noted that it wasn’t new even then. “Although the origin is uncertain,” he wrote, “it has been around for years.”

In March 2006, William Safire devoted his On Language column in the New York Times Magazine to the mushrooming usage. The first published reference Safire could find came from a 1949 column by J.E. Lawrence in the Nebraska State Journal.

Here’s an excerpt from Lawrence’s column, describing the way pioneer life molded character: “New land is harsh, and vigorous, and sturdy. It scorns evidence of weakness. There is nothing of sham or hypocrisy in it. It is what it is, without apology.”

Safire likened the expression to Popeye’s “I yam what I yam.” If it was mushrooming two years ago, when he wrote about it, it’s a mushroom cloud today. But the online version of the Oxford English Dictionary still doesn’t have any citations for it.

In 2007, the Creative Group, a California company representing marketers and advertisers, surveyed clients nationwide and asked them to choose the most annoying and overused of their industry’s clichés. They chose “it is what it is,” along with “outside the box,” “synergy,” “low-hanging fruit,” and others.

If advertisers and marketers hate it, then its fate is probably sealed.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

The hoodoo on “who do”

Q: Am I the only one bothered by the recent surge in the use of “who” to refer to non-human subjects? I haven’t noticed it in print but I hear it on radio and on TV – often by folks with otherwise impeccable grammar. Here’s an example: “The banks who did poorly were hit hard by the sub-prime mortgage crisis.” Is this usage acceptable? Or are the perpetrators just too lazy or unschooled to decide between “that” and “which”?

A: This is new to me. I’ve never heard anyone refer to a lending institution as a “who.” It certainly isn’t acceptable. “Who” is restricted to people and sometimes pets. For example, The New York Times Manual of Style and Usage recommends that animals be referred to with “he,” “she,” and “who” (instead of the impersonal “it” or “that”) if the animal’s name or sex is mentioned.

Your theory about the source of this boo-boo – confusion over “that” and “which” – may be right. The same thing happens regularly with people who can’t decide between “I” and “me,” and decide to play it safe (they think) and go with “myself.”

I often hear from people who have the opposite complaint, and tell me they’re bothered when “that” is used to refer to people and not objects. In fact, this is perfectly good English. A person can be a “who” or a “that.”

The American Heritage Dictionary of the English Language (4th ed.) notes that there’s no foundation either in logic or in usage for the widespread misconception that “that” should refer only to things and not to people.

Both “who” and “that,” as relative pronouns, are appropriate for referring to people, according to A Dictionary of Contemporary American Usage, by Bergen and Cornelia Evans. The authors write:

That has been the standard relative pronoun for about eight hundred years and can be used in speaking of persons, animals, or things. Four hundred years ago, which became popular as a substitute for the relative that and was used for persons, animals, and things. Three hundred years ago, who also became popular as a relative. It was used in speaking of persons and animals but not of things. This left English with more relative pronouns than it has any use for. … Who may in time drive out that as a relative referring to persons, but it has not yet done so.”

One more point. While a bank can’t be a “who,” it can be referred to in the possessive with “whose” (as in “The bank whose loss was greatest was Acme”). There’s an old grammar myth to the effect that you can’t use the possessive “whose” to refer to an inanimate object because things can’t own things. Not so. The possessive “whose” is not restricted to animate antecedents.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

An algebraic occasion

Q: I hear people who should know better use the term “unknown quantity” when “unknown entity” is what they are really getting at. It is the quality of whatever is referred to that is unknown, not the quantity.

A: The expression “unknown quantity” originated in the terminology of algebra in the 17th century. A simple algebraic equation has an “unknown quantity” (often referred to as “x”) plus several known quantities. Solving the equation gives you the value of “x.” For example, in the equation x – 2 = 3 + 4, the unknown quantity (x) equals 9.

The Oxford English Dictionary gives two early citations for this usage: “The degree of Composition in the unknown Quantity of the Æquation” (1676). And “The Root of an Equation, is the Value of the unknown Quantity in the Equation” (1728). So an “unknown quantity,” in this case, is the object in a mathematical operation.

But “unknown quantity” has long been used figuratively as well. The essayist Walter Bagehot wrote in the Fortnightly Review in 1865: “The first election of Mr. Lincoln … was government by an unknown quantity.” The expression likens an unfamiliar or mysterious person or situation (say, a “wild card” or a “loose cannon”) to that part of an equation that remains hidden – the yet-to-be-solved “unknown quantity.”

All things considered, I’d say this is a fair use of metaphor.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

A justifiable usage

Q: I heard someone on the radio say the goal of the MacArthur Foundation is “to help develop a more just society.” I’m of the opinion that “just” is an absolute adjective and cannot be modified. Before I offer a correction to such an esteemed organization, however, I’d like to check with the word maven herself.

A: An absolute adjective is one (like “dead” or “pregnant” or “unique”) that doesn’t have any degrees. You can’t, for example, be “more dead” or “very pregnant” or “extremely unique.”

The adjective “just,” in the context you cite, means fair, lawful, or morally right, according to modern dictionaries. I’m not absolutely certain it’s an absolute adjective, but let’s assume you’re right.

Sometimes an absolute adjective can legitimately be qualified for rhetorical effect. I think the phrase “a more just society” might be an example of this, similar to the phrase “a more perfect union” in the Preamble to the Constitution.

I believe both phrases (“more just” and “more perfect”) can be effective and can be justified when they refer to a striving further toward justice or perfection, rather than an ideal of justice or perfection. The expressions “closer to just” or “closer to perfect” may be more logical, but they have less rhetorical power.

In a 2006 speech in Philadelphia, former President Bill Clinton spoke of “how smart the Founding Fathers were” to come up with the expression “a more perfect union.”

“They knew we would never be perfect,” he said. “But they knew we could always be more perfect.”

And that just about sums it up!

Buy Pat’s books at a local store or
Amazon.com.

The Grammarphobia Blog

What’s new, pussens?

Q: Watching a Pinter play recently, I was struck by the use of the word “pussens” in referring to a cat. I could not find it in the dictionary, but my wife saw the word in Ulysses. Did Joyce make it up?

A: “Puss” and “pussy” have been used for calling cats since the 1500s, and “pussycat” since the 1600s. The origin of “puss” isn’t clear, but there has been some speculation that it evolved in imitation of sounds one makes to call a cat.

There are similar cat terms in several other Germanic languages, including the Scandinavian languages (pus, puse), Low and Middle Low German (pus, puus, puse), and Dutch (poes, puis).

The word “pussens,” according to the Oxford English Dictionary, is an extended form of “puss” that was first recorded in 1866 in a book by the Victorian novelist Charlotte Riddell: ” ‘Oh! you dear, dear old pussens’ – and the child made a dive at the tabby.”

It next appeared in 1915 in one of Lucy Maud Montgomery’s books for girls: “‘Him was a nice old pussens, him was,’ vowed Anne, cuddling her pet defiantly.”

The OED‘s third citation for “pussens” does indeed come from James Joyce’s Ulysses (1922): “Milk for the pussens, he said.” (Here’s another example, not in the OED, from Ulysses: “I never saw such a stupid pussens as the pussens.”)

A 2007 draft revision for the OED calls “pussens” a “nursery and colloquial” usage. The addition of “ens” to “puss” seems to be arbitrary. Another form of the word, “pussums,” first recorded in 1912, may be formed by analogy with “diddums,” which began as 19th-century baby-talk.

So Joyce didn’t make up the word. God only knows where he got it. Do you think he was a fan of Montgomery’s Anne of Green Gables books? No, I don’t think so either. Maybe he picked it up from his governess. I seem to remember that she had the same nickname as Stephen Daedalus’s Dante.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

High muckamucks

Q: I listen to you on XM Public Radio and you’ve reignited my passion for language. I wonder if you can tell me the origin of the term “mucky muck” in reference to a well-heeled, fancy person?

A: The expression, which has a long and interesting history, is usually seen as “high muckamuck,” “muckamuck,” or “muckety-muck” these days. The American Heritage Dictionary of the English Language (4th ed.) has entries for the first two while Merriam-Webster’s Collegiate Dictionary (11th ed.) goes for the third. Both references say the expression refers to an important, often arrogant person.

The term “high muck-a-muck” began life as mid-19th century American slang for an important or self-important person, according to Cassell’s Dictionary of Slang. (Variations included “big muck-a-muck,” “muckety-muck,” “muckti-muck,” “muckty-muck,” “mucky-muck,” and “mucky-mucky.”)

The first citation for the expression in the Oxford English Dictionary comes from an 1856 article in the Democratic State Journal in Sacramento: “The professors – high ‘Muck-a-Mucks’ – tried fusion, and produced confusion.”

Why “muck”? Doing a little exploring, we find that the noun “muck” was first recorded in 1268, when it was a term for excrement, particularly farm manure. In the following century, it came to be used as a general term for dirt or filth.

But from the year 1325 until well into the 19th century, according to the Oxford English Dictionary, “muck” was also a word for “worldly wealth, money, especially regarded as sordid, corrupting, etc.” This sense of “muck” as filthy lucre is now obsolete.

The verb “muck,” not surprisingly, meant to hoard money or wealth back in the 1400s, another meaning that’s become obsolete. And in the 1500s, a “mucker” was a miser or a hoarder of wealth.

All this would seem to point to Middle English origins for the “muckamuck” who’s a VIP in American slang. But surprisingly, the expression apparently comes from a completely different direction – not English at all, but Native American languages of the Pacific Northwest.

As it turns out, according to the OED, “muckamuck” was originally a Chinook word for food, probably derived from a Nootka word meaning choice whalemeat.

In fact, European explorers, settlers, and so on once used “muckamuck” to refer to food or provisions (There was a verb, too: in the old Northwest, to “muckamuck” was to eat.)

It’s this use of “muckamuck” that led to the noun for a big shot. The word in this sense was originally part of the expression “high muck-a-muck,” which the OED says was apparently adapted from Chinook: hiu (“plenty”) plus muckamuck (“food”). In other words, someone with a lot of food was a big cheese.

The “muckamuck” that refers to an important person today is a shorter version of “high muckamuck.” And while we now interpret the beginning of the phrase as the English word “high,” its original meaning was plenty different.

And that’s enough mucking around for today!

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Grandfather clause

Q: I’m in the construction business and I use the term “grandfather clause” all the time. But the other day, somebody jumped down my throat for using a racist expression. Is this true?

A: The term “grandfather clause” usually refers to a provision exempting an existing activity from a new regulation affecting it. Under such a clause, for example, an existing business might be allowed to remain in an area rezoned from commercial to residential.

Although this specific usage has no racist overtones, the expression can also refer to now-defunct statutes some southern states adopted to exempt poor whites from the poll taxes and literacy requirements that kept blacks from voting.

The racist statutes, adopted in the late 19th and early 20th centuries, generally provided that anyone (or his descendents) who had been allowed to vote before African-Americans were enfranchised would be exempt from the anti-black requirements.

The first published reference in the Oxford English Dictionary for “grandfather clause” is in the Jan. 22, 1900, Congressional Record: “The grandfather clause will not avail those citizens who … are unable to pay their poll tax.”

But Dave Wilton, an independent language researcher, reported on the Linguist List forum in 2006 that he had found an even earlier one in the Aug. 3, 1899, issue of the New York Times: ”It provides, too, that the descendents of any one competent to vote in 1867 may vote now regardless of existing conditions. It is known as the ‘grandfather’s clause.’“

The verb “grandfather,” meaning to exempt from new regulations, is undoubtedly derived from “grandfather clause.” Or, so Merriam-Webster’s Collegiate Dictionary (11th ed.) defines it: “to permit to continue under a grandfather clause.”

But the verb, which is often accompanied by the prepositions “in” or “out,” doesn’t appear to have been used in the racial sense. All five citations in a 1993 addition to the online version of the OED use the verb only in the modern way. Here’s an example from the Kentucky Revised Statutes of 1953: “All certificates or permits grandfathered shall be subject to the same limitations and restrictions.”

As for your original question, the use of “grandfather clause” in the modern sense is not racist and doesn’t have anything to do with race, but the expression has its roots in an infamous chapter in US racial history.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Moonshines

Q: Do you have any insights into the verb “moon,” which my dictionary says originated as student slang in the ‘60s?

A: Hmm. Your question reminds me of a “streaking” incident when I was a college student in Iowa in the early ‘70s. (In the lexicon of student pranks, the verb “streak” means to run naked in public.) But back to the subject at hand.

The verb “moon” has been around a bit longer than the 1960s. In fact, the earliest published reference in the Oxford English Dictionary comes from a 1601 translation of Pliny’s Historia Naturalis. In those days, it meant to expose to, bask in, or glow like moonlight.

By the mid-18th century, the verb also meant to act listlessly or aimlessly, according to the OED. And by the late 19th century, it took on the additional meaning of to daydream or act sentimentally.

But your question is obviously about a very different meaning of the word, a slang usage that the OED defines this way: “To expose one’s buttocks, esp. as a gesture intended to insult or shock.”

The OED’s earliest citation for this usage is from a 1968 issue of the journal Current Slang. But I prefer this 1971 citation from National Lampoon: “Have a few ‘brews,’ gross out some chicks, ‘moon’ a townie.”

Although this meaning of the verb “moon” is relatively new, the noun “moon” has been used to refer to buttocks for more than 250 years. The OED’s earliest published reference, which dates from 1756, refers to someone’s uncovered moon.

Here’s a more literary citation, from Joyce’s Ulysses (1922): “Or their skirt behind, placket unhooked. Glimpses of the moon.” And here’s one from the Beckett novel Murphy (1938): “Placing her hands upon her moons, plump and plain.”

Well, enough moonshines for today.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Ups and downs of cyberspace

Q: Why do you think computer information is “uploaded” to a network and “downloaded” from one? Why should the network be above us?

A: The verbs “upload” and “download” are defined somewhat differently in various dictionaries. But generally the term “upload” means to transfer data from a smaller computer to a larger, often remote one. And the term “download” means to transfer data from a larger, often remote computer to a smaller one.

Why “up” and “down,” rather than “left” and “right,” or “in” and “out,” or whatever?

Well, many people think of computer networks, especially the Internet, as somewhere up there in the ether – that is, in cyberspace. And networks are often arranged in a hierarchy, with the important computers on top and the subordinate ones below. But I suspect the answer to your question is more etymological than geographical, technical, or philosophical.

The verb “upload,” meaning to load up, has been with us for almost a century and a half, well before anyone ever thought of uploading computer data.

The first citation in the Oxford English Dictionary, from 1870, is in a poem by William Barnes: Low-headed horses slowly hail / The newly-made hay, uploaded high.

The OED’s first published reference for “upload” used with computer information is in a 1977 article in Aviation Week & Space Technology: “The joint program office … uploaded navigation data into the satellite by readjusting the phasing of the pseudo random noise signal employed by the navigation system.”

Interestingly, Aviation Week had used the term a year before in the older way in reference to loading a C-130 airplane: “At present most C-5C-141 pallets have to be reconfigured prior to uploading the C-130, particularly the two pallet spaces in the wheel-well area.”

So it seems to me that the digital use of “upload” probably evolved from the analog term. We can see similar evolutions with “cut,” “paste,” “spam,” “send,” “attach,” “clipboard,” and so on.

The first citation in the OED for the verb “download” is in a 1980 article in Electronic Design: “These programs are downloaded into the Microsystem Analyzer for debug and execution.”

I suspect, therefore, that the term “download” is a natural outgrowth of “upload.” But if anyone out there (or up there) in the ether has a better explanation, please let me know.

Buy Pat’s books at a local store or
Amazon.com.

The Grammarphobia Blog

A sigh is just a sigh

Q: A Hebrew speaker has asked me to describe the difference between these three words: “moan,” “groan,” “sigh.” There’s only one word for all of them in Hebrew. Can you explain?

A: English is believed by many to have the largest lexicon – that is, the most words – of any modern language.
Although this depends on how one decides what a word is, English does indeed have lots of words  And that gives English speakers lots of flexibility.

When we need a noun, for example, we can often choose between three or four or more of them while the speakers of other languages may have only one to do the job.

So, as you point out (and The New Bantam-Megiddo Hebrew & English Dictionary confirms), we can utter a moan, a groan, or a sigh when we’re feeling down in the dumps, but a Hebrew speaker must be content with an anachah.

As for your question, I’d describe a moan and a groan as similar (they’re both vocal sounds), with the groan more intense. A sigh, on the other hand, is a breathing out that doesn’t involve the vocal cords.

A moan is described in the Oxford English Dictionary as “a low, inarticulate sound” expressing mental or physical suffering (or, in some cases, pleasure) that’s “less harsh and deep than a groan.”

A groan, according to the OED, is “a low vocal murmur, emitted involuntarily under pressure of pain or distress, or produced in voluntary simulation as an expression of strong disapprobation.”

A sigh, the dictionary says, is a “sudden, prolonged, deep and more or less audible respiration, following on a deep-drawn breath, and especially indicating or expressing dejection, weariness, longing, pain, or relief.”

I hope this helps. And tell your friend to lighten up!

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Myriad images

Q: Some people use the word “myriad” as an adjective meaning numerous, others as a noun meaning a large number, and still others as a noun meaning 10,000. Which is correct?

A: “Myriad” is both a noun meaning a great number and an adjective meaning numerous. In fact, the noun came first (circa 1555, vs. 1735 for the adjective). Although “myriad” is derived from two Greek words – one meaning 10,000 and the other innumerable – the sense of 10,000 is archaic today.

I agree with Bryan A. Garner, who notes in Garner’s Modern American Usage that the word “is more concise as an adjective (myriad drugs) than as a noun (a myriad of drugs).” But Garner adds that the choice between noun and adjective “is a question of style, not correctness.”

In my opinion, “myriad,” in the sense of an indefinite large number (which is the modern meaning), is redundant in the plural (“myriads”). I’ve maintained this view in both the first and second editions of my book Woe Is I.

I’m well aware that the plural “myriads” is a very old usage (in fact, it reeks of antiquity). But since the noun “myriad” now means a great number, the plural construction (“myriads”) seems unnecessary. And while we’re on the subject of style, not correctness, I regard “a myriad of” as infelicitous for much the same reason.

But it would be wrong to call those usages incorrect. In fact, they’re “parallel with those of the original ancient Greek,” according to a usage note in The American Heritage Dictionary of the English Language (4th ed.).

[Update: The third edition of Woe Is I (2009, 2010) puts petty prejudice aside and bows to current usage, infelicitous though it may be. Here’s the updated entry: “MYRIAD. It originally meant 'ten thousand,’ but myriad now is an adjective meaning 'numerous’ (Little Chuckie has myriad freckles) or a noun meaning 'great number’ (He has myriads [or a myriad] of them).”]

Buy our books at a local store or Amazon.com.

The Grammarphobia Blog

A murder of crows

Q: When my wife and I grew up in Monmouthshire in the UK back in the ‘40s and ‘50s, a common collective noun for crows was “murder,” as in “a murder of crows.” Please comment on the origin of this usage.

A: The term “murder” was used to describe a flock of crows as far back as the 15th century, according to the Oxford English Dictionary. (Here’s a spine-chilling version from 1475: “A morther of crowys.”)

But the OED cautions that this usage is “one of many alleged group names found in late Middle English glossarial sources.” In other words, it might have been some glossary writer’s fanciful invention, rather than a legitimate expression in common use in the 15th century – like “gaggle of geese” or “pack of dogs.”

But why a “murder”? The OED suggests this is an allusion to “the crow’s traditional association with violent death” or “its harsh and raucous cry.” If you’ve ever heard dozens of agitated crows in full cry, it really does sound as if they’re yelling bloody murder.

This usage, which apparently died out after the 1400s, was revived in the 20th century. The first modern citation in the OED comes from 1939, but the usage was undoubtedly popularized by its appearance in An Exaltation of Larks (1968), a compendium of “nouns of multitude” by James Lipton.

Some of the group nouns (they’re called terms of venery) in that book and its successors are actual archaic terms, and some (like “parliament of owls”) are more likely to be inventions. The term “parliament,” for example, has been used since 1400 to describe a congregation of things: fowls, birds, fools, bees, women, masts (on a ship), and rooks, according to the OED. But not until Lipton’s book does it appear in association with owls.

The American Heritage Dictionary of the English Language (4th ed.) includes many collective nouns for animals under its entry for “flock.” But keep in mind that some of them are found only in very old compilations and may have had little or no existence aside from these lists.

Since Lipton’s very popular book and its successors appeared, the naming of groups of things has become something of a game with his many fans. While these inventions aren’t legitimate historically, some of them are quite imaginative and even beautiful. A favorite of mine: “a chandelier of hummingbirds.”

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Do you know where it’s at?

Q: Something that drives me crazy these days is hearing people say “where it’s at” rather than “where it is.” Is the former phrase now acceptable, or is it still incorrect?

A: There are two versions of the expression “where it’s at.” One can get you a failing grade in grammar (in the few places where grammar is still being taught), but the other is an acceptable idiomatic expression.

First, the no-no. Using “where it’s at” instead of “where it is” is redundant. I’m talking about sentences like “Where are my car keys at?” The “at” is superfluous.

This usage (or, rather, misusage) isn’t especially new, however. Here’s an excerpt from John R. Bartlett’s Dictionary of Americanism (1859): “At is often used superfluously in the South and West, as in the question ‘Where is he at?’”

Now, on to the other usage. There’s nothing wrong with the colloquial expression “where it’s at,” as in “Dylan really knows where it’s at!” It may not be perfect grammar, but it’s accepted as an idiom.

The Oxford English Dictionary defines the idiomatic “where it’s at” this way: “the true or essential nature of a situation (or person); the true state of affairs; a place of central activity.”

This colloquial usage isn’t new either. The OED has published references for it going back to a 1903 article in the New York Sun, and I imagine it was used in speech quite a bit before that.

So that’s where it’s at!

Buy our books at a local store or Amazon.com.

The Grammarphobia Blog

No problemo?

Q: Since when did “no problem” or “not a problem” become an acceptable substitute for “you’re welcome”? Go into ANY restaurant, request that your server bring you extra sauce for your shrimp cocktail, and you’ll hear one of those problematic responses in 8 out of 10 instances. Thank you for allowing me to vent my frustration!

A: This is a tic (like “have a nice day”) that will probably go away eventually. But we’ll all be retired to our rocking chairs by then, and complaining about a whole new set of verbal tics.

The expression “no problem,” meaning OK or thanks, has been with us for half a century at least, according to the Oxford English Dictionary. The first published reference in the OED is in a 1955 collection of letters between the writers Ralph Ellison and Albert Murray.

The OED citation I like best, however, is in Martin Amis’s first novel, The Rachel Papers (1977): “Finally, every time I emptied my glass, he took it, put more whisky in it, and gave it back to me, saying ‘No problem’ again through his nose.”

I can’t find a reference in the OED for the phrase “not a problem” used this way. But the dictionary does have quite a few published references dating back to 1985 for the ersatz Spanish “no problemo” (the actual Spanish word is problema).

As you might image, the word “problem” itself is quite old, appearing in the first complete English translation of the Bible (the Wycliffe version of about 1382).

Some of the more recent expressions in the OED that use the word are “problem child” (1920), “problem-oriented” (1946), “problem hair” (1967), and “problem dandruff” (1997).

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Keeping the faith

Q: Could you tell us about the history of the word “faith” and its use in US politics. I have complete faith that you can deliver the truth.

A: “Faith” is a noble old word. Old because it goes back eight centuries, and noble because it means belief, confidence, reliance, trust, loyalty, allegiance, fidelity.

The word was adopted into Middle English in about 1250, according to The Barnhart Concise Dictionary of Etymology. It was originally spelled “feth” or “feith” and was borrowed, as many words were in those days, from Old French (feit or feid), which ultimately got it from the Latin fides (trust).

The Latin fides, by the way, has given us a whole family of words including “confide” and “confidence,” “defy,” “diffident” (which originally meant distrustful, not shy), “fealty,” “fidelity,” “fiduciary,” “perfidy,” and (no kidding!) “federal”!

It has also given us expressions like “bona fides” (good faith) and “auto-da-fé” (act of faith). The term “auto-da-fé” became a notorious expression during the Inquisition, when it was used to refer to the sentencing of an “infidel” and his subsequent burning at the stake.

At first “faith” referred principally to religious faith, according to the Oxford English Dictionary, but the word’s meaning eventually widened to include broader usages, as in the adjectival phrase “faith-based,” which we hear so much these days.

Draft additions to the OED in 2007 show that “faith-based” dates back, believe it or not, to 1874. The dictionary says the phrase is chiefly American and means “(a) based on religious faith; (b) designating or relating to a charitable institution, social program, etc., created or managed by a religious organization.”

The first citation for this usage in the OED is from a poem by John Henry Vosburg: All God’s gifts are free, and hope is real, / Faith-based, to those who will not think, but feel.

The expression may have lain dormant for another hundred years, because the next citation in the OED is from a piece of literary criticism in 1972: “Ahab equates fear with orthodox belief, the faith-based laws of which he defies.”

Next comes a 1986 mention in a newspaper: “Witness for Peace is a grassroots, non-violent, faith-based movement committed to changing U.S. policy toward Nicaragua.” And then we have a citation from Newsweek in June 1998: “Congress … has swung behind a series of policy changes … which allow federal, state and local funds to flow to faith-based anti-poverty groups.”

There’s some irony, I think, in the fact that the same Latin word, fides, gave us both “faith” and “federal.” According to a recent Pew Forum on Religion and Public Life, “The United States has a long tradition of separating church from state, yet a powerful inclination to mix religion and politics.”

This inclination must be behind the rapid spread of the term “faith-based,” which has become a ubiquitous expression in politics, government, and nearly all facets of public life.

As for your faith in me, I hope it’s been justified.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Don’t know much geometry

Q: I was watching an old “Star Trek” episode in which one of the characters referred to power levels as increasing “geometrically.” (This from a supposed graduate of Starfleet Academy.) Shouldn’t the levels have increased “exponentially”?

A: These two terms can be confusing, and most dictionaries aren’t a help. Despite what you seem to think, however, “geometrically” and “exponentially” can refer to the same kind of numeric growth. Here’s what I mean.

When numbers in a series increase by addition, they form an “arithmetic” (accent on the third syllable) or “linear” sequence. Example: 2, 4, 6, 8, 10, 12. (This sequence has a constant difference of 2.)

When numbers in a series increase by multiplication, they form a “geometric” or “exponential” sequence. Example: 3, 6, 12, 24, 48. (This sequence has a constant ratio of 2.)

I hope this helps.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Food for thought

Q: When did the word “restaurant” come into the English language? My understanding (which I’m not sure is correct) is that it’s a very recent invention (not more than 200 years old).

A: Your instincts about the history of “restaurant” are right on the money. It’s a relatively young word, first used in English in 1827 by the novelist James Fenimore Cooper, author of such wilderness thrillers as The Deerslayer and The Last of the Mohicans.

The earliest citation in the Oxford English Dictionary, which comes from Cooper’s book The Prairie, refers to “the most renowned of the Parisian restaurans.” (Note the lack of the final “t,” which Cooper probably left off either out of ignorance or in an attempt to imitate the French pronunciation.)

It’s significant that Cooper used the word in reference to Parisian eateries, since “restaurant” is said to have originated in Paris in 1765. The French word was a noun based on the present participle of the verb restaurer (“to restore”), which came into French from the Latin verb restaurare. So when Cooper used it, the word was relatively new even in French.

The word “restaurateur,” which confuses so many people today, has had an interesting, if not confusing, history. A similar word, “restaurator,” entered English in the 1600s, but it meant “one who restores” and was first used in the medical sense.

The OED‘s first citation, from Peter Heylin’s Cosmographie (1652), refers to a “great Herbalist and Restaurator of Physick.” This word didn’t come from French, but directly from the Latin restaurare.

Surprisingly, the modern word “restaurateur” came into English before “restaurant.” The OED‘s first citation for “restaurateur” in the sense of a restaurant keeper is from a 1796 letter by Edmund Burke that mentioned the French and “all their former restaurateurs.”

The word “restaurateur,” it turns out, was also briefly used to mean the establishment itself. That usage first showed up in 1801 in a book by Catherine Wilmot about the travels of an Irish peer on the Continent, in which she referred to “Libraries, Restaurateurs, Gambling Houses, Coffee Houses.”

The erroneous spelling “restauranteur” first showed up in the 1940s and has stubbornly persisted. In fact, modern dictionaries now list it as an acceptable variant, but I’m not quite ready to throw in the towel on this one.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Grammar Moses

Q: I’m writing a story for my college newspaper about where grammar comes from and how it’s evolving. As someone who has struggled with the rules of English, I want to find out why grammar is so embedded in our language and why it’s so necessary.

A: That’s a tall order. I can’t do the subject justice in a blog entry, but I can offer a few random thoughts that you may find useful.

We all help define what “proper” English is. It’s merely a set of conventions to make communication easier and more sensible.

Vocabulary changes very quickly, especially in the age of the Internet. Style and usage, documented in dictionaries, usage manuals, and stylebooks, evolve and change slowly over the years as public tastes come and go.

The foundations of grammar, however, are much more resistant to change. These include things like subject/verb agreement, how we form plurals, the case of pronouns (object vs. subject, for example), and so on. All these are very resistant to change, and for a good reason. Grammar is a communications tool, not a fashion accessory that is one flavor today and another tomorrow.

Some have suggested that “proper” grammar is elitist. I disagree. I come from a working-class family in which no one before me had attended college. By no means can my origins be considered elitist or “dominant class.” On the contrary.

But the one thing that was available to me, as well as to the more privileged youngsters in my Midwestern community, was a public education that gave me access to a good grounding in English grammar and composition. (Of course, this was back when grammar was actually taught in school!) I went on to become a journalist, an editor at the New York Times, and finally an author of books about language.

In the world we live in, smart kids have to know several grammars, each one with complex rules. And believe me, they do. We do youngsters a grave disservice when we label particular grammars as “elitist” or “street” or “urban” or “white” or “black.” Some kids simply have to know them all. Once they do, nothing can stop them. They must be able to think, write, read, and speak in several languages.

It’s a mistake to let any child suspect that a certain level of learning is out of his reach. That’s what we do when we expect less of some children than we do of others. Human children are geniuses at soaking up language, and we should expect the max from them, from the very earliest ages.

Grammatical quibbles may seem silly, and some of them are! But what isn’t silly is that communication is important. The ability to read and to communicate one’s thoughts unambiguously in clear English is the best gift we can pass on to any child, no matter where he or she comes from. If this is elitism, so be it. It’s elitism that should belong to everyone, and that can belong to everyone.

Grammar has not been considered an important subject in American public schools for at least the past 30 years. (Ironically, children all over the world are now learning English grammar: in China, Japan, the Scandinavian countries, India, everywhere but here.) The best way a kid in the public schools these days can get a solid grounding in English grammar is to study Spanish, French, Latin, or German. He will learn that language in general has a meaningful structure, roots that are clues to vocabulary, and so on. It’s also important to read! Children who read have better vocabularies and a better grasp of sentence structure than children who don’t.

This is a bit rambling and repetitive, and if I had more time it would be more useful to you, no doubt. But I hope it helps.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Gone by the waistline

Q: I’ve noticed something peculiar about the graduate students who help out at the senior center I attend. These students (master’s candidates in social work) all use the word “particular” for “peculiar.” Perhaps they picked it up from a textbook. In another vein, I hear people who say “in the mixed of” or “in the mist of” instead of “in the midst of.” And I’ve heard someone say, in reference to a custom no longer practiced, “It’s gone by the waistline.”

A: Thanks so much for sharing these. My favorite is “It’s gone by the waistline.” Hilarious!

As for “peculiar” and “particular,” the two words have different meanings, and I can’t imagine why those graduate students confuse them these days.

That said, I should mention that “particular” has been used sometimes in parts of England to mean, yes, eccentric. A regional definition for “particular” in the Oxford English Dictionary, with citations dating back to 1712, is “so unusual as to excite attention; peculiar, odd, strange.” One of the citations is in George Eliot’s Silas Marner (1861): “A partic’lar thing happened … a very partic’lar thing.”

I suspect, however, that those grad students may not read much aside from textbooks. If there is a decline in reading for pleasure, it might account for a lot of mis-hearings (“in the mist/mixed of” and so on). But people (both readers and nonreaders) have been tripping over their tongues for ages.

The editor of a book I’m now writing said that in childhood he thought the expression “eke out a living” was actually “eek out a living.” And come to think of it, earning a living is very often an “Eek!” experience.

These mistakes actually have a name: eggcorns. If you’d like to read more about this phenomenon, check out my March 14, 2007, blog item about a goof similar to the ones you mentioned.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

I yam a sweet potato

Q: I was watching a cooking show on TV and the chef got all upset about people who call sweet potatoes “yams.” Isn’t it just a regional thing from the South? Or another sense of the word?

A: Sweet potatoes and true yams are different, unrelated vegetables. Sweet potatoes come from the tropical Americas and yams from Africa and Asia, and they’re from different plant families.

Despite all that, what we call “yams” in this country (especially in the South and the West) are really sweet potatoes. How did the names of the sweet potato and yam get confused? An article by Jonathan R. Schultheis and L. George Wilson, horticultural scientists at North Carolina State University, explains it this way:

“Several decades ago, when orange-fleshed sweet potatoes were introduced in the southern United States, producers and shippers desired to distinguish them from the more traditional, white-fleshed types. The African word nyami, referring to the starchy, edible root of the Dioscorea genus of plants, was adopted in its English form, yam. Yams in the U.S. are actually sweet potatoes with relatively moist texture and orange flesh. Although the terms are generally used interchangeably, the U.S. Department of Agriculture requires that the label ‘yam’ always be accompanied by ‘sweet potato.’ “

The American Heritage Dictionary of the English Language (4th ed.) notes that the sweet potato is “also called regionally yam.” I’ve always heard the two terms used interchangeably, except when it comes to dessert. Sweet potato pie is always sweet potato pie in my experience, never “yam pie.”

As for the chef on that cooking show, he should relax. A sweet potato by any other name would taste as sweet. Or, as Popeye would say, “I yam what I yam.”

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

An inviting abbreviation

Q: My brother-in-law says “R.S.V.P.” should read “R.s.v.p.” I have always seen it with all capital letters. What is the correct way?

A: The correct style, according to both Merriam-Webster’s Collegiate Dictionary (11th ed.) and The American Heritage Dictionary of the English Language (4th ed.), is “RSVP,” all caps and no periods. “RSVP” is one of many abbreviations that have lost their dots in recent years.

The term, an abbreviation for the French phrase Répondez, s’il vous plait (Please reply), has been used by English writers since the mid-19th century.

The earliest published reference in the Oxford English Dictionary comes from The Ingoldsby Legends (1845), a collection of supernatural stories by the poet and novelist Richard Harris Barham: “Quadrilles in the afternoon, R.S.V.P.”

The OED’s first citation for the term as a verb appears in a 1969 detective novel by Raymond Vernon Beste in which the sleuth “R.S.V.Ped” to a Spanish Duchess.

If you’d like to read more about “RSVP,” check out my May 7, 2007, blog item.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Is he a Yankees fan?

Q: When I go to see my beloved team playing in the Bronx, I go to Yankee Stadium. I am, therefore, a Yankee fan. It hurts my ears to hear somebody say, “I’m going to a Yankees game.” Why that superfluous “s”? I am on a mission to correct all who use the plural for a sports team.

A: I have to disagree with you. I think you’re a Yankees fan and you go to Yankees games. But common practice is all over the place on this. When I googled, I got twice as many hits for “Yankees game” as for “Yankee game,” but half as many for “Yankees fan” as for “Yankee fan.”

Here are my thoughts on whether a noun should be singular or plural when treated like an adjective. When a noun (say, “night” or “spring”) acts like an adjective, it’s usually singular, as in “night game” or “spring training.” This is usually true even when a singular noun looks plural (as in “news reporter” or “measles vaccine”).

But “usually” doesn’t mean all the time. Some singular nouns (such as “arm” or “farmer”) become plural in certain cases when treated like adjectives (“arms control” or “farmers market,” for example). And some plural collective nouns (like team names) can be plural in adjectival phrases if considered as a whole unit.

As for the Yankees, it depends on whether you view them as a whole team or as members of the team. For me, the word “Yankees” represents the team, so one would be a team fan (that is, “a Yankees fan”) and go to a team game (“a Yankees game”).

To use another sports example, “Red Sox” is inherently plural as a team name. A Boston fan wouldn’t say he was going to a “Red Sock game.” If the team is plural (Yankees, Patriots, Giants, whatever) I would think the plural form would be appropriate in most adjectival usages.

This is my opinion, and I may bring the wrath of “Yankee fans” down on my head. I can’t say you’re wrong, just that I disagree. But you may be interested to know that expressions like “Yankees games,” “Yankees uniforms,” “Yankees news,” and the like are listed that way on the official website of the New York Yankees.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Is synergy a sin?

Q: Here’s my pet peeve: I’ve had my fill of the word “synergy.” It seemed to come out of nowhere 15 or 20 years ago, and now it’s everywhere. Enough already! Let’s show this newbie the door.

A: You’re right that “synergy” is seen a lot these days, especially in the corporate world, but it’s definitely not a newbie.

The Oxford English Dictionary has published references dating back to a 1660 ecclesiastical history by Peter Heylin: “They speak only of such a Synergie, or cooperation, as makes men differ from a sensless stock, or liveless statua, in reference to the great work of his own conversion.”

At first the word referred simply to cooperation or working together. By the mid-19th century, scientists were using it to refer to the combined action of bodily organs, mental faculties, remedies. Here’s an 1867 citation from the writings of the philosopher George Henry Lewes: “It is to be observed that the phrenologists have been fully alive to the synergy of organs in producing mental phenomena.”

The most common meaning of “synergy” now (a combined effort that’s greater than its parts) is relatively new, dating from the 1950s, according to the OED. Here’s a definition of “synergy” from a 1966 book on corporate strategy by Harry Igor Ansoff: “It is frequently described as the ‘2 + 2 = 5’ effect to denote the fact that the firm seeks a product-market posture with a combined performance that is greater than the sum of its parts.”

A related term, “synergism,” which dates from 1764, refers to a religious doctrine that human will cooperates with divine grace in the work of regeneration.

As for your pet peeve, “synergy” may sometimes be used too much, but it serves a purpose and I don’t know of any other word that means quite the same thing. On balance, I’d give it thumbs up.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

User friendly

Q: Let’s dispense with “utilize.” I can’t think of a single instance when “use” would not be better (and FAR less pretentious). A professor once explained to me that “use” refers to using something in a way that’s intended while “utilize” refers to using something in a way NOT intended. For instance, he’d use a toothbrush to brush his teeth, but he’d utilize one to clean grout. Is this true? No dictionary I’ve consulted makes such a distinction, and it positively GRATES on me to hear people talk about “utilizing” ANYTHING!

A: You’re right about the inflated verb “utilize.” It usually means the same thing as “use”: to make use of or employ. People who utilize “utilize” when they could utilize “use” want to sound more authoritative than they are.

To be fair to your old professor, however, some dictionaries do make a distinction between “use” and “utilize,” though not the one he made. Although both words can mean to make use of something, “utilize” can also mean to use something in a practical or profitable way, according to The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.).

The Oxford English Dictionary has published references dating back to 1807 for “utilize” as a verb meaning to make or render useful. The far-older verb “use” has meant to make use of or to put into practice from at least the early 14th century.

So, yes, there is a slight difference in meaning between the two words, but I don’t think most people who use “utilize” are aware of it. They just want a longer word.

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

A goldenrod by any other name

Q: A group of rather literary friends recently corrected me for using the word “goldenrods.” They said the plural of the wildflower is the same as the singular. Does “goldenrod” become plural by adding an “s,” like “flower,” or does it stay the same, like “deer”?

A: The dictionaries I’ve consulted don’t indicate that the singular and plural of this word are the same (as they do for invariable nouns like “deer,” “moose,” “sheep,” “swine,” and so on).

So I would conclude that “goldenrod” forms its plural the normal way, with the addition of an “s,” as in “The bouquet included three irises, two lilies, and four goldenrods.”

As an amateur gardener, however, I do know that people with green thumbs often use the singular in referring to plants: “We planted five dozen iris and two dozen crocus last fall” or “I like the way you’ve grouped your three daphne” (instead of “irises,” crocuses,” and “daphnes”).

And plants are often spoken of in the singular, as in “Slender fragrant goldenrod flowers in the summer” (instead of “goldenrods flower”) or “That field of lance-leaved goldenrod is striking.”

Despite these common conventions, “goldenrod” isn’t treated as an invariable noun by dictionaries, which means that technically it has a separate plural.

By the way, the term “goldenrod,” which refers to a plant of the genus Solidago, first appeared in English in a 1568 book by the British botanist William Turner.

All this reminds me of the headline on a “Cuttings” column in the New York Times some years ago: “Just Call Them Glads and Move On From There.”

Buy Pat’s books at a local store or Amazon.com.

The Grammarphobia Blog

Discomfit zone

Q: Why do so many people misuse the word “discomfited”? It doesn’t mean the same as “discomforted,” which ain’t even a word.

A: The verbs “discomfit” and “discomfort” (as well as their past participles “discomfited” and “discomforted”) are indeed real words. They once meant different things, but their meanings (long confused) have now pretty much merged.

“Discomfit,” which dates back to 1225, originally meant to defeat or overthrow. It comes from an Old French word, desconfire, meaning to defeat.

“Discomfort,” which dates from 1330, originally meant to discourage or dishearten, but it eventually weakened and now means to disconcert or make uneasy. It comes from an Old French word, desconforter (to make uncomfortable).

Although many sticklers insist on the original meaning of “discomfit,” the word has been used for hundreds of years to mean disconcert, as in this quote from the Dickens novel Dombey and Son (1848): “Dombey was quite discomfited by the question.”

In fact, this “new” usage is so prevalent that it has become the principal meaning of “discomfited,” according to The American Heritage Dictionary of the English Language (4th ed.), while the old sense of defeated or overthrown has become a secondary meaning.

A usage note in American Heritage says the use of “discomfit” to mean disconcert probably arose in part because of confusion with “discomfort,” but this meaning “is now the most common use of the verb in all varieties of writing and should be considered entirely standard.”

Thus does language change!

Buy Pat’s books at a local store or
Amazon.com.

The Grammarphobia Blog

Seeing is believing

Q: Why do radio hosts, when finishing a show, say, “We’ll see you tomorrow,” when it’s physically impossible? A better signoff: “Until tomorrow (or next time), goodbye.”

A: You’re right that “We’ll see you tomorrow” can’t be literally correct as a radio signoff. But I think many people consider “we’ll see you” an idiomatic expression meaning something like “we’ll be back” or “we’ll be in touch” or “we’ll be with you.”

In case you’re interested, there are many long-established non-literal uses of “see.” I’ll mention just a few.

We often say “I see” when we mean “I understand”; this use of “see” in the sense of perceiving intellectually dates back to the year 1200, according to the Oxford English Dictionary.

To “see” that something is done means to make sure it’s done. This sense of the word dates back to 1300, according to the OED.

To “see” someone’s bet (say, in a game of cards) is to match it. This usage dates from 1599.

“Let me see” or “let us see” (in an attempt to recall something to memory) is a usage dating from 1520.

The OED says the 19th-century expression “see you” (and variations like “see you soon” and “see you around”) is a “colloquial formula of farewell,” and is often used “without reference to an anticipated meeting,” much in the way the French say au revoir and the Germans auf Wiedersehen.

Buy Pat’s books at a local store or
Amazon.com.

The Grammarphobia Blog

The whiching hour

Q: I thought I had “that” vs. ”which” nailed down — “that” for restrictive clauses and “which” for nonrestrictive. But I’ve seen the New Yorker, which I revere as a bastion of great writing, use “which” without a comma to introduce a restrictive clause. Is this usage acceptable where it would allow a writer to avoid using one “that” close to another? I’m thinking of a sentence like this: “I told you that the books which I ordered are not available.”

A: You do have the rule nailed down. Rather than stick in an inappropriate “which” to avoid using “that” twice, I’d write, “I told you that the books I ordered are not available.”

For readers of The Grammarphobia Blog who may not be familiar with restrictive and nonrestrictive clauses, let me explain the rule about “that” vs. “which” in nontechnical language.

When you’ve got a clause (a group of words with a subject and a verb) that you could start with “that” or “which,” and you can’t decide between them, here’s a hint: If you can drop the information and not lose the point of the sentence, use “which.” If you can’t drop it, use “that.”

The examples I use in my grammar book Woe Is I are: (1) “Buster’s bulldog, which had one white ear, won best in show.” (2) “The dog that won best in show was Buster’s bulldog.”

In the first example, the information in the “which” clause is not essential. In the second example, the clause starting with “that” is essential; it’s the whole point of the sentence. Also, as you note, “which” clauses are set off with commas, and “that” clauses aren’t.

As for the New Yorker, it doesn’t religiously observe the distinction (maintained at least in American usage) between “that” and “which.” Neither do British publications, a fact you may already have noticed.

As it happens, William Safire, in two “On Language” columns in the New York Times Magazine in 1989, commented on this practice by the New Yorker.

In the first column, Safire wrote that it was easy to find “outright mistakes” in the New Yorker, and he cited this example: “It is only those drugs which are illegal that inspire the present public furor….”

He then explained the rule about “which” and “that” clauses, and added this quote from Strunk and White: “It would be a convenience to all if these two pronouns were used with precision. “

A month later, Safire stuck to his guns despite a challenge from the scholar and critic Jacques Barzun. “Your general principle is right,” Barzun wrote, “which is nondefining and the other defines. But Fowler, who invented the rule, makes clear that following it should not take precedence over other considerations and he mentions euphony.”

I’m with Safire on this one. I think the distinction between “that” and “which” is often an aid to clarity, and clarity in expression is nothing to sneeze at. Using “which” in restrictive clauses can lead to misunderstandings. For example:


(1) “Fran collected books that were scarce in wartime.” (No way to misread this.)

(2) “Fran collected books which were scarce in wartime.” (Did she collect only scarce books, or were all books scarce in wartime? The meaning isn’t clear.)

By the way, Barzun exaggerated when he said Henry Fowler invented the rule for “that” and “which” clauses. Fowler didn’t, though he certainly popularized it.

Buy Pat’s books at a local store or
Amazon.com.

The Grammarphobia Blog

Don’t cry for me, Argentina

Q: I’ve heard someone, or something, from Argentina called an ar-gen-TINE, an argen-TEEN, and an argen-TIN-i-an. Which is correct?

A: The Random House Webster’s College Dictionary prefers the following English nouns (in order) for a native of Argentina:

(1) “Argentine” (pronounced ar-gen-TEEN.
(2) “Argentine” (ar-gen-TINE).
(3) “Argentinean” (ar-gen-TIN-i-an).

The adjectives referring to someone or something from Argentina are the same.

My husband, who worked as an American journalist in Buenos Aires in the 1960s, uses only the first one (pronounced ar-gen-TEEN) for both the noun and adjective.

The now old-fashioned way of referring to the country of Argentina (“the Argentine”) is usually pronounced the ar-gen-TINE.

Buy Pat’s books at a local store or Amazon.com.