The Grammarphobia Blog

Historic vs. historical: A history

Q: A headline on the CBC website: “23 historical black Canadians you should know.” Wouldn’t “historic” be more accurate?

A: We think that headline writer could justifiably have used either “historical” or “historic.”

The article on the CBCnews website referred to “23 black Canadians who made major contributions to Canada’s culture and legacy.”

As we’ve written before on our blog, “historical” is generally used to mean having to do with history or the past. And “historic” is generally used to mean important in history.

These black Canadians were all real people who lived in the past, so they can be called “historical” figures. They were also important in the past, so they were “historic” figures as well.

But even back in 2006, when we wrote that post, the two terms were often used interchangeably.

The then-current fourth edition of The American Heritage Dictionary of the English Language accepted “historical” as a secondary meaning of “historic,” and the new fifth edition does too.

Merriam-Webster’s Collegiate Dictionary (11th ed.) agrees. Both dictionaries say that either “historic” or “historical” can be used to mean famous or important in history. So the headline writer could have meant “historical” in this sense.

In fact, the difference between these words isn’t nearly as pronounced as some people think. Here’s what Merriam-Webster’s Dictionary of English Usage has to say on the subject:

Historic and historical are simply variants. Over the course of two or three hundred years of use, they have tended to diverge somewhat.”

Evidence in the Oxford English Dictionary supports this view.

The first on the scene was “historical.” In the mid-16th century, the OED says, it meant “belonging to, constituting, or of the nature of history; in accordance with history.”

The adjective “historic” showed up in writing a little later, in the late 16th century, when its meaning was much the same as “historical.”

The OED says it originally meant “relating to history; concerned with past events.” So the two words were more or less synonymous.

Then in the 18th century, both words took on an additional meaning—important or famous in history.

And ever since, according to OED citations, writers have used both “historic” and “historical” in two senses: relating to history and famous in history.

But, as the Merriam-Webster’s usage guide points out, preferences have emerged and the two words have “tended to diverge.” So how are these words used today?

Historical is the usual choice for the broad and general uses relating to history,” the usage guide says. “Historic is most commonly used for something famous or important in history.”

Merriam-Webster‘s conclusion: “We would suggest that you go along with the general trend.”

Although a case can be made for using the two words interchangeably, we use them the way M-W suggests.

Check out our books about the English language

The Grammarphobia Blog

Making sense

Q: Your posting about “make sure” has a raised a question in my mind. We seem to use “make” differently. We can say, “I made sure my students thought about that.” But we have to say, “I made my students think about that.” Why is it that we can use “thought” in the first example, but we have to use “think” in the second?

A: The verb “make” and the verbal phrase “make sure” illustrate two different grammatical constructions.

When used in this sense, “make sure” can be followed by verbs in any form, but “make” alone is always followed by a verb in the infinitive.

This explains why your second sentence has the past tense of “make” followed by an object (“my students”) plus an infinitive (“think”).

The Oxford English Dictionary says the verb “make” here means “to cause (a person or thing) to do something.”

This use of “make,” as the OED notes, is seen in such familiar constructions as “don’t make me laugh,” “to make (one’s) mouth water,” and “to make (one) think.”

When “make” is used this way, the second verb remains in the infinitive, even when “make” shifts from tense to tense: “we made them think” … “we will make them think” … “we would have made them think,” and so on.

That’s why you never see a construction like “we made them thought.” And that’s why this use of “make” is grammatically different from “make sure,” which doesn’t lock in the form or tense of the verb (or verbs) that follow.

In your first sentence, the verbal phrase “made sure” is followed by a clause: “my students thought about that.”

As the OED says, this sense of “make sure” means “to make something certain as a fact … to preclude risk of error; to ascertain,” and it can be followed either by a clause or by “of.”

But unlike the use of “make” we described above, “make sure” can be followed by verbs in any tense. The form isn’t set in stone.

There are many possible constructions: “I make sure my students think about that” … “I’ll make sure my students will think about that”… “I made sure my students would think about that” … and so on.

The verb “make,” meaning to construct something, first showed up in early Old English in the writings of Alfred the Great, King of the West Saxons and Anglo-Saxons, according to the Oxford English Dictionary.

Although the OED has several Old English citations, John Ayto’s Dictionary of Word Origins says “make” wasn’t a particularly common verb in Anglo-Saxon times.

Ayto writes that gewyrcan, the Old English ancestor of the modern word “work,” was “the most usual way of expressing the notion ‘make.’ ”

It wasn’t until the Middle English period (from the late 12th to the late 15th centuries) that the use of “make” became common, according to the Dictionary of Word Origins.

Check out our books about the English language

The Grammarphobia Blog

Contrite and sarcastic?

Q: I recently came across a complaint on the Web about a “contrite and sarcastic” worker at a pizza place. I don’t see how someone can be both contrite and sarcastic. Have you noticed this usage? Can you shed any light on it?

A: It’s difficult to imagine someone who’s both “contrite” and “sarcastic,” at least not at the same time, since those words describe conflicting attitudes.

Someone who’s “contrite” is sorry, and feels regret or sadness about an offense. But someone who’s “sarcastic” is expressing contempt or ridicule.

A Google search did turn up a handful of instances in which writers mistakenly combined “contrite” and “sarcastic.” All the examples seemed to come from blogs, discussion groups, or social networks.

For example, a contributor to a forum about video-game websites was described as “abusive, contrite, sarcastic and just plain mean.”

A contributor to a scuba-diving discussion group wrote, “So I hope you do not take my responses as contrite, sarcastic, flip or disrespectful.”

And a political tweet accused Hillary Clinton of offering “a contrite, sarcastic response” when asked why she didn’t make the rounds more on the Sunday talk shows.

Huh? The use of “sarcastic” is understandable, since these remarks were generally negative. But “contrite” is definitely out of place.

Perhaps these writers are confusing “contrite” with some other word, but what could it be? “Contemptuous” … “contentious” … “contrary”?

A more likely explanation is that they simply don’t know what “contrite” means, and are using it to mean something like rude or dismissive or blunt. Everyone who writes for public consumption should have access to a standard dictionary!

“Contrite,” as we indicated above, is far from rude. The Oxford English Dictionary says it’s been used by writers since the 14th century, when it had a religious flavor.

The original meaning was “crushed or broken in spirit by a sense of sin, and so brought to complete penitence.”

This was a figurative adaptation of the word’s Latin ancestor. As the OED explains, the Latin adjective contritus means bruised or crushed, and comes from the verb conterere (to rub or grind together).

In English, the word still means what it meant almost 700 years ago, though it has a secular sense as well. Here are some examples from famous sources, courtesy of the OED:

“Create and make in vs newe and contrite heartes.” (From The Book of Common Prayer, 1549.)

“Her contrite sighes vnto the clouds bequeathed / Her winged sprite.” (From Shakespeare’s poem The Rape of Lucrece, 1594.)

“With our sighs … sent from hearts contrite, in sign / Of sorrow unfeign’d, and humiliation meek.” (From Milton’s Paradise Lost, 1667.)

Check out our books about the English language

The Grammarphobia Blog

Are you a Japanophone? はい

Q: Is there an English word to describe a Japanese-speaker? Perhaps a something-phone, along the lines of “Anglophone” or “Francophone”?

A: We’ve seen the words “Japanophone” and “Nippophone” (either uppercase or lowercase) used on the Internet to describe a speaker of Japanese.

The preferred term, by a wide margin, appears to be “Japanophone.” The also-ran, “Nippophone,” shows up on French websites more than on English-language sites.

You won’t find either term in standard English dictionaries, however. We checked a half-dozen dictionaries in the US and the UK.

Some people obviously feel a need for such a word, so they’re creating one—or in this case two.

There are certainly precedents, as you’ve pointed out, for using the word element “-phone,” from the Greek term for “sound,” to create a noun referring to the speaker of a specific language.

The most familiar examples are “Anglophone” and “Francophone,” for speakers of English and French. (American dictionaries tend to capitalize the two words while British dictionaries tend to lowercase them.)

In the neologisms we’ve seen online, people are adding “-phone” to versions of “Japan” or “Nippon” to mean a speaker of Japanese.

Interestingly, both the English and Japanese names for the country are ultimately derived from an old Chinese phrase meaning “origin of the sun.” Why? Because the sun rose to the east of China, where Japan was located.

By the way, the terms “Anglophone” and “Francophone” are relatively new, first recorded in English in the early 20th century, according to their entries in the Oxford English Dictionary.

The earliest examples of each are from the same book, The Races of Man (1900), by the anthropologist Joseph Deniker: “In Canada two-thirds of the white population are Anglophones, and the rest Francophones.”

Deniker’s book appeared in French and in an English translation the same year. The French nouns anglophone and francophone had appeared earlier, in 1894, and the French adjective francophone in 1880.

Here’s a more recent example using both words, from the Canadian magazine Saturday Night (1967):

“It is because our fizzy Canadian cocktail has intoxicating qualities, because a dazzling future lies in wait for francophones and anglophones … that we should hold together, along with the valuable New Canadians.”

Other “-phone” words used in this sense are much less common, and few are recognized in dictionaries. The OED does have an entry for the noun “Russophone” (which it capitalizes), from 1899, for a speaker of Russian.

Oxford also has an entry for an adjective, “lusophone,” meaning Portuguese-speaking, but not for a noun. The usage is dated from 1974. (The “luso-” part is from “Lusitania,” an old Latin name for Portugal.)

As for other such words, we’ve found examples of “hispanophone” and “italophone” in literary usage, but generally not as nouns. They’re usually adjectives referring to Spanish and Italian literature (as in “hispanophone proverbs,” “italophone writings,” etc.).

We’ve also found many examples—from books, newspapers, and the Internet—for the noun “slavophone” used in reference to Greeks or ethnic Greeks who speak a Slavic language. But the usage is controversial and caught up in Balkan politics.

It’s easy to invent these words, but some of them are bound to remain oddities, like “netherlandophone.” The simple phrase “Dutch speaker” does the job very well.

As for that Japanese word in our headline, it’s pronounced hai and it means yes.

Check out our books about the English language

The Grammarphobia Blog

The earliest Johnny-come-lately

Q: Do you guys have any idea who the “Johnny” is in “Johnny-come-lately”?

A: The phrase “Johnny-come-lately” originated as a 19th-century American expression for a newcomer or a novice. It’s now also used for an upstart, a late adherent to a trend or cause, and someone who’s late for an event.

There’s no particular significance in the use of the name “Johnny” here.

Since the 17th century, according to the Oxford English Dictionary, this familiar diminutive of “John” has been used “humorously or contemptuously” to mean “a fellow, chap.”

For example, the OED cites Allan Ramsay’s poem And I’ll Awa’ to Bonny Tweedside (1724), in which Edinburgh is described as a place “Where she that’s bonny / May catch a Johny.”

Over the years, both in the US and in the UK, people have used the name “Johnny” as a generic term for a guy. (We wrote blog postings in 2007 and 2009 about a similar usage, “Tom, Dick, and Harry.”)

This generic use of “Johnny” is found in many familiar phrases whose origins are explained in the OED.

For example, “Johnny Reb,” a Northern term for a Confederate soldier, emerged during the American Civil War.

And “Johnny-on-the-spot,” for someone who’s always ready and available when needed, was first recorded in an American novel, Artie (1896), by George Ade.

In Britain, “Johnny raw” and “Johnny Newcome” were early 19th-century phrases for a rookie, a newcomer, or a raw recruit. Those were at least the spiritual forerunners of the American phrase “Johnny-come-lately.”

OED citations indicate that “Johnny-come-lately” first appeared in The Adventures of Harry Franco (1839), a humorous novel by Charles Frederick Briggs, a journalist and former sailor.

Here’s the quotation from Briggs’s novel: “ ‘But it’s Johnny Comelately, aint it, you?’ said a young mizzen topman.”

(Briggs’s claim to fame is that he gave Edgar Allan Poe a job on his short-lived magazine, the Broadway Journal, in 1845.)

The phrase may have originated in America but it didn’t stay there.

One OED citation is from the Christchurch Press in New Zealand, which offered this definition for its readers in 1933: “Johnny-come-lately, nickname for a cowboy or any newly-joined hand or recent immigrant.”

Finally, this 1972 example is from the former BBC publication The Listener, in a reference to the state of Utah: “Here man himself is a Johnny-come-lately.”

Check out our books about the English language

The Grammarphobia Blog

Downton’s steep learning curve

Q: If I type “anachronisms” in a Google search box, Autocomplete suggests adding the words “downton abbey.” So this is not an original topic, but I spotted two possible slip-ups in a recent episode: “learning curve” and “a lot on my plate.”

A: Yes, anachronism-spotting has become something of a sport to watchers of Downton Abbey, and now we can chalk up a couple more.

The period TV drama is set in the years between 1912 and 1921, and it’s highly unlikely people who lived then would have known either “learning curve” or “a lot on my plate.”

Let’s look at “learning curve” first. It’s barely possible that a layman in 1920s England would have known the term, but it’s quite a stretch.

The phrase was in use at the time, in scholarly papers by research psychologists who used it in its literal, scientific sense—a curved line on a graph, representing the rate at which a certain skill is learned.

It’s even less likely that the expanded form of the phrase heard on the show—“steep learning curve”—would have been used then.

The linguist Ben Zimmer has also been following Downton anachronisms, and he had this to say in a recent Word Routes column on his Visual Thesaurus website:

“Matthew Crawley, the presumptive heir of Downton Abbey and now the co-owner of the estate, says, ‘I’ve been on a steep learning curve since arriving at Downton.’ By this he means that he’s had a difficult time learning the ways of Downton. Unfortunately, people didn’t start talking that way until the 1970s.”

Although the term “learning curve” was around in the early 1900s, Zimmer notes, “it didn’t become a common phrase until the ’70s, and it was then that the word steep began to be used to modify it in a rather peculiar way.” A “steep learning curve,” he says, came to mean “an arduous climb.”

He says “learning curve” was apparently first recorded in 1903 in a paper published in the American Journal of Psychology. This is also the earliest usage we’ve been able to find.

The author of the 1903 paper, Edgar James Swift, wrote: “Bryan and Harter (6) found in their study of the acquisition of the telegraphic language a learning curve which had the rapid rise at the beginning followed by a period of retardation, and was thus convex to the vertical axis.”

We checked out the earlier study that Swift refers to, but it didn’t actually use the term “learning curve,” so his usage does appear to be the first.

That earlier study, by William Lowe Bryan and Noble Harter of Indiana University, was published in the Psychological Review in 1897.

The article, “Studies in the Physiology and Psychology of the Telegraphic Language,” described experiments to determine the rates at which telegraph operators learned to send and to receive messages in Morse code.

Bryan and Harter used lines plotted on graphs to illustrate the rates at which the skills were learned. They described the lines with phrases like “sending curve,” “receiving curve,” and “curve of improvement,” but they never used “learning curve.”

Many people credit the concept of a learning curve—if not the phrase itself—to studies in memory published in 1885 by the German psychologist Hermann Ebbinghaus.

But his work doesn’t include the words Lernkurve or Erfahrungskurve, either of which might be translated into English as “learning curve.”

The Oxford English Dictionary hasn’t yet updated its entry for “learning curve” to reflect the earlier usages now available in digitized data banks.

The OED’s earliest example is from a paper published in 1922, and it defines only the literal meaning of the term: “a graph showing progress in learning.”

All of Oxford’s citations for “learning curve” use the phrase in this scientific sense, and the dictionary doesn’t mention any figurative uses.

The other expression you’re asking about, “a lot on my plate,” is another likely anachronism in Downton Abbey. The OED’s earliest citation is from 1928, and we haven’t found an earlier one.

The OED labels the phrase and its variants as colloquialisms meaning “to have a lot of things occupying one’s time or energy.”

Oxford’s earliest example is from the July 4, 1928, issue of a British newspaper, the Daily Express: “I cannot say. I have a lot on my plate. … Mr. Justice Horridge: A lot on your plate! What do you mean? Elton Pace: A lot of worry, my lord.”

This more contemporary example is from Dermot Bolger’s novel Ladies’ Night at Finbar’s Hotel (1999): “I have enough on my plate without worrying about you.”

Want to hear about more Downton Abbey anachronisms? Ben Zimmer has spoken on NPR’s “Morning Edition” about some others, like “I’m just sayin’ ” and “When push comes to shove.”

Check out our books about the English language

The Grammarphobia Blog

Whether … or not?

Q: When you use “whether,” do you need “or not”? I find “whether” being used alone for “if,” and I wonder what is correct.

A: In the phrase “whether or not,” the “or not” is often optional. When the choice is up to you, you can generally use either “whether” or “if.”

But you definitely need “or not” when you mean “regardless of whether,” as in, “I’m out of here whether you like it or not!”

Pat discusses this in her grammar and usage book Woe Is I. Here’s the passage:

“When you’re talking about a choice between alternatives, use whether: Richie didn’t know whether he should wear the blue suit or the green one. The giveaway is the presence of or between the alternatives. But when there’s a whether or not choice (Richie wondered whether or not he should wear his green checked suit), you can usually drop the or not and use either whether or if: Richie wondered if [or whether] he should wear his green checked suit. You’ll need or not, however, if your meaning is ‘regardless of whether’: Richie wanted to wear the green one, whether it had a gravy stain or not. (Or, if you prefer, whether or not it had a gravy stain.)”

Merriam-Webster’s Dictionary of English Usage has some very good advice: “Of course, the simplest way to determine whether the or not can be omitted is to see if the sentence still makes sense without it.”

In case you’re interested, our word “whether” developed from the Old English term hwæther, meaning which of the two. (We’ve used “th” here to represent the letter thorn.)

The Old English term was derived from two prehistoric Germanic roots: khwa- or khwe- (source of such English words as “what” and “who”) and –theraz (a source of “other”), according to John Ayto’s Dictionary of Word Origins.  

Check out our books about the English language

The Grammarphobia Blog

Noun entities

Q: One of my pet peeves is the use of a verb in place of a noun, a practice I often see in the NY Times. Examples: a letter to the editor refers to somebody’s “physical or mental dissolve” … a book review speaks of “a good read” … a secret revelation in a movie is called “the reveal.” You’ll probably tell me that this use of “reveal” dates back to the Elizabethan era. If so, I’ll take your word for it, but it still sounds illiterate to me.

A: The use of “dissolve” to mean a mental or physical decline is a new one on us. But as you probably know, the use of “dissolve” as a noun is common in cinematography. In show biz, a “dissolve” is a sequence in which one scene fades out as the next fades in.

The noun “dissolve” in the motion picture sense, first recorded in 1918, was adapted from a similar use of the verb in 1912. The original verb, from the Latin dissolvere, first appeared in the 14th century.

We’ve found only isolated examples of the noun “dissolve” used as in that letter to the editor. But it’s not an inappropriate metaphor—aging as the Great Dissolve. (It seems better than “dissolution,” which implies a moral disintegration as well.)

At any rate, this practice of adapting verbs for use as nouns is nothing new. For example, we wrote blog entries last year on the nouns “remit” and “hit,” both derived from the earlier verbs.

We’ve also written a more general blog entry on the process, known as “conversion,” whereby one part of speech begins functioning as another:

In that post, we gave several examples of nouns adapted from earlier verbs, as in “a winning run,” “a long walk,” “a constant worry,” “take a call,” “a vicious attack.

We might have added “a good read,” a usage you ask about. This is an example of a noun that was adapted from the verb a very long time ago, subsequently fell out of use, and finally was reinvented centuries later.

Let’s start with the verb. As we’ve said before on our blog, to “read” once meant more than to peruse written words.

According to the Oxford English Dictionary and other sources, the Old English verb rædan originally meant many other things besides “to scan or study writing.”

It also meant to consider, interpret, discern, guess, discover, expound the meaning (of a riddle, say, or an omen), and so on.

The noun “read,” derived from the verb, also dates back to Old English (ræde). In its earliest uses, the OED says, it meant “an act of reading aloud” or “a lesson,” a usage that survived into the 1300s and then became obsolete.

Half a millennium later, in the 19th century, another noun “read” came into being: “an act of reading or perusing written matter; a spell of reading,” in the words of the OED.

Oxford’s earliest example is from a novel by William Makepeace Thackeray, The History of Samuel Titmarsh and the Great Hoggarty Diamond (1838): “When I arrived and took … my first read of the newspaper.”

Charles Darwin used the same noun in a letter written in 1862: “I have just finished, after several reads, your paper.”

Such usages led to a similar sense of the word, described in the OED as “something for reading (usually with modifying word, as good, bad, etc., indicating its value as a source of entertainment or information).”

The word was used this way in a British literary magazine, John o’ London’s Weekly, in 1961: “My Friend Sandy can be hugely recommended … as a pleasantly light, bright sophisticated read.”

Another example the OED cites is also from the British press. It appeared in The Independent on Sunday in 2002: “This is an authentic, funny, edgy read.”

So “read” was used in that Times book review in a familiar and well-established sense. It’s recognized in standard dictionaries as well as the OED.

You also mention the noun “reveal,” which does indeed date from the Elizabethan era. When first recorded in the late 1500s, it meant “an act of revealing something; a revelation; a disclosure; an unveiling,” the OED says.

This meaning is still seen today. For example, the OED cites this passage from an essay William Goldman wrote in 1997 about his screenplay for the movie Maverick:

“This is how the concluding moments read in rehearsal, starting with the reveal of the spade ace as the next card.” (We’ve expanded the citation to provide more context.)

But the sense of “reveal” that you’re talking about is somewhat different. The OED describes this noun as a term in broadcasting and advertising to mean “a final revelation of something previously kept from an audience, a participant in a programme, etc.”

The earliest citation in the OED is from Allen Funt’s book Eavesdropper at Large (1952): “This is the process we call ‘the reveal’—the point, toward the end of each candid portrait, where we reveal to the subject what we’ve been doing.”

Funt was the creator of TV’s Candid Camera, which he originated on radio in the 1940s as Candid Microphone. This 1975 OED citation, from the New York Times, is another reference to him:

“But now the final coup, Allen’s trademark—the ‘reveal.’ ‘Madame, did you know that at this moment you are on nationwide TV?’ ”

We’ll give one more example, from Gwendolyn A. Foster’s Class-passing: Social Mobility in Film and Popular Culture (2005):

“After a barrage of commercials, we are presented with what the show describes as ‘the reveal,’ the first view of her face.”

Check out our books about the English language

The Grammarphobia Blog

Hear Pat live today on WNYC

She’ll be on the Leonard Lopate Show around 1:20 PM Eastern time to discuss the English language and take questions from callers. Today’s topic: inspired by a puff of white smoke from the Vatican, Pat will discuss communicating through smoke signals. If you miss the program, you can listen to it on Pat’s WNYC page.

The Grammarphobia Blog

NOO-kya-lur reactions

Q: The more I learn about English, the more I find myself wondering whether something is an error or just an acceptable variant. Now for my question: Is it acceptable to pronounce “nuclear” as NOO-kya-lur instead of NOO-klee-ur?

A: We discussed this subject several years ago on our blog when a reader complained about President George W. Bush’s pronunciation of the word.

As we wrote back in 2008, Bush was far from the only US President to take liberties with “nuclear.” At least three others—Eisenhower, Carter, and Clinton—did so too.

Although the NOO-kya-lur pronunciation is very widespread, we said in that posting, it’s frowned on by many.

We wrote then that both The American Heritage Dictionary of the English Language (4th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.) noted the objections.

We’ve now checked a newer edition of American Heritage and a newer printing of Merriam-Webster’s, but not much has changed.

A usage note in the new fifth edition of American Heritage says the NOO-kya-lur pronunciation “is generally considered incorrect” and is “an example of how a familiar phonological pattern can influence an unfamiliar one.”

AH adds that the “usual pronunciation of the final two syllables” is klee-ur, “but this sequence of sounds is rare in English.”

The usage note says the kya-lur sequence is “much more common” and “occurs in words like particular, circular, spectacular, and in many scientific words like molecular, ocular, and vascular.

It says the “NOO-kya-lur” pronunciation “is often heard in high places” and “is not uncommon in the military in association with nuclear weaponry.”

Despite “the prominence of these speakers,” American Heritage concludes, the NOO-kya-lur pronunciation “was considered acceptable to only 10 percent of the Usage Panel in our 2004 survey.”

A usage note from the latest printing of Merriam-Webster’s Collegiate  says the NOO-kya-lur pronunciation is “disapproved of by many.”

But Merriam-Webster’s notes that the pronunciation is “in widespread use among educated speakers,” including scientists, lawyers, professors, congressmen, cabinet members, and presidents.

The dictionary adds that the NOO-kya-lur pronunciation has “also been heard from British and Canadian speakers.”

Merriam-Webster’s Dictionary of English Usage makes many of the same points and suggests that people use the variant kya-lur ending because they have trouble pronouncing “nuclear” with klee-ur at the end.

The usage guide adds that “there is no other common word in English” with a klee-ur ending. (The italics are in the entry.)

We take issue with this last point. At least two common English words, “likelier” and “sicklier,” have that ending. And English speakers don’t seem to have problems pronouncing them.

Check out our books about the English language

The Grammarphobia Blog

A blizzard of etymology

Q: I read the other day that the term “blizzard” was first used in Estherville, Iowa. I grew up in northern Iowa, not far from Estherville, and experienced my share of blizzards, but I’d never heard this. Is it true?

A: Several towns in the upper Midwest—Marshall, Minnesota; Sturgis and Vermillion, South Dakota; and Spencer and Estherville, Iowa—have been mentioned over the years as the source of the word “blizzard.”

As a Midwesterner who’s experienced stormy winters, you won’t be surprised to hear this. But did the word really originate in your neck of the woods?

Well, Estherville can indeed take credit for the first use of “blizzard” in reference to a severe snowstorm, but the term had been around for dozens of years in another sense.

Allen Walker Read, a Columbia University etymologist and lexicographer who died in 2002, wrote two papers in the journal American Speech about his efforts to track down the roots of the word “blizzard.”

In an article published in February 1928, Read says the earliest example of the usage he found was from the April 23, 1870, issue of the Northern Vindicator, a newspaper in Estherville serving Emmet County in northwest Iowa. (Someone should write an article about the names of small-town newspapers.)

That issue of the Vindicator debunked a “glowing account” in another newspaper, the Algona Upper Des Moines, that an Emmet County resident was endangered by a severe storm that had struck the Midwest on March 14-16, 1870:

“Campbell has had too much experience with northwestern ‘blizards’ to be caught in such a trap, in order to make sensational paragraphs for the Upper Des Moines.”

A week later, on April 30, 1870, the Vindicator spelled “blizzard” with a double “z.” Under the headline “Man Frozen at Okoboji, Iowa,” an article says:

“Dr. Ballard who has just returned from a visit to the unfortunate victim of the March ‘blizzard’ reports that his patient is rapidly improving.”

In both of these articles, the word is enclosed in quotation marks, suggesting that the usage was relatively new or considered colloquial.

A couple of weeks later, in its May 14, 1870, issue, the newspaper endorsed a proposal to rename a local baseball team as “the Northern Blizzards”:

“We confess to a certain liking for it, because it is at once startling, curious and peculiarly suggestive of the furious and all victorious tempests which are experienced in this northwestern clime.”

Read notes in American Speech that O. C. Bates, the editor of the Northern Vindicator in 1870, had a fondness for coining new words, including “weatherist,” “baseballism,” and “lollygagging.”

Did he coin “blizzard”? From the available evidence, it’s likely that he either coined it or popularized it.

Read cites several 19th-century reports that suggest the term may have been in use in Estherville before the Northern Vindicator published it.

One account, for example, says the term “blizzard” was coined by a local character in Estherville who was known as Lightning Ellis and “was given to drollery and quaint expressions.”

In a February 1930 article in American Speech, Read discounts reports that the term originated elsewhere in the Midwest or even in Texas. He cites the reports as examples of “what legendary material can … grow up around a word.”

As for the earlier incarnation of “blizzard,” the term showed up for the first time in the Virginia Literary Museum, a weekly journal published at the University of Virginia.

Robley Dunglison, a co-editor of the journal, included it in a list of Americanisms published in 1829: “Blizzard, a violent blow, perhaps from blitz (German: lightning).”

Davy Crockett, in his 1834 memoir, An Account of Colonel Crockett’s Tour to the North and Down East, used the term figuratively to mean a burst of speech:

“A gentleman at dinner asked me for a toast; and supposing he meant to have some fun at my expense, I concluded to go ahead, and give him and his likes a blizzard.”

Is the English word derived from the German blitz, as Dunglison and others have suggested?

The Oxford English Dictionary apparently thinks not. It doesn’t mention blitz and debunks speculation that the French blesser (to wound) may be the source.

The OED suggests instead that “blizzard” is “probably more or less onomatopoeic; suggestive words are blow, blast, blister, bluster.

Oxford defines the term in its original sense as “a sharp blow or knock; a shot. Also fig. U.S.

In his February 1930 article, Read notes the appearance of the word “blizz” in a weather sense in a May 31, 1770, entry in the diary of Col. Landon Carter: “At last a mighty blizz of rain.” He cites this usage as a “semantic shift in the very process of making.”

In the same article Read notes examples of the surname “Blizzard” (or “Blizard”) dating back to the mid-17th century. In 1658, one citation reports, “a Capt. Charles Blizard left this country for Antigua.”

However, Read seems skeptical about the relevance of the surname “to the semantics of the content word blizzard.”

Did the original sense of “blizzard” as a sharp blow or a shot lead to the use of the word to mean a severe storm?

The Merriam-Webster New Book of Word History notes the early evolution of the term and concludes:

“From a shotgun blast to a verbal blast to a wintry blast would seem to be a reasonable enough development, but we cannot demonstrate it.”

We can’t prove it either, but we think that’s a reasonable explanation.

And while we’re on the subject of extreme-weather terms coined in Iowa, here’s another one: “derecho.”

As we wrote on our blog last August, it was created by a University of Iowa professor in the late 19th century to describe a variety of severe thunderstorm.

Check out our books about the English language

The Grammarphobia Blog

“Inalienable” or “unalienable”?

Q: When President Obama quoted from the Declaration of Independence in his Inaugural Address, he used the word “unalienable.” But I’ve also seen the word as “inalienable.” Which is correct English? Which is actually in the Declaration?

A: Both “inalienable” and “unalienable” are legitimate English words, and they have identical meanings.

The word in the final version of the Declaration of Independence is “unalienable,” though it’s “inalienable” in earlier versions of the document. Here’s the word in context:

“We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”

You can see an image of the final version on the National Archives page for the Declaration. Click “read transcript” to see a copy in ordinary print.

President Obama has used both words over the years. In his Inaugural Address on Jan 21, 2013, he referred to “unalienable rights,” but in remarks about gun violence on Jan 16, 2013, he used the phrase “inalienable rights.”

Although both words are correct, the one we see most often now is “inalienable.” And that’s the word some dictionaries seem to prefer.

For example, Merriam-Webster’s Collegiate Dictionary (11th ed.) has an entry for “inalienable” (defined as “incapable of being alienated, surrendered, or transferred”). But under “unalienable,” the dictionary simply says it means “inalienable.” 

Many other Americans have puzzled over the years about which word is “correct” and which one actually appears in the Declaration. The nonprofit Independence Hall Association, based in Philadelphia, has a page devoted to this question on its website.

As you’ll see, the site has photocopies of the various drafts of the Declaration, some with “inalienable” (in Thomas Jefferson’s handwriting) and some with “unalienable” (in John Adams’s).

The website quotes a footnote from Carl Lotus Becker’s The Declaration of Independence: A Study in the History of Political Ideas (1922):

“The Rough Draft reads ‘[inherent &] inalienable.’ There is no indication that Congress changed ‘inalienable’ to ‘unalienable’; but the latter form appears in the text in the rough Journal, in the corrected Journal, and in the parchment copy. John Adams, in making his copy of the Rough Draft, wrote ‘unalienable.’ Adams was one of the committee which supervised the printing of the text adopted by Congress, and it may have been at his suggestion that the change was made in printing. ‘Unalienable’ may have been the more customary form in the eighteenth century.”

As we said, both words are legitimate. They’ve been part of the language since the early 17th century.

Check out our books about the English language

The Grammarphobia Blog

Make sure you’re sure

Q: A friend of mine, a Stevie Wonder fan, has a “Make Sure You’re Sure” ringtone on his cell. After listening to it a few hundred times, the phrase “make sure” started to sound funny to me. Is it proper English?

A: The phrase “make sure” is a fine old usage dating back to the 16th century, and Stevie Wonder is using it properly in that song, part of the score and soundtrack he composed for the 1991 Spike Lee movie Jungle Fever.

In its earliest usage, according to the Oxford English Dictionary, the phrase  meant “to make something certain as an end or result … to preclude risk of failure.”

The OED’s earliest example in writing is from Cardinal William Allen’s A Defence and Declaration of the Catholike Churches Doctrine, Touching Purgatory (1565):

“And therefore to make sure, I humbly submit my selfe, to the iudgement of suche [as] … are made the lawful pastors of our soules.”

Here are some more OED citations:

1698: “To make sure, he made another Shot at her.” (From a description of a tiger hunt in John Fryer’s A New Account of East-India and Persia.)

1891: “It is difficult to make sure of finding the birds.” (From Chambers’s Journal.)

The phrase is still used in that sense. But “make sure of” is also used to mean “to act so as to be certain of getting or winning; to secure,” as the OED says.

Oxford has citations ranging from the 17th to the 19th centuries. The earliest is from a letter written in 1673 by Sir William Temple: “A Peace … cannot fail us here provided we make sure of Spain.”

The phrase can also be followed by a clause, as in this example from Frances Eliza Millett Notley’s novel The Power of the Hand (1888): “That fellow rode up to the house to make sure Tristram was away.”

In another practice dating from the 19th century, the OED says “make sure” is used loosely to mean “to feel certain, be convinced.”

This citation is from Frederick C. Selous’s Travel and Adventure in South-east Africa (1893): “I made sure I should get finer specimens later on.”

Stevie Wonder uses it in this looser sense, and we’ll end with a few lines of his lyrics:

Well the night is young
And the stars are out
And your eyes are all aglow
And you say you feel
Ways you’ve never felt
But are you sure, make sure you’re sure
.

Check out our books about the English language

The Grammarphobia Blog

You better believe it

Q: I’m Australian, and an American friend often says things like “I better not forget it” instead of “I’d better not forget it.” Is this correct? Is it a case of US usage differing from UK/Australian usage?

A: The idiomatic phrase “had better” (as in “I had better study” or “We’d better go”) is a venerable usage with roots far back in Old English.

The shortened form “better” (as in “I better study” or “We better go”) dates from the 1830s and is used informally in both British and American English.

In fact, Fowler’s Modern English Usage (rev. 3rd ed.) says it’s not unheard of in your neck of the woods: “In practice this use of an unsupported better is much more common in North America, Australia, and NZ than in Britain.”

Using “better” by itself is fine except in formal English. “In a wide range of informal circumstances (but never in formal contexts) the had or ’d can be dispensed with,” Fowler’s says.

Merriam-Webster’s Dictionary of English Usage calls “had better” a standard English idiom and agrees with Fowler’s that “better,” when used alone in this sense, “is not found in very formal surroundings.”

The Oxford English Dictionary’s earliest citation for the construction without “had” is from a pseudonymous letter to a newspaper by “Major Jack Downing”:

“My clothes had got so shabby, I thought I better hire out a few days and get slicked up a little.” (The letter was published in a book in 1834 but was written in 1831.)

The OED says the abbreviated usage originated in the US, and labels it a colloquialism. But Merriam-Webster’s Collegiate Dictionary (11th ed.) lists it without reservations.

The Merriam-Webster’s editors give the example “you better hurry,” and says “better” in this sense is a “verbal auxiliary.”

It should be noted that even the full phrase, “had better,” was criticized by some in the 19th century on the ground that it was illogical and couldn’t be parsed.

An 1897 issue of the Ohio Educational Monthly says many teachers found “had better” and other idioms “very difficult to dispose of grammatically.”

“Because some teachers do not understand how to dispose of them, they teach that they are incorrect,” the monthly adds. “They insist upon changing ‘had better’ to ‘would better.’ ”

In other words, the schoolmasters condemned what they couldn’t understand, and offered a replacement that was even harder to justify.

Even the poet Robert Browning disgraced himself here. In early editions of his dramatic poem Pippa Passes, first published in 1841, the final scene has the line “I had better not.” In later editions, Browning changed the line to “I would better not.”

According to William J. Rolfe and Heloise E. Hersey, who edited an 1886 edition of Select Poems of Robert Browning, the poet took a dislike to “the good old English form ‘had better.’ ”

Why? Because he mistook the “I’d” in “I’d better” as a contraction of “I would” instead of “I had.”

Browning once explained in a letter that he was repudiating “the slovenly I had for I’d, instead of the proper I would,” on the advice of his friend Walter Savage Landor, who hotly criticized many well-known English idioms.

As Rolfe and Hersey write in a footnote: “This is essentially the familiar grammar-monger’s objection to had better, had rather, had as lief, etc., that they ‘cannot be parsed’—which is true of many another well-established idiom, and merely shows that the ‘parsers’ have something yet to learn.”

A look at the history of “had better” helps to illuminate its meaning.

The idiom was first recorded in writing in the 10th century, according to the OED.

The original form was “were better,” and it was used with object (or, more properly at that time, dative) pronouns: “him,” “me,” “us,” and so on.

As the OED explains, the phrase me were betere meant “it would be more advantageous for me,” and him wære betere meant “it would be better for him.”

The OED’s earliest example in writing is from a collection known as the Blickling Homilies (971): “Him wære betere thæt he næfre geboren nære.” (“Better it were for him never to have been born.”)

During the Middle English period, the pronouns began changing into the nominative (“he were better,” “I were better,” etc).

And finally, beginning in the 16th century, “were better” gave way to the modern “had better.” As the OED says, “I had better = I should have or hold it better, to do, etc.”

Oxford’s earliest example is from Nicholas Udall’s Thersytes, a farce that some scholars date to 1537: “They had better haue sette me an errande at Rome.”

The OED also cites this line from a letter written by Sir John Harington in the early 1600s: “Who livethe for ease had better live awaie [from Court].”

Historical note: Harington was a courtier to Elizabeth I, and one of his claims to fame is that he designed Britain’s first flushable toilet, which he installed in his manor house in Somerset. He included an image in a work he wrote on the subject.

Check out our books about the English language

The Grammarphobia Blog

Invasion of the brainworms

Q: During the college football bowls, an advertiser proclaimed that “by the end of this game, you or your company can have its own [x].” That sentence is now a worm in my brain. Should “its” have been “your”? Help!

A: When a compound subject is joined by “or” or “nor,” the verb agrees with the part that’s closer (“Cookies or cake is fine” … “Cake or cookies are fine”).

The same is true of any accompanying possessive pronoun (“Cookies or cake has its uses” … “Cake or cookies have their uses”). Take your cue from the part of the subject that’s nearer the verb.

But correct or not, this rule of subject-verb agreement can lead to extremely awkward sentences.

If the problem is that one part of the subject is singular and the other plural (as in the examples above), it often pays to put the plural part last: “Cake or cookies have their uses.”

This solution won’t give anybody a brainworm, because despite the “or” there’s a notion of plurality in that kind of sentence.

To use another example, it may be correct to write, “Neither they nor she has paid her tab.” But it sounds better to turn the subject around: “Neither she nor they have paid their tab.”

The problem isn’t as easy to fix when a compound consists of a “you” and an “it.” Technically, that advertiser was correct: “By the end of this game, you or your company can have its own [x].” But ouch!

And turning the subject around doesn’t help: “your company or you can have your own [x].” Ouch again!

Any sentence that leaves a worm in your brain should be recast, even if it’s written by the rules. There’s always a better way.

For example, the advertiser could have said, “ By the end of this game, you or your company can have an all new, one-of-a kind [x].”

Speaking of brainworms, you don’t hear the usage much nowadays, except in zoology, where the term “brainworm” refers to a parasitic roundworm that infects the brains of deer, moose, and other large hoofed animals.

However, the term has been used figuratively since the early 1600s to describe an imaginative worm infecting the brain, according to citations in the Oxford English Dictionary.

Here’s an example from Antiquity Revived, a 1693 religious tract: “Which undutiful and turbulent Allegation has not seldom created such a restless Brain-worm in the noddles of the multitude.”

The latest OED citation for the figurative use is from Musicophilia: Tales of Music and the Brain, a 2007 book in which Oliver Sacks discusses the earworms set off by movie, TV, and advertising music:

“This is not coincidental, for such music is designed, in the terms of the music industry, to ‘hook’ the listener, to be ‘catchy’ or ‘sticky,’ to bore its way, like an earwig, into the ear or mind; hence the term ‘earworms’—though one might be inclined to call them ‘brainworms’ instead.”

We hope this helps you get rid of that brainworm of yours.

Check out our books about the English language

The Grammarphobia Blog

Why “won’t” isn’t “willn’t”

Q: I was having a conversation with one of my co-workers about “won’t” and grabbed my office copy of Woe Is I to resolve the issue, only to find (or fail to find) that the use of this word is not explained in the book. Can you render an opinion as to its acceptability?

A: “Won’t” is a perfectly acceptable contraction of “will” and “not.” However, it’s an odd bird that’s been condemned at times for not looking quite like other contractions.

Merriam-Webster’s Dictionary of English Usage describes it as “one of the most irregular looking of the negative contractions that came into popular use during the 17th century.” Others include “don’t,” “han’t,” “shan’t,” and “an’t” (an early form of “ain’t”).

Why, you may ask, do we contract “will” and “not” as “won’t” instead of “willn’t”? Here’s Merriam-Webster’s explanation:

Won’t was shortened from early wonnot, which in turn was formed from woll (or wol), a variant form of will, and not.”

The M-W editors give early examples of “won’t” from several Restoration comedies, beginning with Thomas Shadwell’s The Sullen Lovers (1668): “No, no, that won’t do.” 

By the way, the verb “will” has been spelled all sorts of ways since first showing up as wyllan around 1,000 in Aelfric’s Grammar, an Old English introduction to Latin grammar.

The Oxford English Dictionary has many Middle English examples of the wole or wol spelling dating back to the 1200s.

So etymologically, there’s a case to be made for contracting “will” and “not” as “won’t.” Nevertheless, some language commentators have grumbled about the usage.

Joseph Addison, for example, complained in a 1711 issue of the Spectator that “won’t” and other contractions had “untuned our language, and clogged it with consonants.”

“Won’t,” in particular, “seems to have been under something of a cloud, as far as the right-thinkers were concerned, for more than a century afterward,” Merriam-Webster’s says.

“This did not, of course, interfere with its employment,” the usage guide adds.

It was popular enough, M-W says, “to enjoy the distinction of being damned in the same breadth as ain’t in an address delivered before Newburyport (Mass.) Female High School in December 1846.”

Both “won’t” and “ain’t” were condemned by the Newburyport speaker as “absolutely vulgar.”

“How won’t eventually escaped the odium that still clings to ain’t is a mystery,” M-W Usage says, “but today it is entirely acceptable.”

Of course a few sticklers still feel that all contractions aren’t quite quite. Well, we beg to differ. As we’ve written on the blog, contractions are impeccably good English.

Check out our books about the English language

The Grammarphobia Blog

Zero-sum games

Q: I see references to both “zero-sum games” and “zero-sum gains” on the Internet. Which is correct?

A: The term “zero sum” is widely misunderstood as meaning that nobody wins—or perhaps that nobody loses. In fact it means quite the opposite.

In any competitive situation, one side can’t win unless the other loses. “Zero-sum” means that when the losses are subtracted from the gains, the sum is zero.

The adjective “zero-sum” originated in the field of game theory in the mid-1940s, and it’s still commonly used to modify the word “game.” But “zero-sum” is also used to modify all kinds of nouns and to describe a wide variety of situations.

It would be inappropriate, however, to use it in the phrase “zero-sum gain.” That’s because “zero-sum” implies an equal balance between gain and loss.

We suspect that people are simply misunderstanding the phrase and hearing “gain” instead of “game.”

You’re right, though, that there’s a lot of zero-sum gaining on the Web. We got nearly 200,000 hits when we googled “zero-sum gain.” But we had nearly ten times as many hits for “zero-sum game.”

In game theory, as the Oxford English Dictionary explains, the adjective “zero-sum” is “applied to a game in which the sum of the winnings of all the players is always zero.”

In other words, the losses offset the gains, and the sum of losses and gains is zero.

But “zero-sum” is also used, the OED explains, to denote “any situation in which advantage to one participant necessarily leads to disadvantage to one or more of the others.”

So, for example, in “zero-sum diplomacy,” both sides can’t be winners.

The adjective was first used, according to OED citations, in John Von Neumann and Oskar Morgenstern’s book Theory of Games and Economic Behavior (1944):

“An important viewpoint in classifying games is this: Is the sum of all payments received by all players (at the end of the game) always zero; or is this not the case? … We shall call games of the first mentioned type zero-sum games.”

Here are a few more of the quotations cited in the OED:

“Perhaps the contestants in most important games nowadays (from labour disputes … to international diplomacy) too readily regard their games as zero-sum.” (From Stafford Beer’s book Decision and Control, 1966.)

“Everybody can win. Manufacturing is not a zero-sum game.” (A quote by L. B. Archer, from Gordon Wills and Ronald Yearsley’s Handbook of Management Technology, 1967.)

“C. Wright Mills … used a zero-sum conception of power (i.e., the more one person had the less was available to others).” (From the Times Literary Supplement, 1971.)

“We live in a zero-sum world.” (From the former BBC magazine The Listener, 1983.)

Check out our books about the English language

The Grammarphobia Blog

A noun for being upside down

Q: Why is there no word that describes the state of being upside down?

A: There’s a hyphenated word that may be what you’re looking for. It’s a noun, “upside-downism” (what else?), and the Oxford English Dictionary has exactly one citation for its use.

The word appeared in a book called The Oxonian in Iceland (1861), a travel book by Frederick Metcalfe about a trip taken in the summer of 1860.

We’ll expand the OED citation to provide some context. Here’s Metcalfe, describing a horseback ride through a volcanic region known as a “hraun” (Icelandic for “lava”):

“It was a ruin indeed, the abomination of desolation; as if the elements of some earlier world had melted with fervent heat; and as they cooled had burst asunder and been hurled by the Demons of Misrule and Upside-downism into a disjointed maze of confusion worse confounded.”

(Makes the eruption sound like a moral failing on the part of the volcano, doesn’t it?)

The OED doesn’t define “upside-downism,” but it describes it as a derivative of “upside down,” which has had an appropriately topsy-turvy history since it entered English in the 1300s.

For many centuries, “upside down” was exclusively an adverb (as in “turned upside down”). The adjective, usually hyphenated (as in “upside-down cake”), came along in the
mid-19th century.

When originally recorded, the Middle English adverb was up so doun (or up swa doun in northern dialects), and it apparently meant something like “up as if down.”

It first appeared in writing around 1340 in a Northumbrian religious poem, The Pricke of Conscience, which the OED attributes to the Oxford-educated mystic and hermit Richard Rolle:

“Tharfor it es ryght and resoune, / That they be turned up-swa-doune.” (We’ve converted the letter thorn to “th” throughout.)

The term appeared in another 14th-century poem, a verse rendition of The Seven Sages of Rome, an ancient Eastern collection of tales found in many languages and probably about 2,500 years old.

Here’s the couplet: “The cradel and the child thai found / Up so doun upon the ground.”

As the OED says, “The use of so is peculiar, the only appropriate sense being that of ‘as if.’ ”

At any rate, the “so” eventually disappeared. The OED explains that the compound was “frequently reduced to upsa-, upse-, and subsequently altered to upset and upside down, in the endeavour to make the phrase more intelligible.”

During the 15th and 16th centuries there were many versions of the term, including “opsadoun,” “upsedoun,” “up set doune,” “upset downe,” “upsydowne,” “vpsyde downe,” and others.

By the early 17th century, the modern spelling “upside down” had become established.

You didn’t ask, but the playful interjection “upsy-daisy,” which we’ve written about on our blog, is no relation—apart from the presence of “up.”

Check out our books about the English language

The Grammarphobia Blog

“Like” minded

Q: During the apocalyptic talk about the Mayan calendar, I wrote, “Planet Earth will not blow up like Krypton or be smashed by Planet X.” Is “like” OK here? (Krypton, the home planet of Superman, blew up just after little Kal-el left.)

A: The passage you wrote is fine as it is. In the sentence “Planet Earth will not blow up like Krypton or be smashed by Planet X,” the words “like” and “by” are prepositions. The underlined parts, “like Krypton” and “by Planet X,” are prepositional phrases.

This represents the traditionally correct use of “like”—as a preposition. The problem you’re thinking of is the use of “like” as a conjunction, a usage many sticklers frown on.

“Like” is used as a conjunction when it introduces a clause, as in “like Krypton did.” (A clause, you probably know, contains a verb and its subject.) A stickler would insist on “as” instead: “as Krypton did” (or “as did Krypton”).

However, the English in your original example is impeccable, even if you regard the verb “did” as implied but not expressed.

But what if you had included the verb (“like Krypton did”)? Here we part company with the sticklers, because even then we’d give you a passing grade.

You’re not writing elevated, formal prose. And as we’ve said before on our blog, the use of “like” as a conjunction is no crime in less than formal writing.

In fact, it represents a return to the past, before the 19th-century prohibition against the conjunctive “like” came along. And you don’t have to take our word for it.

Here’s what Merriam-Webster’s Collegiate Dictionary (11th ed.) has to say in a usage note:

“Like has been used as a conjunction since the 14th century. In the 14th, 15th, and 16th centuries it was used in serious literature, but not often; in the 17th and 18th centuries it grew more frequent but less literary. It became markedly more frequent in literary use again in the 19th century.”

It wasn’t until the mid-19th century, according to Merriam-Webster’s, that the usage came under fire. The dictionary’s conclusion:

“There is no doubt that, after 600 years of use, conjunctive like is firmly established. It has been used by many prestigious literary figures of the past, though perhaps not in their most elevated works; in modern use it may be found in literature, journalism, and scholarly writing. While the present objection to it is perhaps more heated than rational, someone writing in a formal prose style may well prefer to use as, as if, such as, or an entirely different construction instead.”

By the way, you might like to see a posting of ours about the use of “like” for “such as.” (Yes, it’s OK.)

And if you’re still “like”-minded, you might look at an article that Pat wrote for the New York Times Magazine about the use of “like” to quote or paraphrase people, as in “She’s like, what unusual taste you have.”

Check out our books about the English language

The Grammarphobia Blog

Elliptical reasoning

Q: I’m a court reporter working in Baton Rouge. When someone ends a sentence with “so,” we have differing thoughts here amongst the 20 plus of us. Example: “Question, Why did you buy the drugs? Answer, I had the money, so.” Some here end the answer with “money … so.” What are your thoughts.

A: In the sense you’re talking about, “so” is a conjunction meaning “therefore” or “consequently” or “with the result that.”

A sentence ending with this kind of “so” is incomplete. The speaker is indicating that a fuller reply is possible but isn’t being offered. So the sentence is deliberately incomplete—that is, the speaker wasn’t cut off mid-sentence.

Such a deliberate omission, according to the ordinary rules of English punctuation, should be indicated with ellipsis points at the end.

The Chicago Manual of Style (16th ed.), under section 13.53 (Deliberately incomplete sentence), says, “Three dots are used at the end of a quoted sentence that is deliberately left grammatically complete.”

So your sentence would be transcribed this way: “I had the money, so …” Note the space between “so” and the first ellipsis point. No period follows (three dots, not four).

You definitely should not use ellipsis points BEFORE the “so.” If any are used at all, they should FOLLOW the “so.”

But if ellipsis points would lead to any ambiguity, don’t use them.

We wouldn’t use them, for example, if there’s any chance that they could be interpreted as meaning that something unintelligible followed or that the speaker was interrupted.

We weren’t familiar with the punctuation conventions of court reporting, but we took a crash course by visiting the website of Margie Wakeman Wells, author of Court Reporting: Bad Grammar/Good Punctuation.

Wells says the use of ellipses to mark a trailing off has been gaining favor among court reporters. She says the use of a dash should be avoided, even with a space before the dash and a period after it.

A dash would be misleading, in our opinion: “I had the money, so—” It would indicate that the speaker was cut off.

If any misinterpretation is possible, we’d throw the rules to the wind and use a simple period: “I had the money, so.”

Your principal aim is not to follow the conventions of ordinary English punctuation, but to accurately convey the sense of what a witness said. And a simple period would do that.

In short, ellipsis points would be our choice—but ONLY if you’re sure that ellipses couldn’t be interpreted as signaling that the speaker was cut off or unintelligible.

Check out our books about the English language

The Grammarphobia Blog

“Each other” vs. “one another”

Q: Some good writers use “each other” and “one another” interchangeably, while others use them in distinctly different ways. What are your thoughts?

A: These terms are interchangeable, despite a common belief that “each other” is properly used in reference to two people or things, and “one another” for more than two. In fact, we’re revising our own thinking on this one.

In a 2006 posting, we said it was OK to use “one another” in either case. But we didn’t go far enough. We said most usage experts would object to using “each other” for three or more, though we acknowledged that the distinction was being relaxed,

Seven years later, our opinion has changed. The old distinction isn’t worth preserving—even for “each other”—and it wasn’t valid in the first place.

Merriam-Webster’s Dictionary of Modern English Usage explains that “the prescriptive rule that each other is to be restricted to two and one another to more than two” can be traced to a 1785 grammar book written by George N. Ussher. But it notes that there’s no foundation for such a rule.

Evidence in the Oxford English Dictionary shows “that the restriction has never existed in practice,” the M-W editors write, adding:

“The interchangeability of each other and one another had been established centuries before Ussher or somebody even earlier thought up the rule.”

The usage guide concludes that the restriction is a mere invention (or, as M-W puts it, “was cut out of the whole cloth”) and “there is no sin in its violation.”

The OED’s entries for the expressions confirm this. The dictionary says “each other” means the same thing as “one another.” And it defines “one another” as a “compound reciprocal pronoun” referring to “two or more.”

R. W. Burchfield, the author of Fowler’s Modern English Usage (rev. 3rd ed.), agrees that the traditional restriction isn’t valid:

“The belief is untenable,” Burchfield writes. He goes on to quote many respected writers who use “one another” for two and “each other” for three or more.

Standard dictionaries also recognize the terms as interchangeable.

The American Heritage Dictionary of the English Language (5th ed.) says the distinction between the two “is often ignored without causing confusion and should be regarded more as a stylistic preference than a norm of Standard English.”

Webster’s Third New International Dictionary, Unabridged, says “each other” means “each of two or more in reciprocal action or relation.” And “one another,” the dictionary says, means “each other.”

In short, this is an issue of style rather than correctness. There’s no harm in following that “traditional” rule if you like, but there’s no harm in ignoring it either.

Check out our books about the English language

The Grammarphobia Blog

Lex appeal: Does size matter?

Q: How many words do most native English speakers know? Do Brits know more than Americans? How many do language mavens know? How about Shakespeare, Samuel Johnson, etc.? And what about age or educational level?

A: We’re afraid this will disappoint you. Many of your questions are impossible to answer. And even if we could contrive numbers for you, they wouldn’t be very meaningful.

We had a post on our blog a few years ago about the difficulty of counting words and comparing the lexicons of different languages.

The linguists Robert P. Stockwell and Donka Minkova, who discuss this in their book English Words: History and Structure (2001), write:

“A question which everyone wonders about, and often asks of instructors, is ‘How many words does English have?’ And even more commonly, ‘How many words does the typical educated person know, approximately?’ There are no verifiable answers to these questions.”

They do say that Shakespeare is known to have used about 30,000 different words in his plays, and that “a really well-educated adult” may have a vocabulary of up to 100,000 words—“but this is a wildly unverifiable estimate.”

As for the size of the lexicon, they conclude: “Nobody knows how many words English has.”

The linguist David Crystal said more or less the same thing in a 1987 article in the journal English Today:

 “How many words are there in English? And how many of these words does a native speaker know? These apparently simple little questions turn out to be surprisingly complicated. In answer to the first, estimates have been given ranging from half a million to over 2 million. In answer to the second, the estimates have been as low as 10,000 and over ten times that number.”

We can tell you that the biggest English dictionaries have about half a million words, but that’s no help because dictionaries are selective.

The editors at Oxford Dictionaries Online and Merriam-Webster’s Online discuss the difficulties of counting the number of words in English.

The principal problem in coming up with a number is which words to count. Are “do” and “does” two separate words? How about “doing,” “doer,” “don’t,” and “undo”? What about “cat” and “cats,” not to mention “catlike,” “catty,” and “anti-cat”?

That 30,000-word estimate for Shakespeare, as Stockwell and Minkova say, would drop to “about 21,000 if you count play, plays, playing, played as a single word,” and do the same in similar cases.

Do features like prefixes (“anti-,” “re-,” “un-,” etc.) and suffixes (“-ly,” “-er,” “-ing”) swell the number of possible words we count? Is a word with two meanings (say, “cleave”) counted as one word or two? Should we count symbols, acronyms, initialisms, spelled-out numbers? The questions go on and on.

We’ve also found varying statements about the number of words the average person knows or uses.

In their book Theory of Language (1999), the linguists Steven Weisler and Slavoljub P. Milekic estimate that “an average-educated English-speaking adult knows more than 50,000 words.”

But they say a person’s “lexical capacity” is larger. As current events and new technology create the need for new language, the authors write, “English-speakers are free to make up new words and to create new uses of existing words at the spur of the moment.”

You ask about age and educational level and how they affect vocabulary. Here’s what the British language writer Michael Quinion says on his website World Wide Words:

“It’s common to see figures for vocabulary quoted such as 10,000-12,000 words for a 16-year-old, and 20,000-25,000 for a college graduate. These seem not to have much research to back them up.”

So much for vocabulary size. But how many of those tens of thousands of words do we actually use? According to the Collins Corpus, an analytical database of English, “around 90% of English speech and writing is made up of approximately 3,500 words.”

That doesn’t sound like a lot, but let’s call it a day. We’ve run out of our daily quota of words.

Check out our books about the English language