Frequently asked questions about English
- Is it wrong to end a sentence with a preposition?
- Is it wrong to split an infinitive?
- Why do people pronounce “ask” like “axe”?
- It is “I feel bad” or “I feel badly”?
- How about “I’m good” vs. “I’m well”?
- What’s the origin of “the whole nine yards”?
- Should I say “I wish I was” or “I wish I were”?
- Which is right: “If I was” or “If I were”?
- Is it OK to say “he graduated college”?
- How do you pronounce “forte”?
- Is it all right to use “that” instead of “who” to refer to a person?
- Is it “an historic” or “a historic”?
- Should I use “toward” or “towards”?
- Is it OK to say something fell “off of” a truck?
- What’s wrong with “Where’s it at?” and “Where it’s at?”
- Why do people begin sentences with “so”?
- Can one soldier be called a “troop”?
- When should I use “whom” instead of “who”?
- What’s the difference between “that” and “which”?
- What’s wrong with “not that big of a deal”?
- Is it “taller than I” or “taller than me”?
- What is the subjunctive?
- What’s the story with “whole nother”?
- Why do New Yorkers say “on line” instead of “in line”?
- What does “beg the question” mean?
- How can an exception “prove the rule”?
- Where does “spitting image” come from?
- Is it OK to use “myself” instead of “I” or “me”?
- Why do people say, “The problem is, is …”?
- Why do English spellings make no sense? Shouldn’t “daughter” and “laughter” rhyme? And why the “l” in “should” and “would”?
- Is it wrong to say, “It was so fun,” or “It’s a fun movie”?
- Why do people say “I could care less”?
- Why is English a Germanic language, if it has so many words from Latin?
1. Preposition at the end:
There’s no grammatical reason not to end a sentence with a preposition. The 17th-century poet John Dryden concocted this so-called rule, apparently to make English act more like Latin. But we can blame Robert Lowth, an 18th-century clergyman and Latin scholar, for popularizing it.
The prohibition caught on, perhaps because of its simple-mindedness, even though great literature from Chaucer to Shakespeare to Milton is full of sentences ending in prepositions (positioning words like “at,” “on,” “over,” “up,” and so on).
The reason we’re compelled to end sentences with prepositions is that this is a normal Germanic sentence structure. And English is a Germanic (not a Romance) language. (See FAQ #33.)
While there’s nothing wrong with ending an English sentence with a preposition, one can overdo it. A quotation, attributed to E. B. White, shows how silly a pile of prepositions can sound. Child supposedly says to father, who has brought the wrong book upstairs for bedtime reading: “What did you bring me that book I don’t want to be read to out of up for?”
2. Split infinitive:
The belief that it’s wrong to split an infinitive is a notorious myth that grammarians have been trying to debunk for generations. This never was a genuine rule. It was merely a misconception based on the wrong-headed notion that English (a Germanic language) should conform to the rules of Latin sentence structure.
An infinitive is a verb in its simplest form and often has the word “to” in front of it: “Darcy helped to find Lydia and Wickham.” But the “to” isn’t actually part of the infinitive and it isn’t always necessary: “Darcy helped find Lydia and Wickham.”
The myth against “splitting” an infinitive was born in the 19th century when Latin scholars misguidedly called it a crime to put a descriptive word between the prepositional marker “to” and the infinitive: “Darcy helped to quickly find Lydia and Wickham.”
Since “to” isn’t part of the infinitive, however, nothing is being split, and the whole idea of “splitting” an infinitive is nonsense. (In Latin, the infinitive is a single word without a prepositional marker, and obviously can’t be split.)
A sentence often sounds better when the “to” is close to the infinitive, but there’s no harm in separating them by putting an adverb or two in between. Writers of English have been happily “splitting” infinitives since the 1300s. So if you want to happily join them, feel free.
3. “Axe” for “ask”:
Contrary to popular belief, the “axe” pronunciation of “ask” isn’t limited to some African Americans or Southern whites. It’s widespread, and is even heard in parts of Britain. Today this pronunciation is considered nonstandard, but it wasn’t always so.
The verb entered Old English in the eighth century and had two basic forms, “ascian” and “acsian.” During the Middle English period (1100-1500), the latter form (“acsian”) became “axsian” and finally “axe” (or “ax”), which was the accepted written form until about 1600.
Chaucer, in The Parson’s Tale (1386), writes of “a man that … cometh for to axe him of mercy.” And in Miles Coverdale’s 1535 translation of the Bible, there are lines like “Axe and it shal be given you,” and “he axed for wrytinge tables.”
In the early 17th century, “ask” (which had been lurking in the background) replaced “ax.” Though the spelling changed and the consonant sounds were switched in standard English, the old pronunciation can still he heard in the Midland and Southern dialects of Britain, according to the Oxford English Dictionary, as well as in the US.
Some have speculated that perhaps the earlier “axe” was somehow passed on to slaves (which could help explain why it survived among some African Americans). But there’s no evidence to support that theory, and besides, whites use the pronunciation too.
4. “Feel bad”:
“Feel” (meaning to sense rather than to touch) is a linking verb, along with “be,” “appear,” “become,” “grow,” “look,” “remain,” “seem,” “smell,” “sound,” and “taste.” Linking verbs are modified by adjectives (like “bad” and “good”) rather than adverbs (“badly” and “well”).
A linking verb differs from other verbs in that it conveys a state or condition, rather than an activity. It’s called a linking verb (or copula) because it merely links (or couples) the subject with the complement, as in “He seems tired.”
In other words, the complement is an adjective rather than an adverb; in effect, it modifies the subject rather than the verb.
If you use “feel” to mean touch, however, then it’s NOT a linking verb, and it would take an adverb (“His fingertips are numb from the cold, so he feels badly” … “Now that his fingertips have warmed up, he feels well”). Few people talk this way, though.
For more on the subject, see the answer to FAQ #5.
5. “I’m good”:
Some people who say “How are you?” object to hearing “I’m good” or “I feel good” in reply. But you can correctly say either “I’m well” or “I’m good,” and “I feel well” or “I feel good,” depending on your meaning.
Most usage guides will tell you that “well,” when applied to a person, means “healthy.” So if you said, “I’m well” or “I feel well,” you’d be talking about your state of health.
If you said “I’m good” or “I feel good,” according to these usage guides, you’d be talking about your state of being (how you felt in general). A dying person might say on his deathbed, “I feel good, knowing my affairs are in order.” He wouldn’t mean that he felt well.
Here’s the reasoning.
When health isn’t involved, adjectives (like “good”) are used instead of adverbs (like “well”) to modify linking verbs (like “is” or “feel” or “smells”). A linking verb is one that describes a state or condition rather than an action. That’s why we say “He smells good,” “It tastes good,” “She looked good,” and so on.
But adverbs (like “well”) are used to modify verbs showing activity (such as “skate” or “tango”). That’s why we say “He skates well” and “She tangos well.”
One way to remember this is to recall the song lyric “I feel pretty.” With a linking verb like “feel,” you use an adjective (“pretty”), not an adverb (“prettily”). Same with “is” and “smells” and “seems” and other linking verbs: “He is nice” … “They seem nice” … “That smells nice.”
In theory, you can “feel badly” when your sense of touch is awry, or “smell badly” when your nose is stuffed, but I’ve never actually heard anyone use those expressions in real life. (“Feel” and “smell” are action verbs here: they refer to the acts of feeling and smelling, not the conditions.)
For more on the subject of linking verbs, see the answer to FAQ #4.
6. “Whole nine yards”:
As of this writing (late 2009), we simply don’t know for sure when or how “the whole nine yards” originated. Many theories (involving cement mixers, ammunition belts, nuns’ habits, Scottish kilts, ships’ sails, shrouds, garbage trucks, a maharaja’s sash, a hangman’s noose, and others) have been proposed, but there’s no evidence for any of them.
To date, the earliest known use of “the whole nine yards” in print comes from Senate testimony by Vice Admiral Emory S. Land in 1942 about production at nine shipyards:
“You have to increase from 7.72 to 12 for the average at the bottom of that fifth column, for the whole nine yards.” (The admiral is obviously using the phrase literally here to mean nine shipyards, not in its usual sense as the whole enchilada. And it’s highly unlikely that the figurative use grew out of this literal one, since the Admiral’s testimony at an obscure committee meeting was not widely known.)
As of now, the earliest published reference for the expression in its usual sense is from “Man on the Thresh-Hold,” a short story by Robert E. Wegner printed in the fall 1962 issue of the literary magazine Michigan’s Voices. A rambling sentence in the story refers to “house, home, kids, respectability, status as a college professor and the whole nine yards, as a brush salesman who came to the house was fond of saying, the whole damn nine yards.”
It’s clear that the author didn’t coin this usage. We can safely assume that it was an expression familiar to him (though perhaps not to his readers, since he felt the need to explain it somewhat).
The next known appearance is from a letter to the editor published in the December 1962 issue of Car Life magazine: “Your staff of testers cannot fairly and equitably appraise the Chevrolet Impala sedan, with all nine yards of goodies, against the Plymouth Savoy which has straight shift and none of the mechanical conveniences which are quite common now.”
But perhaps the most tantalizing early citation so far is from an article by the World Book Encyclopedia Science Service about jargon in the space program. (A reprint appeared in the April 18, 1964, issue of the San Antonio Express and News and elsewhere.)
The article (entitled “How to Talk ‘Rocket’”) defined “the whole nine yards” as “an item-by-item report on any project.” The author, Stephen Trumbull, added that “the new language” from the space program was spreading “across the country—like a good joke—with amazing rapidity.”
Could NASA, which was established on July 29, 1958, be the ultimate source of this usage? We don’t know, but stay tuned. (Sam Clements, Bonnie Taylor-Blake, Stephen Goranson, and Joel S. Berson are among the word detectives who helped track down the latest footprints of “the whole nine yards.”)
7.-8.-22. “Was” vs. “were” (use of the subjunctive):
When you express a wish, or when you use an “if” statement to talk about a condition contrary to fact, you use the subjunctive mood instead of the indicative.
So you’d say, “Last year I was in Honolulu” (indicative), but “Today I wish I were in Honolulu” and “If I were in Honolulu I wouldn’t be here” (both subjunctive). This is why we say things like “If I were king …” or “If only she were here …” or “I wish he were nicer to his parents,” and so on. In the subjunctive mood, “was” becomes “were.”
English speakers use the subjunctive mood (instead of the normal indicative mood) on three occasions:
(1) When expressing a wish: “I wish I were taller.” [Not: “I wish I was taller.”]
(2) When expressing an “if” statement about a condition that’s contrary to fact: “If I were a blonde …” [Not: “If I was a blonde …”]
(3) When something is asked, demanded, ordered, suggested, and so on: “I suggest he get a job.” [Not: “I suggest he gets a job.”]
9. “Graduated college”:
Traditionally, it’s the school that graduates the student, not the other way around.
In its original meaning, to “graduate” was to confer a degree on someone, so it was an action by the school. The student himself, on the other hand, “was graduated” by the school. But for the last 300 years (since the early 1800s), it’s also been standard practice to say the student “graduated from” the school. The misuse (“she graduated college” instead of “she graduated from college”) dates from the mid-20th century. Here’s the scoop:
- CORRECT: “Princeton graduated him in 1986.”
- CORRECT: “She was graduated from college in 1997.”
- CORRECT: “We graduated from Georgia Tech in 2003.”
- INCORRECT: “They graduated Stanford in the early ’80s.”
So in modern usage, only the last example is a crime.
10. “Forte”:
You can correctly pronounce it like the English word “fort” (as in Fort Knox), or as two syllables, FOR-tay. Dictionaries now accept either pronunciation.
This wasn’t always the case.
The English word “forte” is derived from a French word (fort) meaning “strong,” so traditionally it has been pronounced as one syllable. And the other pronunciation, FOR-tay, was long frowned upon. It was probably influenced by the Italian word forte, a musical term meaning “loud.” (Perhaps we have musicians to blame for the Italian pronunciation of the nonmusical term.)
At any rate, the two-syllable version has become so entrenched, doubtless because of the Italian influence, that it’s no longer condemned by usage experts.
Although both the FORT and FOR-tay pronunciations are now acceptable in American English, common usage seems to prefer the two-syllable version.
But sticklers take note! As we write in our book Origins of the Specious, neither pronunciation is etymologically correct. Historically, both pronunciations resulted from mistakes. In French the word is spelled fort, and the “t” is silent.
11. “That” instead of “who”:
Despite what many people believe, a person can be either a “that” or a “who.” There’s no grammatical foundation for the belief that it’s incorrect to refer to a person as a “that” (“the man that I marry,” “the girl that married dear old dad,” and so on).
There may be a politeness issue here, a belief that using “that” in place of “who” or “whom” demeans or objectifies a human being. But there’s no grammatical reason for such a rule, even though many style books persist in spreading the misconception.
A Dictionary of Contemporary American Usage, by Bergen Evans and Cornelia Evans, has this to say on the issue: “That has been the standard relative pronoun for about eight hundred years and can be used in speaking of persons, animals, or things. … Three hundred years ago who also became popular as a relative. It was used in speaking of persons and animals but not of things. This left English with more relative pronouns than it has any use for. … Who may in time drive out that as a relative referring to persons, but it has not yet done so.”
12. “An historic”?
The short answer is that “an historic,” in the mouth of somebody who pronounces the “h,” is an affectation.
The rule is that you use “a” before a word or acronym that starts with a consonant SOUND (“a eulogy,” “a hotel,” “a unicorn,” “a YMCA”), and you use “an” before a word or acronym that starts with a vowel SOUND (“an uproar,” “an hour,” “an unending saga,” “an M&M cookie”).
It’s not the LETTER at the start of the word or acronym that determines the article; it’s the SOUND. So a word starting with “h” can go either way, depending on whether the “h” is sounded or not. Similarly, it’s correct to say “a rhinoceros” but “an RFP from a client,” because the letter “r” is pronounced as if it begins with a vowel.
Now for the longer answer.
There has been a long fluctuation in the pronunciation of the initial “h” in an unstressed syllable (as in “historic” and “hotel”). In literary usages, it was long common in England to drop the “h” sound if the syllable was not stressed, and to use “an” instead of “a.” This is no longer the case. (In the US, in Ireland, in Scotland, and in extreme northern England, people never did drop their aitches.)
Nowadays standard English pronunciation, both here and in Britain, calls for sounding the “h.” Not all Brits do, though, so it’s natural that those who don’t would say “an ’istory.”
When you see “an” before a word beginning with “h” in British literature, that means the “h” was pronounced either weakly or not at all. In the 1500s, this was true even for words of one syllable (“hill”), and of words in which the first syllable was stressed (“history,” “hundred,” “humble”). That’s why you will see “an hundred” in Shakespeare and “an hill” in the King James Version of the Bible.
Later on, aitches were dropped in literary usage only with unstressed syllables, and to this day some British writers persist in using “an hotel” or “an historic.” But that too is now falling away and is considered overly “literary,” even in England. In fact, Henry Fowler called it “pedantic” back in 1926.
To sum up, the article “an” before a sounded “h” is unnatural in English and in fact is discouraged in Britain these days as well as in the United States. The British dictionary Longman’s, under its entry for “historic,” gives these example of its proper use: a historic meeting of world leaders and “It is a historic moment,” he told journalists.
13. “Toward” vs. “towards”:
Both “toward” and “towards” are fine.
“Toward” is more common in American English and “towards” in British English. So an American is likely to say “toward me” and a Briton “towards me.” The meanings are identical, and both versions have been around for more than a thousand years.
Many people also ask about “forward/forwards” and “backward/backwards.” In American English, the words in each set are interchangeable, and the ones without the “s” are more common.
In British English, however, there’s a slight difference between the words with “s” and those without. Those without the “s” are used as adjectives (“forward motion,” “backward glance”), while the others are used for the most part as adverbs (“move forwards,” “run backwards”).
There are signs, though, that “forward” is gaining in popularity as an adverb in the British Isles.
14.-15 “Off of …” and “Where’s it at?”/”Where it’s at”
“Off of” is no way to talk (as in “I took it off of the table”). It’s not the best English. Don’t use “of” if you don’t need it, especially in written or more formal usage: “I took it off the table.” This is a common redundancy that can’t be justified.
But as we’ve written on our blog, “Where’s it at?”/”Where it’s at” is another story. The use of the contraction, the normal rhythms of speech, and the stress on the locative “at” all seem to demand this idiomatic construction.
And by the way, some people think “Where’s it at?” is incorrect because the sentence ends with a preposition. That’s not the case. See FAQ #1.
16. “So …”:
Many people use “so” indiscriminately at the beginning of sentences. Why do they do it?
Our guess is that interviewers may begin their questions with “so” because it’s an easy way to get into a topic without taking the trouble to find a more graceful entry. And interviewees may use “so” to respond because it gives them a moment to gather their thoughts – that is, to stall for time.
Although some WNYC listeners find this “so” business annoying, it’s not ungrammatical. In fact, the Oxford English Dictionary says the use of “so” as “an introductory particle” goes back to Shakespeare’s day.
Still, perhaps it’s true that “so” has become overused of late. Interviewers as well as interviewees tend to run out of new ideas after a while, and when one of them starts briskly with “so,” then others tend to jump on the usage. Thus the thing snowballs as it becomes more popular, and eventually starts to resemble a verbal tic permeating the airwaves.
Scientists and academics may be more prone to this habit, since “so” is a handy way of leading from one related idea to another.
The overuse of “so” in interviews will probably go away when it starts to sound worn-out. And so it goes.
17. “Troop”:
One soldier does not a troop make. But the use of “troops” to refer to a small number of soldiers, sailors, marines, and so on is gaining in acceptance.
Both The American Heritage Dictionary of the English Language (5th ed.) and Merriam-Webster’s Collegiate Dictionary (11th ed.) say “troops” can refer to soldiers. Neither suggests that the number of soldiers has to be large.
Even a relatively fussy language guide like Garner’s Modern American Usage (3rd ed.) accepts the use of the plural “troops” to mean a small number of soldiers: “In this sense, troops refers to individual soldiers (three troops were injured in the raid), but only when the reference is plural,” the usage guide says. “That is, a single soldier, sailor, or pilot would never be termed a troop.”
Although “some may object” to using “troops” to refer to individuals, the book adds, “the usage is hardly new.” It cites an 1853 item in the New York Times reporting that “three of the Government troops were killed and five wounded.”
“Today,” Garner’s says of the usage, “it’s standard.”
Here’s a little history. The English military term “troop” (16th century) comes from Middle French troupe, which comes from Old French trope, probably derived from tropeau (flock, herd).
Tropeau, in turn, comes from the Latin troppus, which may derive from the ancient Germanic sources that gave us “thorp” and “throp,” the Middle English terms for hamlet or village. Hundreds of years later, we reborrowed it from the French. In language, it’s often the case that what goes around comes around.
The Oxford English Dictionary says the original English meaning of the singular “troop,” from the mid-1500s, was “a body of soldiers,” and soon after that it meant “a number of persons (or things) collected together; a party, company, band.”
In 1590 it acquired a technical meaning in the military: “A subdivision of a cavalry regiment commanded by a captain, corresponding to a company of foot and a battery of artillery.” By 1598, the plural “troops” had come to mean “armed forces collectively,” the OED says. Here’s a 1732 citation: “Certain sums of money to raise troops.” And here’s another, from 1854: “The courage displayed by our troops.”
18. “Who” vs. “whom”:
The most important point to remember in this “who”-vs.-“whom” business is that “who” does something to “whom.” In other words, “who” is a subject, like “he,” and “whom” is an object, like “him.”
In fact, if you could substitute “him,” choose “whom”; if you could substitute “he,” use “who.” (You may have to rearrange the word order to make this work.)
Although a preposition is often followed by “whom” (or “whomever”), that’s not always the case. Sometimes a preposition is followed by a clause that begins with “who” (or “whoever”) as in: “Give a ticket to whoever needs one.”
That said, many modern grammarians believe that in conversation or informal writing, “who” is acceptable in place of “whom” at the beginning of a sentence or clause: “Who’s the package for?” … “You’ll never guess who I ran into the other day.”
Where “whom” should be used after a preposition, you can substitute “who” by reversing the order and putting who in front. “From whom?” becomes “Who from?”
Once strongly resisted, this pronoun usage is slipping into standard English, like the informal use of “me” after forms of the verb “be,” as in “It’s me.”
19. “That” vs. “which”:
When you have a clause that could start with “that” or “which” and you can’t decide between them, here’s a hint: If you can drop the information and not lose the point of the sentence, use “which.” If you can’t drop it, use “that.”
Here are a couple of examples: “The dog, which came from a pet store, bit their neighbor.” … “The dog that bit their neighbor came from a pet store.”
In the first example, the information in the “which” clause is not essential. In the second example, the clause starting with “that” is essential; it’s the whole point of the sentence.
You’ll also notice that “which” clauses are set off with commas, and “that” clauses aren’t. So if you find yourself wanting to insert little pauses before and after the information, it’s probably a “which” clause.
20. “That big of a deal”:
The expression is redundant. The correct phrase is “not that big a deal.” Similar usages include phrases like “not that bad a storm,” “not too old an athlete,” “not so good a movie,” and so on.
In expressions like these, where an adjective is being used to describe a noun, “of” is unnecessary. Articles in the journal American Speech have referred to this usage as the “big of” syndrome.
Perhaps the confusion arises because of phrases like “a hell of a storm” and “a whale of a good time” and “a monster of a party.” In expressions like those, where a noun is being used to describe another noun, the “of” is required. (Technically, the two nouns are in apposition, a grammatical construction where one noun is the explanatory equivalent of the other.)
21. “Than I” or “than me”?
The answer depends on how formal you want to be.
First, here’s the traditionalist view. In the most strictly formal contexts, “He’s bigger than me” is considered unacceptable. The preferred versions are “He’s bigger than I” or “He’s bigger than I am.” This is because for the last couple of hundred years, traditionalists have regarded “than” as a conjunction, not a preposition.
So much for the traditionalists. But some usage gurus, including the late William Safire, have insisted that “than” can indeed be used as a preposition, which means that an object pronoun (“me,” “him,” “us,” and so on) afterward is just fine. Certainly common usage is on their side.
Here’s a little historical background. From the 15th until the mid-18th century, “than” was used both as a preposition (as in “She’s taller than me”) and as a conjunction (“She’s taller than I [or I am]”). The choice was up to the speaker or writer, and nobody seemed to mind.
Then along came Robert Lowth, the Latin scholar who helped popularize the myth that it’s wrong to end a sentence with a preposition (see FAQ # 1). In a 1762 grammar book, Lowth decreed that “than” should be treated as a conjunction, not a preposition, before a personal pronoun.
To this day, some pedants take the view that “than” is a conjunction and only a conjunction. Nevertheless, millions of educated people use “than” as a preposition too. And the more sensible contemporary language authorities are on their side.
“Than is both a preposition and a conjunction,” says Merriam-Webster’s Dictionary of English Usage. “In spite of much opinion to the contrary, the preposition has never been wrong.” Even the more traditional usage guides now accept this in speech or casual writing.
23. “Whole nother”:
People who use “whole nother” are probably conflating two terms: “whole other” and “another.” It has also been suggested that “nother” reflects a misunderstanding of the word “another” as two words: “a … nother.”
But this may be not such a “misunderstanding” after all. “Another” was originally two words, according to the Oxford English Dictionary: “an other (often a nother).”
Until long into the 17th century, “nother” was a common pronoun and adjective meaning “a second or other; a different one,” the OED says. This use of “nother,” according to the OED, is largely obsolete today and survives only as a colloquialism in the United States, where it’s commonly used with “whole.”
Still, “whole nother” is, as the OED says, a colloquialism. It’s not standard English and isn’t recommended in formal speech or writing.
24. “On line”:
It’s an accepted idiom in New York City to stand “on line,” though it sounds odd to people from other parts of the country.
Somebody from Atlanta or Chicago or Omaha or Phoenix gets “in line” and then stands or waits “in line”; somebody from New York gets “on line” and then stands or waits “on line.” Similarly, New York shopkeepers and such will always say, “Next on line!” instead of “Next in line!”
This is a good example of a regionalism. In Des Moines you get black coffee when you ask for “regular” coffee. In New York, “regular” means coffee with milk. It’s a big country.
Interestingly, New Yorkers aren’t the only ones who stand on line. A dialect survey that maps North American speech patterns found that the idiom was most prevalent in the New York metropolitan area, but that it occurred in pockets around the country, especially in the East.
25. “Beg the question”:
The much-abused expression “beg the question” is almost never used in its historically traditional meaning.
Strictly speaking, what it DOESN’T mean is raising, avoiding, prompting, mooting, ignoring, or inspiring the question. And it also doesn’t mean countering with another, different question in an ironic way.
What it DOES mean is engaging in a logical fallacy, namely, basing your argument on an assumption that itself is in need of proof.
Here’s one example Bryan A. Garner gives in Garner’s Modern American Usage (3d ed.): “Reasonable people are those who think and reason intelligently.” As Garner writes, “This statement begs the question, What does it mean to think and reason intelligently?”
He also gives this example: “Life begins at conception, which is defined as the beginning of life.” As Garner says: “This comment is patently circular.”
Frankly, the expression “beg the question” is so frequently used to mean so many different things that today it’s virtually useless.
26. “The exception proves the rule”:
To call something “the exception that proves the rule” does seem nonsensical. If there’s an exception, then it should disprove the rule, right? Many word lovers have turned themselves inside out in an attempt to explain this seeming contradiction.
But the word “proves” isn’t the key to the problem. (Contrary to statements in several reference books, “proves” here does indeed mean “proves,” not “tests.”) The key is the word “exception,” which English adopted from French in the 14th century.
When the word (spelled “excepcioun”) showed up in Chaucer’s writings in 1385, it meant a person or thing or case that’s allowed to vary from a rule that would otherwise apply.
That sense of the word led to the medieval Latin legal doctrine exceptio probat regulam in casibus non exceptis (exception proves the rule in cases not excepted), according to the Oxford English Dictionary.
By the 17th century, the Latin expression was being quoted in English as “the exception proves the rule” or variations on this. And the exception, the OED tells us, was something that “comes within the terms of a rule, but to which the rule is not applicable.”
Let’s say, for example, that all students in a school are required to attend gym class. If a kid with a sprained ankle is excused from gym, then the exception made for him proves that there is in fact a rule for everybody else.
27. “Spitting image”:
“Spitting image” is one of those phrases that can’t be pinned down with certainty.
Michael Quinion, on his website World Wide Words, discusses the various versions of the phrase (“spitting image,” “spitten image,” “spit and image,” “the very spit of,” and “dead spit for”) and several theories of their origin. Here’s an excerpt:
“The two most common suggest that our modern phrase [‘spitting image’] is, via one or other of these forms, a corruption of spit and image. This contains the even older spit, which existed by itself in phrases such as the last two above. Larry Horn, Professor of Linguistics at Yale, argues convincingly that the original form was actually spitten image, using the old dialectal past participle form of spit. He suggests that the phrase was reinterpreted when that form went out of use, first as spit ’n’ image and then as spit and image or spitting image.
“But why spit? One view is that it’s the same as our usual meaning of liquid ejected from the mouth, perhaps suggesting that one person is as like the other as though he’d been spat out by him. But some writers make a connection here with seminal ejaculation, which may account for the phrase being used originally only of the son of a father.
“Quite a different origin is suggested by other writers, who argue that spit is really an abbreviation of spirit, suggesting that someone is so similar to another as to be identical in mind as well as body. Professor Horn is sure that this supposed derivation is wrong.”
28. “Myself” vs. “I” or “me”:
“Myself” is heard a lot these days because many people have forgotten how to use “I” and “me.” Faced with the decision (“I” or “me”), they resort to “myself” as a fallback position.
But using “myself” instead of “I” or “me” is bad grammar.
As a general rule, “myself” and the other reflexive pronouns (“-self” words like “herself,” “themselves,” etc.) should not be used to replace an ordinary pronoun like “I/me,” “she/her,” “they/them,” “he/him,” and so on.
A good rule to follow is that if you can substitute an ordinary pronoun, then don’t use a reflexive pronoun. Here’s when to use one:
(1) For emphasis. (“I made it myself.”)
(2) To refer to a subject already named. (“He beats up on himself.”)
That said, however, people often use “myself” or “himself” or “herself” deep in a sentence when the ordinary pronoun would almost get lost. This is more of a stylistic issue than a crime against good grammar.
29. “The problem is, is …”:
This is what happens when a speaker gets as far as the first verb (“The thing is …”), hesitates, and then continues, forgetting that a verb has already been used (“ … is that I forgot my wallet”).
People who do this are essentially treating a clause like “the thing is” or “the point is” or “the problem is” as the subject of the sentence as a whole.
They then have to give that subject a verb, so they forge ahead with another verb (“ … is that”), mentally shifting gears and grinding them in the process. This often happens when the clause after “is” starts with “that.”
Two linguists, Michael Shapiro and Michael C. Haley, wrote about the subject in the journal American Speech in 2002, calling this kind of “is is” a “reduplicative copula.” (A copula is a linking verb, like “is.”)
The simple “double copula,” on the other hand, isn’t grammatically incorrect. Examples: “What that is is an armadillo,” or “What he is is a felon.” (Or, to use an example quoted by Shapiro and Haley: “What the problem is is still unclear.”)
But the kind of sentence we’re talking about (“The problem is is that I’m too busy”) is grammatically incorrect, or, as Shapiro and Haley would say, it’s a “nonstandard syntactic construction.” The phenomenon is recent but not particularly new. The two linguists cite examples going back to 1993.
30. Those English spellings:
Many English spellings appear to be almost nonsensical. Words like “caught,” “ought,” “thought,” “laughter,” “daughter,” and “should” have letters unrelated to the way they sound.
This wasn’t always so. Our spelling system began as an attempt to reproduce the speech of the day. But because most spellings became fixed centuries ago, they no longer reflect modern pronunciations.
The “gh” in words like “caught,” “ought,” and others is annoying to people who’d like to reform English spelling. Many wonder, for example, why “laughter” and “daughter” don’t rhyme. Well, they once did.
“Daughter” has had several pronunciations over the centuries, including DOCH-ter (with the first syllable like the Scottish “loch”), DAFF-ter (rhyming with “laughter”), and DAW-ter. We know which one survived.
The Middle English letter combination “gh” is now pronounced either like “f” (as in “cough/trough/laugh/enough,” etc.) or not at all (“slaughter/daughter/ought/through,” and others).
The word “night,” to use another example, went through dozens of spellings over 600 years, from nact, nigt, niht, and so on, to “night” around 1300. It’s a cousin not only to the German nacht but probably to the Greek nyktos and the Old Irish innocht, among many others.
The odd-looking consonants in the middle of “night” (as well as “right” and “bright”) were once pronounced with a guttural sound somewhere between the modern “g” and “k.” But though the pronunciation moved on, the spelling remained frozen in time.
To use another example, the letter “l” in “should” and “would” looks entirely superfluous. But it was once pronounced, as it was in “walk,” “chalk,” “talk,” and other words.
The same goes for the “w” in “sword” and the “b” in “climb.” They were once pronounced. Similarly, the “k” in words like “knife,” “knee,” and “knave” was not originally silent. It was once softly pronounced. But while pronunciation changed, spelling did not.
There are several reasons that English spellings and pronunciations differ so markedly.
Much of our modern spelling had its foundation in the Middle English period (roughly 1100 to 1500). But in the late Middle English and early Modern English period (roughly 1350 to 1550), the pronunciation of vowels underwent a vast upheaval.
Linguists call this the Great Vowel Shift, and it’s too complicated to go into in much detail here. To use one example, before the Great Vowel Shift the word “food” sounded like FODE (rhymes with “road”).
While the pronunciations of many words changed dramatically, their spellings remained largely the same. That’s because printing, which was introduced into England in the late 1400s, helped retain and standardize those older spellings.
Complicating matters even further, the first English printer, William Caxton, employed typesetters from Holland who introduced their own oddities (the “h” in “ghost” is an example, borrowed from Flemish).
In addition, silent letters were introduced into some English words as afterthoughts to underscore their classical origins. This is why “debt” and “doubt” have a “b” (inserted to reflect their Latin ancestors debitum and dubitare).
Sometimes, a letter was erroneously added to reflect a presumed classical root that didn’t exist. This is why “island” has an “s” (a mistaken connection to the Latin isola).
To complicate matters, “Much of the irregularity of modern English spelling derives from the forcing together of Old English and French systems of spelling in the Middle Ages,” David Crystal has written.
English spelling is a vast subject. In summary, spellings eventually settle into place and become standardized, but pronunciations are more mercurial and likely to change.
31. “So fun”; “a fun movie”:
Sentences like “That was such fun” or “This is so much fun” are perfectly correct. But usages like “That was so fun” and “This is a fun movie” are natural in speech but frowned upon by usage experts.
Why? Because traditionally “fun” is a noun (a thing, as in “We had fun”), not an adjective. So in formal English you usually wouldn’t use it as a modifier (“We had a fun day”). An exception would be when “fun” is a predicate nominative – a noun that follows a verb and modifies the subject (“This is fun”). Therefore, “fun” would be correct in a sentence like “Skiing is fun,” but questionable in one like “We had a fun day on the slopes.”
For the same reason, the ubiquitous “so fun” is considered excessively casual, but “so much fun” is impeccable. If you mentally substitute another noun, like “entertainment,” you can see why. You wouldn’t say “so entertainment,” but you could say “so much entertainment.” Similarly you could say “This is entertainment” (predicate nominative).
But despite the language mavens, the use of “fun” as an adjective is so common nowadays that someday it will undoubtedly become accepted.
The expression “so fun” was probably inevitable. People are used to putting “so” before predicate adjectives – that is, adjectives that describe the subject of a sentence (“This is so pink” or “It was so hard”). It’s natural, if not proper, that some people would want to put “so” before a noun that describes a subject (“This is so fun”). And if enough people do it, “fun” may become a legitimate adjective one day.
32. “I could care less”:
The traditional idiomatic expression is “I couldn’t care less.” It first appeared in print in 1946 and means “I’m completely uninterested” or “I’m utterly indifferent.”
A shortened version, “I could care less,” has been gaining popularity in the United States since the 1960s. It has much the same meaning as the original expression plus an ironic twist. The OED’s first published reference appeared in 1966.
Although many people are disturbed by the abbreviated American idiom, it is probably here to stay. It’s obviously intended ironically. The message is something like “I don’t even distinguish this by identifying it as the thing I care least about in the world.”
You might be interested in reading what Steven Pinker says on the subject of “I could care less,” which he calls “an alleged atrocity” and a favorite target of language mavens.
As he points out in his book The Language Instinct, the melodies and stresses in intonation between “I couldn’t care less” and “I could care less” are completely different and indicate youthful sarcasm: “By making an assertion that is manifestly false or accompanied by ostentatiously mannered intonation, one deliberately implies its opposite. A good paraphrase is, ‘Oh yeah, as if there was something in the world that I care less about.’ ”
33. Why is English a Germanic language?
Let’s begin with where linguists place English among the world’s languages.
English, Icelandic, Faroese, Norwegian, Swedish, Danish, Frisian, Flemish, Dutch, Afrikaans, German, and Yiddish are the living languages that are part of the Germanic family.
This family is divided into North Germanic (Icelandic, Faroese, Norwegian, Swedish, Danish) and West Germanic (English, Frisian, Flemish, Dutch, Afrikaans, German, Yiddish). The now defunct East Germanic branch consisted of Gothic, which is extinct.
The other principal European language family is the Italic (popularly called Romance). This consists of the modern languages derived from Latin: Portuguese, Spanish, Catalan, Provençal, French, Italian, Rhaeto-Romance, and Romanian.
These two families, Germanic and Italic, are branches of a single prehistoric language called Indo-European or Proto-Indo-European. The language group descended from Indo-European includes the Balto-Slavic, Albanian, Celtic, Italic, Greek, and Germanic families of languages. It’s estimated that about half the earth’s population speaks a language from the Indo-European group, which is only one of several language groups that have been identified worldwide.
But back to English. Why do we call it a Germanic language?
As Calvert Watkins writes in The American Heritage Dictionary of Indo-European Roots, one of the dialects of Indo-European “became prehistoric Common Germanic, which subdivided into dialects of which one was West Germanic.”
This in turn, Watkins says, “broke up into further dialects, one of which emerged into documentary attestation as Old English. From Old English we can follow the development of the language directly, in texts, down to the present day.”
But while English is Germanic, it has acquired much of its vocabulary (or lexicon) from other sources, notably Latin and French.
As Watkins explains: “Although English is a member of the Germanic branch of Indo-European and retains much of the basic structure of its origin, it has an exceptionally mixed lexicon. During the 1400 years of its documented history, it has borrowed extensively and systematically from its Germanic and Romance neighbors and from Latin and Greek, as well as more sporadically from other languages.”
Where exactly does our modern vocabulary come from? The website AskOxford cites a computerized analysis of the roughly 80,000 words in the old third edition of the Shorter Oxford Dictionary.
The study, published in 1973, offered this breakdown of sources: Latin, 28.34 percent; French, 28.3 percent; Old and Middle English, Old Norse, and Dutch, 25 percent; Greek 5.32 percent; no etymology given, 4.03 percent; derived from proper names, 3.28 percent; all other languages, less than 1 percent.