Categories
English English language Etymology Expression Language Linguistics Punctuation Usage Word origin Writing

From A to &, et cetera

Q: What is the story of “&” and why is it replacing “and”?

A: The “&” character, or ampersand, is seen a lot these days in texting, email, and online writing, but the use of a special character for “and” isn’t a new phenomenon. English writers have been doing this since Anglo-Saxon days, a usage borrowed from the ancient Romans.

In his book Shady Characters: The Secret Life of Punctuation, Symbols & Other Typographical Marks (2013), Keith Houston writes that the Romans had two special characters for representing et, the Latin word for “and.” They used either ⁊, a symbol in a shorthand system known as notae Tironianae, or the ancestor of the ampersand, a symbol combining the e and t of et.

The Tironian system is said to have been developed by Tiro, a slave and secretary of the Roman statesman and scholar Cicero in the first century BC. After being freed, Tiro adopted Cicero’s praenomen and nomen, and called himself Marcus Tullius Tiro.

Houston says the earliest known recorded version of the ampersand was an et ligature, or compound character, scrawled on a wall in Pompeii by an unknown graffiti artist and preserved under volcanic ash from the eruption of Mount Vesuvius in AD 79.

He cites the research of Jan Tschichold, author of Formenwandlungen der &-Zeichen (1953), which was translated from German to English in 1957 as The Ampersand: Its Origin and Development. An illustration that Houston based on Tschichold’s work shows the evolution of the ampersand over the years.

(Image #1 is from Pompeii, while the modern-looking #13 is from the Merovingian Latin of the eighth century.)

In Shady Characters, Houston describes how the ampersand competed with the Tironian ⁊ in the Middle Ages. “From its ignoble beginnings a century after Tiro’s scholarly et, the ampersand assumed its now-familiar shape with remarkable speed even as its rival remained immutable,” he writes.

“Whatever its origins, the scrappy ampersand would go on to usurp the Tironian et in a quite definitive manner,” he says, adding, “Tiro’s et showed the way but the ampersand was the real destination.”

Today, Houston writes, the Tironian character “survives in the wild only in Irish Gaelic, where it serves as an ‘and’ sign on old mailboxes and modern road signs,” while the ampersand “ultimately earned a permanent place in type cases and on keyboards.” (We added the links.)

Although the ampersand was common in medieval Latin manuscripts, including works written in Latin by Anglo-Saxon scholars, it took quite a while for the character to replace the Tironian et in English. In most of the Old English and Middle English manuscripts we’ve examined, the Tironian symbol is the usual short form for the various early versions of “and” (end, ond, ænd, ande, and so on).

A good example is the original manuscript of Beowulf, an epic poem that may have been written as early as 725. The anonymous author uses ond for “and” only a few times, but the Tironian symbol appears scores of times. However, modern transcriptions of the Old English in Beowulf often replace the “⁊” with ond or “&.” When the Tironian character does appear, it’s often written as the numeral “7.”

Here are the last few lines of the poem with the Tironian characters (or notes) intact: “cwædon þæt he wære wyruldcyning, / manna mildust ⁊ monðwærust / eodum liðost, ⁊ lofgeornost” (“Of all the world’s kings, they said, / he was the kindest and the gentlest of men, / the most gracious to his people and the most worthy of fame”).

Although you can find dozens of ampersands in transcriptions of Old English and Middle English manuscripts, an analysis of the original documents shows that most of the “&” characters were originally Tironian notes.

Dictionaries routinely transcribe the Tironian note as an ampersand in their citations from Old and Middle English. As the Oxford English Dictionary, the most influential and comprehensive etymological dictionary, says in an explanatory note, “In this dictionary the Old and Middle English Tironian note is usually printed as &.”

However, the ampersand does show up at times in early English. For example, it’s included in an Anglo-Saxon alphabet dating from the late 10th or early 11th century. A scribe added the alphabet to an early 9th-century copy of a Latin letter by the scholar, cleric, and poet Alcuin of York (British Library, Harley 208, fol. 87v).

The alphabet is in the upper margin of the image. It includes the 23 letters of the classical Latin alphabet (with a backward “b”) followed by the ampersand, the Tironian et, and four Anglo-Saxon runes: the wynn (ᚹ), the thorn (þ), the aesc (ᚫ), and an odd-looking eth (ð) that resembles a “y.” At the end of the alphabet, the scribe added the first words of the Lord’s Prayer in Latin (pater noster). The British Library’s digital viewer lets readers examine the image in more detail.

At the end of Harley 208, which includes copies of 91 letters by Alcuin and one by Charlemagne, the scribe wrote a line in Old English, “hwæt ic eall feala ealde sæge (“Listen, for I have heard many old sagas”), which is reminiscent of line 869 in Beowulf: “eal fela eald gesegena” (“all the many old sagas”). Is the scribe suggesting that the letters are ancient tales?

A similar alphabet appears in Byrhtferð’s Enchiridion, or handbook (1011), a wide-ranging compilation of information on such subjects as astronomy, mathematics, logic, grammar, and rhetoric. However, the alphabet in the Enchiridion (Ashmole Ms. 328, Bodleian Library, Oxford), differs somewhat from the one above—the æsc rune is replaced by an ae ligature at the end.

We’ve seen several other Old English alphabets arranged in similar order. In most of them, an ampersand follows the letter “z.”  Fred C. Robinson, a Yale philologist and Old English scholar, has said the “earliest of the abecedaria is probably” the one in Harley 208 (“Syntactical Glosses in Latin Manuscripts of Anglo-Saxon Provenance,” published in Speculum, A Journal of Medieval Studies, July 1973). An “abecedarium” (plural “abecedaria”) is an alphabet written in order.

We haven’t seen any examples of the ampersand used in Old English other than in alphabets. The earliest examples we’ve found for the ampersand in actual text are in Middle English. Here’s an example from The Knight’s Tale of the Hengwrt Chaucer, circa 1400, one of the earliest manuscripts of The Canterbury Tales:

The middle line in the image reads: “hir mercy & hir grace” (“her mercy & her grace”). Here’s an expanded version of the passage: “and but i have hir mercy & hir grace, / that i may seen hire atte leeste weye / i nam but deed; ther nis namoore to seye” (“And unless I have her mercy & her grace, / So I can at least see her some way, / I am as good as dead; there is no more to say”).

Middle English writers also used the ampersand in the term “&c,” short for “et cetera.” In a 1418 will, for example, “&c” was used to avoid repeating a name: “quirtayns [curtains] of worsted … in warde of Anneys Elyngton, and … a gowne of grene frese, in ward, &c” (from The Fifty Earliest English Wills in the Court of Probate, edited by Frederick James Furnivall, 1882).

Although literary writers didn’t ordinarily use a symbol for “and” in early Modern English, the ampersand showed up every once in a while. For example, the character slipped into this passage from The Shepheardes Calender (1579), Edmund Spenser’s first major poem: “The blossome, which my braunch of youth did beare, / With breathed sighes is blowne away, & blasted.”

And in the 1603 First Quarto of Hamlet, Shakespeare has Hamlet telling Horatio, “O the King doth wake to night, & takes his rouse [a full cup of wine, beer, etc.].” But “and” replaces the ampersand, and the “O” disappears, in the Second Quarto (1604) and the First Folio (1623).

As for today, we see nothing wrong with using an ampersand in casual writing (we often use “Pat & Stewart” to sign our emails), but we’d recommend “and” for formal writing and noteworthy informal writing.

Nevertheless, formal use of the ampersand is common today in company names, such as AT&T, Marks & Spencer, and Ben & Jerry’s. And some authors, notably H. W. Fowler in A Dictionary of Modern English Usage (1926), have used them regularly in formal writing.

Finally, we should mention that the term “ampersand” is relatively new. Although the “&” character dates back to classical times, the noun “ampersand” didn’t show up in writing until the 18th century.

The earliest OED example for “ampersand” with its modern spelling is from a travel book written in the late 18th century. Here’s an expanded version:

“At length, having tried all the historians from great A, to ampersand, he perceives there is no escaping from the puzzle, but by selecting his own facts, forming his own conclusions, and putting a little trust in his own reason and judgment” (from Gleanings Through Wales, Holland and Westphalia, 1795, by S. J. Pratt).

The expression “from A to ampersand” (meaning from the beginning to the end, or in every particular) is an old way of saying “from A to Z.” It was especially popular in the 19th century.

As we’ve noted, the ampersand followed the letter “z” in some old abecedaria, a practice going back to Anglo-Saxon days. And when children were taught that alphabet in the late Middle Ages, they would recite the letters from “A” to “&.”

In Promptorium Parvolorum (“Storehouse for Children”), a Middle English-to-Latin dictionary written around 1440, English letters that are words by themselves, including the ampersand, are treated specially in reciting the alphabet, according to The Merriam-Webster New Book of Word Histories (1991), edited by Frederick C. Mish.

As Mish explains, when a single letter formed a word or syllable—like “I” (the personal pronoun) or the first “i” in “iris”—it was recited as “I per se, I.”  In other words, “I by itself, I.”

“The per se spellings were used especially for the letters that were themselves words,” Mish writes. “Because the alphabet was augmented by the sign &, which followed z, there were four of these: A per se, A; I per se, I; O per se, O, and & per se, and.”

Since he “&” character was spoken as “and,” children reciting the alphabet would refer to it as “and per se, and.” That expression, Mish says, became “in slightly altered and contracted form, the standard name for the character &.” In other words, “ampersand” originated as a corruption of “and per se, and.”

The two earliest citations for “ampersand” in the OED spell it “ampuse and” (1777) and “appersiand” (1785). Various other spellings continued to appear in the 1800s—“ampus-and” (1859), “Amperzand” (1869)—before the modern version became established.

We’ll end with “The Ampersand Sonnet,” the calligrapher A. J. Fairbank’s take on Shakespeare’s Sonnet 66. In this version of the sonnet, each “and” in Shakespeare’s original is replaced by a different style of ampersand:

Help support the Grammarphobia Blog with your donation. And check out our books about the English language and more.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Phrase origin Usage Writing

Try and stop us!

Q: How old is the use of “and” in place of “to” to join infinitives?  For example, “He wants to try and kill her” instead of “He wants to try to kill her.” I heard the usage on British TV, so it’s not just American.

A: The use of “and” to link two infinitives is very old, dating back to Anglo-Saxon days. And as you’ve discovered, it’s not just an American usage. In fact, the use of “try and” plus a bare, or “to”-less, infinitive (as in “He wants to try and kill her”) is more common in the UK than in the US.

As the Oxford English Dictionary explains, “and” is being used here for “connecting two verbs, the second of which is logically dependent on the first, esp. where the first verb is come, go, send, or try.” With the exception of “come” and “go,” the dictionary adds, “the verbs in this construction are normally only in the infinitive or imperative.”

In other words, the “come” and “go” versions of the usage can be inflected—have different verb forms, such as the future (“He’ll come and do it”) or the past (“He went and did it”). But the “try” version is normally restricted to the infinitive (“He intends to try and stop us”) or the imperative (“Try and stop us!”).

The earliest example of the construction in the OED is from Matthew 8:21 in the West Saxon Gospels, dating from the late 900s: “Drihten, alyfe me ærest to farenne & bebyrigean minne fæder” (“Lord, let me first go and bury my father”). The verbs “go” (farenne) and “bury” (bebyrigean) here are infinitives.

The dictionary’s first citation for a “try” version of the construction is from the records of the Burgh of Edinburgh. A 1599 entry reports that the council “ordanis the thesaure [orders the treasurer] to trye and speik with Jhonn Kyle.”

However, we’ve found an earlier “try” example in the Early English Books Online database: “they are also profitable for the faithfull / for they trye and purefye the faith of goddes [God’s] electe.” From A Disputacio[n] of Purgatorye, a 1531 religious treatise by John Frith, a Protestant writer burned at the stake for heresy.

In the 19th century, some language commentators began criticizing the use of “try and” with a bare infinitive. For example, George Washington Moon chided a fellow British language authority, Henry Alford, Dean of Canterbury, for using “try and” in a magazine article based on Alford’s A Plea for the Queen’s English: Stray Notes on Speaking and Spelling (1863):

“Near the end of a paragraph in the first Essay occurs the following sentence, which is omitted in the book:— ‘And I really don’t wish to be dull; so please, dear reader, to try and not think me so.’ Try and think, indeed! Try to think, we can understand. Fancy saying ‘the dear reader tries and thinks me so’; for, mind, a conjunction is used only to connect words, and can govern no case at all” (The Dean’s English: A Criticism on the Dean of Canterbury’s Essays on the Queen’s English, 1865).

Moon was apparently bugged by Alford’s use of “try and think” because the phrase couldn’t be inflected. But as we’ve shown, English writers had been using “try and” phrases this way for hundreds of years before any commentator raised an objection.

Although some later language authorities have echoed Moon’s objection to the usage, others have said it’s acceptable, especially in informal English.

As Henry W. Fowler says in A Dictionary of Modern English Usage (1926), “It is an idiom that should not be discountenanced, but used when it comes natural.” It meets “the proper standard of literary dignity,” he writes, though it is “specially appropriate to actual speech.”

Jeremy Butterfield, editor of the fourth edition of Fowler’s usage guide, expands on those comments from the first edition, adding that the “choice between try to and try and is largely a matter of spontaneity, rhythm, and emphasis, especially in spoken forms.”

In Garner’s Modern English Usage (4th ed.), Bryan A. Garner describes “try and” as a “casualism,” which he defines as “the least formal type of standard English.” And Pat, in her grammar and usage guide Woe Is I (4th ed.), recommends “try to” for formal occasions, but says “try and, which has been around for hundreds of years, is acceptable in casual writing and in conversation.”

Merriam-Webster’s Dictionary of English Usage, which has dozens of examples of “try and” and similar constructions used by respected writers, says, “About the only thing that can be held against any of these combinations is that they seem to be more typical of speech than of high-toned writing—and that is hardly a sin.” Here are few of the usage guide’s “try and” citations:

“Now I will try and write of something else” (Jane Austen, letter, Jan. 29, 1813).

“ ‘Stand aside, my dear,’ replied Squeers. ‘We’ll try and find out’ ” (Charles Dickens, Nicholas Nickleby, 1839).

“The unfortunate creature has a child still every year, and her constant hypocrisy is to try and make her girls believe that their father is a respectable man” (William Makepeace Thackeray, The Book of Snobs, 1846).

“to try and soften his father’s anger” (George Eliot, Silas Marner, 1861).

“We are getting rather mixed. The situation entangled. Let’s try and comb it out” (W. S. Gilbert, The Gondoliers, 1889).

“If gentlemen sold their things, he was to try and get them to sell to him” (Samuel Butler, The Way of All Flesh, 1903).

Some linguists and grammarians describe the “and” here as a “quasi-auxiliary,” and its use in “try and” constructions as “pseudo-coordination,” since “and” is functioning grammatically as the infinitive marker “to,” not as a conjunction that coordinates (that is, joins) words, phrases, or clauses.

Getting back to your question about the usage in American versus British English, Lynne Murphy, an American linguist at the University of Sussex in Brighton, England, discusses this in The Prodigal Tongue (2018). She cites two studies that indicate “try and” is more popular in the UK than in the US.

One study found that the British “used try and (where they could have said try to) over 71% of the time in speech and 24% of the time in writing,” Murphy writes. “The American figures were 24% for speech and 5% in writing.” (The study she cites is “Try to or Try and? Verb Complementation in British and American English,” by Charlotte Hommerberg and Gunnel Tottie, ICAME Journal, 2007.)

Another study, Murphy says, shows that older, educated people in the UK “prefer try to a bit more, but those under forty-five say try and 85% of the time, regardless of their level of education.” (The study here is “Why Does Canadian English Use Try to but British English Use Try and?” by Marisa Brook and Sali A. Tagliamonte, American Speech, 2016.)

In a Dec. 14, 2016, post on Murphy’s website, Separated by a Common Language, she has more details about the studies. She also notes a theory that some people may choose “try and get to know” over “try to get to know” because of a feeling that the repetition of “to” in the second example sounds awkward. Linguists refer to the avoidance of repetition as horror aequi, Latin for “fear of the same.”

Some grammarians and linguists have suggested that there may be a difference in meaning between the “try and” and “try to” constructions. But the linguist Åge Lind analyzed 50 English novels written from 1960 to 1970 and concluded: “If a subtle semantic distinction exists it does not seem to be observed” (from “The Variant Forms Try and/Try to,” English Studies, December 1983).

Help support the Grammarphobia Blog with your donation. And check out our books about the English language and more.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Language Linguistics Pronunciation Usage Writing

Something wicked this way comes

Q: We were reading Shakespeare, and wondered about the pronunciation of the final “-ed” in words like “beloved” and “blessed.” I just assumed that people in Elizabethan England spoke that way, but my partner thought it was merely a poetical device to fill out a metrical line. What do you say?

A: When the “-ed” suffix first appeared in Old English writing, according to scholars, it sounded much like the modern pronunciation of the last syllable in adjectives like “crooked,” “dogged,” and “wicked.”

In Old English, spoken from the mid-5th to the late 11th centuries, the “-ed” suffix was one of several endings used to form the past participle of verbs and to form adjectives from nouns. For example, the past participle of the verb hieran (to hear) was gehiered (heard). And the adjectival form of the noun hring (ring) was hringed.

The “-ed” syllable was still usually pronounced in Middle English, which was spoken from around 1150 to 1450, but writers occasionally dropped the “e” or replaced it with an apostrophe, an indication that the syllable was sometimes lost in speech. The Old English gehiered (heard), for instance, was variously hered, herrd, herd, etc., in Middle English writing.

It’s clear from the meter that Chaucer intended the “-ed” of “perced” (pierced) to be pronounced at the beginning of The Canterbury Tales (1387): “Whan that Aprill with his shoures soote [its showers sweet] / The droghte of March hath perced to the roote.”

As far as we can tell, the word “pierced” was two syllables in common speech as well as poetry when Chaucer was writing, but a one-syllable version showed up in writing (and probably speaking) by the mid-1500s.

Here’s an example, with the past participle written as “perst,” from “The Lover Describeth His Being Stricken With Sight of His Love,” a sonnet by Thomas Wyatt:

“The liuely sparkes, that issue from those eyes, / Against the which there vaileth [avails] no defence, / Haue perst my hart, and done it none offence” (Songes and Sonettes, 1557, a collection of works by Wyatt, Henry Howard, Nicholas Grimald, and various anonymous poets).

In the early Modern English period, when Shakespeare was writing, the “-ed” ending was often contracted in writing to “-d” or “-t,” indicating that this was the usual pronunciation. Here are a few examples from Shakespeare:

“O, never will I trust to speeches penn’d, / Nor to the motion of a schoolboy’s tongue” (Love’s Labour’s Lost, believed written in the mid-1590s).

“I remember the kissing of her batlet [butter paddle] and the cow’s dugs [uddders] that her pretty chopt hands had milked” (As You Like It, circa 1599). The adjective “chopped” here meant cracked or chapped.

“This would have seem’d a period / To such as love not sorrow” (King Lear, early 1600s).

However, writers in the early Modern English period tended to keep the full “-ed” ending in many words where the syllable is still heard now, as in these examples from Shakespeare:

“To cipher what is writ in learned books, / Will quote my loathsome trespass in my looks” (The Rape of Lucrece, 1594).

“And the stony-hearted villains know it well enough” (King Henry IV, Part I, late 1500s).

“O heaven, the vanity of wretched fools!” (Measure for Measure, early 1600s).

“Something wicked this way comes” (Macbeth, early 1600s).

Although people began dropping the “e” of “-ed” in writing and apparently pronunciation in early Modern English, the full syllable was still being written and pronounced in the 18th and 19th centuries in some words where it’s now lost.

In A Critical Pronouncing Dictionary and Expositor of the English Language (1791), John Walker says the adjectives “crabbed,” “forked,” “flagged,” “flubbed,” “hooked,” “scabbed,” “snagged,” “tusked,” and others are “pronounced in two syllables.” An 1859 update of the dictionary, edited by Townsend Young, adds “hawked,” “scrubbed,” “tressed,” and a few more.

However, writers continued to drop the final syllable of “-ed” words despite the objections of lexicographers and pronunciation  guides. In the early 18th century, one of the sticklers, Jonathan Swift, condemned the loss of the final syllable in verbs written as “drudg’d,” “disturb’d,” “rebuk’d,” and “a thousand others, everywhere to be met with in Prose as well as Verse.”

In a 1712 letter to Robert, Earl of Oxford, Swift argued that “by leaving out a Vowel to save a Syllable, we form so jarring a Sound, and so difficult to utter, that I have often wondred how it could ever obtain.” Yes, “wondred” used to be a past tense of the verb “wonder,” which was originally wondrian in Old English and wondri or woundre in Middle English. Thus language changes.

Today, the “-ed” suffix is used in writing for the past tense and past participle of regular (or weak) verbs, for participial adjectives, and for adjectives derived from nouns. It’s usually not pronounced as a syllable, but there are some notable exceptions.

As the Oxford English Dictionary explains, “this -ed is in most cases still retained in writing, although the pronunciation is now normally vowelless.” The dictionary says “-ed” is usually pronounced as either “d” (as in “robed”) or “t” (“reaped”). The “t” sound follows a voiceless consonant, one produced without the vocal cords.

The OED says the “full pronunciation” of “-ed” as a syllable (pronounced id) “regularly occurs in ordinary speech only in the endings -ted, -ded” (that is, after the letters “t” and “d” as in “hated” and “faded”).

“A few words, such as blessed, cursed, beloved, which are familiar chiefly in religious use, have escaped the general tendency to contraction when used as adjectives,” the OED says, adding that “the adjectival use of learned is distinguished by its pronunciation” as two syllables. Additional exceptions include the adjectives “aged,” “jagged,” “naked,” “ragged,” “wretched,” and others mentioned in this post.

As we said at the beginning, the suffix “-ed” was used in Old English  to form the past participle of verbs and to turn nouns into adjectives.

The past participle of a weak verb was formed by adding “-ed,” “-ad,” “-od,” or “-ud” to the stem. The past participle of a strong verb (now commonly called an irregular verb) was formed by changing the stressed vowel or by adding the suffix “-en.”

And as we said earlier, the use of “-ed” to turn nouns into adjectives has also been around since Anglo-Saxon times. Nevertheless, some language commentators objected to the usage in the 18th and 19th centuries.

Samuel Johnson, for example, apparently considered the usage new and was surprised to see it in these lines from “Ode on Spring” by Thomas Gray: “The insect youth are on the wing, / Eager to taste the honied spring.” Here’s Johnson’s comment:

“There has of late arisen a practice of giving to adjectives derived from substantives, the termination of participles; such as the cultured plain, the daisied bank, but I was sorry to see, in the lines of a scholar like Gray, the honied spring” (from Johnson’s Lives of the Most Eminent English Poets, 1779-81).

We’ll end with a grumpy comment about the adjective “talented,” written by Samuel Taylor Coleridge on July 8, 1832. This is from Specimens of the Table Talk of Samuel Taylor Coleridge (1836), edited by Henry Nelson Coleridge, a frequent visitor to his uncle’s home:

“I regret to see that vile and barbarous vocable talented, stealing out of the newspapers into the leading reviews and most respectable publications of the day. … The formation of a participle passive from a noun is a licence that nothing but a very peculiar felicity can excuse. If mere convenience is to justify such attempts upon the idiom, you cannot stop till the language becomes, in the proper sense of the word, corrupt. Most of these pieces of slang come from America.” (The OED’s earliest examples for the adjective “talented” used to mean “possessing talent” come from British sources.)

Help support the Grammarphobia Blog with your donation. And check out our books about the English language and more.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Word origin Writing

What’s up with ‘below’?

Q: Merriam-Webster describes “below” as an adverb in these two examples: “gazed at the water below” and “voices from the apartment below.” My understanding is that adverbs modify verbs, adjectives, and other adverbs. But “below” here is modifying two nouns, “water” and “apartment.” So what am I missing?

A: You raise a very good question. As it happens, linguists have asked themselves the same thing, and in the last few decades they’ve abandoned the traditional thinking about the status of “below” and similar words that express spatial relationships.

Traditionally, “below” has been classified as either a preposition or an adverb. It’s a preposition if an object follows, as in “the water below the bridge” and “the apartment below ours.” It’s an adverb if it doesn’t have an object, as in “the water below” and “the apartment below.” As far as we can tell, that’s been the thinking among grammarians since the late 18th century.

But as we’ll explain later, linguists now regard “below” solely as a preposition, a view reflected in recent comprehensive grammar books but not yet recognized in popular grammars and standard dictionaries.

Of course, for all practical purposes the word hasn’t changed, either in its meaning or in the way it’s used. In the scholarly comprehensive  grammars, the word has merely shifted in some cases from one lexical category (adverb) to another (preposition).

Standard dictionaries haven’t yet caught up to this new way of thinking about “below.” The 10 standard dictionaries we usually consult say it can be either an adverb or a preposition in constructions like those above.

Cambridge, for example, calls it a preposition in “below the picture” but an adverb in “the apartment below.” The dictionary adds: “When the adverb below is used to modify a noun, it follows the noun.” (We know what you’re thinking: An adverb modifying a noun? Stay tuned.)

Despite the differing labels, the adverb and the preposition have virtually the same meaning. By and large, the standard dictionaries that define them say the adverb means “in or to a lower position” or “beneath,” while the preposition means “lower than” or “beneath.”

And in the Oxford English Dictionary, an etymological dictionary based on historical evidence, the broad definitions for the adverb and the preposition are identical: “Expressing position in or movement to a lower place.”

As we mentioned above, this view of “below” and words like it has a long history. Some similar words of this kind, prepositions that have traditionally been called adverbs when used without an object, include these:

“aboard,” “about,” “above,” “across,” “after,” “against,” “ahead,” “along,” “around,” “before,” “behind,” “below,” “beneath,” “besides,” “between,” “beyond,” “by,” “down,” “for,” “in,” “inside,” “near,” “off,” “on,” “opposite,” “out,” “outside,” “over,” “past,” “round,” “since,” “through,” “throughout,” “to,” “under,” “underneath,” “up,” “within,” “without.”

For example, Lindley Murray’s English Grammar, Adapted to the Different Classes of Learners (1795), says that in some instances a preposition “becomes an adverb merely by its application.” The word “since,” he says, is a preposition in “I have not seen him since that time” and an adverb in “Our friendship commenced long since.”

Murray also says, “The prepositions after, before, above, beneath, and several others, sometimes appear to be adverbs, and may be so considered,” giving as an example “He died not long before.” But when a complement follows, he writes, the word is a preposition, as in “He died not long before that time.”

A generation later, the philosopher John Fearn echoed Murray, referring to “the known Principle” that prepositions at the end of a sentence “become Adverbs by Position.”

Fearn also distinguishes between prepositions that require an object (like “with” and “from”) and those that don’t (like “through”). Those in the second group, he says, are “prepositional adverbs” when they’re used without an object (as in “He went through”).

(From Fearn’s Anti-Tooke: Or an Analysis of the Principles and Structure of Language, Vol. II, 1827, an extended argument against the language theories of John Horne Tooke.)

As we said above, the traditional view persists in standard dictionaries but is no longer found in up-to-date comprehensive grammar. Thinking began to change in the late 1960s, when some academic linguists began questioning the “adverb” label and widening the definition of “preposition.”

In the early ’90s, the linguist Ronald W. Langacker gave four examples of “below” as a preposition—“the valley below; the valley below the cliff; A bird flew below; A bird flew below the cliff.” (From “Prepositions as Grammatical(izing) Elements,” published in the journal Leuvense Bijdragen, 1992.)

Note that in those examples “below” is classified as a preposition (1) whether it’s used alone or with a complement, and (2) whether it follows a noun or a verb—thus resembling an adjective in one case (“valley below”) and an adverb in the other (“flew below”).

Most linguists today would agree with that interpretation: “below” and words like it are prepositions. Used with a complement, they’re said to be “transitive prepositions”; used without one, they’re “intransitive prepositions.”

The newer interpretation has only gradually made its way into major books on English grammar.

For example, the old view persisted at least through the publication in 1985 of A Comprehensive Grammar of the English Language, by Randolph Quirk et al. It uses the terms “postmodifying adverb” and “prepositional adverb” for “below” and similar words in constructions like these.

A “postmodifying adverb,” according to the Comprehensive Grammar, is identical to a preposition except that it has no complement and modifies a preceding noun. Examples given include “the sentence below” … “the way ahead” … “the people behind.”

A “prepositional adverb,” the book says, is identical to a preposition but has no complement and modifies a verb. Examples include “She stayed in” … “A car drove past.

The word is a preposition, according to Quirk, only if a complement is present (and regardless of what it seems to modify). Examples include “below the picture” … “She stayed in the house” … “A car drove past the door.

The Comprehensive Grammar doesn’t use the words “transitive” and “intransitive” for prepositions, but it comes close: “The relation between prepositional adverbs and prepositional phrases may be compared to that between intransitive and transitive uses of certain verbs.”

The next exhaustive grammar book to come along, The Cambridge Grammar of the English Language (2002), does use those terms. In this book, words that Quirk had previously classified as either postmodifying adverbs or prepositional adverbs are newly categorized as prepositions. The Cambridge Grammar uses “transitive” for prepositions that have a complement, “intransitive” for those that don’t—and it’s the first important English grammar to do so.

The book calls “in” and “since” intransitive prepositions here: “He brought the chair in” … “I haven’t seen her since.” And it calls them transitive prepositions here: “He put it in the box” … “I haven’t seen her since the war.”

The authors of the Cambridge Grammar, Rodney Huddleston and Geoffrey K. Pullum, don’t discuss “below” at length, but they do say that it “belongs only to the preposition category.” It’s also included among a list of prepositions that are used with or without a complement, and these examples show it without one: “the discussion below” … “the room below.

Huddleston and Pullum essentially redraw the boundary between prepositions and adverbs, defining prepositions more broadly than “traditional grammars of English.” In this, they say, they’re “following much work in modern linguistics.” And they give two chief reasons why they  reject the traditional view and reclassify words like “below” as prepositions.

(1) The traditional view “does not allow for a preposition without a complement.” The Cambridge Grammar argues that the presence or absence of a complement has no bearing on the classification. So “the traditional definition of prepositions,” one that says they require a complement, is “unwarranted.”

The book makes an important point about these newly recognized prepositions. Their ability to stand alone, without a complement, “is not a property found just occasionally with one or two prepositions, or only with marginal items,” the book says. “It is a property found systematically throughout a wide range of the most central and typical prepositions in the language.”

(2) The “adverb” label is inappropriate for words like “below” because they don’t behave like adverbs. In “The basket is outside,” for instance, the word “outside” is traditionally defined as an adverb. But as the authors point out, typical adverbs, such as those ending in “-ly,” aren’t normally used to modify forms of the verb “be.”

That role is normally played by adjectives, or by prepositions of the kind we’re discussing—“inside,” “outside,” “above,” “below,” and so on. And such words, the authors write, “no more modify the verb than does young in They are young.”

[Here you might ask, Then why aren’t these words adjectives? “Below” certainly looks like an adjective in uses like “the water below.” The Cambridge Grammar discusses this at length and gives reasons including these: Prepositions can have objects but adjectives can’t. Prepositions are fixed, while adjectives can be inflected for degree (as in “heavy,” “heavier,” “heaviest”) or modified by “very” and “too.” As we wrote on the blog in 2012, the adjectival use of “below” premodifying a noun, as in “Click on the below link,” is not generally accepted.]

In summary, Huddleston and Pullum suggest that if an “-ly” adverb cannot be substituted for the word, then it’s not an adverb. And if a complement could be added (as in “The basket is outside the door”), then it’s not an adverb.

The next influential scholarly grammar to be published, the Oxford Modern English Grammar (2011), written by Bas Aarts, reinforces and builds on this distinction between transitive and intransitive prepositions. And it includes “below” in a list of prepositions that can be used either way—with or without a complement.

Aarts also discusses prepositions that follow a verb and can either stand alone or have a complement: “We might go out” or “We might go out for a meal “I shall probably look in” … or “I shall probably look in at the College.”

In short, modern developments in linguistics have given “below” a new label—it’s a preposition, and only a preposition. The traditional view lives on in dictionaries, and no doubt it will persist for quite some time. But in our opinion, the new label makes more sense.

Help support the Grammarphobia Blog with your donation. And check out our books about the English language and more.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Pronunciation Usage Writing

Tawk of the Town

[Pat’s review of a book about New York English, reprinted from the September 2020 issue of the Literary Review, London. We’ve left in the British punctuation and spelling.]

* * * * * * * * * * * *

PATRICIA T O’CONNER

You Talkin’ to Me? The Unruly History of New York English

By E J White. Oxford University Press 296 pp

You know how to have a polite conversation, right? You listen, wait for a pause, say your bit, then shut up so someone else can speak. In other words, you take your turn.

You’re obviously not from New York.

To an outsider, someone from, say, Toronto or Seattle or London, a conversation among New Yorkers may resemble a verbal wrestling match. Everyone seems to talk at once, butting in with questions and comments, being loud, rude and aggressive. Actually, according to the American linguist E J White, they’re just being nice.

When they talk simultaneously, raise the volume and insert commentary (‘I knew he was trouble’, ‘I hate that!’), New Yorkers aren’t trying to hijack the conversation, White says. They’re using ‘cooperative overlap’, ‘contextualization cues’ (like vocal pitch) and ‘cooperative interruption’ to keep the talk perking merrily along. To them, argument is engagement, loudness is enthusiasm and interruption means they’re listening, she writes. Behaviour that would stop a conversation dead in Milwaukee nudges it forward in New York.

Why do New Yorkers talk this way? Perhaps, White says, because it’s the cultural norm among many of the communities that have come to make up the city: eastern European Jews, Italians, and Puerto Ricans and other Spanish speakers. As for the famous New York accent, that’s something else again.

White, who teaches the history of language at Stony Brook University, New York, argues that ‘Americans sound the way they do because New Yorkers sound the way they do’. In You Talkin’ to Me? she makes a convincing case that the sounds of standard American English developed, at least in part, as a backlash against immigration and the accent of New York.

Although the book is aimed at general readers, it’s based on up-to-the-minute research in the relatively new field of historical sociolinguistics. (Here a New Yorker would helpfully interrupt, ‘Yeah, which is what?’) Briefly, it is about how and why language changes. Its central premise is that things like social class, gender, age, group identity and notions of prestige, all working in particular historical settings, are what drive change.

Take one of the sounds typically associated with New York speech the oi that’s heard when ‘bird’ is pronounced boid, ‘earl’ oil, ‘certainly’ soitanly, and so on. Here’s a surprise. That oi, White says, was ‘a marker of upper-class speech’ in old New York, a prestige pronunciation used by Teddy Roosevelt and the Four Hundred who rubbed elbows in Mrs Astor’s ballroom. Here’s another surprise. The pronunciation is now defunct and exists only as a stereotype. It retired from high society after the First World War and by mid-century it was no longer part of New York speech in general. Yet for decades afterwards it persisted in sitcoms, cartoons and the like. Although extinct ‘in the wild’ (as linguists like to say), it lives on in a mythological ‘New York City of the mind’.

Another feature of New York speech, one that survives today, though it’s weakening, is the dropping of r after a vowel in words like ‘four’ (foah), ‘park’ (pahk) and ‘never’ (nevuh). This was also considered a prestige pronunciation in the early 1900s, White says, not just in New York City but in much of New England and the South as well, where it was valued for its resemblance to cultivated British speech. Until sometime in the 1950s, in fact, it was considered part of what elocutionists used to call ‘General American’. It was taught, the author writes, not only to schoolchildren on the East Coast, but also to aspiring actors, public speakers and social climbers nationwide. But here, too, change lay ahead.

While r-dropping is still heard in New York, Boston and pockets along the Eastern Seaboard, it has all but vanished in the South and was never adopted in the rest of the United States. Here the author deftly unravels an intriguing mystery: why the most important city in the nation, its centre of cultural and economic power, does not provide, as is the case with other countries, the standard model for its speech.

To begin with, White reminds us, the original Americans always pronounced r, as the British did in colonial times. Only in the late 18th century did the British stop pronouncing r after a vowel. Not surprisingly, the colonists who remained in the big East Coast seaports and had regular contact with London adopted the new British pronunciation. But those who settled inland retained the old r and never lost it. (As White says, this means that Shakespeare’s accent was probably more like standard American today than Received Pronunciation.)

Posh eastern universities also helped to turn the nation’s accent westward. Towards the end of the First World War, White says, Ivy League schools fretted that swelling numbers of Jewish students, admitted on merit alone, would discourage enrolment from the Protestant upper class. Admissions practices changed. In the 1920s, elite schools began to recruit students from outside New York’s orbit and to ask searching questions about race, religion, colour and heritage. The result, White says, was that upper-crust institutions ‘shifted their preference for prestige pronunciation toward the “purer” regions of the West and the Midwest, where Protestants of “Nordic” descent were more likely to live’. Thus notions about what constituted ‘educated’ American speech gradually shifted.

Another influence, the author writes, was the Midwestern-sounding radio and television ‘network English’ that was inspired by the Second World War reporting of Edward R Murrow and the ‘Murrow Boys’ he recruited to CBS from the nation’s interior. Murrow’s eloquent, authoritative reports, heard by millions, influenced generations of broadcasters, including Walter Cronkite, Chet Huntley and Dan Rather, who didn’t try to sound like they had grown up on the Eastern Seaboard. The voice of the Midwest became the voice of America.

This book takes in a lot of territory, all solidly researched and footnoted. But dry? Fuhgeddaboutit. White is particularly entertaining when she discusses underworld slang from the city’s ‘sensitive lines of business’ and she’s also good on song lyrics, from Tin Pan Alley days to hip-hop. She dwells lovingly on the ‘sharp, smart, swift, and sure’ lyrics of the golden age of American popular music – roughly, the first half of the 20th century. It was a time when New York lyricists, nearly all of them Jewish, preserved in the American Songbook not only the vernacular of the Lower East Side but also the colloquialisms of Harlem and the snappy patois of advertising.

You Talkin’ to Me? is engrossing and often funny. In dissecting the exaggerated New York accents of Bugs Bunny and Groucho Marx, White observes that ‘Bugs even wielded his carrot like Groucho’s cigar’. And she says that the word ‘fuck’ is so ubiquitous in Gotham that it has lost its edge, so a New Yorker in need of a blistering insult must look elsewhere. ‘There may be some truth to the old joke that in Los Angeles, people say “Have a nice day” and mean “Fuck off,” while in New York, people say “Fuck off” and mean “Have a nice day.”’

Help support the Grammarphobia Blog with your donation. And check out our books about the English language and more.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Language Linguistics Usage Word origin Writing

Lex education

Q: I recently received a list of the finalists in a wordplay contest for lexophiles. The winner: “Those who get too big for their pants will be totally exposed in the end.” So, what can you say about the term “lexophile”? I love a clever turn of aphasia.

A: That wordplay list, which has been making the rounds online, purports to be from an annual New York Times lexophile contest. As far as we know, the Times has never had such a contest. In fact, we couldn’t find the word “lexophile” in a search of the newspaper’s archive.

We also couldn’t find “lexophile” in the Oxford English Dictionary or any of the 10 standard dictionaries we regularly consult. However, we did find it in two collaborative references, the often helpful Wiktionary and the idiosyncratic Urban Dictionary.

Wiktionary defines “lexophile” as “a lover of words, especially in word games, puzzles, anagrams, palindromes, etc.” The elements are Greek: “lex-” from λέξις (lexis, word), and “-phile” from ϕίλος (philos, loving).

One contributor to Urban Dictionary defines “lexophile” as “a lover of cryptic words,” while another defines “lexiphile” as “a word used to describe those that have a love for words.”

A more common noun for a word lover, “logophile,” is found in eight standard dictionaries as well as the OED, which is an etymological dictionary. The element “log-” is from the Greek λόγος (logos, word); both logos and lexis are derived from λέγειν (legein, to speak).

The earliest OED citation for “logophile” is from the Feb. 1, 1959, issue of the Sunday Times (London): “We are pretty sure that since all Sunday Times readers are natural and inveterate logophiles … he [the lexicographer R. W. Burchfield] will get some invaluable assistance.”

We’ve found an earlier example for “logophile” in a California newspaper, but the term was used to mean someone who loves to talk, not someone who loves words: “One who loves to talk, but does not carry it to the point of mania, is a logophile, pronounced: LOG-uh-file” (San Bernardino Sun, Jan. 17, 1951).

Interestingly, the noun logophile appeared in French in the mid-19th century with a similar voluble sense. Dictionnaire National ou Dictionnaire Universel de la Langue Française (1850), by Louis-Nicolas Bescherelle, defines a logophile as “Qui aime à parler, à faire de longs discours.”

Merriam-Webster says the first known use of “logophile” in English was in 1923, but it doesn’t include a citation. We haven’t been able to find any examples earlier than the mid-20th century.

As for your “clever turn of aphasia,” the less said the better.

Help support the Grammarphobia Blog with your donation. And check outour books about the English language and more.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Word origin Writing

Whomspun history

Q: I often see the use of “whomever” as an object in a subordinate clause like “whomever he chooses.” I can see the logic of this, but it feels awkward to me. Is it because I grew up surrounded by grammatical laxity that “whomever” seems like a neologism born of pedantry? Was it already established as correct English before my time?

A: If “whomever” seems awkward to you, its stuffier sidekick “whomsoever” must strike you as even more awkward. The roots of both pronouns, as well as of “whom” itself, go back to Anglo-Saxon times, though it looks as if all three may be on the way out.

In Old English, the language was much more inflected than it is now—that is, words changed their forms more often to show their functions. You can see this in some of the forms, or declensions, of hwa, the ancestor of “who,” “whom,” and “what.”

When used for a masculine or feminine noun, as we use “who” and “whom” today, the Old English forms were hwa (subject), hwam or hwæm (indirect object), and hwone or hwæne (direct object). When used for a neuter noun, as we use “what” today, the forms were hwæt (subject), hwam or hwæm (indirect object), and hwæt (direct object).

As for “whoever” and “whomever,” the two terms ultimately come from swa hwa swa, the Old English version of “whoso,” and swa hwam swaswa, the early Middle English rendering of “whomso.”

An Old English speaker would use swa hwa swa (literally “so who so”) much as we would use “whoever” and “whosoever.” And his Middle English-speaking descendants would use swa hwam swaswa (“so whom soso”) as we would use “whomever” and “whomsoever.”

Here’s an early “whoso” example that we’ve found in King Alfred’s Old English translation (circa 888) of De Consolatione Philosophiae, a sixth-century Latin work by the Roman philosopher Boethius: “swa hwa swa wille dioplice spirigan mid inneweardan mode æfter ryhte” (“whoso would deeply search with inner mind after truth”).

And here’s a “whomso” citation in the Oxford English Dictionary from a 12th-century document in the Anglo-Saxon Chronicle: “Þæt hi mosten cesen of clerchades man swa hwam swaswa hi wolden to ercebiscop” (“that they could choose from the secular clerks whomso they wished as archbishop”).

“Whosoever” (hwase eauer) and “whoever” (hwa efre) also first appeared writing in the  12th century, while “whomever” (wom euer) showed up in the 14th century and “whomsoever” (whom-so-euyr) followed in the 15th.

The first OED citation for “whoever,” which we’ve expanded, is from an Old English sermon in the Lambeth Homilies (circa 1175):

“Hwa efre þenne ilokie wel þene sunne dei. oðer þa oðer halie daʒes þe mon beot in chirche to lokien swa þe sunne dei. beo heo dalneominde of heofene riches blisse” (“Whoever looks well on Sunday and on the other holy days that man must also be in church, then he shall participate in the heavenly kingdom’s bliss”).

The dictionary’s earliest example for “whomever” is from Arthour and Merlin (circa 1330): “Wom euer þat he hitt, Þe heued to þe chinne he slitt” (“Whomever he hit, he beheaded, to the chin he slit”). Arthurian legends can get gory at times.

So as you can see, “whomever” was indeed established in English before your time—quite a few centuries before.

As for the use of these terms today, you can find “whoso” and “whomso” in contemporary dictionaries, but they’re usually labeled “archaic,” while “whosoever” and “whomsoever” are generally described as formal versions of “whoever” and “whomever.”

“Who,” of course, is still one of the most common pronouns in English, but “whom” and company are falling out of favor, and many usage writers now accept the use of “who” and “whoever” for “whom,” “whomever,” and “whomsoever” in speech and informal writing.

As Jeremy Butterfield puts it in Fowler’s Dictionary of Modern English Usage (4th ed.), “In practice, whom is in terminal decline and is increasingly replaced by who (or that), especially in conversational English, in which in most cases it would be inappropriately formal.”

Butterfield’s recommendation: “Despite exceptions, the best general rule is as follows: who will work perfectly well in conversation (except the most elevated kind) and in informal writing.” The main exception he notes is that “who” should not be used for “whom” right after a preposition.

Traditionally, as you know, “who” (like the Old English hwa) is a subject, and “whom” (like hwam) is an object. As Pat explains in Woe Is I, her grammar and usage book, “who does something (it’s a subject, like he), and whom has something done to it (it’s an object, like him).”

Pat recommends the traditional usage in formal writing, but she has a section in the new fourth edition of Woe Is I on how to be whom-less in conversation and informal writing:

A Cure for the Whom-Sick

Now for the good news. In almost all cases, you can use who instead of whom in conversation or in informal writing—personal letters, casual memos, emails, and texts.

Sure, it’s not a hundred percent correct, and I don’t recommend using it on formal occasions, but who is certainly less stuffy, especially at the beginning of a sentence or a clause: Who’s the letter from? Did I tell you who I saw at the movies? Who are you waiting to see? No matter who you invite, someone will be left out.

A note of caution: Who can sound grating if used for whom right after a preposition. You can get around this by putting who in front. From whom? becomes Who from? So when a colleague tells you he’s going on a Caribbean cruise and you ask, “Who with?” he’s more likely to question your discretion than your grammar.

[Note: The reader who sent us this question responded, “Your example involving a Caribbean cruise seems fraught with danger in these pan(dem)icky times. If a colleague were to tell me that, my first instinct would be to ask, ‘Who would dare?’ ”]

Help support the Grammarphobia Blog with your donation. And check out our books about the English language and more.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Writing

When negatives collide

Q: I often encounter a sentence such as “I wouldn’t be surprised if she didn’t steal the necklace,” when it actually means the opposite—the speaker or writer wouldn’t be surprised if she DID steal it. Is there a term for this (a type of double negative, maybe)? And how did it come to be so widespread?

A: We’ve seen several expressions for this kind of construction. Terms used by linguists today include “expletive negation,” in which “expletive” means redundant; “negative concord,” for multiple negatives intended to express a single negative meaning; and, more simply, “overnegation.”

Yes, it’s also been called a “double negative,” the term H. L. Mencken used for it more than 80 years ago. Like linguists today, Mencken didn’t find this particular specimen odd or unusual. As he wrote in The American Language (4th ed., 1936), “ ‘I wouldn’t be surprised if it didn’t rain’ is almost Standard American.”

The linguist Mark Liberman discussed this usage—“wouldn’t be surprised” followed by another negative—on the Language Log in 2009. He called it a “species of apparent overnegation” along the lines of “fail to miss” and “cannot underestimate.” (More on those two later.)

Of course, what appears to be an overnegation may not be so. For instance, if everyone but you is predicting rain, you might very well respond with “I wouldn’t be surprised if it didn’t rain” (i.e., you wouldn’t be surprised if it failed to rain). No overnegation there, just two negatives used literally, nothing redundant.

But the usage we’re discussing is a genuine redundancy with no literal intent. And it’s a type of redundancy that’s very common, especially in spoken English. Yet it seldom causes confusion. People generally interpret those dueling negative clauses just as the writer or speaker intends.

You’re a good example of this. While you noticed the redundancy there (“I wouldn’t be surprised if she didn’t steal the necklace”), you correctly interpreted the writer’s meaning (if she did steal it). And no doubt most people would interpret it that way, whether they encountered the sentence in writing or in speech. Why is this?

In the case of written English, our guess is that readers interpreting the writer’s intent take their cues not only from the surrounding context but also from their own past experience. They’re used to seeing this construction and don’t automatically interpret it literally.

In the case of spoken English, where the usage is more common, listeners have the added advantage of vocal cues. Take these two sentences, which are identical except for the different underlined stresses. A listener would interpret them as having opposite meanings:

(1) “I wouldn’t be surprised if he didn’t win” = I wouldn’t be surprised if he won.

(2) “I wouldn’t be surprised if he didn’t win” = I wouldn’t be surprised if he lost.

In #1, the redundant or overnegated example, the speaker emphasizes the verb and whizzes past the superfluous second negative (“didn’t”). But in #2, the literal example, the speaker emphasizes the second negative, so there’s no doubt that it’s intentional and not redundant.

Language types have been commenting on the overnegated “wouldn’t be surprised” usage since the 19th century.

On the Language Log, Liberman cites this entry from “Some Peculiarities of Speech in Mississippi,” a dissertation written by Hubert Anthony Shands in 1891 and published in 1893: “Wouldn’t be surprised if it didn’t. This expression is frequently used by all classes in the place of wouldn’t be surprised if it did.”

The usage wasn’t peculiar to Mississippi, though. In old newspaper databases, we’ve found 19th-century examples from other parts of the country.

These two 1859 sightings, the earliest we’ve seen, appeared in a humorous story, written in dialect, from the May 7, 1859, issue of the Columbia Spy, a Pennsylvania newspaper:

“ ‘There’s been so much hard swearin’ on that book’ (pointing to Logan’s Bible) ‘I wouldn’t be surprised if the truth was not pretty considerably ranshacked outen it.’ ”

“ ‘I wouldn’t be surprised if you wa’nt vain arter this.’ ”

This example is from newspaper serving the twin cities of Bristol in Virginia and Tennessee: “I wouldn’t be surprised if some of them didn’t run away after all without paying their bills.” (The Bristol News,  Feb. 8, 1876.)

And here’s one from the Midwest: “The business interests of Salina feel the weight of their power, and we wouldn’t be surprised if even Nature did not pause for a moment and measure their colossal proportions.” (The Saline County Journal in Salina, Kansas, Jan. 25, 1877.)

As mentioned above, there are other varieties of overnegation besides the “wouldn’t be surprised” variety. Here are some of the more common ones, along with their intended interpretations.

“You can’t fail to miss it” = You can’t miss it

“We can’t underestimate” = We can’t overestimate

“Nothing is too trivial to ignore” = Nothing is too trivial to consider

“I don’t deny that she doesn’t have some good qualities” = I don’t deny that she does have some good qualities

“We don’t doubt that it’s not dangerous” = We don’t doubt that it is dangerous

As we’ve said, even readers or listeners who notice the excess negativity will understand the intended meaning.

The Dutch linguist Wim van der Wurff uses the term “expletive negation” for usages of this kind. As he explains, the first clause “involves a verb or noun with the meaning ‘fear,’ ‘forbid,’ ‘prohibit,’ ‘hinder,’ ‘prevent,’ ‘avoid,’ ‘deny,’ ‘refuse,’ ‘doubt’ or another predicate with some kind of negative meaning.” What follows is a subordinate clause with “a negative marker” that’s “semantically redundant, or expletive.”

He gives an example from a letter written by Charles Darwin: “It never occurred to me to doubt that your work would not advance our common object in the highest degree.” (From Negation in the History of English, edited by Ingrid Tieken-Boon Van Ostade and others.)

Historical linguists have shown that this sort of overnegation exists in a great many languages and in fact was a common usage in Old English and early Middle English.

“Negative concord has been a native grammatical construction since the time of Alfred, at least,” Daniel W. Noland writes, referring to the 9th-century Saxon king (“A Diachronic Survey of English Negative Concord,” American Speech, summer 1991).

But after the Middle Ages, the use of overnegation in English began to fall off, at least in the writings that have been handed down. Little by little, from around the late 15th to the 18th century, multiple negations became less frequent until they finally came to be considered unacceptable. Why?

Don’t point to the grammarians. It seems that this transition happened naturally, not because people started to object on logical or grammatical grounds.

In her monograph A History of English Negation (2004), the Italian linguist Gabriella Mazzon says the claim “that multiple negation was excluded from the standard as a consequence of the grammarians’ attacks is not correct, since the phenomenon had been on its way out of this variety [i.e., standard English] for some time already.”

As for today, Noland says in his American Speech paper, this type of overnegation “still has a marginal status even in standard English.”

We wouldn’t be surprised!

Help support the Grammarphobia Blog with your donation. And check out our books about the English language and more.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Word origin Writing

To internet, or not to internet?

Q: I saw this in a New York Times article the other day: “The Virus Changed the Way We Internet.” And this was the tagline of a recent Bayer TV commercial: “This is why we science.” Am I just an old fogy or can any noun be turned into a verb these days?

A: You won’t find the verbs “internet” or “science” in standard dictionaries, but there’s a case to be made for the verbing of the noun “internet.” In fact, the verb showed up in print just a year after the noun, though not in the sense you’re asking about.

When the noun “internet” first appeared in 1975, it referred to “a computer network comprising or connecting a number of smaller networks,” according to the Oxford English Dictionary. When the verb appeared in 1976, it meant “to connect by means of a computer network.”

The usual sense of the noun now—a global computer network that allows users around the world to communicate and share information—evolved over the 1980s and ’90s. The verb took on the sense you’re asking about—to use the internet—in the 1990s.

Here are the two earliest OED citations for the verb used in that way: “A number of providers want you to Internet to their services” (Globe & Mail, Toronto, May 13, 1994) … “I didn’t sleep, I didn’t eat. I just internetted” (Associated Press, Aug. 21, 1994).

Oxford doesn’t include a usage label that would suggest the verb is anything other than standard English. However, none of the 10 standard dictionaries that we regularly consult have an entry for “internet” as a verb. (The collaborative Wiktionary includes it as an “informal” verb meaning to use the internet, and offers this example: “Having no idea what that means, I am internetting like mad.”)

As for the verb “science,” we couldn’t find an entry for it in either the OED or standard dictionaries. However, Oxford and four standard dictionaries include the adjective “scienced” as a rare or archaic usage.

Oxford describes the adjective as “now rare” when used to mean “knowledgeable, learned; skilled or trained in a specified profession or pursuit; (in later use also) adopting a scientific approach.” It says the term is “now somewhat archaic” when used in the sense of “well versed or trained in boxing.”

(Wiktionary includes the “colloquial, humorous” use of the verb “science,” meaning “to use science to solve a problem.” It also includes the adjective “scienced,” meaning “knowledgeable, learned; skilled or trained in a specified profession or pursuit.” It doesn’t cite any examples.)

Speaking for ourselves, we aren’t likely to use “internet” or “science” as a verb, at least not yet. Neither usage is widespread enough. However, we see nothing wrong in principle with the verbing of nouns. In a 2016 post, we defended it as process that dates back to the early days of English.

Help support the Grammarphobia Blog with your donation. And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a comic novel.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Expression Language Linguistics Usage Writing

Sticking in a knife with a smile

Q: I have recently heard two instances of someone prefacing a criticism by saying, “I am telling you this lovingly.” It sounds to me like sticking in a knife with a smile. It’s similar to prefacing a remark with “clearly,” an indication that things may not be all that clear. Any thoughts about this?

A: We haven’t yet noticed “lovingly” used to criticise with a smile. But like you, we’re bugged by deceptive preludes to faultfinding.

As you know, these introductory remarks are often followed by the word “but” and the critical statement. Some of the more common ones: “I don’t want to criticize, but …,” “I hate to be the one to tell you, but …,” “Don’t take this the wrong way, but …,” and “I don’t want to hurt your feelings, but ….”

These “contrary-to-fact phrases” have been called “false fronts,” “wishwashers,” “but heads,” and “lying qualifiers,” according to the lexicographer Erin McKean, as we noted in a 2012 post.

McKean says the object of these opening remarks is “to preemptively deny a charge that has yet to be made, with a kind of ‘best offense is a good defense’ strategy” (Boston Globe, Nov. 14, 2010).

“This technique,” she notes, “has a distinguished relative in classical rhetoric: the device of procatalepsis, in which the speaker brings up and immediately refutes the anticipated objections of his or her hearer.”

Once you start looking for these deceptive introductions, McKean says, “you see them everywhere, and you see how much they reveal about the speaker. When someone says ‘It’s not about the money, but …,’ it’s almost always about the money. If you hear ‘It really doesn’t matter to me, but …,’ odds are it does matter, and quite a bit.”

“ ‘No offense, but …’ and ‘Don’t take this the wrong way, but …’ are both warning flags, guaranteed to precede statements that are offensive, insulting, or both,” she adds. “ ‘I don’t mean to be rude, but …’ invariably signals the advent of breathtaking, blatant, write-in-to-Miss-Manners-style rudeness. (And when someone starts out by saying ‘Promise me you won’t get mad, but …’ you might as well go ahead and start getting mad.)”

McKean doesn’t mention the use of “clearly” at the beginning of a sentence, but she discusses a few similar sentence adverbs: “Someone who begins a sentence with ‘Confidentially’ is nearly always betraying a confidence; someone who starts out ‘Frankly,’ or ‘Honestly,’ ‘To be (completely) honest with you,’ or ‘Let me give it to you straight’ brings to mind Ralph Waldo Emerson’s quip: ‘The louder he talked of his honor, the faster we counted our spoons.’ ”

We should also mention a 2013 post of ours about “Just sayin’,” an expression that follows a critical comment: “ ‘You might look for a new hair stylist. Just sayin’.”

Why do people use deceptive phrases in criticizing others? McKean suggests that “our real need for these phrases may be rooted in something closer to self-delusion. We’d all like to believe we aren’t being spiteful, nosy or less than forthcoming. To proclaim our innocence in this way is to assert that we are, indeed, innocent.”

However, we think that many of us—including the two of us—use these sneaky expressions simply because we don’t feel comfortable criticizing others, even when criticism may be warranted. Unfortunately, a sneaky criticism often stings more than one that’s plainspoken.

Help support the Grammarphobia Blog with your donation. And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a comic novel.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Writing

Now I am become Death

Q: I recently read a reference to J. Robert Oppenheimer’s comment about the first test of an atomic bomb: “Now I am become Death, the destroyer of worlds.” I assume that “I am become” is an old usage. How would it be expressed in modern English?

A: That quotation illustrates an archaic English verb construction that’s now found chiefly in literary, poetic, or religious writings. This is the use of forms of “be” in place of “have” as an auxiliary verb in compound tenses: “The prince is [or was] arrived” instead of “The prince has [or had] arrived.”

The passage you ask about, “I am become Death,” is a present-perfect construction equivalent to “I have become Death.” (We’ll have more to say later about Oppenheimer and his quotation from the Bhagavad Gita.)

As we wrote on the blog in 2015, Lincoln’s Gettysburg Address has a well-known example of this usage: “We are met on a great battle-field.” Another familiar use is from the Bible: “He is risen” (King James Version, Matthew 28:6). And Mark Twain uses “I am grown old” in his Autobiography (in a passage first published serially in 1907). All of those are in the present-perfect tense.

Though usages like this were rare in Old English, they became quite frequent during the early Modern English period—roughly from the late 1400s to the mid-1600s, according to The Origins and Development of the English Language (4th ed., 1992), by Thomas Pyles and John Algeo.

The verbs affected were mostly intransitive (that is, without objects) and involved movement and change. The Oxford English Dictionary mentions “verbs of motion such as come, go, rise, set, fall, arrive, depart, grow, etc.”

The dictionary’s citations from the mid-1400s include “So may þat boy be fledde” (“That boy may well be fled”) and “In euell tyme ben oure enmyes entred” (“Our enemies are entered in evil times”).

In Modern English (mid-17th century onward), this auxiliary “be” faded from ordinary English and was largely replaced by “have.” So by Lincoln’s time, the auxiliary “be” was considered poetic or literary. You can see why if you look again at the examples above.

Lincoln used “we are met” to lend his speech a gravity and stateliness that wouldn’t be conveyed by the usual present-perfect (“we have met”). “He is risen” is nobler and more elevated than the usual present perfect (“He has risen”). And Twain’s poetic “I am grown old” is weightier and more solemn than the prosaic version (“I have grown old”).

Apart from matters of tone, the auxiliary “be,” especially in the present perfect, conveys a slightly different meaning than the auxiliary “have.” It emphasizes a state or condition that’s true in the present, not merely an act completed in the past.

As Oxford says, this use of “be” expresses “a condition or state attained at the time of speaking, rather than the action of reaching it, e.g. ‘the sun is set,’ ‘our guests are gone,’ ‘Babylon is fallen,’ ‘the children are all grown up.’ ”

Even today verbs are sometimes conjugated with “be” when they represent states or conditions. A modern speaker might easily say, “The kids were [vs. had] grown long before we retired,” or “By noon the workmen were [vs. had] gone,” or “Is [vs. has] she very much changed?”

In older English, those participles (“grown,” “gone,” “changed”) would have been recognized as verbs (“grow,” “go,” “change”) conjugated in the present perfect with the auxiliary “be.” Many such examples are interpreted as such in the OED. However, in current English they can also be analyzed as participial adjectives modifying a subject, with “be” as the principal verb.

In its entry for the verb “grow,” for example, Oxford has this explanation: “In early use always conjugated with be, and still so conjugated when a state or result is implied.” And in the case of “gone,” the dictionary says that its adjectival use “developed out of the perfect construction with be as auxiliary, reinterpreted as main verb with participial adjective.”

We can never write enough about the word “be.” As David Crystal says, “If we take its eight elements together—be, am, are, is, was, were, being, been—it turns out to be the most frequent item in English, after the” (The Story of Be, 2017).

And a word that’s in constant, heavy use for 1,500 years undergoes a lot of transformations. It’s entitled to be complicated, and no doubt further complications are still to come. To use an expression first recorded in the 1600s, miracles are not ceased.

As for Oppenheimer’s comment, various versions have appeared since he witnessed the atomic test at Alamogordo, NM, on July 16, 1945. You can hear his words in The Decision to Drop the Bomb, a 1965 NBC documentary:

“We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad Gita: Vishnu is trying to persuade the Prince that he should do his duty and, to impress him, takes on his multi-armed form and says, ‘Now I am become Death, the destroyer of worlds.’ I suppose we all thought that, one way or another.”

Help support the Grammarphobia Blog with your donation. And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a comic novel.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Word origin Writing

A ‘they’ by any other name

Q: You have defended the singular “they” when it applies to an unknown person of unknown gender. OK. But how about for a known person of unknown gender? A recent news article that said “they were fired” caused me to search back and forth to find who else was fired. A waste of time.

A: We have indeed defended the use of “they” in the singular for an unknown person—an individual usually represented by an indefinite pronoun (“someone,” “everybody,” “no one,” etc.). Some examples: “If anyone calls, they can reach me at home” … “Nobody expects their best friend to betray them” … “Everyone’s looking out for themselves.”

As we’ve said on the blog, this singular use of “they” and its forms (“them,” “their,” “themselves”) for an indefinite, unknown somebody-or-other is more than 700 years old.

You’re asking about a very different usage, one that we’ve also discussed. As we wrote in a later post, this singular “they” refers to a known person who doesn’t identify as either male or female and prefers “they” to “he” or “she.” Some examples: “Robin loves their new job as sales manager” … “Toby says they’ve become a vegetarian.”

This use of “they” for a known person who’s nonbinary and doesn’t conform to the usual gender distinctions is very recent, only about a decade old.

When we wrote about the nonbinary “they” three years ago, we noted that only one standard dictionary had recognized the usage. The American Heritage Dictionary of the English Language included (and still does) this definition within its entry for “they”: “Used as a singular personal pronoun for someone who does not identify as either male or female.”

American Heritage doesn’t label the usage as nonstandard but adds a cautionary note: “The recent use of singular they for a known person who identifies as neither male nor female remains controversial.”

Since then, a couple of other standard dictionaries have accepted the usage, but without reservation.

Merriam-Webster’s entry for “they” was updated in September 2019 to include this definition: “used to refer to a single person whose gender identity is nonbinary.”

A British dictionary, Macmillan, now has a similar definition: “used as a singular pronoun by and about people who identify as non-binary.” Macmillan’s example: “The singer has come out as non-binary and asked to be addressed by the pronouns they/them.”

Neither dictionary has any kind of warning label or cautionary note. Other dictionaries, however, are more conservative on the subject, merely observing in usage notes that the nonbinary “they” is out there, but not including it among the standard definitions of “they.”

For instance, Dictionary.com (based on Random House Unabridged) says in a usage note that the use of “they” and its forms “to refer to a single clearly specified, known, or named person is uncommon and likely to be noticed and criticized. … Even so, use of they, their, and them is increasingly found in contexts where the antecedent is a gender-nonconforming individual or one who does not identify as male or female.”

And Lexico (the former Oxford Dictionaries Online) has this in a usage note: “In a more recent development, they is now being used to refer to specific individuals (as in Alex is bringing their laptop). Like the gender-neutral honorific Mx, the singular they is preferred by some individuals who identify as neither male nor female.”

The Oxford English Dictionary, an etymological dictionary based on historical evidence, added the binary use of “they” and its forms in an October 2019 update.

This is now among the OED’s definitions of “they”: “Used with reference to a person whose sense of personal identity does not correspond to conventional sex and gender distinctions, and who has typically asked to be referred to as they (rather than as he or she).”

The dictionary’s earliest example is from a Twitter post (by @thebutchcaucus, July 11, 2009): “What about they/them/theirs? #genderqueer #pronouns.” Oxford also has two later citations:

“Asher thought they were the only nonbinary person at school until a couple weeks ago” (the Harvard-Westlake Chronicle, Los Angeles, Sept. 25, 2013).

“In 2016, they got a role on Orange Is the New Black as a wisecracking white supremacist” (from a profile of Asia Kate Dillon on the Cut, a blog published by New York magazine, June 3, 2019).

We agree with you that this usage can confuse a reader. When a writer uses “they” in an article, we’re sometimes left to wonder how many people are meant.

But a careful writer can overcome this problem. The use of “they” in that last OED citation (“they got a role”) is not confusing because it links the pronoun with a single role  And elsewhere in the article, the author, Gabriella Paiella, took pains to be clear (“they’re arguably Hollywood’s most famous nonbinary actor, one whose star turn came on an unlikely television series”).

As we noted in our nonbinary “they” post, “Clarity is just as important as sensitivity. Be sure to make clear when ‘they’ refers to only one person and when it refers to several people.” We also noted that “when ‘they’ is the subject of a verb, the verb is always plural, even in reference to a single person: ‘Robin says they are coming to the lunch meeting, so order them a sandwich.’ ”

Help support the Grammarphobia Blog with your donation. And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a comic novel.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Expression Language Linguistics Usage Writing

Complementary remarks

Q: I teach writing to foreign students and was asked a question that I just cannot answer. Both of these sentences are normal English: “I was happy being alone” … “I was happy to be alone.” But only the first of these is normal: “I became [or got] lonely being alone” … “I became [or got] lonely to be alone.” What’s wrong with that last sentence?

A: Your question has to do with adjectives and their complements—that is, the words or phrases that complete them. In this case, you’re attempting to complement the adjectives “happy” and “lonely” with two different phrases: one formed with a gerund or “-ing” participle (“being alone”), and the other with a “to” infinitive (“to be alone”).

As any native speaker of English would know immediately, the “happy” sentences work with both complements, but the “lonely” sentences don’t. “I got lonely to be alone” doesn’t sound like normal English.

This is because “lonely” is not among adjectives that can invariably be complemented by an infinitive. If you replace “lonely” with “afraid,” the original examples work: “I became [or got] afraid being alone” … “I became [or got] afraid to be alone.”

We wish we could tell you that there’s a predictable pattern here—that certain types of adjectives can always be complemented by both participles and infinitives, while other types are always restricted to one or the other.

Unfortunately, no clear pattern emerges. Different adjectives simply act differently in different contexts.

For instance, with another subject and another verb, those “lonely” sentences work with both complements: “It is lonely being alone” … “It is lonely to be alone.” (There “it” is a dummy subject; the real subject is the complement: “Being alone is lonely” … “To be alone is lonely.”)

While we can’t give you a rule about all this, we can make a few broad observations.

Dozens of evaluative adjectives (like “educational,” “interesting,” “lovely,” “pleasant,” etc.) can be used with either “to” infinitives or “-ing” participles if the subject is a dummy “it” and the verb is a form of “be” (like “is,” “was,” “might have been,” and so on). With adjectives like these, the complements are pretty much interchangeable: “It was lovely to see you” … “It was lovely seeing you.”

Many adjectives that modify a subject, and that have to do with the subject’s attitude or capabilities, are often complemented by infinitives. These include “able,” “afraid,” “anxious,” “bound,” “delighted,” “determined,” “eager,” “happy,” “hesitant,” “liable,” “likely,” “quick,” “reluctant,” and “unwilling.” (Example: “The pianist was delighted to perform.”)

Some of those adjectives can also be complemented by “-ing” participles if a preposition is added, like “about” (as in “delighted about performing”), or “in” (“quick in replying”).

Still other adjectives, ones that refer to the experiencing or doing of something rather than to the thing itself, can be complemented by “to” infinitives. In a sentence like “This piece is difficult to perform,” the adjective “difficult” refers more to the performing than to the piece. These adjectives include “boring,” “delicious,” “difficult,” “easy,” “enjoyable,” “hard,” “impossible,” “tough,” and “tiresome.”

Adjectives that are usually complemented by “-ing” participles are much less numerous than the other kind. Among them are “busy,” “pointless,” “useless,” “worth,” and “worthwhile.” For instance, we can say, “She’s busy eating,” but not “She’s busy to eat.

As for “busy,” notice what happens when we modify the adjective with “too”—both complements work: “She’s too busy eating” … “She’s too busy to eat.” Completely different meanings! This is because “too busy eating” implies a missing element—“… to do [something else].”

Some adjectives that are usually complemented by infinitives—like “absurd,” “annoying,” “awkward,” “fortunate,” “happy,” “logical,” “odd,” and “sad”—can be complemented with participles as well.

Here a point should be made. Sometimes the choice of adjective complement—infinitive or participle—makes no difference in meaning, especially if the subject is the dummy “it.” (Examples: “It’s exhausting to cook for twenty” … “It’s exhausting cooking for twenty.”)

But sometimes a different complement produces a different meaning. “He was happy to carry your suitcase” does not mean “He was happy carrying your suitcase.” Similarly, “I became afraid to be alone” is not the same as “I became afraid being alone.”

This is also true of verbs with these complements or objects. For instance, “I stopped to think” does not mean “I stopped thinking,” and “I remembered to call” does not mean “I remembered calling.” We’ve written several posts, most recently in 2019, about verbs that can have infinitives or gerunds or both as their objects or complements.

With verbs, too, linguists have found no clear pattern that could help a foreign student predict which types work with gerunds, or with infinitives, or with both. As we wrote in 2014, there are only broad outlines that don’t work reliably in all cases.

You can find more on this subject in The Cambridge Grammar of the English Language, by Rodney Huddleston and Geoffrey K. Pullum (pp. 1246, 1259), and A Comprehensive Grammar of the English Language, by Randolph Quirk et al. (pp. 1224, 1230-31, 1392-93).

Help support the Grammarphobia Blog with your donation. And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a comic novel.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Phrase origin Usage Word origin Writing

Not a man but felt this terror

Q: I have a question about the strange use of “but” in the following letter of Emerson to Carlyle: “Not a reading man but has a draft of a new Community in his waistcoat pocket.” I see no modern definition of “but” that fits here. Is the usage archaic?

A: Yes, Ralph Waldo Emerson’s use of “but” is archaic in that sentence, but the usage is still occasionally seen in contemporary historical novels.

The sentence is from a letter Emerson wrote to Thomas Carlyle on Oct. 30, 1840. In it, Emerson refers to the plans of American social reformers to set up utopian communities inspired by the ideas of the French social theorist Charles Fourier.

The passage is especially confusing because it has principal and subordinate clauses with elliptical, or missing, subjects. The “but” is being used to replace a missing pronoun (the subject) in the subordinate clause and to make the clause negative.

Here’s the sentence with all the missing or substitute parts in place: “There is not a reading man who has not a draft of a new Community in his waistcoat pocket.”

As the Oxford English Dictionary explains, “but” is being used here “with the pronominal subject or object of the subordinate clause unexpressed, so that but acts as a negative relative: that … not, who … not (e.g. Not a man but felt this terror, i.e. there was not a man who did not feel this terror, they all felt this terror). Now archaic and rare.”

The earliest OED example of the usage is from a medieval romance: “There be none othir there that knowe me, but wold be glad to wite me do wele” (“There are none there that know me who would not gladly expect me to act well”). From The Three Kings’ Sons, circa 1500. Frederick James Furnivall, who edited the manuscript in 1895 for the Early English Text Society, suggested that David Aubert, a French calligrapher for the Duke of Burgundy, may have been the author.

The most recent Oxford example for this use of “but” is from a 20th-century historical novel for children:

“There is scarce one among us but knows the fells as a man knows his own kale-garth” (“There is scarce one among us who doesn’t know the hills as a man knows his own cabbage garden”). From The Shield Ring, 1956, by Rosemary Sutcliff.

Help support the Grammarphobia Blog with your donation. And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a comic novel.

Subscribe to the blog by email

Enter your email address to subscribe to the blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Language Linguistics Pronunciation Usage Word origin

Ethos, logos, pathos

Q: A friend and I were recently discussing “ethos,” “logos,” and “pathos.” Having studied classical Greek, I asserted they should be pronounced as the ancients did: eth-ahs, lah-gahs, and pa-thahs. My friend said English has adopted the words so the commonly used pronunciations of eth-ohs, loh-gohs, and pay-thohs are now acceptable. Any help?

A: As you know, ἦθος (“ethos”), λόγος (“logos”), and πάθος (“pathos”) are in Aristotle’s ῥητορική (Rhetoric), a treatise on the art of persuasion. In the work, he uses “ethos” (character), “logos” (reason), and “pathos” (emotion) in describing the ways a speaker can appeal to an audience. (The classical Greek terms have several other meanings, which we’ll discuss later.)

When English adopted the terms in the 16th and 17th centuries, they began taking on new senses. Here are the usual English meanings now: “ethos,” the spirit of a person, community, culture, or era; “logos,” reason, the word of God, or Jesus in the Trinity; “pathos,” pity or sympathy as well as a quality or experience that evokes them.

So how, you ask, should an English speaker pronounce these Anglicized words?

In referring to the Rhetoric and other ancient texts, we’d use reconstructed classical Greek pronunciations (EH-thahs, LAH-gahs, PAH-thahs), though there’s some doubt as to how Aristotle and others actually pronounced the terms. But in their modern English senses, we’d use standard English pronunciations for “ethos,” “logos,” and “pathos.”

As it turns out, the 10 online standard dictionaries we regularly consult list a variety of acceptable English pronunciations that include the reconstructed ones:

  • EE-thohs, EE-thahs, EH-thohs, or EH-thahs;
  • LOH-gohs, LOH-gahs, or LAH-gahs;
  • PAY-thohs, PAY-thahs, PAY-thaws, PAH-thohs, or PAH-thahs.

(Our preferences would be EE-thohs, LOH-gohs, and PAY-thohs for the modern senses, though these aren’t terms we use every day in conversation.)

When English adopts a word from another language, the spelling, pronunciation, meaning, number, or function of the loanword often changes—if not at once, then over the years. This shouldn’t be surprising, since English itself changes over time. The Old English spoken by the Anglo-Saxons is barely recognizable now to speakers of modern English.

Similarly, the Attic dialect used by Aeschylus (circa 525-455 BC) differed from the Attic of Aristotle (384-322 BC), the Doric dialect of Pindar (c. 518-438 BC), the Aeolic of Sappho (c. 630-570 BC), and the Ionic of the eighth-century BC Homeric epics, the Iliad and the Odyssey.

You were probably taught a reconstructed generic Attic pronunciation of the fifth century BC. The reconstruction originated with Erasmus in the early 16th century and was updated by historical linguists in the 19th and 20th centuries. The linguists considered such things as the meter in poetry, the way animal sounds were written, the spelling of Greek loanwords in Latin, usage in medieval and modern Greek, and the prehistoric Indo-European roots of the language.

But Attic, the dialect of classical Greek spoken in the Athens area, wasn’t generic—it was alive and evolving. And to use a fifth-century BC Attic reconstruction for all classical Greek spoken from the eighth to the fourth centuries BC is like using a generic Boston pronunciation of the 19th century for the English spoken in Alabama, New York, Ohio, and Maine from the 18th to the 21st centuries.

As for the etymology, English borrowed “ethos” from the classical Latin ēthos, which borrowed it in turn from ancient Greek ἦθος, according to the Oxford English Dictionary. In Latin, the word meant character or the depiction of character. In Greek, it meant custom, usage, disposition, character, or the delineation of character in rhetoric.

When “ethos” first showed up in English in the late 17th century, the OED says, it referred to “character or characterization as revealed in action or its representation.” The first Oxford example is from Theatrum Poetarum (1675), by the English writer Edward Phillips:

“As for the Ethos … I shall only leave it to consideration whether the use of the Chorus … would not … advance then diminish the present.” Some scholars believe that the poet John Milton, an uncle who educated Phillips, contributed to the work, which is a list of major poets with critical commentary.

In the mid-19th century, according to the OED, “ethos” came to mean “the characteristic spirit of a people, community, culture, or era as manifested in its attitudes and aspirations; the prevailing character of an institution or system.”

The first citation is from Confessions of an Apostate, an 1842 novel by Anne Flinders: “ ‘A sentiment as true as it is beautiful,’ I replied, ‘like the “austere beauty of the Catholic Ethos,” which we now see in perfection.’ ”

English adopted “logos” in the late 16th century from λόγος in classical Greek,  where it meant word, speech, discourse, or reason. The OED’s first English citation uses it as “a title of the Second Person of the Trinity,” or Jesus:

“We cal him Logos, which some translate Word or Speech, and othersome Reason” (from A Woorke Concerning the Trewnesse of the Christian Religion, a 1587 translation by Philip Sidney and Arthur Golding of a work by the French Protestant writer Philippe de Mornay).

The OED says modern writers use the term “untranslated in historical expositions of ancient philosophical speculation, and in discussions of the doctrine of the Trinity in its philosophical aspects.”

English got “pathos” in the late 16th century from the Greek πάθος, which meant suffering, feeling, emotion, passion, or an emotional style or treatment. In English, it first meant “an expression or utterance that evokes sadness or sympathy,” a usage that Oxford describes as rare today.

The dictionary’s earliest English example is from The Shepheardes Calender (1579), the first major poetical work by the Elizabethan writer Edmund Spenser: “And with, A very Poeticall pathos.” (The original 1579 poem uses παθός, but a 1591 version published during Spenser’s lifetime uses “pathos.”)

In the mid-17th century, according to the OED, the term took on the modern sense of “a quality which evokes pity, sadness, or tenderness; the power of exciting pity; affecting character or influence.” The first citation is from “Of Dramatic Poesie,” a 1668 essay by John Dryden: “There is a certain gayety in their Comedies, and Pathos in their more serious Playes.”

Help support the Grammarphobia Blog with your donation. And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a comic novel.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Word origin Writing

Shrink, shrank, shrunk

Q: Is it OK to use “shrunk” as the past tense of “shrink,” as in Honey, I Shrunk the Kids?

A: Yes, it’s OK if you’re American, like that 1989 Disney film. However, British dictionaries are divided over the usage.

As we wrote in 2010, most standard American dictionaries recognize either “shrank” or “shrunk” as a legitimate past tense of “shrink.” So as far back as nine years ago, a sentence like “His trousers shrunk in the laundry” was widely accepted as standard in the US.

These were the recommended American forms: “shrink” as the present tense; “shrank” or “shrunk” as the past tense; “shrunk” or “shrunken” as the past participle (the form used in perfect tenses, requiring an auxiliary like “have” or “had”).

Today, acceptance of the past tense “shrunk” is even more pronounced, as we found in checking the 10 standard American and British dictionaries we usually consult.

All five of the American and three out of the five British dictionaries now accept “shrunk” as well as “shrank.” (One of those last three, Cambridge, qualified its acceptance by saying that “shrunk” is standard in the US but not in the UK.)

Only two holdouts insist on “shrank” alone as the past tense, the British dictionaries Longman and Lexico (formerly Oxford Dictionaries Online). They accept “shrunk” solely as a past participle.

Despite the increasing respectability of the past tense “shrunk,” it’s apparently regarded by some as casual or informal.

Merriam-Webster’s Dictionary of English Usage says that while “shrunk” is “undoubtedly standard” in the past tense, “shrank” is the usual preference in written English. (As we’ll show later, “shrunk” is widely preferred in common usage, if not in edited writing.)

However, we see no reason to avoid “shrunk,” even in formal writing. The Oxford English Dictionary, an etymological dictionary based on historical evidence, says “shrunk” has been used this way since the 1300s. It apparently fell out of favor—at least in written English—sometime in the 19th century and became respectable again in the latter half of the 20th.

The fact that modern lexicographers have come around to accepting “shrunk” is not an indication that standards are slipping or that English is becoming degraded. On the contrary, this development echoes a pattern seen with other verbs of that kind. Here’s the story.

The verb “shrink” was first recorded around the year 1000, as scrincan in Old English. It was inherited from other Germanic languages, with cousins in Middle Dutch (schrinken), Swedish (skrynkato), and Norwegian (skrekka, skrøkka).

In Anglo-Saxon days “shrink” had two past-tense forms—“shrank” (scranc) in the singular and “shrunk” (scruncon) in the plural—along with the past participle “shrunken” (gescruncen). So originally (and we’ll use the modern spellings here), the past-tense vowel changed only when the verb shifted from singular to plural, as in “I shrank” vs. “we shrunk.”

But in the 14th century, the dictionary says, the originally plural past tense “shrunk” began appearing with a singular subject (as in “I shrunk,” “he shrunk”). The dictionary’s earliest example is dated circa 1374:

“Sche constreynede and schronk hir seluen lyche to þe comune mesure of men” (“She contracted and shrunk herself to the common measure of men”). From Geoffrey Chaucer’s translation of Boethius’s De Consolatione Philosophiae.

This use of “shrunk,” the OED says, went on to become “frequent in the 15th cent.,” and was “the normal past tense in the 18th cent.”

Dictionaries of the time agree. A New Dictionary of the English Language (William Kenrick, 1733) and A General Dictionary of the English Language (Thomas Sheridan, 1780) both prefer “shrunk” over “shrank” as the past tense. They use the same illustration—“I shrunk, or shrank”—treating “shrank” as a secondary variant.

The preference for “shrunk” persisted among some writers well into the 19th century, as these OED citations show:

“Wherever he went, the enemy shrunk before him” (Washington Irving, A History of New York, 1809) … “Isaac shrunk together, and was silent” (Sir Walter Scott, Ivanhoe, 1819) … “She shrunk back from his grasp” (Scott’s novel Kenilworth, 1821) … “Opinions, which he never shrunk from expressing” (Edward Peacock’s novel Narcissa Brendon, 1891).

But in the meantime “shrank” was also being used, and during the 19th century its popularity gradually revived in written English. Soon it came to be regarded as the better choice.

By the early 20th century, textbooks and usage guides were recommending “shrank” as the proper past-tense form. Henry Fowler, in A Dictionary of Modern English Usage (1926), said “shrunk” had become archaic. (He was wrong, as we now know. Far from being archaic, “shrunk” had stubbornly persisted in common use.)

Here a question arises. If “shrunk” was the normal past tense in the 18th century, why did commentators in the early 20th century suggest that “shrank” was better?

Apparently arbiters of the language felt that forms of “shrink”—the present, past, and perfect tenses—should conform with those of similar verbs:  “drink/drank/drunk,” “sink/sank/sunk,” “swim/swam/swum,” and so on. They felt that the legitimate past tenses should be spelled with “a,” the past participles with “u,” and the distinction preserved.

But they overlooked the fact that many similar verbs had adopted “u” in the past tense with no objections. These all belong to a class that in Old English had “i” as the present-tense vowel and had two past-tense vowels: “a” in the singular (“I shrank,” “he shrank”) and “u” in the plural (“we shrunk,” “they shrunk”).

Examples of verbs like this include “cling,” “spin,” “swing,” and “wring.” By the 18th century, they had abandoned the old past tenses spelled with “a” (“clang,” “span,” “swang,” “wrang”) and adopted “u” forms identical to their past participles (“clung,” “spun,” “swung,” “wrung”).

The linguist Harold B. Allen has described “shrink” as “typical” of that class—Old English verbs that “in moving toward a single form for past and participle have popularly used the vowel common to both” (The English Journal, February 1957).

Unlike those other verbs, however, “shrink” was arrested in the process. Instead of dropping its “a” form completely, it has kept both past tenses, “shrank” and “shrunk.” (The same is true of the verbs “spring” and “stink,” which have retained both of their old past tense forms, “sprang/sprung” and “stank/stunk.”)

As we mentioned above, “shrunk” is the past tense favored in common usage. More than 60 years ago, Allen wrote that although textbooks listed “shrank” as the proper past tense, “shrunk” was more popular.

“The findings of the fieldwork for The Linguistic Atlas of the Upper Midwest,” he wrote, “indicate that 86.5% of all informants responding to this item use shrunk as the preterit [past tense],” he said. And there was no evidence of a “small educated minority clinging to a favored shrank.

The preference for “shrunk,” he said, was “nearly the same in all three groups: 89% of the uneducated, 89% of the high school graduates, and 86% of the college graduates.” Though preferences were divided, he wrote, “the general dominance of shrunk is certain, despite the contrary statements of the textbooks.”

A final word about “shrunken,” which dictionaries still list alongside “shrunk” as a past participle. Today it’s “rarely employed in conjugation with the verb ‘to have,’ ” the OED says. There, too, “shrunk” has become the popular choice (as in “The trousers have shrunk”), and “shrunken” is seen mostly as a participial adjective (“the shrunken trousers”).

The same thing has happened with the verb “drink.” The usual past participle is now “drunk” (as in “he had drunk the poison”), while the old past participle “drunken” is now used only as an adjective.

But as for its past tense, “drink” has held on to “drank” in modern English, and a usage like “he drunk the poison” is not considered standard.

Help support the Grammarphobia Blog with your donation. And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a comic novel.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Phrase origin Usage Word origin Writing

Like more, only more so

Q: I’m seeing “more so” or “moreso” where I would expect “more.” Am I suffering from the usual recency illusion? Can I change it to “more” when editing? I sometimes have trouble knowing whether a language change is far enough along to indulge it.

A: The two-word phrase “more so” is standard English and showed up nearly three centuries ago. You can find it in two of James Madison’s essays in The Federalist Papers and in Jane Austen’s novel Emma.

The one-word version “moreso” has been around for almost two centuries, though it’s not accepted by any modern standard dictionary. The Oxford English Dictionary, an etymological reference, says it’s mainly an American usage.

The OED says “more so” (as well as “moreso”) is derived from the earlier use of “more” with “ellipsis of the word or sentence modified.” That is, it comes from the use of “more” by itself to modify missing words, as in “I found the first act delightful and the second act even more.” (Here “delightful” is missing after “more” but understood.)

The earliest Oxford example for this elliptical “more” usage, which we’ll expand here, is from a Middle English translation of a 13th-century French treatise on morality:

“He ssolde by wel perfect and yblissed ine þise wordle and more ine þe oþre” (“He shall be morally pure and blessed in this world and more in the other”; from Ayenbite of Inwyt, a 1340 translation by the Benedictine monk Michel of Northgate of La Somme des Vices et des Vertus, 1279, by Laurentius Gallus).

Today, the OED says in a December 2002 update to its online third edition, the usage is seen “frequently with anaphoric so” in the phrase “more so (also, chiefly U.S., moreso).” An anaphoric term refers back to a word or words used earlier, as in “I saw the film and so did she.”

The dictionary’s first citation for “more so” is from an early 18th-century treatise by the Irish philosopher George Berkeley: “This is so plain that nothing can be more so” (A Defence of Free-Thinking in Mathematics, 1735). Berkeley, California, was named after the philosopher, who was also the Anglican bishop of Cloyne, Ireland.

The next Oxford example is from a Federalist essay in which Madison discusses the size of districts that choose senators: “Those of Massachusetts are larger than will be necessary for that purpose. And those of New-York still more so” (Federalist No. 57, “The Alleged Tendency of the New Plan to Elevate the Few at the Expense of the Many Considered in Connection with Representation,” Feb. 19, 1788).

In the OED’s citation from Emma, published in 1815, Emma and Mr. Knightley are discussing Harriet’s initial rejection of Mr. Martin: “ ‘I only want to know that Mr. Martin is not very, very bitterly disappointed.’  ‘A man cannot be more so,’ was his short, full answer.”

The one-word version “moreso” soon appeared in both the US and the UK. The earliest British example that we’ve seen is from a clinical lecture on amputation delivered at St. Thomas’s Hospital, London, Nov. 25, 1823:

“In all these cases, it is of infinite importance to be prompt in your decision, moreso almost than in any other cases to be met with in the practice of the profession” (from an 1826 collection of surgical and clinical lectures published by the Lancet).

The earliest American example we’ve found is from an Indiana newspaper: “Cure for the Tooth ache—This is one of the most vexatious of the ills that flesh (or rather nerves) is heir to. The following simple prescription can do no injury, & from actual experiment, we know it to be highly efficacious, moreso than any specific the dread of cold iron ever induced the sufferer to” (Western Sun & General Advertiser, Vincennes, April 29, 1826).

A few months later, the one-word spelling appeared in the published text of a Fourth of July speech at the University of Vermont in Burlington. Here’s the relevant passage from the speech by George W. Benedict, a professor of natural philosophy and chemistry at the university:

“Much has been said of the ingratitude of popular governments. That in those of ancient times, the very individuals to whom they were under the greatest obligations were as liable as others, sometimes apparently moreso, the victims of sudden resentment or the objects of a cold, unfeeling neglect, is doubtless true.”

The OED’s only example for “moreso” is from a late 20th-century book published in Glasgow: “Anyone perceived as being different from society’s norms was a potential target—no-one moreso than the local wise-woman” (Scottish Myths and Customs, 1997, by Carol P. Shaw).

However, the dictionary does have a hyphenated example from the 19th century: “The English servant was dressed like his master, but ‘more-so’ ” (The Golden Butterfly, an 1876 novel by the English writers Walter Besant and Samuel James Rice).

The linguist Arnold Zwicky notes in a May 30, 2005, post on the Language Log that “more” could replace “more so” or “moreso” in all of the OED citations, though the anaphoric versions (those with “so”) may add contrast or emphasis:

“The choice between one variant and the other is a stylistic one. One relevant effect is that, in general, explicit anaphora, as in more so, tends to be seen as more emphatic or contrastive than zero anaphora, as in plain more.”

In the 21st century, people seem to be using the one-word “moreso” in several new nonstandard senses. For example, Zwicky points out that “moreso” is now being used as a simple emphatic version of “more,” without referring back to a word or words used earlier: “alternating more and moreso have been reinterpreted as mere plain and emphatic counterparts, with no necessary anaphoricity.”

Here’s a recent example from an NPR book review of Brynne Rebele-Henry’s Orpheus Girl, an updated version of the myth of Orpheus and Eurydice (Oct. 13, 2019): “Moreso than Hades’s mythic underworld of old, this camp is Actual Hell (and all the trigger warnings that go with that).”

In another innovative reinterpretation, Zwicky says in his 2005 post, “moreso” is being used as a sentence adverb “without any specific standard of comparison implicated.”

It means “moreover” or “furthermore” in this recent sighting on Amazon.com: “Moreso, infants and preschoolers do not have the ability to express feelings of sadness in apt language” (from a description of How to Detect and Help Children Overcome Depression, 2019, by J. T. Mike).

And  “moreso” is being used in the sense of “rather” in this example: “Scientist Kirstie Jones-Williams, who will be helping to train and guide the volunteer researchers, says the goal of the program isn’t to create more scientists, but moreso global ambassadors on the dangers of pollution and more” (from a Sept. 25, 2019, report on NBC Connecticut about a trip to Antarctica).

We’ve occasionally seen the two-word “more so” used in such creative ways too, perhaps influenced by the newer uses of “moreso.” The phrase is a sentence adverb meaning “more importantly” in this query about the hip-hop career of the former Pittsburgh Steelers wide receiver Antonio Brown:

“But we have to know, if Brown starts releasing music, will you listen? More so, will you buy it? Let us know” (an item that appeared Oct. 16, 2019, on Instagram from USA Today’s Steelers Wire).

Although lexicographers are undoubtedly aware of the evolution of “moreso” in the 21st century, none of these new senses have made it into either the OED or the 10 standard dictionaries we regularly consult. Webster’s New World, the only standard dictionary to take note of “moreso,” merely labels it a “disputed” spelling of “more so.” The online collaborative Wiktionary say it’s a “nonstandard” spelling of the phrase.

Getting back to your question, are you suffering from the recency illusion? Well, perhaps a bit. The term, coined by Zwicky, refers to the belief that things you recently notice are in fact recent. Yes, the anaphoric use of “more so” and “moreso” has been around for centuries, but “moreso,” with its new senses, seems to have increased in popularity in recent years.

Historically, “moreso” has been relatively rare in relation to “more so,” according to Google’s Ngram viewer, which compares words and phrases in digitized books published through 2008. However, recent searches with the more up-to-date iWeb corpus, a database of 14 billion words from 22 million web pages, suggest that “moreso” sightings may now be on the rise. Here’s what we found: “moreso,” 8,022 hits; “more so,” 107,837.

Should you change “moreso” or “more so” to “more” when editing? That depends.

Since “moreso” isn’t standard English, we’d change it to an appropriate standard term, depending on the sense—“more,” “more so,” “moreover,” “rather,” and so on.

As for “more so,” we’d leave it alone if it’s being used anaphorically. Otherwise, we’d change it to an appropriate standard term.

But as you note in your question, the English language is evolving. If you ask us about this in a few years, we may have a different answer.

Help support the Grammarphobia Blog with your donation. And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a novel.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you’re a subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Phrase origin Usage Word origin Writing

Can not, cannot, and can’t

Q: Can you please dwell in some detail on why “can not” is now usually written as “cannot”? Is there a linguistic reason for this uncontracted form? Or is it just one of those irregularities that cannot be accounted for?

A: When the usage showed up in Old English, the language of the Anglo-Saxons, it was two words.

One of the oldest examples in the Oxford English Dictionary is from the epic poem Beowulf, perhaps written as early as the 700s: “men ne cunnon” (“men can not”).

And here’s an expanded version that offers context as well as a sense of the Anglo-Saxon poetry:

“ac se æglæca etende wæs, / deorc deaþscua duguþe ond geogoþe, / seomade ond syrede; sinnihte heold / mistige moras; men ne cunnon / hwyder helrunan hwyrftum scriþað” (“all were in peril; warriors young and old were hunted down by that dark shadow of death that lurked night after night on the misty moors; men on their watches can not know where these fiends from hell will walk”).

The combined form “cannot” showed up in the Middle English period (1150 to 1450), along with various other spellings: cannat, cannatte, cannouȝt, connat, connott, conot, conott, cannote, connot, and cannott.

The earliest OED example with the modern spelling is from Cursor Mundi, an anonymous Middle English poem that the dictionary dates at sometime before 1325: “And þou þat he deed fore cannot sorus be” (“And thou that he [Jesus] died for cannot be sorrowful”).

In contemporary English, both “cannot” and “can not” are acceptable, though they’re generally used in different ways. The combined form, as you point out, is more common (Lexico, formerly Oxford Dictionaries Online, says it’s three times as common in the Oxford English Corpus).

Here’s an excerpt from the new, fourth edition of Woe Is I, Pat’s grammar and usage book, on how the two terms, as well as the contraction “can’t,” are generally used today:

CAN NOT / CANNOT / CAN’T. Usually, you can’t go wrong with a one-word version—can’t in speech or casual writing, cannot in formal writing. The two-word version, can not, is for when you want to be emphatic (Maybe you can hit high C, but I certainly can not), or when not is part of another expression, like “not only . . . but also” (I can not only hit high C, but also break a glass while doing it). Then there’s can’t not, as in The diva’s husband can’t not go to the opera.

Getting back to your question, why is “cannot” more popular than “can not”? We believe the compound is more common because the two-word phrase may be ambiguous.

Consider this sentence: “You can not go to the party.” It could mean either “You’re unable to go” or “You don’t have to go.” However, the sentence has only the first meaning if you replace “can not” with “cannot” (or the contraction “can’t”).

In The Cambridge Grammar of the English Language (2002), Rodney Huddleston and Geoffrey K. Pullum say that “You can’t/cannot answer their letters” means “It is not possible or permitted for you to answer their letters,” while “You can not answer their letters” means “You are permitted not to answer their letters.”

In speech, Huddleston and Pullum write, any ambiguity is cleared up by emphasis and rhythm: “In this use, the not will characteristically be stressed and prosodically associated with answer rather than with can by means of a very slight break separating it from the unstressed can.” The authors add that “this construction is fairly rare, and sounds somewhat contrived.”

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language. For a change of pace, read Chapter 1 of Swan Song, a comic novel.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Grammar Language Linguistics Usage Writing

Pat reviews 4 language books

Read Pat in the New York Times Book Review on four new books about the English language.

 

Categories
English English language Expression Grammar Language Linguistics Usage Writing

Making sense of mixing tenses

Q: I mixed tenses in two news items I wrote about a legal decision. In the original, I wrote, “the judge ruled such passenger fees are constitutional.” After a settlement months later, I wrote, “he said such fees were legal.” Both seem right, but I’m not sure why I used the present tense in the first and the past in the second.

A: Both seem right to us too, even though you combined the tenses differently. The first verb in each passage is in the past tense, but the tense of the second verb varies. As we’ll explain, this mixing of tenses is allowed.

The problem you raise—how to use tenses in a sequence—is particularly common among journalists, who are often required to use what The Cambridge Grammar of the English Language calls “indirect reported speech.”

This construction is used to report what somebody said, but not in a direct quote. The principal verb in your examples is in the past tense (“the judge ruled” … “he said”), but then you’re faced with the problem of what tense to use in the verbs that follow.

As we wrote in a 2015 post, the following tenses need not necessarily be identical to the first; in some cases the choice is optional.

For instance, even when the second verb expresses something that is still true (those fees are still legal now), a writer may prefer to echo the past tense of the first verb. In fact, the default choice here is the past tense; the present tense may be used, but it’s not required.

In explaining how this works, the Cambridge Grammar begins with this quotation spoken by a woman named Jill: “I have too many commitments.”

Her “original speech,” the book says, may be reported indirectly as either “Jill said she has too many commitments” or “Jill said she had too many commitments.”

“The two reports do not have the same meaning,” write the authors, Rodney Huddleston and Geoffrey K. Pullum, “but in many contexts the difference between them will be of no pragmatic significance.”

So when would the difference matter? One factor that might make a writer choose one tense over the other is the time elapsed between the original speech and the reporting of it. Did Jill say this last year or five minutes ago?

In a sentence like “Jill said she had/has a headache,” the authors say, “Jill’s utterance needs to have been quite recent for has to be appropriate.”

In the case you raise, the original version is closer in time to the judge’s ruling, and the present tense is reasonable: “ruled that such passenger fees are constitutional.” But your follow-up story came much later, which may be why the past tense seemed better to you: “he said such fees were legal.”

In a post that we wrote in 2012, we note that the simple past tense takes in a lot of territory—the very distant as well as the very recent past. A verb like “said” can imply a statement made moments, years, or centuries ago—about situations long dead or eternally true. So the verbs that follow can be challenging.

As the Cambridge Grammar explains, there are no “rules” for this. But in our opinion, if an experienced writer like you thinks the tense in a subordinate clause is reasonable and logical, it probably is.

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Word origin Writing

Hear Pat on Iowa Public Radio

She’ll be on Talk of Iowa today from 10 to 11 AM Central time (11 to 12 Eastern) to discuss the wonders of adjectives, and to take questions from callers.

Help support the Grammarphobia Blog with your donation
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Expression Grammar Language Linguistics Phrase origin Usage Word origin Writing

Are you down on “up”?

Q: How did “heat up” replace “heat” in referring to heating food? And why has the equally awful “early on” become so popular?

A: “Heat up” hasn’t replaced “heat” in the kitchen, but the use of the phrasal verb in this sense has apparently increased in popularity in recent years while the use of the simple verb has decreased.

A search with Google’s Ngram Viewer, which compares phrases in digitized books, indicates that “heat the soup” was still more popular than “heat up the soup” as of 2008 (the latest searchable date), though the gap between them narrowed dramatically after the mid-1980s.

However, we haven’t found any standard dictionary or usage guide that considers “heat up” any less standard than “heat” in the cooking sense.

Merriam-Webster online defines the phrasal verb as “to cause (something) to become warm or hot,” and gives this example: “Could you heat up the vegetables, please?”

You seem to think that “heat up” is redundant. We disagree.

As you probably know, “up” is an adverb as well as a preposition. In the phrasal verb “heat up,” it’s an adverb that reinforces the meaning of the verb. (A phrasal verb consists of a verb plus one or more linguistic elements, usually an adverb or a preposition.)

In a 2012 post entitled “Uppity Language,” we quote the Oxford English Dictionary as saying the adverb “up” in a phrasal verb can express “to or towards a state of completion or finality,” a sense that frequently serves “to emphasize the import of the verb.”

The OED, an etymological dictionary based on historical evidence, doesn’t mention “heat up” in that sense, but it cites “eat up,” “swallow up,” “boil up,” “beat up,” “dry up,” “finish up,” “heal up,” and many other phrasal verbs in which “up” is used to express bringing something to fruition, especially for emphasis.

Our impression is that people may also feel that it’s more informal to “heat up” food than simply “heat” it, though dictionaries don’t make that distinction. The phrasal verb “hot up” is used similarly in British English as well as in the American South and South Midland, and dictionaries generally regard that usage as informal, colloquial, or slang.

We also feel that people may tend to use “heat up” for reheating food that’s already cooked, and “heat” by itself for heating food that’s prepared from scratch. An Ngram search got well over a hundred hits for “heat up the leftovers,” but none for “heat the leftovers.” However, we haven’t found any dictionaries that make this distinction either.

In addition to its food sense, “heat up” can also mean “to become more active, intense, or angry,” according to Merriam-Webster online, which cites these examples: “Their conversation started to heat up” …. “Competition between the two companies is heating up.”

And the adverb “up” can have many other meanings in phrasal verbs: from a lower level (“pick up,” “lift up”), out of the ground (“dig up,” “sprout up”), on one’s feet (“get up,” “stand up”), separate or sever (“break up,” “tear up”), and so on.

When the verb “heat” appeared in Old English (spelled hǽtan, haten, hatten, etc.), it was intransitive (without an object) and meant to become hot. The earliest citation in the Oxford English Dictionary is from a Latin-Old English entry in the Epinal Glossary, which the OED dates at sometime before 700: “Calentes, haetendae.”

The first OED citation for the verb used transitively (with an object) to mean make (something) hot is from Old English Leechdoms, a collection of medical remedies dating from around 1000: “hæt scenc fulne wines” (“heat a cup full of wine”).

As far as we can tell, the phrasal verb “heat up” appeared in the second half of the 19th century, though not in its cooking sense. The earliest example we’ve seen is from an April 9, 1878, report by the US Patent Office about an invention in which a system of pipes “is employed to heat up the feedwater of a steam-boiler.”

A lecture in London a few years later touches on cooking: “Now a Bunsen burner will roast meat very well, provided that the products of combustion are not poured straight on to whatever is being cooked; the flame must be used to heat up the walls of the roaster, and the radiant heat from the walls must roast the meat.” (The talk on the use of coal gas was given on Dec. 15, 1884, and published in the Journal of the Society of Arts, Jan. 9, 1885.)

The earliest example we’ve seen for “heat up” used in the precise sense you’re asking about is from a recipe for shrimp puree in Mrs. Roundell’s Practical Cookery Book (1898), by Mrs. Charles Roundell (Julia Anne Elizabeth Roundell):

“bring to the boil, skimming off any scum that may rise, then cool, and pass all through the sieve into another stewpan, stir in the shrimps that were reserved for garnish and heat up.”

As for the adverbial phrase “early on,” it’s been used regularly since the mid-18th century to mean “at an initial or early stage,” according to the OED. The dictionary also cites examples of the variant “earlier on” from the mid-19th century.

Oxford’s earliest example of “early on” is from a 1759 book about tropical diseases by the English physician William Hillary: “When I am called so early on in the Disease … I can strictly pursue it” (from Observations on the Changes of the Air, and the Concomitant Epidemical Diseases in the Island of Barbados).

And the first “earlier on” example is from the Manchester Guardian, April 21, 1841: “It took place earlier on in the year.”

You’re right that “early on” has grown in popularity lately, though “earlier on” has remained relatively stable, according to a comparison of the phrases in the Ngram Viewer.

However, we don’t see why the usage bothers you. The four online standard dictionaries we’ve consulted (Merriam-Webster, American Heritage, Oxford, and Longman), list it without comment—that is, as standard English.

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Word origin Writing

Why foxes have fur, horses hair

Q: Why do we say some animals have “hair” while others have “fur”?

A: All mammals have hair—dogs, cats, foxes, pigs, gerbils, horses, and people. Even dolphins have a few whiskers early in their lives. Scientifically speaking, there’s no difference between hair and fur.

“This is all the same material,” Dr. Nancy Simmons, a mammalogist with the American Museum of Natural History, said in a 2001 interview with Scientific American. “Hair and fur are the same thing.”

She added that there are many norms for hair length, and that different kinds of hair can have different names, such as a cat’s whiskers and a porcupine’s quills.

Well, science is one thing but common English usage is another. Most of us do have different ideas about what to call “hair” and what to call “fur.”

For example, we regard humans as having “hair,” not “fur.” And we use “hair” for what grows on livestock with thick, leathery hides—horses, cattle, and pigs.

But we generally use “fur” for the thick, dense covering on animals like cats, dogs, rabbits, foxes, bears, raccoons, beavers, and so on.

Why do some animals have fur and others hair? The answer lies in the origins of the noun “fur,” which began life as an item of apparel.

In medieval England, “fur” meant “a trimming or lining for a garment, made of the dressed coat of certain animals,” according to the Oxford English Dictionary.

The source, the dictionary suggests, is the Old French verb forrer, which originally meant to sheathe or encase, then “developed the sense ‘to line,’ and ‘to line or trim with fur.’ ”

When the word “fur” first entered English, it was a verb that meant to line, trim, or cover a garment with animal hair. The earliest OED use is from Kyng Alisaunder, a Middle English romance about Alexander the Great, composed in the late 1200s or early 1300s:

“The kyng dude of [put on] his robe, furred with meneuere.” (The last word is “miniver,” the white winter pelt of a certain squirrel.)

The noun followed. Its first known use is from The Romaunt of the Rose, an English translation (from 1366 or earlier) of an Old French poem. The relevant passage refers to a coat “Furred with no menivere, But with a furre rough of here [hair].”

The noun’s meaning gradually evolved over the 14th and 15th centuries. From the sense of a lining or trimming, “fur” came to mean the material used to make it. Soon it also meant entire garments made of this material, as well as the coats of the animals themselves.

Oxford defines that last sense of “fur” this way: “The short, fine, soft hair of certain animals (as the sable, ermine, beaver, otter, bear, etc.) growing thick upon the skin, and distinguished from the ordinary hair, which is longer and coarser. Formerly also, the wool of sheep” [now obsolete].

Note that this definition establishes the distinction between the special hair we call “fur” (short, fine, soft), and “ordinary hair” (longer, coarser).

The dictionary’s earliest citation is a reference to sheep as bearing “furres blake and whyte” (circa 1430). The first non-sheep example was recorded in the following century, a reference to the “furre” of wolves (Edmund Spenser, The Shepheardes Calender, 1579).

From the 17th century on, examples are plentiful. Shakespeare writes of “This night wherin … The Lyon, and the belly-pinched Wolfe Keepe their furre dry” (King Lear, 1608). And Alexander Pope writes of “the strength of Bulls, the Fur of Bears” (An Essay on Man, 1733).

But a mid-18th-century example in the OED stands out—at least for our purposes—because it underscores that “fur” was valued because it was soft and warm: “Leave the Hair on Skins, where the Fleece or Fir is soft and warm, as Beaver, Otter, &c.” (From An Account of a Voyage for the Discovery of a North-west Passage, 1748, written by the ship’s clerk.)

Elsewhere in the account, the author notes that deer or caribou skins were “cleared of the Hair” to make use of the skin as leather.

As for “hair,” it’s a much older word than “fur” and came into English from Germanic sources instead of French.

Here’s the OED definition: “One of the numerous fine and generally cylindrical filaments that grow from the skin or integument of animals, esp. of most mammals, of which they form the characteristic coat.”

The word was spelled in Old English as her or hær, Oxford says, and was first recorded before the year 800 in a Latin-Old English glossary: “Pilus, her.” (In Latin pilus is a single hair and pili is the plural.)

By around the year 1000, “hair” was also used as a mass or collective noun, defined in the OED as “the aggregate of hairs growing on the skin of an animal: spec. that growing naturally upon the human head.”

In summary, most of us think of “fur” as soft, cuddly, warm, and dense. We don’t regard “hair” in quite the same way (even though it technically includes “fur”). “Hair,” in other words, covers a lot more bases.

But in practice, English speakers use the words “hair” and “fur” inconsistently. People often regard some animals, especially their pets, as having both “fur” and “hair.”

They may refer to Bowser’s coat as “fur,” but use the word “hair” for what he leaves on clothes and furniture. And when he gets tangles, they may say that either his “hair” or his “fur” is matted and needs combing out.

Furthermore (no pun intended), two different people might describe the same cat or dog differently—as having “hair” or “fur,” as being “hairy” or “furry,” and (particularly in the case of the cat) as throwing up a “hairball” or a “furball.” They simply perceive the animal’s coat differently.

Our guess is that people base their choice of words on what they perceive as the thickness, density, or length of a pet’s coat. The heavy, dense coat of a Chow dog or a Persian cat is likely to be called “fur.” And the short, light coat of a sleek greyhound or a Cornish Rex is likely to be called “hair.”

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Phrase origin Pronunciation Punctuation Spelling Style Usage Word origin Writing

A new ‘Woe Is I’ for our times

[This week Penguin Random House published a new, fourth edition of Patricia T. O’Conner’s bestselling grammar and usage classic Woe Is I: The Grammarphobe’s Guide to Better English in Plain English. To mark the occasion, we’re sharing the Preface to the new edition.]

Some books can’t sit still. They get fidgety and restless, mumbling to themselves and elbowing their authors in the ribs. “It’s that time again,” they say. “I need some attention here.”

Books about English grammar and usage are especially prone to this kind of behavior. They’re never content with the status quo. That’s because English is not a stay-put language. It’s always changing—expanding here, shrinking there, trying on new things, casting off old ones. People no longer say things like “Forsooth, methinks that grog hath given me the flux!” No, time doesn’t stand still and neither does language.

So books about English need to change along with the language and those who use it. Welcome to the fourth edition of Woe Is I.

What’s new? Most of the changes are about individual words and how they’re used. New spellings, pronunciations, and meanings develop over time, and while many of these don’t stick around, some become standard English. This is why your mom’s dictionary, no matter how fat and impressive-looking, is not an adequate guide to standard English today. And this is why I periodically take a fresh look at what “better English” is and isn’t.

The book has been updated from cover to cover, but don’t expect a lot of earthshaking changes in grammar, the foundation of our language. We don’t ditch the fundamentals of grammar and start over every day, or even every generation. The things that make English seem so changeable have more to do with vocabulary and how it’s used than with the underlying grammar.

However, there are occasional shifts in what’s considered grammatically correct, and those are reflected here too. One example is the use of they, them, and their for an unknown somebody-or-other, as in “Somebody forgot their umbrella”—once shunned but now acceptable. Another has to do with which versus that. Then there’s the use of “taller than me” in simple comparisons, instead of the ramrod-stiff “taller than I.” (See Chapters 1, 3, and 11.)

Despite the renovations, the philosophy of Woe Is I remains unchanged. English is a glorious invention, one that gives us endless possibilities for expressing ourselves. It’s practical, too. Grammar is there to help, to clear up ambiguities and prevent misunderstandings. Any “rule” of grammar that seems unnatural, or doesn’t make sense, or creates problems instead of solving them, probably isn’t a legitimate rule at all. (Check out Chapter 11.)

And, as the book’s whimsical title hints, it’s possible to be too “correct”— that is, so hung up about correctness that we go too far. While “Woe is I” may appear technically correct (and even that’s a matter of opinion), the lament “Woe is me” has been good English for generations. Only a pompous twit—or an author trying to make a point—would use “I” instead of “me” here. As you can see, English is nothing if not reasonable.

(To buy Woe Is I, visit your local bookstore or Amazon.com.)

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Expression Grammar Language Linguistics Style Usage Writing

Who, me?

Q: In Michelle Obama’s memoir, Becoming, she uses this sentence to describe the sacrifices her parents made in raising her and her brother Craig: “We were their investment, me and Craig.” Surely that should be “Craig and I.”

A: Not necessarily. We would have written “Craig and I.” But the sentence as written is not incorrect. It’s informal, but not ungrammatical.

Here the compound (“me and Craig”) has no clear grammatical role. And as we wrote in 2016, a personal pronoun without a clear grammatical role—one that isn’t the subject or object of a sentence—is generally in the objective case.

In our previous post, we quoted the linguist Arnold Zwicky—the basic rule is “nominative for subjects of finite clauses, accusative otherwise.” In other words, when the pronoun has no distinctly defined role, the default choice is “me,” not “I.”

The Merriam-Webster Online Dictionary has this usage note: “I is now chiefly used as the subject of an immediately following verb. Me occurs in every other position.” The examples given include “Me too” … “You’re as big as me” … “It’s me” … “Who, me?”

“Almost all usage books recognize the legitimacy of me in these positions,” M-W says.

As we said, we think the compound “me and Craig” has no clear grammatical role. But digging deeper, we could interpret it as placed in apposition to (that is, as the equivalent of) the subject of the sentence: “we.” And technically, appositives should be in the same case, so the pronoun in apposition to “we” should be a subject pronoun: “I [not “me”] and Craig.”

That’s a legitimate argument, and if the author were aiming at a more formal style, she no doubt would have taken that route.

On the other hand, the same argument could be made against “Who, me?” Those two pronouns could be interpreted as appositives, but forcing them to match (“Whom, me?” or “Who, I?”) would be unnatural.

In short, the choice here is between formal and informal English (not “correct” versus “incorrect”), and the author chose the informal style.

By the way, as we wrote in 2012, the order in which the pronoun appears in a compound (as in “me and Craig” versus “Craig and me”) is irrelevant. There’s no grammatical rule that a first-person singular pronoun has to go last. Some people see a politeness issue here, but there’s no grammatical foundation for it.

That said, when the pronoun is “I,” it does seem to fall more naturally into the No. 2 slot. “Tom and I are going” seems to be a more natural word order than “I and Tom are going.” This is probably what’s responsible for the common (and erroneous) use of “I” when it’s clearly an object—as in “Want to come with Tom and I?”

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Word origin Writing

Can ‘clear’ mean ‘clearly’?

Q: Is “clear” an adverb as well as an adjective? Can one say “I speak clear” or is it always “I speak clearly”?

A: The word “clear” can be an adverb as well as an adjective, but it’s not used adverbially in quite the same way as “clearly” in modern English.

A sentence like “I speak clearly” is more idiomatic (that is, natural to a native speaker) than “I speak clear.” However, “I speak loud and clear” is just as idiomatic as “I speak loudly and clearly.” And “I speak clear” would have been unremarkable hundreds of years ago. Here’s the story.

As Merriam-Webster’s Dictionary of English Usage explains, “Both clear and clearly are adverbs, but in recent use they do not overlap. Clear is more often used in the sense of ‘all the way.’ ”

The usage guide gives several “all the way” examples, including one from a Jan. 18, 1940, letter by E. B. White (“there is a good chance that the bay will freeze clear across”) and another from Renata Adler in the April 24, 1971, issue of the New Yorker (“a model son who had just gone clear out of his mind”).

The Oxford English Dictionary notes that “clear” is also used adverbially to mean distinctly or clearly, as in “loud and clear” and “high and clear.” The OED adds that “in such phrases as to get or keep (oneself) clear, to steer clear, go clear, stand clear, the adjective passes at length into an adverb.”

We’d add the use of “see (one’s way) clear” in the sense of agreeing to do something, as in “Can you see your way clear to lending me the money?”

In Fowler’s Dictionary of Modern English Usage (4th ed.), Jeremy Butterfield writes that “it would be absurd to substitute clearly for clear in such phrases as go clear, keep clear, stand clear, stay clear, steer clear, loud and clear, or in sentences like the thieves got clear away.”

However, Butterfield adds, “Clearly is overwhelmingly the more usual adverbial form of the two.”

So how is the adverb “clearly” used in modern English?

It can mean “in a clear manner,” as in this M-W example from At Swim-Two-Birds, a 1939 novel by the Irish writer Flann O’Brien, pseudonym of Brian O’Nolan: “His skull shone clearly in the gaslight.” And this M-W citation from the November 1982 issue of Smithsonian: “looked clearly at their country and set it down freshly.”

The “-ly” adverb can also mean “without a doubt,” as in this M-W citation from the Oct. 2, 1970, Times Literary Supplement: “He clearly knows his way about the complex and abstruse issues.” And this one from James Jones in Harper’s (February 1971): “walked toward them calmly and sanely, clearly not armed with bottles or stones.”

In addition, the M-W usage guide says, “clearly” can be a sentence adverb meaning “without a doubt,” as in this passage by Sir Richard Livingstone in the March 1953 Atlantic: “Clearly it is a good thing to have material conveniences.” And this citation from Barry Commoner in the Spring 1968 Columbia Forum: “Clearly our aqueous environment is being subjected to an accelerating stress.”

In an adverbial phrase that combines different adverbs, the form of the adverbs is usually consistent: either flat (“loud and clear”) or with a tail (“loudly and clearly”). We’ll cite recent pairs of each that we’ve found in the news.

This “-ly” example is from an opinion piece in the Nov. 5, 2018, Boston Globe: “As concerned citizens committed to our democratic values, we must be willing to stand up and say loudly and clearly that we will not stand for that kind of governance.”

And this tailless example is from a Nov. 11, 2018, report in the Washington Post about President Trump’s recent trip to Paris: “Trump was not making a sound, but his presence could still be heard loud and clear.”

When English borrowed “clear” from Old French in the late 13th century, it was an adjective “expressing the vividness or intensity of light,” according to the OED. It ultimately comes from the Latin clārum (bright, clear, plain, brilliant, and so on).

The dictionary’s earliest example for the adjective is from The Chronicle of Robert of Gloucester, an account of early Britain written around 1300, perhaps as early as 1297: “a leme swythe cler & bryȝte” (“a light very clear and bright”).

The adverbs “clear” and “clearly” both showed up in writing around the same time in the early 1300s. The adverbial “clear” initially described visual clarity, while “clearly” referred to brightness.

The earliest OED example for “clear” used as an adverb is from Cursor Mundi, an anonymous Middle English poem composed before 1325 and possibly as early as 1300: “Þe sune … schines clere” (“The sun … shines clear”).

The dictionary’s first citation for “clearly” (clerliche in Middle English) is from the Life of St. Brandan (circa 1300): “Hi seȝe in the see as clerliche as hi scholde alonde” (“He sees on the sea as clearly as he should on land”). The medieval Irish saint, usually called St. Brendan, is known for a legendary sea journey from Ireland to the Isle of the Blessed.

Why do some adverbs have tails while others don’t? Here’s a brief history.

In Anglo-Saxon days, adverbs were usually formed by adding –lice or –e at the end of adjectives. Over the years, the –lice adverbs evolved into the modern “-ly” ones and the adverbs with a final –e lost their endings, becoming tailless flat adverbs that looked like adjectives.

Sounds simple, but things got complicated in the 17th and 18th centuries, when Latin scholars insisted that adjectives and adverbs should have different endings in English, as they do in Latin. As a result, people began sticking “-ly” onto perfectly good flat adverbs and preferring the “-ly” versions where both existed.

Although the adjective “clear” comes from Old French, not Old English, the flat adverb “clear” may have been influenced by the loss of the adverbial –e in native Anglo-Saxon words, first in pronunciation and later in spelling.

As the OED explains, the adverbial use of “clear” arose “partly out of the predicative use of the adjective” and “partly out of the analogy of native English adverbs,” which by loss of the final –e had become identical in form with their adjectives.

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Language Linguistics Pronunciation Spelling Usage Writing

Syllables gone missing

Q: I just heard a BBC interviewer pronounce “medicine” as MED-sin. I’m pretty sure that Doc Martin attended MED-i-cal school, so why do the British drop the vowel “i” when speaking of pharmaceuticals?

A: The pronunciation of “medicine” as MED-sin is standard in British speech. It’s part of a larger phenomenon that we wrote about in 2012, the tendency of British speakers to drop syllables in certain words.

What’s dropped is a weak or unstressed next-to-last syllable in a word of three syllables or more. So in standard British English, “medicine” is pronounced as MED-sin, “necessary” as NESS-a-sree, “territory” as TARE-eh-tree, and so on.

The dropped syllable or vowel sound is either unstressed (like the first “i” in “medicine”) or has only a weak, secondary stress (like the “a” in “necessary”).

This syllable dropping apparently began in 18th- and 19th-century British speech, and today these pronunciations are standard in Britain. You can hear this by listening to the pronunciations of “medicine,” “secretary,” “oratory,” and “cemetery” in the online Longman Dictionary of Contemporary English (click the red icon for British, blue for American).

We know roughly when such syllable-dropping began because, as we wrote in our book Origins of the Specious, lexicographers of the time commented on it.

It wasn’t until the late 18th century that dictionaries—like those by William Kenrick (1773), Thomas Sheridan (1780), and John Walker (1791)—began marking secondary stresses within words, and providing pronunciations for each syllable.

Sheridan in particular made a point of this, lamenting what he saw as a general “negligence” with regard to the pronunciation of weakly stressed syllables.

“This fault is so general,” Sheridan wrote, “that I would recommend it to all who are affected by it, to pronounce the unaccented syllables more fully than is necessary, till they are cured of it.” (A Complete Dictionary of the English Language, 1780.)

Despite such advice, syllable dropping continued, and these abbreviated pronunciations became more widely accepted throughout the 1800s. By 1917, the British phonetician Daniel Jones had recognized some of these pronunciations as standard.

In An English Pronouncing Dictionary, Jones omitted the next-to-last syllable in some words (“medicine,” “secretary,” “cemetery”) while marking it as optional in others (“military,” “necessary,” “oratory”). As the century progressed, later and much-revised editions of Jones’s dictionary omitted more of those syllables.

As Jones originally wrote, his aim was to describe what was heard in the great English boarding schools, the accent he called “PSP” (for “Public School Pronunciation”). In the third edition of his dictionary (1926), he revived the older, 19th-century term “Received Pronunciation” and abbreviated it to “RP” (here “received” meant “socially accepted”).

Americans, meanwhile, continued to pronounce those syllables.

In The Origins and Development of the English Language (4th ed., 1993), Thomas Pyles and John Algeo write that while British speech lost the subordinate stress in words ending in “-ary,” “-ery,” and “-ory,” this stress “is regularly retained in American English.”

As examples of American pronunciation, the authors cite “mónastèry, sécretàry, térritòry, and the like,” using an acute accent (´) for the primary stress and a grave accent (`) for the secondary stress.

Similarly, The Handbook of English Pronunciation (2015), edited by Marnie Reed and John M. Levis, says that in words “such as secretary, military, preparatory, or mandatory,” the next-to-last vowel sound “is usually deleted or reduced in Britain but preserved in North America.”

The book adds that North American speech also retains unstressed vowels in the word “medicine,” in the names of berries (“blackberry,” “raspberry,” “strawberry,” etc.), in place names like “Birmingham” and “Manchester,” and in names beginning with “Saint.”

However, not every unstressed next-to-last syllable is dropped in standard British pronunciation. The one in “medicine” is dropped, but the British TV character Doc Martin would pronounce the syllable in “medical,” as you point out.

And the word “library” can go either way. As Pyles and Algeo write, “library” is “sometimes reduced” to two syllables in British speech (LYE-bree), though in “other such words” the secondary stress can be heard. Why is this?

In The Handbook of English Pronunciation, Reed and Levis write that some variations in speech are simply “idiosyncratic.” They discuss “secretary,” “medicine,” “raspberry,” and the others in a section on “words whose pronunciation varies in phonologically irregular ways.”

However you view it—“idiosyncratic” or “phonologically irregular”—this syllable-dropping trend is not irreversible. As Pyles and Algeo note, “Some well-educated younger-generation British speakers have it [the secondary stress] in sécretàry and extraórdinàry.”

There’s some evidence for this. A 1998 survey of British speakers found that those under 26 showed “a sudden surge in preference for a strong vowel” in the “-ary” ending of “necessary,” “ordinary,” and “February.” (“British English Pronunciation Preferences: A Changing Scene,” by J. C. Wells, published in the Journal of the International Phonetic Association, June 1999.)

So has American pronunciation influenced younger British speakers? Not likely, in the opinion of Pyles and Algeo: “A restoration of the secondary stress in British English, at least in some words, is more likely due to spelling consciousness than to any transatlantic influence.”

And Wells seems to agree: “English spelling being what it is,” he writes, “one constant pressure on pronunciation is the influence of the orthography. A pronunciation that is perceived as not corresponding to the spelling is liable to be replaced by one that does.”

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Language Linguistics Pronunciation Spelling Usage Word origin Writing

When a bomb goes boom

Q: I’ve come across a cartoon online that raises a good question: If “tomb” is pronounced TOOM and “womb” is pronounced WOOM,” why isn’t “bomb” pronounced BOOM?

A: In the past, “bomb” was sometimes spelled “boom” and probably pronounced that way too. In fact, a “bomb” was originally a “boom,” etymologically speaking.

The two words have the same ancestor, the Latin bombus (a booming, buzzing, or humming sound). The Romans got the word from the Greek βόμβος (bómbos, a deep hollow sound), which was “probably imitative in origin,” according to the Chambers Dictionary of Etymology.

The Latin noun produced the words for “bomb” in Italian and Spanish (bomba), French (bombe), and finally English, where it first appeared in the late 1500s as “bome,” without the final “b.”

The “bome” spelling was a translation of the Spanish term. It was first recorded in Robert Parke’s 1588 English version of a history of China written by Juan González de Mendoza. Here’s the OED citation:

“They vse … in their wars … many bomes of fire, full of olde iron, and arrowes made with powder & fire worke, with the which they do much harme and destroy their enimies.”

After that, however, the word disappeared for almost a century, reappearing as a borrowing of the French bombe, complete with the “b” and “e” at the end.

The earliest English example we’ve found is from A Treatise of the Arms and Engines of War, a 1678 English translation of a French book on war by Louis de Gaya. A section entitled “Of Bombes” begins:

“Bombes are of a late Invention. … They are made all of Iron, and are hollow … they are filled with Fire-works and Powder, and then are stopped with a Bung or Stopple well closed; in the middle of which is left a hole to apply the Fuse to.”

The Oxford English Dictionary’s earliest “bombe” example appeared a few years later: “They shoot their Bombes near two Miles, and they weigh 250 English Pounds a piece” (from the London Gazette, 1684).

The first appearances we’ve found of the modern spelling “bomb,” without the “e” on the end, are from a 1680 edition of The Turkish History, by Richard Knolles. The word “bomb” appears more than a dozen times, as both noun and verb.

Here’s a noun example: “twenty of them were killed that day by one Bomb.” And here’s one with the verb: “the Captain General form’d all the Trenches and Traverses for an Attack, and Bomb’d the Town with twenty Mortar-pieces.”

By the mid-1690s the “bomb” spelling had become established enough to appear in an English-to-French dictionary, Abel Boyer’s A Complete French Mastery for Ladies and Gentlemen (1694): “a bomb, une bombe.” That final silent “b” remained in the word, probably for etymological reasons, forever after.

The pronunciation of “bomb” has varied over the centuries, and it still does. Today three pronunciations are considered standard, according to the OED.

The dictionary, using the International Phonetic Alphabet, gives them as /bɒm/, /bʌm/, and /bɑm/, which we might transcribe as BOM, BUM, and BAHM (the first two are British, the third American).

The three vowels sound, respectively, like the “o” in “lot,” the “u” in “cup,” and the “a” in “father.” Furthermore, the British pronunciations are short and clipped in comparison with the American, which is more open and drawn out.

The second British pronunciation, BUM, was “formerly usual” in the British Army, Oxford says. And it apparently was widespread in the 18th century, since it’s the only pronunciation given in several dictionaries of the time, including the most popular one, John Walker’s A Critical Pronouncing Dictionary (1791).

As for the BOOM pronunciation, “bomb” was sometimes spelled “boom” or “boomb,” suggesting that it was pronounced that way too. The OED cites both spellings in an anonymous 1692 diary of the siege and surrender of Limerick: “600 Booms” … “800 Carts of Ball and Boombs.”

And the dictionary points readers to rhymes in poetry, where “bomb” is sometimes rhymed with “tomb” and “womb,” which were pronounced TOOM and WOOM at the time.

Here’s an Oxford citation from “The British Sailor’s Exultation,” a poem Edward Young wrote sometime before his death in 1765: “A thousand deaths the bursting bomb / Hurls from her disembowel’d womb.”

We’ve found a couple of additional examples in poetry of the 1690s.

In a 1692 poem written in rhyming couplets and based on Virgil’s Dido and Aeneas, John Crown rhymes “bomb’d” with “entomb’d.” Here are the lines: “The wealthy Cities insolently bomb’d, / The Towns in their own ashes deep entomb’d.”

And Benjamin Hawkshaw’s poem “The Incurable,” written in rhyming triplets, rhymes “womb,” “tomb,” and “bomb.” These are the lines: “It works like lingring Poyson in the Womb, / And each Day brings me nearer to my Tomb, / My Magazin’s consum’d by this unlucky Bomb.” (From Poems Upon Several Occasions, 1693.)

What’s more, the word “boom” (for a loud hollow noise) was sometimes spelled “bomb” or “bombe,” which suggests that the pronunciations occasionally coincided.

This example, cited in the OED, is from Francis Bacon’s Sylva Sylvarum, a natural history, or study of the natural world, published in 1627, a year after his death:

“I remember in Trinity Colledge in Cambridge, there was an Vpper Chamber, which being thought weake in the Roofe of it, was supported by a Pillar of Iron … Which if you had strucke, it would make a little flat Noise in the Roome where it was strucke; But it would make a great Bombe in the Chamber beneath.” (We’ve expanded the citation to give more context.)

And we found this example in a work that discusses sound production, Walter Charleton’s A Fabrick of Science Natural (1654): “As in all Arches, and Concamerated or vaulted rooms: in which for the most part, the sound or voyce loseth its Distinctness, and degenerates into a kind of long confused Bombe.”

In short, it’s safe to say that that “bomb” was probably pronounced BOOM by some educated speakers in the 17th century.

As we’ve noted, the word didn’t appear until 1588, during the modern English period. As far as we know, the final “b” was never pronounced. But the other words you mention, “womb” and “tomb,” are much older, and the “b” in their spellings was originally pronounced.

In the case of “womb,” a Germanic word that dates back to early Old English, it originally had a different vowel sound, too. But beginning in the Middle English period (roughly 1150 to 1500), the “oo” vowel sound developed and the “b” became silent.

As for “tomb,” a Latin-derived word that English borrowed from the French toumbe around 1300, it came with the “oo” vowel sound, and the “b” became silent in later Middle English. The “b” remained in the spelling, though in the 16th and 17th centuries the word occasionally appeared as “toom” or “toome,” according to OED citations.

Several other words ending in “b” (“lamb,” “dumb,” “comb,” “climb,” “plumb”) originally had an audible “b,” but it became silent during the Middle English period. Linguists refer to this shift in pronunciation from “mb” to “m” as an example of “consonant cluster reduction.”

We wrote a post in 2009 about other kinds of spelling puzzles—why “laughter” and “daughter” don’t rhyme, and why silent letters appear in words like “sword” and “knife.” And in 2017 we discussed “-ough” spellings (“enough,” “ought,” “though,” “through,” etc.), which are pronounced in many different ways.

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Spelling Usage Word origin Writing

What the rooster useter do

Q:I run a class for language-obsessed retirees in Australia, where “useter” is commonly used for “used to,” as in “I useter drive a Volvo” or “Didn’t you useter drive a Volvo?” May I ask you to write about this usage?

A: The word spelled “useter” represents the way some people pronounce “used to”—same meaning, different spelling. And it’s found in the US and Britain as well as in Australia.

So a sentence spoken as “I useter drive a Volvo” would be written more formally as “I used to drive a Volvo.” And the question “Didn’t you useter drive a Volvo?” would be written as “Didn’t you use to drive a Volvo?”

The spelling “useter” arose as a variant “representing a colloquial pronunciation of used to,” the Oxford English Dictionary explains. When “useter” appears in the dictionary’s written examples, it’s always an attempt to imitate the spoken usage.

The OED cites published examples of “useter” in both American and British English dating from the mid-19th century. In its earliest appearance, the word is spelled “use ter”:

“You don’t know no more ’bout goin’ to sea than I knows about them ’Gyptian lookin’ books that you use ter study when you went to College.” (From an 1846 novel, The Prince and the Queen, by the American writer and editor Justin Jones, who wrote fiction under the pseudonym Harry Hazel.)

The dictionary’s most recent example is from a British newspaper, the Evening Gazette (Middlesbrough), dated June 14, 2003: “They useter ’ave a big Rockweiler … but it got nicked.”

Among the OED’s examples is one spelled “useta,” representing what’s probably the more common American pronunciation:

“I useta beg her to keep some of that stuff in a safe-deposit box.” From The Burglar in the Closet (1980), by the American mystery writer Lawrence Block.

As we said in a recent post, this sense of “use” in the phrase “used to” refers to an action in the past that was once habitual but has been discontinued.

We won’t say any more about the etymology of “use,” since we covered it in that post. But we’ll expand a bit on the sense of “use” as a verb that roughly means “customarily do.”

This sense of “use” has died out in the present tense. A 17th-century speaker might have said, “John uses to drink ale,” but today the present-tense version would be “John usually [or customarily or habitually] drinks ale.”

In modern English, this sense of “use” is found only in the past tense: “used” or “did use.” We now say, for example, “Normally he drives a Ford, but he used [or did use] to drive a Volvo.”

Since the “d” in “used to” is not pronounced, the phrase sounds like “use to,” and people sometimes write it that way in error.

As the OED explains, the “d” and the “t” sounds in “used to” became “assimilated” in both British and American English, and “attempts to represent these pronunciations in writing gave rise to use to as a spelling for used to.” The “use to” spelling “occurs from at least the late 17th cent. onwards,” the dictionary says.

Another irregularity is that people commonly—but redundantly—use “did” and “used” together, as in “Did he used to drive a Volvo?” But with “did,” the normal form is “use” (“Did he use to drive a Volvo?”).

As Pat explains in her book Woe Is I, “did use” is another way of saying “used,” just as “did like” is another way of saying “liked.” And just as we don’t write “did liked,” we shouldn’t write “did used.” She gives this usage advice:

  • If there’s no “did,” choose “used to” (as in “Isaac used to play golf”).
  • If there’s a “did,” choose “use to” (as in “Isaac did use to play golf” … “Did Isaac use to play squash?” … “No, he didn’t use to play squash”).

As you’ve noticed, questions and negative statements like those last two are sometimes constructed differently.

Americans, and many speakers of British English, typically say, “Did he use to drive a Volvo?” … “No, he didn’t use to drive a Volvo.”

But sometimes, sentences like these get a different treatment in British English: “Used he to drive a Volvo?” …”Usedn’t he to drive a Volvo?” … “No, he used not [or usedn’t] to drive a Volvo.”

What’s happening in those negative examples? The OED says that “not” sometimes directly modifies “use,” resulting in “the full form used not… although usedn’t occasionally occurs as well as usen’t.”

In closing, we’ll share a few lines from Irving Berlin’s 1914 song “I Want to Go Back to Michigan (Down on the Farm)”:

I miss the rooster,
The one that useter
Wake me up at four A.M.

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Grammar Language Linguistics Usage Word origin Writing

Hear Pat on Iowa Public Radio

She’ll be on Talk of Iowa today from 10 to 11 AM Central time (11 to 12 Eastern) to discuss the English language and take questions from callers. Today’s topic: new words and how they make it into the dictionary.

Help support the Grammarphobia Blog with your donation
And check out our books about the English language.

Categories
English English language Etymology Expression Grammar Language Linguistics Phrase origin Usage Word origin Writing

Me, myself, and I

Q: In the 1960s, I began noticing the use of “myself” as a cover for the inability of the speaker/writer to know whether “I” or “me” is correct. Can you predate that?

A: English speakers have been using “myself” in place of the common pronouns “I” and “me” since the Middle Ages, and the usage wasn’t questioned until the late 1800s, according to Merriam-Webster’s Dictionary of English Usage.

Critical language mavens argued that “myself’ (and other “-self” words) should be used only for emphasis (“Let me do it myself”) or reflexively—that is, to refer back to the subject (“She saw herself in the mirror”).

Alfred Ayres was apparently the first language writer to question the broader usage. In The Verbalist, his 1881 usage manual, Ayres criticizes the routine use of “myself” for “I”:

“This form of the personal pronoun is properly used in the nominative case only where increased emphasis is aimed at.”

Some modern usage writers still insist that “-self” pronouns should be used only emphatically or reflexively, but others accept their broader use as subjects and objects.

In Garner’s Modern English Usage (4th ed.), Bryan A. Garner objects to the wider usage, while in Fowler’s Dictionary of Modern English Usage (4th ed.), Jeremy Butterfield accepts it.

We believe that “myself” and company should primarily be used for emphasis or to refer back to the subject. And we suspect that some people fall back on “myself” when they’re unsure whether “I” or “me” would be grammatically correct.

However, there’s nothing grammatically wrong with using “myself” and other reflexive pronouns more expansively for euphony, style, rhythm, and so on. Respected writers have done just that for centuries, both before and after the language gurus raised objections.

This example is from a letter written on March 2, 1782, by the lexicographer Samuel Johnson: “Williams, and Desmoulins, and myself are very sickly.”

And here are some of the many other examples that Merriam-Webster has collected from writers who were undoubtedly aware of the proper uses of “I” and “me”:

“the pot I placed with Miss Williams, to be eaten by myself” (Samuel Johnson, letter, Jan. 9, 1758);

“both myself & my Wife must” (William Blake, letter, July 6, 1803);

“no one would feel more gratified by the chance of obtaining his observations on a work than myself” (Lord Byron, letter, Aug. 23, 1811);

“Mr. Rushworth could hardly be more impatient for the marriage than herself” (Jane Austen, Mansfield Park, 1814);

“it will require the combined efforts of Maggie, Providence, and myself” (Emily Dickinson, letter, April 1873);

“I will presume that Mr. Murray and myself can agree that for our purpose these counters are adequate” (T. S. Eliot, Selected Essays, 1932);

“with Dorothy Thompson and myself among the speakers” (Alexander Woollcott, letter, Nov. 11, 1940);

“which will reconcile Max Lerner with Felix Frankfurter and myself with God” (E. B. White, letter, Feb. 4, 1942);

“The Dewas party and myself got out at a desolate station” (E. M. Forster, The Hill of Devi, 1953);

“When writing an aria or an ensemble Chester Kallman and myself always find it helpful” (W. H. Auden, Times Literary Supplement, Nov. 2, 1967).

In those examples, “myself” is being used for “I” or “me” in three ways: (1) for “I” as the subject of a verb; (2) for “me” as the object of a verb; and (3) for “me” as the object of a preposition.

When “myself” is used as a subject, it’s usually accompanied by other pronouns or nouns, as in the Auden example above. However, the M-W usage guide notes that “myself” is sometimes used alone in poetry as the subject of a verb:

“Myself hath often heard them say” (Shakespeare, Titus Andronicus, 1594);

“My selfe am so neare drowning?” (Ben Johnson, Ode, 1601);

“Myself when young did eagerly frequent” (Edward FitzGerald, translation, Rubáiyát of Omar Khayyám, 1859);

“Somehow myself survived the night” (Emily Dickinson, poem, 1871).

The Cambridge Grammar of the English Language describes “-self” pronouns used expansively as “override reflexives”—that is, “reflexives that occur in place of a more usual non-reflexive in a restricted range of contexts where there is not the close structural relation between reflexive and antecedent that we find with basic reflexives.”

The authors, Rodney Huddleston and Geoffrey K. Pullum, say (as we do above) that the substitution of “myself” for “me” and “I” may sometimes be the result of uncertainty about the rules for using the two common pronouns.

“Much the most common override is 1st person myself,” Huddleston and Pullum write. “The reflexive avoids the choice between nominative and accusative me, and this may well favour its use in coordinate and comparative constructions, where there is divided usage and hence potential uncertainty for some speakers as to which is the ‘approved’ case.”

The use of override reflexives, especially “myself,” has been “the target of a good deal of prescriptive criticism,” the authors say, adding: “there can be no doubt, however, that it is well established.”

The M-W usage guide, which accepts the moderate use of “myself” for “I” and “me,” notes that the prescriptive criticism has often been contradictory, relying on such labels as “snobbish, unstylish, self-indulgent, self-conscious, old-fashioned, timorous, colloquial, informal, formal, nonstandard, incorrect, mistaken, literary, and unacceptable in formal written English.”

It’s hard to tell when people confused by “I” and “me” began using “myself” as a substitute. But it may have begun in the late 19th century, prompting those early complaints about the usage. Some of those adjectives used by critics (“nonstandard,” “incorrect,” “mistaken,” etc.) may have referred to the English of people with a shaky grasp of grammar.

As for the early etymology, all three of those first-person singular pronouns showed up in Anglo-Saxon times—“I” as the Old English ic, ih, or ich; “me” as mē or mec; “myself” as mē self. In the 12th century the ic spelling was shortened to and gradually began being capitalized in the 13th century, as we wrote in a 2011 post.

In Old English, spoken from roughly the 5th to the 12th centuries, “myself” was used  emphatically or reflexively. In Middle English, spoken from about the 12th to the 15th centuries, “myself” was also used as a subject of a verb, an object of a verb, and an object of a preposition.

Here’s an early example from the Oxford English Dictionary of “myself” used as the subject of a verb: “Sertes, my-selue schal him neuer telle” (“Certainly, myself shall never tell him”). It’s from The Romance of William of Palerne, a poem translated from French sometime between 1350 and 1375.

And this is an example of “myself” as the object of a verb: “Mine þralles i mire þeode me suluen þretiað” (“My servants and my people shall threaten myself”). From Layamon’s Brut, a poem written sometime before 1200.

Finally, here’s “myself” used as the object of a preposition: “Þe londes þat he has he holdes of mi-selue” (“The lands that he has he holds for myself”). Also from The Romance of William of Palerne.

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Phrase origin Usage Word origin Writing

‘More’ or ‘-er’? ‘Most’ or ‘-est’?

Q: Is there a rule for when to use “more” and “most” to form comparatives and superlatives, and when to use “er” and “est”? Why do we have two ways to do this?

A: There’s no “rule” about using “more” and “most” versus “-er” and “-est” to express the comparative and superlative. But there are some common conventions.

With “most adjectives and adverbs of more than one syllable, and with all those of more than two syllables,” the Oxford English Dictionary says, “the normal mode” of forming the comparative and superlative is by using “more” and “most.”

A few one-syllable words (like “real,” “right,” “wrong,” and “just”) also normally form comparatives and superlatives with “more” and “most” instead of with “-er” and “-est” suffixes, according to the OED.

The dictionary adds that “more” is also sometimes used with words of one or two syllables that would normally have “-er” comparatives, like “busy,” “high,” “slow,” “true,” and so on. Why? Here’s how Oxford explains it:

“This form is often now used either for special emphasis or clearness, or to preserve a balance of phrase with other comparatives with ‘more,’ or to modify the whole predicate rather than the single adjective or adverb, especially when followed by than.”

So we might choose “much more humble” instead of “much humbler.” Or we might say “so-and-so’s voice was more quiet but no less threatening.” Or “that’s more true than false.” Or even “his feet are more big than ungainly.”

The OED offers additional details about the the use of the “-er” and “-est” suffixes with adjectives and adverbs.

In modern English, the dictionary says, “the comparatives in -er are almost restricted to adjectives of one or two syllables,” while longer adjectives as well as two-syllable adjectives not ending in “-ly” or “-y” form the comparative “by means of the adverb more.”

The same goes for the “-est” suffix, which is used similarly to form the superlative of adjectives (Oxford points to its “-er” comparative entry for the “present usage” of the “-est” superlative).

As for the use of “-er” and “-est” with adverbs, those that have the same form as corresponding adjectives (“hard,” “fast,” “close,” etc.) chiefly form the comparative and superlative with “-er” and “-est,” while adverbs that end in “-ly” form the comparative with “more” and the superlative with “most.”

There are quite a few exceptions, of course. For a more comprehensive guide to how the comparative and superlative are expressed in English today, check out Jeremy Butterfield’s entry for “-er and -est, more and most” in Fowler’s Dictionary of Modern English Usage (4th ed.).

How did we end up with two ways to express the comparative and superlative in English? In a 2008 post, we discuss the etymology of “more” and “most” as well as the history of the suffixes “-er” and “-est.”

As we say in that post, the “-er” and “-est” suffixes have been used to make comparisons since the earliest days of English, and it’s a practice handed down from ancient Indo-European.

The Old English endings were originally spelled differently than they are today: -ra for the comparative, and -ost (sometimes -est) for the superlative.

Taking the word “old” as an example, the Old English forms were eald (“old”), yldra (“older”), yldest (“oldest”). And taking “hard” as another, the forms were heard (“hard”), heardra (“harder”), heardost (“hardest”).

Meanwhile, there was another set of Old English words: micel (meaning “great” or “big”), mara (“more”), and maest (“most”).

While “more” and “most” (or their ancestors) were around since the earliest days of English, it wasn’t until the early 1200s that we began using them as adverbs to modify adjectives and other adverbs in order to form comparatives and superlatives—that is, to do the job of the “-er” and “-est” suffixes.

For a few centuries, usage was all over the place. In fact, it wasn’t uncommon for even one-syllable words to be used with “more” and “most,” according to The Origins and Development of the English Language, by Thomas Pyles and John Algeo. The authors cite the frequent use of phrases like “more near,” “more fast,” “most poor,” and “most foul.”

And multi-syllable words were once used with “-er” and “-est,” like “eminenter,” “impudentest,” and “beautifullest.”

Pyles and Algeo say there were even “a good many instances of double comparison, like more fittermore better, more fairer, most worst, most stillest, and (probably the best-known example) most unkindest.”

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Grammar Language Linguistics Usage Word origin Writing

When Dickens don’t use ‘doesn’t’

Q: While reading Dickens, I’ve noticed the use of “don’t” where we would now use “doesn’t.” In The Mystery of Edwin Drood, for example, the boastful auctioneer Thomas Sapsea says, “it don’t do to boast of what you are.”

A: What standard dictionaries say today about these contractions is fairly clear cut:

  • “Doesn’t” (for “does not”) should be used in the third person singular—with “he,” “she,” “it,” and singular nouns.
  • “Don’t” (for “do not”) is correct in all other uses—with “I,” “we,” “you,” “they,” and plural nouns. In the third person singular, “don’t” is considered nonstandard.

As you’ve noticed, however, it’s not unusual to find “don’t” used in place of “doesn’t” in 18th- and 19th-century fiction, like the example you found in that unfinished 1870 novel.

Was the usage ever “correct”? As is often the case with English, this is not a “yes or no” question.

In our opinion, this way of using “don’t” was always somewhat irregular (the Oxford English Dictionary suggests that it was regional or nonstandard from the start).

And as we’ll explain later, we think that in your example Dickens used “it don’t” colloquially to show that Mr. Sapsea didn’t speak the very best English.

The history of these contractions begins two centuries before Dickens. Both were formed in the 17th century, at a time when all forms of “do” were unsettled, to say the least.

For one thing, “does” and “doth”—both spelled in a variety of ways—were competing for prominence, as Merriam-Webster’s Dictionary of English Usage points out.

For another, some writers used the bare (or uninflected) “do” as the third person singular, according to M-W. The usage guide cites Samuel Pepys, writing in 1664: “the Duke of York do give himself up to business,” and “it seems he [the king] do not.”

With the verb itself so unsettled, it’s not surprising that the state of the contractions was even more chaotic.

In fact, M-W suggests that the use of the uninflected “do” for “does,” as in the Pepys citations, may have influenced the use of “don’t” as a contracted “does not.”

It’s significant that “don’t” was on the scene first; for a long while it was the only present-tense contraction for “do.” It was used as short for “do not” and (rightly or wrongly) for “does not.”

The earliest known written uses of “don’t” are from plays of the 1630s, though spoken forms were surely around long before that. And in the earliest OED examples, it’s used in the standard way—as short for “do not.”

The dictionary’s first example is dated 1633: “False Eccho, don’t blaspheme that glorious sexe.” (From Jasper Fisher’s Fuimus Troes, a verse drama; though published in 1633, it was probably performed a decade or so earlier.)

The next example is from William Cartwright’s The Ordinary, believed written about 1635: “Don’t you see December in her face?”

The OED also has a citation (with “I don’t”) from a comedy first acted in 1635 and published in 1640, Richard Brome’s The Sparagus Garden. And we’ve found a couple of interrogative uses (“dont you” and “dont they”) in a 1639 comedy, Jasper Mayne’s The City Match.

But “doesn’t,” with various spellings, wasn’t recorded until decades later—spelled “dozn’t” in 1678 and “doesn’t” in 1694, according to OED citations.

Even after “doesn’t” came on the scene, it apparently wasn’t common until at least a century later. Most uses of “doesn’t” that we’ve found in historical databases are from the 1760s or later, and it didn’t start appearing regularly (at least in writing) until the 1800s.

Before then, most writers used the uncontracted form, “does not,” even in fictional dialogue. The use of “don’t” in the third person singular was apparently irregular. The OED cites “he don’t,” “she don’t,” and “it don’t” among examples of regional or nonstandard uses, dating from 1660.

But to be fair, it seems only natural that mid-17th century British writers seeking a contraction for “does not” would use “don’t” in colloquial dialogue if “doesn’t” was unknown to them.

And no one can argue the fact that the earliest contraction people used for “does not” was “don’t.” Many continued to do so long after “doesn’t” came into the language.

M-W says, for example, that from the 17th through 19th centuries, the third person singular “don’t seems to have had unimpeachable status.” It cites examples (mostly in letters) by Horace Walpole, Charles Lamb, George Bernard Shaw, and Oliver Wendell Holmes.

Only after the usage was condemned in the latter half of the 19th century, M-W says, was this sense of “don’t” considered nonstandard.

We don’t agree entirely with M-W here. We’ve found hints that this use of “don’t” was regarded as less than exemplary by novelists of the 18th century.

For example, there are no irregular uses of “don’t” in Daniel Defoe’s Robinson Crusoe (1719), in his Moll Flanders (1722), or in Laurence Sterne’s Tristram Shandy (completed in 1767).

All three novels freely use “don’t” in the standard way and “does not” in the third person singular.

In Samuel Richardson’s novel Pamela (1740), we counted 14 examples of “don’t” in the third person singular—all but four used by servants—compared with 54 of “does not.”

We found no irregular uses of “don’t” in Henry Fielding’s Joseph Andrews (1742) and only two in his Tom Jones (1749)—spoken by a clerk and a servant.

Tobias Smollett’s Humphry Clinker (1771) has four uses of this irregular “don’t,” three by servants and one by an eccentric duke. Otherwise Smollett uses “does not” in the third person singular.

So apparently the principal novelists of the 18th century did not consider the third person singular “don’t” a normal usage, except sometimes among the rural or working classes. (None of them ever used “doesn’t” in writing, as far as we can tell.)

Even in 19th-century fiction, it’s mostly working-class characters who use “don’t” in a nonstandard way (though the occasional aristocrat uses it in a slangy, casual manner).

Let’s consider your quotation from Charles Dickens. When he wrote The Mystery of Edwin Drood, he deliberately put the nonstandard “it don’t” into the mouth Mr. Sapsea, a conceited fool who is convinced he’s brilliant and has pretensions to good breeding. The character is introduced with these words:

“Accepting the Jackass as the type of self-sufficient stupidity and conceit—a custom, perhaps, like some few other customs, more conventional than fair—then the purest Jackass in Cloisterham is Mr. Thomas Sapsea, Auctioneer.”

Sapsea isn’t the only character in the novel to use this irregular “don’t,” but the others are mostly laborers or servants. Those with higher education (teachers, clergy, etc.) use “does not.”

You don’t have to read 18th- or 19th-century fiction, however, to find nonstandard uses of “don’t.” They can be found in modern writing, too, mostly when the author intends to convey dialectal, regional, or uneducated English.

Graham Greene’s novel Brighton Rock (1938), for instance, has many examples in the speech of working-class characters: “That don’t signify” … “it don’t make any odds” … “it don’t seem quite fair.”

But modern British authors sometimes use this irregular “don’t” in portraying sophisticated, affluent characters who are deliberately (even affectedly) careless or casual in their speech.

Take, for example, Lord Peter Wimsey, the aristocratic, Oxford-educated detective in Dorothy L. Sayers’s novels of the ’20s and ’30s. He not only drops a “g” here and there (“an entertainin’ little problem”), but he often uses “don’t” in the third-person singular.

To cite just a handful of examples: “gets on your nerves, don’t it?” … “it don’t do to say so” … “when he don’t know what else to say, he’s rude” … “it don’t do to wear it [a monocle] permanently” … “it don’t do to build too much on doctors’ evidence” … “it don’t account for the facts in hand.”

Lord Peter isn’t an 18th-century character. He’s a 20th-century snob, and when he uses such English, he’s slumming linguistically.

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Expression Grammar Language Linguistics Usage Writing

How to say you’re not quite sure

Q: What is the difference in meaning between “John didn’t come yesterday—he must have been ill” and “John didn’t come yesterday—he will have been ill”? I realize that “must” is more popular than “will” in such constructions, but does one express more certainty than the other?

A: The words “will” and “must” in your examples are epistemic modal verbs, auxiliary verbs that express probability.

As Rodney Huddleston and Geoffrey K. Pullum explain in the Cambridge Grammar of the English Language, “epistemic modality qualifies the speaker’s commitment to the truth.”

“While It was a mistake represents an unqualified assertion,” Huddleston and Pullum write, “It must have been a mistake suggests that I am drawing a conclusion from evidence rather than asserting something of whose truth I have direct knowledge.”

In your first example (“John didn’t come yesterday—he must have been ill”), the auxiliary “must” indicates that the writer (or speaker) believes John was probably ill.

In our opinion, the expression “he will have been ill” indicates somewhat more probability than “he must have been ill” (though some might argue the point). And both of them indicate a much greater probability than “he may have been ill”—another example of epistemic modality.

Huddleston and Pullum note that epistemic modality is “commonly expressed by other means than modal auxiliaries.” For example, by adverbs (“he was probably ill”), verbs (“I believe he was ill”), adjectives (“he was likely to be ill”), and nouns (“in all likelihood, he was ill”).

There are two other principal kinds of modality: deontic, which expresses permission or obligation (“He may have one more chance, but he must come tomorrow”), and dynamic, which expresses willingness or ability (“I won’t come today, but I can come tomorrow”).

In A Comprehensive Grammar of the English Language, the authors, Randolph Quirk et al., say, “At its most general, modality may be defined as the manner in which the meaning of a clause is qualified so as to reflect the speaker’s judgment of the likelihood of the preposition it expresses being true.”

Quirk divides the modal verbs into two types:

“(a) Those such as ‘permission,’ ‘obligation,’ and ‘volition’ which involve some kind of intrinsic human control over events, and

“(b) Those such as ‘possibility,’ ‘necessity,’ and ‘prediction,’ which do not primarily involve human control of events, but do typically involve human judgment of what is or is not likely to happen.”

Quirk adds that the two categories “may be termed intrinsic and extrinsic modality respectively,” since “each one of them has both intrinsic and extrinsic uses: for example, may has the meaning of permission (intrinsic) and the meaning of possibility (extrinsic); will has the meaning of volition (intrinsic) and the meaning of prediction (extrinsic).”

“However, there are areas of overlap and neutrality between the intrinsic and extrinsic senses of a modal: the will in a sentence such as I’ll see you tomorrow then can be said to combine the meanings of volition and prediction.”

Another point to consider, Quirk writes, “is that the modals themselves tend to have overlapping meanings, such that in some circumstances (but not in others), they can be more or less interchangeable.”

In other words, there’s a lot of ambiguity here. Or, as Quirk puts it, “the use of modal verbs is one of the more problematic areas of English grammar.”

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.

Categories
English English language Etymology Expression Language Linguistics Usage Word origin Writing

The truth about trees

Q: The words “tree” and “true,” according to a TED video, have a common ancestor. From what I gather, this was back in prehistoric times, before there was writing. So how do we know the first thing about an ancient language if there’s no written record of it?

A: Historical linguists believe that “tree” and “true” have a common prehistoric ancestor, a belief that’s based on studies of a reconstructed, hypothetical ancient language known as Proto Indo-European (PIE for short), not on any written evidence.

By studying members of the present Indo-European family, linguists have extrapolated back to the presumed prehistoric language. The Indo-European family comprises many European and Asian languages, including English.

In reconstructing the PIE vocabulary, for example, linguists have used the comparative method, a technique for finding similar words in various Indo-European languages.

As the linguist and phonologist Calvert Watkins explains, similar words, or cognates, in present Indo-European languages “provide evidence for the shape of the prehistoric Indo-European word.”

Watkins, author of The American Heritage Dictionary of Indo-European Roots, says “tree” and “true” are ultimately derived from deru-, a Proto Indo-European root meaning to “be firm, solid, steadfast.”

This PIE root gave prehistoric Germanic the terms trewam (“tree”) and treuwithō (“truth”), and Old English the words trēow (“tree”) and trēowe (“true”), Watkins writes. (Old English spellings vary considerably.)

The earliest example of “tree” in the Oxford English Dictionary (with the plural treo) is from the Vespasian Psalter, an illuminated manuscript dated around 825:

“Muntas and alle hyllas, treo westemberu and alle cederbeamas” (“Mountains and all hills, fruit trees and all cedars”). We’ve expanded the citation, from Psalm 148.

And in Pastoral Care, King Alfred’s late ninth-century translation of a sixth-century work by Pope Gregory, an unbeliever is compared to a barren tree:

“Ælc triow man sceal ceorfan, þe gode wæstmas ne birð, & weorpan on fyr, & forbærnan” (“Every tree that does not bear good fruit shall be cut down and cast into the fire and burnt”). We’ve expanded the OED citation, which refers to Matthew 7:19. The dictionary describes triow here as a variant reading of treow.

When “true” showed up in Old English, it meant loyal, faithful, or trustworthy. Here’s an example from Beowulf, an epic poem that may have been written as early as 725:

“Þa gyt wæs hiera sib ætgædere, æghwylc oðrum trywe” (“The two were at peace together, true to each other”). Here trywe is a variant spelling of trēow.

As Old English gave way to Middle English in the 12th century, the word “true” came to mean accurate or factual, as in this example from Layamon’s Brut, an early Middle English poem written sometime before 1200:

“Belin ihærde sugge þurh summe sæg treowe of his broðer wifðinge” (“Belin heard it said through some true report of his brother’s marriage”). The passage refers to Belin and Brennes, brothers who vie in Anglo-Saxon legend to rule Britain.

And this example is from Wohunge Ure Lauerd (“The Wooing of Our Lord”), a Middle English homily written sometime before 1250. The author, presumably a woman, tells Jesus of her passionate love for him:

“A swete ihesu þu oppnes me þin herte for to cnawe witerliche and in to reden trewe luue lettres” (“Ah, sweet Jesus, you open your heart to me, so that I may know it inwardly, and read inside it true love letters”).

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

Subscribe to the Blog by email

Enter your email address to subscribe to the Blog by email. If you are an old subscriber and not getting posts, please subscribe again.