The Grammarphobia Blog

Ing-lish spoken here

Q: What do you think of the recent Doonesbury strip on the use of present participles in TV talk? I’ve been foaming at the mouth over this for years.

A: We’re not foaming at the mouth, but too much of any trendy usage can be annoying.

The linguist Geoffrey Nunberg, who commented on this usage more than a dozen years before the cartoonist Garry Trudeau, has coined a term for the use of “-ing” participles in broadcasting: “ing-lish.”

In a Dec. 8, 2002, article in the New York Times, Nunberg notes that “the all-news networks have begun to recite their leads to a new participial rhythm.”

“Fox News Channel and CNN have adopted it wholesale, and it is increasingly audible on network news programs as well,” he says.

A sentence like “The Navy has used the island for 60 years but will cease its tests soon,” Nunberg explains, comes out in ing-lish as “The Navy using the island for 60 years but ceasing its tests soon.”

“What ing-lish really leaves out is all tenses, past, present or future, and with them any helping verbs they happen to fall on—not just be, but have and will,” he says.

Interestingly, Nunberg adds, this usage “doesn’t actually save any time—sometimes, in fact, it makes sentences longer. ‘Bush met with Putin’ is one syllable shorter than ‘Bush meeting with Putin.’ ”

If it doesn’t save time, why do broadcast journalists use ing-lish?

The linguist Asya Pereltsvaig suggests that it may be because the present progressive tense (“I am dancing”) denotes “something that happens at this very moment” while the simple present (“I dance”) refers to “a broader range of temporal points.”

In a Sept. 21, 2015, post on Languages of the World, she explains that the present tense (“I dance”) can refer to dancing “often/every day/from time to time” and so on.

The linguist Mark Liberman, in a Sept. 20, 2015, comment on the Language Log about the Doonesbury strip, says the “idea that short phrases convey urgency is a well-established principle of writing advice.”

“But it’s not obvious to me that either in headlines or in broadcast news, the use of present participles rather than tensed verbs is generally the more urgent-seeming choice,” he says.

Liberman gives these two examples to make his point: “The town reels, its dreams of a better tomorrow up in smoke!” versus “The town reeling, its dreams of a better tomorrow up in smoke!”

He also points out that “there are famous examples where a sense of urgency is associated with long run-on sentences,” like Molly Bloom’s soliloquy at the end of James Joyce’s Ulysses.

And yes we’ll end with the last few lines of the soliloquy: “and then I asked him with my eyes to ask again yes and then he asked me would I yes to say yes my mountain flower and first I put my arms around him yes and drew him down to me so he could feel my breasts all perfume yes and his heart was going like mad and yes I said yes I will Yes.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

When the present is past

Q: I’m trying to figure out this sentence: “Something Grandma let me do that my parents wouldn’t is/was eat cake.” Which is it? I spent an hour online looking for the answer, and now I’m more confused than before!

A: Let’s reduce your sentence to its relevant parts: “Something Grandma let me do was/is eat cake.” (The intervening clause, “that my parents wouldn’t,” does not affect the grammar here.)

Do we choose “was eat cake” because the principal verb (“let”) is in the past tense? Or do we choose “is eat cake” because at the time Grandma allowed it, the question of cake-eating existed in the present?

You might argue either way. But since you’re talking about something Grandma allowed in the past (and probably the distant past), we think “was” is the better choice: “Something Grandma let me do was eat cake.”

Note that we said “the better choice,” not “the grammatically correct choice.” In our opinion, “was” is more natural here, but we’ve found no hard-and-fast rule about this, at least not one that’s convincing.

A similar but harder question has to do with a situation that is relevant to the present, but is mentioned in the past tense because the sentence’s main verb is in the past tense. This is what we mean:

“Pasteur believed that intense heat was the key to killing bacteria” (it still is the key) … “Skeptics denied that the Earth revolved around the sun” (it still does) … “Claire didn’t know where Idaho was” (it’s still there).

The grammarian Otto Jespersen says that in this kind of sentence even an “eternal truth” may be expressed in the past tense. He cites the example “My father convinced me that nothing was useful which was not honest.” (Essentials of English Grammar, 1931.)

And we find the linguist Renaat Declerck saying much the same 60 years later. “Contrary to what is sometimes claimed,” he writes, “the past tense can be used even if the complement clause expresses an ‘eternal truth.’ Using the present tense is never obligatory.” (Tense in English, 1991.)

In fact, the tense in the lesser clause is more likely to echo the past tense of the main clause. Or, as Declerck puts it, “temporal subordination is the default choice.”

Jespersen gives these examples of cases in which the situation in the second clause is still relevant to the present, and yet the past tense is used: “I tried to forget who I was” and “What did you say was your friend’s name?”

Obviously, the speaker could just as well have used the present tense, but didn’t. Why not?

Jespersen suggests that frequently the use of the past tense here “is due simply to mental inertia: the speaker’s mind is moving in the past, and he does not stop to consider whether each dependent statement refers to one or the other time, but simply goes on speaking in the tense adapted to the main idea.”

“A typical example,” Jespersen writes, “is found when the speaker discovers the presence of someone and exclaims, ‘Oh, Mr Summer, I didn’t know you were here.’ ”

Jespersen, Declerck, and others suggest that this sort of tense adaptation is especially common in indirect or reported speech—that is, a second-hand report of what someone said.

The Cambridge Grammar of the English Language says that in the following examples of reported speech, the choice of tense in the second clause is optional:

“Jill said that she had too many commitments” … “Jill said that she has too many commitments.”

Both sentences are correct, though the first suggests that Jill had too many commitments in the past, while the second suggests that she may still have too many.

As the Cambridge Grammar says, “the two reports do not have the same meaning, but in many contexts the difference between them will be of no pragmatic significance.”

However, in the sentence “Jill said she had/has a headache,” the book notes that “Jill’s utterance needs to have been quite recent for has to be appropriate.”

Declerck says the decision about the tense of the secondary clause “will generally be based on pragmatic considerations.” For instance, a speaker might shift to the present tense to indicate he thinks the situation is still valid or relevant.

He uses the example “He said that Betty is a very clever girl.” This shift “from a past to a present domain,” Declerck says, “is optional, since the speaker could also have chosen to keep the domain constant,” as in “He said that Betty was a very clever girl.”

As you can see, the choice here isn’t always clear.

It’s safe to say that when we speak of the immediate or recent past, we’re more likely to use the present tense in the lesser clause (“He learned that he has cancer”).

But when a shift to the present would be jarring, we stick to the past tense even when the situation is still true (“She knew Wednesday was his poker night”).

We can think of further examples in which the tense is optional but the choice of one over the other makes a difference, as in this sentence:

“He said on the Today show that gluten is becoming a national obsession.” (The use of “is” stresses that the situation is still unfolding.)

And in the next two sentences, the differing tenses indicate differing views of an event:

“Did you know that what you were doing is wrong?” (The speaker is stressing that it’s still wrong.)

“Did you know that what you were doing was wrong?” (The speaker is emphasizing what the person knew at the time.)

Finally, we’ll end this post with an example in which the secondary verb is clearly better in the present tense.

We were just thinking that it’s time to sign off.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Do we doff only hats?

Q: Why is the verb “doff” almost exclusively linked to hats of one sort or another? It’s a great word and I was wondering about its history.

A: The verb “doff” has been used with all sorts of clothing since it showed up in English in the 1300s.

The Oxford English Dictionary describes “doff” as “a literary word with an archaic flavour,” and defines it as “to put off or take off from the body (clothing, or anything worn or borne); to take off or ‘raise’ (the head-gear) by way of a salutation or token of respect.”

The earliest example of the usage in the OED is from The Romance of William of Palerne, a poem written around 1375 and edited in the 19th century by the English philologist Walter William Skeat:

“Dof bliue þis bere-skyn” (“Doff quickly this bearskin”). The reference here is to a wrap made from the skin of a bear; the term “bearskin” didn’t refer to a hat until the 19th century.

In fact, most of the citations for the verb in the OED refer to doffing items of clothing other than hats.

In the history play King John (believed written in the 1590s), Shakespeare refers to a cloak of lion’s hide: “Thou weare a Lyons hide! doff it for shame.”

And in the epic poem Marmion (1808), Sir Walter Scott uses the term for both outerwear and headgear: “Doffed his furred gown, and sable hood.”

There are even examples for doffing things other than clothing. Shakespeare’s Macbeth (late 1500s to early 1600s) refers to making Scottish women fight “to doffe their dire distresses.” And in Romeo and Juliet (1590s), Juliet says, “Romeo doffe thy name.”

As for today, the verb “doff” is often associated with hats, but not “almost exclusively,” as you seem to believe. Here are the results of two Google searches: “doffed his hat,” 41,100 hits; “doffed his shirt,” 26,600.

We’ve checked eight standard dictionaries and all but one say “doff” may refer to any type of clothing. However, most of them note its specific use for tipping or removing a hat in greeting or to show respect.

Etymologically, the word “doff” is a “coalesced form of do off,” according to the OED. It’s derived from the expression “to do off,” meaning “to put off, take off, remove (something that is on).”

Similarly, the verb “don” (to put on), which dates from the 1560s in written English, is a contracted form of “do on.”

Oxford says the expression “do off,” which dates from early Old English, is now archaic. However, it has a recent example from The Sharing Knife: Legacy, a 2007 fantasy novel by Lois McMaster Bujold: “She wriggled up to do off her boots and belt.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Instead of … what?

Q: I recently came across this headline online: “Here’s What Happens When You Color Instead of Watch TV for a Week.” I thought we have to use a gerund (“watching”) after the preposition “of.” Isn’t there something wrong here?

A: Cortney Clift’s article about the adult coloring-book trend, published on the website Brit + Co, is interesting, but that headline is debatable.

The compound preposition “instead of” is usually followed by a noun or noun surrogate, as in this example with a gerund, a verb form that acts like a noun:

“Here’s What Happens When You Color Instead of Watching TV for a Week.”

The original headline might perhaps be defended as an elliptical way of saying “Here’s What Happens When You Color Instead of [When You] Watch TV for a Week.”

In fact, a majority of the usage panel at The American Heritage Dictionary of the English Language (5th ed.) considers a similar sentence acceptable English: “We would have liked to buy instead of rent, but prices were just too high.”

However, the dictionary’s editors note that this usage “is somewhat informal” and would “seem a grammatical error” under the traditional usage.

A Comprehensive Grammar of the English Language (by Randolph Quirk et al.) defends another nontraditional usage—following “instead of” with an infinitive to maintain parallelism in a sentence.

The authors argue that “instead of may be classified as a marginal preposition … since it can have an infinitive clause as a complement.”

They give this example of the infinitive usage from A Summer Bird-Cage (1963), Margaret Drabble’s first novel: “It must be so frightful to have to put things on in order to look better, instead of to strip things off.”

“Although instead of + infinitive has been attested in good written English,” the authors explain, “many would here prefer ‘… instead of stripping …’ (which, however, would spoil the parallelism with to put that may have motivated the use of to strip here).”

George O. Curme, in A Grammar of the English Language, goes a step further and says “instead of” can sometimes act as a conjunction when two verbs are contrasted.

Curme gives this example from Shadows Waiting, a 1927 novel by Eleanor Carroll Chilton: “I saw that you were the real person; someone I admired as well as loved, and respected instead of—well, patronized.”

What do we think? If we were writing the headline you cited, we’d use “watching.” But if we wanted to keep the verbs parallel (“color” and “watch”), we’d replace the preposition “instead of” with “rather than,” a compound conjunction with a similar meaning:

“Here’s What Happens When You Color Rather Than Watch TV for a Week.”

We should mention here that even the traditional usage allows some exceptions to the use of a noun or noun-like wording after “instead of.”

“Instead of,” the Oxford English Dictionary explains, “may also be used elliptically before a preposition, adverb, adjective, or phrase.” Here are several OED citations, dating back to the early 1800s:

“People … called upon to conform to my taste, instead of to read something which is conformable to theirs.” (An 1834 citation from the Autobiography of Henry Taylor, published in 1885.)

“The Law was to be written on the hearts of men instead of on tables of stone.” (From The Jewish Temple and the Christian Church, 1865, by R.W. Dale.)

“I found the patient worse instead of better” … “You should be out instead of in, on such a fine day” … “I found it on the floor instead of in the drawer.” (Examples from A New English Dictionary on Historical Principles, the original title of the OED’s first edition, published in the late 19th and early 20th centuries.)

The compound preposition “instead of” showed up in the 1200s, meaning “in place of, in lieu of, in room of; for, in substitution for,” according to OED citations.

The dictionary says the phrase was sometimes written as three words (“in stead of”) and sometimes as four (“in the stead of”). In Old English, a stede was a point or place.

The adverb “instead” was “rarely written as one word before 1620,” Oxford says, and “seldom separately after c1640, except when separated by a possessive pronoun or possessive case, as in my stead, in Duke William’s stead.”

Finally, you mentioned gerunds in your question. As we’ve written on the blog over the years, a gerund can be a subject (“skating is restful”); a complement (“her hobby is skating”), a direct object (“she enjoys skating”), or the object of a preposition (“she has no interests apart from skating”).

We could go on, but instead we’ll conclude with a few lines from the “winter of our discontent” soliloquy in Shakespeare’s Richard III:

And now, instead of mounting barded steeds
To fright the souls of fearful adversaries,
He capers nimbly in a lady’s chamber
To the lascivious pleasing of a lute.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Master piece

Q: Yale is in an uproar about the use of “master” for the head of a residential college, given the term’s historical ties with slavery.  I wonder what you usage experts think of this. If you defend the usage, the PC/Language Police will jump all over your insensitivities.

A: We’ll try to be sensitive as well as sensible in writing about “master,” a term whose association with education dates back to Anglo-Saxon times.

But first let’s look at the story behind your question. Although abolitionists at Yale vigorously opposed slavery, the university relied on slave-trading money in its early days and later named many buildings after slave traders or defenders of slavery.

In fact, the university’s namesake, Elihu Yale, had ties to the slave trade. And in the 20th century, Yale named several of its 12 residential colleges after slave owners, including John C. Calhoun, a South Carolina politician and white supremacist.

Since the mass shooting last June at the Emanuel African Methodist Episcopal Church in Charleston and the removal of the Confederate battle flag from the South Carolina capitol, students, alumni, and faculty have pressed Yale to rename Calhoun College.

And Stephen Davis, the master of another residential college, Pierson, has asked that his title be dropped, saying no African American “should be asked to call anyone ‘master.’ ”

Now let’s look at the history of “master,” a word that by itself or in compounds has been used in an educational sense since the early days of Old English—many hundreds of years before the word showed up in reference to slavery in the US.

The term (spelled “mægster,” “magester,” or “magister” in Old English) was borrowed from Latin, where a magister was a chief, head, director, or superintendent.

The “master” spelling gradually evolved in Middle English after the Norman Conquest, influenced by the Anglo-Norman spellings maistre and mastre.

When the word first appeared in English, according to the Oxford English Dictionary, it referred to “a person (predominantly, a man) having authority, direction or control over the action of another or others; a director, leader, chief, commander; a ruler, governor.”

Oxford adds that “its meaning has been extended to include women (either potentially or in fact) in many of the senses illustrated.”

The dictionary’s first written citation is from King Ælfred’s Old English translation in the late 800s of a Latin treatise by Pope Gregory I commonly known in English as Pastoral Care:

“Ðonne he gemette ða scylde ðe he stieran scolde, hrædlice he gecyðde ðæt he wæs magister & ealdormonn” (“When he saw the sin that he should punish, he showed that he was master and lord”).

The use of “master” for a teacher showed up around the same time in Ælfred’s translation of De Consolatione Philosophiae, by Boethius. We’ve expanded this OED citation to give it context:

“Hwæt, we witon ðæt se unrihtwisa Neron wolde hatan his agenne magister, his fostorfæder ácwellan, þæs nama wæs Seneca; se wæs uðwita, þa he þa onfunde þa he dead bion.”

(“Do we not know that the wicked king Nero was willing to order that his own teacher and foster father, whose name was Seneca, a philosopher, be put to death?”)

Although the term usually referred to a man in Anglo-Saxon days, one of the earliest OED examples uses a feminine version for a woman who teaches.

The citation, using “magistra” for “magister,” is from an Old English translation of Bede’s Ecclesiastical History of the English People that many scholars believe was sponsored, though not written, by King Ælfred.

In Middle English, the term’s meaning as well as spelling evolved to include scholar (early 1200s), holder of a senior degree (late 1300s), and presiding officer of a society, institution, college, etc. (late 1300s).

And in the 20th century, a “master teacher” came to mean one who was highly skilled or experienced.

(We’ve written posts in 2012 and 2015 about pluralizing “master’s degree,” and a post in 2010 about whether a woman is a “mistress of ceremonies” or a “master of ceremonies.”)

The OED’s earliest citation for the college sense of “master” that you’re asking about is from The Way to Wealth (1550), by the Protestant clergyman Robert Crowley: “A maister of an house in Oxforde or Cambridge.”

And its earliest example for “master” used to mean the owner of a slave is from an 1833 work by John Greenleaf Whittier: “A majority of the masters … are disposed to treat their … slaves with kindness.”

However, we’ve found several examples from the late 1700s for “master” used in reference to American slavery, including this one in Notes on the State of Virginia (1794), by Thomas Jefferson:

“The whole commerce between master and slave is a perpetual exercise of the most boisterous passions, the most unremitting despotism on the one part, and degrading submissions on the other.”

Getting back to your question, we don’t see any reason to avoid using “master” in such academic terms as “schoolmaster,” “master teacher,” “master’s degree,” “master of arts,” and “master” to mean the head of a college in the UK.

Should “master” also be used for the head of a residential college in the United States, a country that still bears the scars of its slave past?

Though etymologically blameless, the use of “master” for a college head may be hurtful to African Americans at Yale.

But Jonathan Holloway, an African American and the dean of Yale College, found it “deliciously ironic” when he served as the master of Calhoun.

“I worry about historical amnesia,” Holloway said in an article earlier this month in the New York Times. “But in the wake of the Charleston shooting, I found myself disillusioned.”

Should Yale give up its name because of its namesake’s profiting from slavery? “History is filled with ugliness,” he said, “and we can’t absolve ourselves of it by taking down something that offends us.”

What do we think? We agree with Holloway. We worry about etymological amnesia. Yale may be blamed for its past associations with the slave trade, but not for the use of “master” at its residential colleges.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Are these dates of-putting?

Q: I’ve been reading a book that often uses this construction: “in April of 1887.” The “of” strikes me as superfluous, but is it wrong, as an editor I knew used to insist? I can’t find a rule in the usual publishing stylebooks.

A: Yes, this “of” isn’t necessary, but it isn’t necessarily wrong either, despite criticism from some usage authorities.

Garner’s Modern American Usage (3rd. ed.), for example, considers the preposition “superfluous in dates.”

Garner’s suggests that “December of 1987 should be December 1987,” and that “February 2010 is better than February of 2010.”

We aren’t told why the “of”-free version is better, however. Apparently the judgment is based on conciseness—if a word can be dispensed with, it should go.

But we disagree. While it’s true that “of” isn’t required here, we don’t think it’s incorrect—and we’ve found no good reason to think it is.

Both The Associated Press Stylebook and The Chicago Manual of Style and Usage (16th ed.) say that where only the month and year are given, no comma is used between them. But they don’t say that “of” can’t be used.

In our opinion, this isn’t a matter of right or wrong. A writer’s decision to use “of” in a date or leave it out is simply a style choice.

For instance, “of” inserts a rhythmic beat that can give a measure of dignity to a sentence. Here’s what we mean:

“The images, sounds and stories of the second Tuesday in September of 2001 will be seared forever in the nation’s memory” (from Sept 11, 2001, an anthology compiled by the Poynter Institute for Media Studies).

And sometimes adding “of” to a date can make a sentence sound informal or conversational: “Aggressive stocks have really tanked since June of 1983” (from the New York Times, 1984).

The usage is found in literary writing as well: “In June of 1845 Emerson was writing to Elizabeth Hoar about a new enthusiasm” (from The Letters of Ralph Waldo Emerson, edited by Ralph L. Rusk, 1941).

For one reason or another, writers often choose to insert “of” between the month and the year. All of these examples appeared in the news during the first couple of weeks of August 2015:

“This isn’t the first time something heartwarming has happened at the restaurant that opened in December of 2013” (the Fresno Bee).

“In December of 1988, this culture of violence came to my very doorstep” (Huffington Post).

“The government had a deficit of $94.6 billion in July of 2014” (Reuters).

“Durham wrote that, in April of 2010, the FBI ‘tasked’ a mob  informant ‘to go see Gentile and engage him in general conversation’ ” (Hartford Courant).

A usage like “April of 2010,” with “of” preceding the year, may be a clipped version of older formulations:

“in March of the year 1781” … “in February of the year 1742” … “in the August of 1750” … “in December of the year 1832.” (All examples are taken from searches of 18th- and 19th-century literature.)

And “of” has long been used before the month in day/month formulations:

“the fifth day of May next”  … “on the 15th day of September last”… “the 22nd of October”  … “the 14th of this month.”

So all things considered, we see no reason to avoid “of” in dates, unless you want your writing to be clipped and fat-free—and there’s certainly nothing wrong with that.

Speaking of dates, you may be interested in a 2012 post of ours about how to punctuate them with commas, and a 2009 post on the use of the suffix “-th” in dates (as in “September 6th”).

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Two chips off an old block

Q: How did “traduce” come to mean translate in Spanish and denigrate in English? Maybe there are zillions of such deviations, but I just stumbled upon this one.

A: The verb “traduce” once meant translate in English too, but that sense of the word is now considered obsolete or an affectation, according to the Oxford English Dictionary.

The OED has examples of the usage from the early 1500s to the mid-1800s. The most recent is from Alton Locke, an 1850 novel by Charles Kingsley about a young tailor who educates himself with the help of a Scottish bookseller:

“If ye canna traduce to me a page o’ Virgil by this day three months, ye read no more o’ my books.” (We’ve expanded the Oxford citation to add context.)

When the verb “traduce” showed up in English in the 16th century, it meant “to convey from one place to another; to transport,” according to the OED, but that sense is also considered obsolete.

The dictionary says the English word is derived from the Latin traducere, meaning “to lead across, transport, transfer, derive; also, to lead along as a spectacle, to bring into disgrace.”

The sense of “to lead across, transport, transfer, derive” inspired the translate meaning in the Romance languages (traducir in Spanish, traduire in French, tradurre in Italian, and so on).

As we’ve noted, English speakers used “traduce” similarly for a couple of centuries, but the OED considers that sense now obsolete or “an affectation after French traduire or Latin traducere.”

In the late 1500s, according to Oxford citations, the English word took on a new meaning: “to speak evil of, esp. (now always) falsely or maliciously; to defame, malign, vilify, slander, calumniate, misrepresent.”

The first example of the usage in the dictionary, dated 1586-’87, is from The Register of the Privy Council of Scotland: “To detract, traduce and utter speichis full of dispyte.”

The Chambers Dictionary of Etymology says the defame sense of “traduce” is probably derived from the use of the Latin traducere to mean “lead along as a spectacle, exhibit or expose (esp. captives, prisoners, etc.) to scorn or disgrace.”

So English borrowed one sense from traducere and the Romance languages borrowed another.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.x

The Grammarphobia Blog

Slash talk

Q: What is your take on using the word “slash” when speaking of something with various attributes, as in “He is a husband slash father”? We don’t usually pronounce symbols, but it seems as if I hear “slash” spoken more often than I used to, maybe due to its use in Internet addresses.

A: Is “slash” in that example merely a lexical rendering of the / symbol? Or does it have a life of its own apart from the symbol? Or has the symbol itself come to represent an actual word, as the ampersand is a stand-in for “and”?

We’ve checked eight standard dictionaries and all but one of them describe the word “slash” as a lexical rendering of the diagonal symbol. Oxford Dictionaries online, for example, defines the noun this way:

“An oblique stroke (/) in print or writing, used between alternatives (e.g., and/or), in fractions (e.g., 3/4), in ratios (e.g., miles/day), or between separate elements of a text.”

The Oxford Guide to Style says the most common use for the symbol is “as shorthand to denote alternatives,” but adds that the symbol is “sometimes misused for and rather than or.”

The sentence you cite (“He is a husband slash father”) is an example of “slash” used for “and” rather than “or.”

Is the term, as the style manual suggests, “misused” in your example?

Well, most standard dictionaries don’t recognize this use of the term—or, for that matter, this use of the symbol itself. An exception is The American Heritage Dictionary of the English Language (5th ed.).

American Heritage describes a “slash” as either a symbol used in the traditional way or an informal conjunction (represented by word or symbol) meaning “as well as” or “and.”

The dictionary gives these examples of the conjunction: “an actor-slash-writer; a waiter/dancer.” It adds that the symbol is often used in print.

We suspect that the appearance of “slash” in American Heritage is a sign of things to come. In fact, the usage isn’t all that new. The word has been used this way for more than a dozen years.

The linguist Brett Reynolds, who blogs about language at English, Jack, has found a couple of examples from the 1990s.

This one is from the Sept. 28, 1992, issue of Time magazine: “Meet urban planner Campbell Scott (‘a realist slash dreamer’).” And this one is from the script for the 1999 movie Mumford: “sexual surrogate slash companion.”

In an Aug. 27, 2010, posting, Reynolds compares “slash” to “cum” (a Latin preposition meaning “with” that is often used in the sense of “and” or “along with”).

Although most standard dictionaries still consider “cum” a preposition, Merriam-Webster’s Collegiate Dictionary (11th ed.) recognizes it as a conjunction and has this example from George Bernard Shaw: “a credible mining camp elder-cum-publican.”

In an Aug. 27, 2010, post on the Language Log, the linguist Geoffrey K. Pullum discusses the use of “slash” in two sentences like yours: “There is also a study slash guest bedroom” and “We need a corkscrew slash bottle opener.”

In considering which part of speech this use of “flash” falls into, Pullum concludes that it’s a coordinator (also known as a coordinating conjunction) like “and” or “but.”

“We seem to have actually added a coordinator to the language,” says Pullum, the co-author (with Rodney Huddleston) of The Cambridge Grammar of the English Language.

(The scholarly Cambridge Grammar lists the parts of speech as noun, verb, adjective, adverb, preposition, determinative, subordinator, coordinator, and interjection. Cambridge includes what most people would call conjunctions among coordinators and subordinators.)

Pullum describes the coordinator use of “slash” as “a new discovery about English” and “a fairly surprising one.”

He points out that “the class of coordinators is thought of as an extremely small, closed category that has hardly ever expanded since the Middle Ages (when at some point the preposition buton, meaning ‘outside,’ turned into the modern-day coordinator but).”

As for the etymology of “slash,” it showed up as a noun in the 1500s, when the word meant a cutting stroke with a sword or whip, according to the Oxford English Dictionary.

The earliest OED example for the noun is from A Panoplie of Epistles, a 1576 letter-writing manual in which Abraham Fleming translates works by Cicero, Pliny, and others into English:

“Because euery one was ready to cutte his throte as to haue a slash at his fleshe.”

English adopted “slash”—the noun as well as the verb—from esclachier, an Old French verb meaning to break.

The noun didn’t become a term for the symbol / until well into the 20th century. The OED’s earliest example is from a 1961 entry in Webster’s New International Dictionary of the English Language (3rd ed.).

Webster’s Third says “slash” and “slash mark” mean the same as a similar sense of the noun “diagonal,” which it defines as “the symbol / used especially to denote ‘or’ (as in and/or), ‘and or’ (as in straggler/deserter form), ‘per’ (as in feet/second), ‘in’ or ‘of’ (as in U.S. Embassy/Paris), ‘shilling’ (as in 6/8d),” and several other uses.

Of the various terms for the / symbol (“solidus,” “slash,” “slash mark,” “stroke,” “oblique,” “virgule,” “diagonal,” and “shilling mark”) the oldest is “solidus,” which dates from the late 1800s. (Some of the other terms appeared earlier, but not in the symbol sense.)

Here’s a 19th-century example for “solidus” from George Chrystal’s Introduction to Algebra (1898): “The symbols / (solidus notation) and : (ratio notation) are equivalent to ÷.”

The OED doesn’t have an entry for the word “slash” used as a coordinator. It has entries only for the noun or verb.

However, the lexicographer Jesse Sheidlower, a former OED editor, has cited several examples of the usage from the dictionary’s files.

In commenting on Pullum’s Language Log post, Sheidlower listed these examples with multiple slashes:

“I’m a dishwasher slash cake maker slash cookie scooper slash and whatever else they want me to do,” from the Dec. 15, 2002, issue of the New York Times.

“Halcyon, the café-slash-restaurant-slash-record store, is closing its doors in April,” from the Feb. 25, 2004, issue of the Village Voice.

“I’m an actress-slash-model-slash-hostess,” from Beth Kendrick’s 2005 novel Fashionably Late.

Getting back to your question, we believe that “slash” is evolving as a part of speech—in writing as well as speech. In our opinion, it’s only a matter of time before more standard dictionaries accept its use as a coordinator/coordinating conjunction/conjunction.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Is a ranking a rating?

Q: I teach a class in how to use a risk-mitigation tool that assigns a number between 1 and 10 to potential problems. The documentation refers to this number as a “ranking.” I think a “ranking” is a position on a list, and use the less specific term “rating” in my class. Am I causing needless confusion over a picayune statistical distinction?

A: Standard dictionaries generally agree with you that a “ranking” is a position on a list or scale based on achievement, as in “a number-three tennis ranking,” while a “rating” is merely a classification based on quality, as in “a four-star restaurant rating.”

Although some of these dictionaries use the word “rating” in defining “ranking” and “ranking” in defining “rating,” the examples given fall into the two categories above.

Oxford Dictionaries online, for example, defines a “ranking” as a “position in a scale of achievement or status,” and gives this example: “Victory at the Deutsche Bank championship lifts Singh to number one in the world rankings.”

Oxford defines “rating” as a “classification or ranking of someone or something based on a comparative assessment of their quality, standard, or performance,” and offers this example: “The hotel regained its five-star rating.”

Yes, the difference is subtle, but there is a difference. Vijay Sing was the best of the best golfers in the world, while the hotel was one of those with five stars.

Are you causing needless confusion over a picayune statistical distinction?

Well, you may be causing confusion and the distinction may be something less than earth-shattering. But you’re the teacher and it’s your call.

If you think it’s important enough to maintain this distinction in your class, then maintain it. You have the lexicographers at most standard dictionaries on your side.

Both nouns showed up in English in the 1500s, according to citations in the Oxford English Dictionary. “Ranking” then referred to the act of classifying people or things, and “rating” meant assessing for taxation.

It wasn’t until several hundred years later that the two terms developed the meanings you’re asking about.

In the early 1800s, “ranking” came to mean a position on a scale of comparison, OED citations indicate, and in the early 1900s, “rating” took on the sense of a measurement of one’s achievement. Here are the first examples for each usage.

● “A preparation too well known to require describing, except in regard to its mode of formation, which the preparer, in spite of his ranking as a scientific druggist, has hitherto kept a profound secret” (from the April 1836 issue of the American Journal of Pharmacy).

● “He has been elected President of the Blanker Banking Co., which means that his rating is first-class in the business world” (from Out of the Ashes: A Possible Solution to the Social Problem of Divorce, by Harney Rennolds, 1906).

The word “ranking” is an offspring of “rank,” which in turn comes from ranc, Old French for row or rank, but the ultimate source is khrengaz, a prehistoric Germanic root meaning circle or ring, and source of the English word “ring.”

How did that prehistoric Germanic word give English both “ring” and “rank”?

The OED suggests that “the sense of the French word apparently arose from application originally to a circular or cross-shaped disposition of forces in battle.” So a rank of soldiers may once have been in a circle, not in a row.

The ultimate source of “rating” is the Latin phrase pro rata parte (according to a fixed part, or proportionately), John Ayto writes in his Dictionary of Word Origins.

Ayto notes that rata is the feminine form of ratus, past participle of reri (to think or calculate), which has given English “ratio,” “ration,” “reason,” and similar words.

Help support the Grammarphobia Blog with your donation.
And check out our books about the English language.

The Grammarphobia Blog

Had I but known …

Q: Ogden Nash named a category of mystery stories “HIBK” (for “Had I But Known”). The heroine (it’s almost always a woman), says something like, “Had I but known the killer escaped, I would not have gone for a walk in the woods.” The “but” changes a reasonable statement into one that’s melodramatic. How does it do that?

A: You bring up an interesting use of “but,” and one that’s often found in older literature.

Here, “but” means something like “only,” so “had I but known” is another way of saying “had I only known” or “if I had only known.”

Yes, the “but” could be dispensed with (“had I known”). However, its presence tightens the screw and adds a hand-wringing portent of doom.

This “had I but” formula is used with other verbs too: “had we but stayed,” “had she but gone,” “had they but seen,” and so on.

The construction uses elements of the past perfect tense (“I had known,” “they had seen,” and so on), rearranged and with “but” inserted.

Shakespeare’s plays have quite a few examples of the “had I but + verb” construction. We’ll quote only a couple:

“Had I but served my God with half the zeal / I served my king, he would not in mine age / Have left me naked to mine enemies” (Henry VIII, 1613).

“Had I but died an hour before this chance, I had lived a blessed time” (Macbeth, 1606).

And you’ll find the same construction in Jane Austen’s Pride and Prejudice (1813): “Had I but explained some part of it only—some part of what I learnt, to my own family! … But it is all, all too late now.”

Perhaps the best-known of these formulations is “had I but known.” The expression has its own Wikipedia entry and, as you mention, Ogden Nash poked fun at what he referred to as “the H.I.B.K. school” of 20th-century mystery fiction.

However, this didn’t start with the 20th century. There are many examples of “had I but known” in the literature of the 17th, 18th, and 19th centuries.

The earliest we’ve been able to find is from William Haughton’s comedy Englishmen for My Money: Or, A Woman Will Have Her Will (1616):

“O God, had I but known him; if I had, / I would have writ such letters with my sword / Upon the bald skin of his parching pate, / That he should ne’er have lived to cross us more.”

John Dryden used the device in his play The Spanish Fryar: Or, The Double Discovery (1680): “Had I but known that Sancho was his Father, / I would have pour’d a Deluge of my Bloud / To save one drop of his.”

And the playwright William Mountfort used it in his tragedy The Injur’d Lovers (1688):  “Had I but known / The evil Meanings of his Soul.”

There’s even more melodrama in these lines from Bussy D’Ambois: Or, The Husbands Revenge (1691), by Thomas D’Urfey: “Oh! Curs’d, Curs’d, Fate, had I but known the Fiends, / Not all the Powers of Heaven and Earth had sav’d ’em.”

In the 18th century, “had I but known” became quite common, appearing in the works of many prominent English writers.

In his erotic poem “The Delights of Venus” (1702) the Earl of Rochester writes: “Had I but known the Bliss, or had I guess’d / At the Delights with which I’m now posses’d, / I had not staid [waited] for Marriage.”

And here’s a sighting from Henry Fielding’s comedy The Wedding Day (written in the 1720s but not produced until 1743):

“Oh! Plotwel, had I but known thee sooner! had I but known a Friend like you, who could have armed my unexperienced Soul against the wicked Arts of this deceitful Man.”

Samuel Richardson’s novel Clarissa (1748) has this passage: “Had I but known that your ladyship was not married, I would have eat my own flesh, before—before—before” (much sobbing and weeping ensues).

The rest is history. By now, “had I but known” has become a literary cliché, especially in potboiler mysteries.

In case you’re wondering about the grammar here, we won’t leave you hanging.

The “had known” construction, as we said above, is identical to the past perfect tense. But in clauses like “had I but known,” “had I known,” and “if I had known,” the formulation is being used in a hypothetical way, so the mood is subjunctive—specifically, the past perfect subjunctive.

This isn’t as intimidating as it sounds, so stay tuned.

“Had I known” (or “had I but known”) is a conditional clause—it expresses a supposition. And as we’ve written before on the  blog, we use the subjunctive with some conditional clauses of an “iffy” or hypothetical kind: those that are contrary to fact (as in “if I were you”).

Well, “had I known” (or “had I but known”) is similarly a conditional clause that’s contrary to fact. But in this case, the subject and verb are reversed and there’s no “if.”

Take this sentence, for example: “Had I (but) known, I would never have opened the door to the laboratory.”

Within its entry for the verb “have,” the OED discusses sentences like that under “specialized uses of the past perfect subjunctive” (that is, “had”).

The OED would call that example “a counterfactual conditional sentence, with inversion of subject and verb instead of an if-clause.”

This is nothing new. The OED has many citations, beginning in late Old English and ending with this one from the November 2010 issue of Time Out New York:

“Had the show opened out of town, many of its narrative troubles might have been fixable.”

We should note that “but” conveys a sense of “only” in a similar construction, one in which it precedes a noun or noun phrase instead of a verb.

Here, the OED says, the conjunction “but passes into the adverbial sense of: Nought but, no more than, only, merely.” Oxford has several examples, including these:

“Premature consolation is but the remembrancer of sorrow” (Oliver Goldsmith’s novel The Vicar of Wakefield, 1766).

“My Love She’s but a Lassie Yet” (the title of a poem by Robert Burns, 1794).

“In arms the kingdom had but a single rival” (John Richard Green’s A Short History of the English People, 1876).

We might add an old chestnut that’s a favorite of Pat’s, “it was but the work of a moment,” an expression found in dozens of melodramatic 19th-century novels.

We’ll end with a few lines from Ogden Nash’s poem “Don’t Guess, Let Me Tell You,” which appeared in the April 20, 1940, issue of the New Yorker:

Personally, I don’t care whether a detective-story writer was educated in night school or day school
So long as he doesn’t belong to the H.I.B.K. school,
The H.I.B.K. being a device to which too many detective-story writers are prone;
Namely the Had-I-But-Known.

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

They sore what they seen

Q: Is there a reason people use the pronunciation “sore” for “saw” or use “seen” instead of “saw,” as in “I sore her yesterday” or “I seen her last week”?

A: These are two entirely different issues, and they have different causes.

The use of what sounds like “sore” for “saw” is merely a regional pronunciation.

The speaker here is being grammatically correct, since he or she is actually using the word “saw” (and would write it that way), but is pronouncing it with a regional accent.

In this case, the accent represents a speech pattern often heard on the East Coast, and one that we’ve written about before on our blog.

As we wrote in 2008, the speaker inserts an “r” sound, sometimes called the intrusive “r.”

This “r” is sometimes inserted just before a word beginning with a vowel sound. So, for instance, the speaker would say, “That’s a bad idea” (normal pronunciation), but “That idear annoys me” (intrusive “r”).

As we’ve said, this pronunciation should not be considered a mistake, merely a regionalism.

The use of “I seen,” on the other hand, isn’t standard English; it’s a grammatical error.

The mistake is using the past participle (“seen,” the form used with “have” or “had”) instead of the simple past tense (“saw”).

The basic tense forms for the verb “see” are “I see” (present), “I saw” (past), “I have seen” (present perfect), and “I had seen” (past perfect).

Interestingly, “saw” has been spelled may different ways since it showed up in Old English, suggesting that its pronunciation has varied too.

The word is spelled “saeh” in the Lindisfarne Gospel of John, which is believed to date from the early 700s. Some other early spellings in the Oxford English Dictionary are “seah,” “sauh,” “saue,” and “sawhe.”

The use of “I seen” for “I saw” may not be standard English (the OED describes it as colloquial and dialectal), but it’s been around for quite a while.

The earliest Oxford example of the usage is from the Sept. 30, 1796, issue of the Philadelphia Aurora newspaper: “So fine a sight (says Yankee to his friend) I swear I never seen—you may depend.”

And here’s an 1861 example from Tom Brown at Oxford, a sequel to the better-known Thomas Hughes novel Tom Brown’s School Days: “ ‘Hev’ee seed aught o’ my bees?’ … ‘E’es, I seen ’em.’ ”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.

The Grammarphobia Blog

Hear Pat on Iowa Public Radio

She’s on Talk of Iowa today from 10 to 11 AM Central time (11 to 12 Eastern) to discuss the English language and take questions from callers.

The Grammarphobia Blog

Appositively speaking

Q: I seem to be the only person who feels that this construction requires a comma after “Delmonico” to offset the appositional phrase: “The oldest resident of the nursing home, Delmonico is given to reciting bawdy limericks.” Thanks for any light you can shed.

A: Grammatically, as you know, “apposite” means equivalent (not to be confused with “opposite”), and an appositive is the explanatory equivalent of a noun or noun phrase previously mentioned.

In the sentence you cite, the appositive “Delmonico” helps identify the noun phrase “the oldest resident of the nursing home.”

An appositive, as we wrote in a 2013 post, is sometimes surrounded by commas and sometimes not.

The appositive is set off between two commas only if it’s not essential—that is, if it could be deleted without losing the point of the sentence. If it’s essential and couldn’t be dropped, it isn’t followed by a comma.

In your example, the point of the sentence is that Delmonico, the real subject, likes to spout racy limericks. The introductory phrase merely adds information about Delmonico.

The point of the sentence would be lost if we dropped “Delmonico,” so the appositive here isn’t followed by a comma.

If we rewrite the sentence and make “Delmonico” the introductory element, then the nonessential stuff that follows becomes the appositive and is surrounded by commas:

“Delmonico, a long-term resident of the nursing home, is given to reciting bawdy limericks.”

In short, put commas around an appositive that’s dispensable—one that could be dropped without losing the point of the sentence.  But don’t put commas around one that’s essential to the point.

By the way, an appositive is usually found right after its equivalent, but that isn’t always the case.

The Cambridge Grammar of the English Language gives this example of an appositive that’s separated from its “anchor,” or original noun phrase: “I met a friend of yours at the party last night—Emma Carlisle.”

Here are some additional examples that may help illustrate the use of appositives and commas:

● “My youngest son, a whiz in shop class, is handy with a hammer and nails.” The appositive (“a whiz in shop class”) can be deleted without losing the point of the sentence. Arrange it differently, and you can drop the second comma: “A whiz in shop class, my youngest son is handy with a hammer and nails.”

“A trim athlete, my sister tries to watch what she eats.” The introductory phrase adds information, but it’s not essential. The appositive “my sister” is essential—it’s the whole point of the sentence. So it’s restrictive and should not be followed by a comma.

You sometimes see this construction in cases where a name has already been mentioned and it’s replaced by a pronoun as an appositive.

Consider this passage: “Delmonico is a born comic. A long-term resident of the nursing home, he’s given to reciting bawdy limericks.”

Help support the Grammarphobia Blog with your donation.
And check out
our books about the English language.