It’s necessary for authors to be careful and explicit regarding the use of pronouns when more than one person is present in the context of a discussion. The antecedents of pronouns may be perfectly clear in the writer’s mind, but to a reader it may be anything but.
Consider the following sentence, which seems to make perfect sense:
He told him that he would pick up his kids and take them to his house.
But there is a big problem if the author is thinking:
John told Jake that Jack would pick up Jim’s kids and take them to Joe’s house.
There is nothing wrong with using proper nouns when pronouns would result in confusion.
In the course of copyediting, I often find it useful to nose around in (aka research) what great authors of the past did. The sorts of points I seek insights into include examples of word usage, what preposition a verb most often takes, whether to use a comma in “Yes, sir”, and other subtleties of punctuation.
To aid myself, I’ve accumulated a small library of fine literature in plain text format, currently numbering twenty-seven books, and including works by Charles Dickens, F. Scott Fitzgerald, Henry James, H.L. Mencken, Edith Wharton, P.G. Wodehouse, and a translation of the Bible in modern English. These are in the public domain, acquired from Project Gutenberg.
I’ve hoped to add more volumes and more authors to this collection, except that, masterpieces though they may be, these books are venerably old from the standpoint of contemporary publishing practice, and many styles that were current in Dickens’s day, or even Wodehouse’s more recent era, are not those of today. Newer books are more difficult to come by, at least legally. The truth is, I don’t know where to get them illegally, either. I’d love to have plain text versions of Updike, Wallace, Delillo, and even the likes of Hemingway and Steinbeck, plus a number of non-fiction texts, but I’m unlikely to ever get them, short of scanning them with an optical character reader myself (which I ain’t gonna do), because they are carefully guarded. (And I don’t mean to suggest that I would want them illegally, for I am a respecter of copyrights.)
Much of the same information, and of books published up to the year 2000, is available from Google Books, particularly using Ngrams, but specific examples require more digging and clicking. Sometimes the effort yields useful examples, but it can also be a pain and more trouble than it’s worth.
Plain text files are searchable using standard Unix type commands or programs written in a programming language such as Perl (my personal favorite), which allow me to filter and format the results any way I wish. Therefore, using skills as a former software engineer, I’ve devised a number of tools to get at information.
To use one of the examples above, I find that in this collection “Yes, sir” (with a comma) occurs 224 times (spoken most frequently by Jeeves to Bertie Wooster in P.G. Wodehouse’s books), and only once without the comma, likewise in a conversation between Jeeves and Bertie Wooster in the midst of many others that do have the comma — so doubtless a copyediting oversight! I conclude from the data thus obtained that it’s best to use the comma in dialogue that contains words that follow the model “Yes, sir”. (Many patterns fit the model.)
Recently I recently wondered about the average word length (in letters per word) within a book. This information is easily obtained by counting the characters and dividing by the number of words. There’s a Unix tool, wc(1), to get the numbers, and a script can gather them and do the dividing. The result is not precisely accurate, because the Unix tool counts as words every group of characters separated by spaces, so punctuation and numbers and various oddities skew the number correspondingly. But as averages across a library of books with the same constraints, they’re good enough for comparison, which I imagine is why the tool was written near the dawn of the Unix era.
The range from author to author and book to book is not as broad as you might think. A calculation to several decimal places is in order. My script calculates to fifteen decimal places, but about three places seems to be adequate for discussion purposes.
So take a guess — what you think the range would be among these highly literate authors? The shortest average among all of them is the Bible at 5.377 letters per word (lpw). The modern book with the shortest words is (believe it or not) Charles Dickens’s Great Expectations, with an average of 5.514 lpw, and the longest is 6.121 lpw by H.L. Mencken, who may have had the largest vocabulary of any English-speaking person who ever lived. Amusingly, the book with that count is titled: Damn! A Book of Calumny. Apparently the man even knew how to cuss in words of more than four letters.
From that analysis we see that the range from shortest to longest average word length is well less than a letter per word. Sounds about right to me.
Recently I edited one of the most horrendously bloated books I’ve ever laid eyes on. The author was a thesaurus diver, determined to seek out the longest and least common word in every possible case. It’s no exaggeration to say that in one out of three instances he used the more obscure words incorrectly. My task became an arduous one of consulting the dictionary, mind-reading, and replacing incorrect and rare words with ones his readers (as few as they will be, mostly his relatives) would be likely to recognize. In time it dawned on me that this guy may have used longer words on average than any author I’ve ever encountered — which made me wonder: How much longer? So I saved the document’s body text to a plain text file and made calculations as described above. (It was a very long book, too, over 500 manuscript pages.) The number I came out with was 6.821 lpw, vastly longer than H.L. Mencken’s erudite habit. (Most importantly, Mencken used and spelled all the words right, as his monumental three-volume work The American Language demonstrates conclusively.)
My favorite sentence from this editing job, said in regard to one of the author’s primary subjects of discussion, says:
He was not wont to bloviate.
Wont means inclined, and to bloviate means to speak verbosely and windily. How ironic that such was not the author’s own inclination, and that at six words in a book where sentences of thirty to sixty words are legion, it was also likely the shortest sentence in the book.
If the author was trying to impress readers that he’s smart, then bzzzt! Big mistake! No person, no matter now intellectual, actually talks like that. What he left instead was quite the opposite impression.
In contrast, the very next project I worked on was written by an author who describes himself as dyslexic and unable to read until after he left school. He has the vocabulary of a fourth-grader (though no fault of his own), and the average word length on his project came out to 4.401 lpw. It was the longest work of fiction I’ve ever edited — by about 20 percent. But it took me far less time to edit it than the previous book.
 Speaking of subtle things, did you notice that the previous sentence contains a subtlety of punctuation? And have you ever noticed that the spelling of subtle is subtle?
 The form wc(1) is standard Unix man (manual) page syntax.
 Aka Coordinated Universal Time (UTC), which began at midnight on Thursday, January 1, 1970, and is calculated within many computer programs in seconds. I don’t know when wc(1) was created, but I’m rather certain that given the nature of it and its typical use, it had to be among the first that Ken Thompson and Dennis Ritchie provided when they first created Unix.
In the course of editing the writing of clients, I encounter much in the way of ticks and bad habits, not to mention sheer ignorance, particularly in the writing of beginners and illiterati — of which I edit more than I’d like — in addition to the usual complement of routine mechanical errors.
Some booboos are laugh-out-loud hilarious. (If I didn’t allow myself to fulminate at and even ridicule (privately) some of the manuscripts I work on, I wouldn’t be able to do this work at all.) Other clinkers are illuminating in that they reveal modes of thinking on the part of authors that can be picked apart and learned from.
One of the most powerful features of the ubiquitous Microsoft Word text-whacker, one that is underutilized or even unknown by authors who work alone, is its ability to track changes from version to version. This function works well. A heavily edited manuscript ends up having a forest of marks on it that leave it looking like the image at the beginning of this article.
When an editor is done with a manuscript, the author can go through and accept or reject changes, or having had a matter brought to light, might make a different change. For instance, if I delete the word very before large (because very is such a wimpy word that I routinely uproot occurrences like dandelions), the author might still wish to intensify the idea of largeness, so will delete large and substitute huge.
Not every change is routine. Sometimes an editor offers an explanation in a comment — such as you also see in the example image — in part to assure the author that the editor has a sounder reason than whimsy for making the change. (Sometimes I write them to convince myself as well.)
But comments take time to write, which has an economic impact on productivity. More importantly, marginal comments are not the place to go into detail or to give English lessons. Most of the time the editor is obliged to move on in order to get a job done. But certain problems stick in your craw.
At least they do for me, with the result that I may scribble out a few relevant lines in one of my many electronic notebooks.
And So …
I’m planning to write an ongoing series of short articles about craw-stuck problems I’ve encountered in the wild, that is, within manuscripts I’ve worked on, and to illustrate points with sentences taken from client work, suitably anonymized so as to avoid copyright infringement, letter bombs, and other negative fallout.
About the Name
These articles will be categorized under the Track Changes Meditations category in my blog. What I really call them is Meditations from the Track Changes Column, but that’s too long a label for a menu, so I saved that name for this cover article. That title is a tribute to a book that is legendary among ultrarunners, Meditations from the Breakdown Lane: Running Across America, by James E. Shapiro, regrettably long out of print and no longer available except on Amazon at a price I don’t want to pay. It’s about things the author thought about on his transcontinental run on the highways and byways of the United States. (He was one of the first to do this.)
 The text in the image is from an article by Dan Horvath in Marathon & Beyond magazine article that I edited a few years ago, used with the permission of the author. The actual content is irrelevant to this discussion.
Don’t you hate it when you see above and below used as nouns?
This lumpy construction usually occurs when the author wants to refer to material within text in a position relative to where the monstrosity occurs. (More precisely before and after, if you want to get literal about it.)
My favorite magazine, The Watchtower, has a series of study articles in the July issue that uses endnotes rather than the footnotes it has used almost if not entirely exclusively in my forty-three years of reading the journal. A friend, knowing I’m an editor, asked if I know what the difference is between footnotes and endnotes, and why endnotes are used for this series of articles.
To state the obvious, footnotes go at the bottom (the foot) of pages, and endnotes go at the ends of articles, chapters, or a whole book. Note, too, that to call an endnote a footnote (or vice versa) is wrong. Last Sunday, when we studied the first article from this magazine, the reader kept calling them footnotes, even though the word Endnotes appears over them, grouped together at the end of each article. Bzzzt! Wrong!
Functionally, footnotes and endnotes accomplish the same purpose. Which kind to use is a publisher’s style decision. Each has advantages over the other, and each has disadvantages.
Generally, notes are a means of moving material that is parenthetical yet worthy of inclusion out of the main discourse. Often, readers who skip them will not lose anything essential to the main arguments being presented.
Footnotes are convenient. A reader can just drop his eyes down, read, and go back — or not. But footnotes are usually in a smaller type size, which makes them harder to read. Endnotes stand a danger of being skipped because they require flipping to another page and back. The way we study these articles, there is zero danger of their being skipped, whatever style is used.
Sometimes notes contain nothing more than references to outside sources. At other times they add interesting supplementary material, information that is worth reading, but that would be awkward to integrate into the main narrative.
But footnotes add clutter to a page, and too many of them are annoying to some readers and even intimidating to others who may think that only scholarly works that are beyond their ability to comprehend use such apparatus.
Which style to use is up to the publisher. Most journals, textbooks, and scientific, medical, and legal publishers have meticulous requirements for their publications that must be followed without variation.
Rarely, a publication will use both footnotes and endnotes. I’m currently reading FDR, the biography of Franklin Delano Roosevelt by Jean Edward Smith, which uses both. The footnotes use asterisks as markers, are few (no more than two on any page), unobtrusive, and contain only supplementary information. I’ve been reading all of them. But there are 154 pages of numbered endnotes, most of them bibliographical references, with occasional minor commentary added. I’ve been skipping those because personally I despise endnotes. (There is also a huge bibliography.) Whereas I don’t speak for the publishers of The Watchtower, I can make an educated guess why the decision was made to switch from the customary footnote style to endnotes in the July 2013 issue.
These articles seem to have a little more than the usual extra material than most others. Also, there are two-page graphical spreads within the first two articles, which might have complicated the layout if they also had to squeeze in one or more footnotes on the bottom.
So different publishers have different requirements, based largely on aesthetic considerations. Each publisher has its own style guide that trumps the various standard style guides used as starting points or fallbacks. And given that prime decision-driving considerations of Watchtower Society publications are readability and accessibility to a worldwide readership, it should come as no surprise when we see things done a little differently once in a while, and that the result is usually delightful.
“There are all types of people in the world.” So claims an author I’ve been editing. Sounds like a truism, right?
No there’s not.
To say there is sounds as though there’s some master catalog of types, and that someone has checked to be sure there is at least one of each.
There are exactly as many types of people in the world at any given moment as there are people because no two are the same.
But the next time someone is born, it will be someone of a type that has never existed before, meaning that before he was born, there were not all types of people in the world.
Unless you start defining types with generalizations and grouping people into them, in which case someone could easily devise a type that no one matches, never did, and ever will, once again making the statement “There are all types of people in the world” false.
I’ll define a type of people right now: People with five heads and seven arms. I’ll bet there’s never been anyone of that type, and hope there never will be. But if there hasn’t ever been, then it’s still a fact that there are not all types of people in the world.
Two obvious lessons to be derived from this facetious exercise are: (1) all is a mighty big word; (2) vague generalities are often meaningless.
Please to find in file word proces is many words of Story, Is very very funny hilarious freinds say (ha ha!) Please to choose salubrious and make nice sentinces with sound is Good English. If maybe some not gentle or Maybe I make masteak, but I did best Ican an dont know masteaks. Am thanking you for kindliest atention.
Contrary to implications from the title, and also to the customary method of presenting biography, Bob Dylan’s book Chronicles: Volume 1 is not a traditional “Born on a mountaintop in …” chronologically-told tale. We learn bits of the back story throughout the book, enough to be satisfied that Dylan, famous for his penchant for privacy, has not withheld anything important. Is it any business of we the curious to expect more? In any case, the sort of trivia that obsessive star-stalkers seek is not hard to uncover from other sources; some of it is even true. (Apparently, but what do I know?)
I’ve been listening to Bob Dylan since early times. I used to hear him regularly in the early sixties on The Midnight Special, a Saturday night radio program dedicated to American roots music, broadcast on Chicago’s great FM radio station WFMT. The show has been running continuously since 1953, though I haven’t heard it myself since college days. I don’t know how often they continued to play Dylan after he became a breakaway star. For all I know, they still do. For all he has done in his life, he remains first and foremost a folksinger.
Chronicles: Volume 1 opens and closes around Dylan’s signing first a publishing deal with Leeds Music, which he soon got out of (he was technically underage when he signed it without the co-signature of a parent or guardian), then a recording deal with Columbia Records, having been acquired by John Hammond, one of the greatest talent discoverers in music history — all before Dylan had begun to write much at all. In comparison, imagine being the record company that signed the Beatles before John and Paul had written Please Please Me — which actually happened.
To be invited to record with Columbia on the basis of Dylan’s prior experience was a happening equivalent in order of magnitude to an aspiring classical pianist being asked to present his world premiere performance as a concerto soloist with the New York Philharmonic. In those days (late 1961) you couldn’t get a better deal, although Columbia also had a reputation that if your first record didn’t sell well, they would bury you and your career would be over.
A few years later, my band was also invited to cut a demo for Columbia. They didn’t take us on. It was probably not a good match for either of us at the time. My band never went much of anywhere, and today nobody has heard of it. Obviously, it went better for Dylan.
Following the signings in October 1961, we are shifted back in time to February of that year, when nascent but already experienced folk singer Bob Zimmerman, not yet Dylan, arrived in New York City. We learn of his successful efforts to find venues in the West Village basket houses and clubs, another experience we shared with Dylan, ours about seven years later. Dylan’s repertoire was already substantial, but for a while he would do nothing but Woody Guthrie songs. He seems to be a sponge for memorizing words. He hadn’t yet begun writing songs of his own. And we are told of the friends he’d acquired who were happy to let him stay on their living room couches for weeks at a time.
Dylan had a feeling he was going places, but even he could not possibly have anticipated what actually happened.
It should come as no surprise that Dylan, revered even more for his poetic lyrics than for the music that accompanies them (although I like the music and lyrics equally myself), is capable of writing engaging passages of prose, interrupted occasionally by quirkily casual colloquialisms, such as “Me and Clayton went [somewhere],” which a friend postulates is just “Dylan going from Proust mode, say, to Woodie Guthrie mode, just because he is able to do so. Think Mark Twain.” Surely Bob Dylan knows how to use pronouns properly, so I’ll grant credence to my friend’s theory. There are nonetheless a few minor passages in the book that could have used closer attention by a copyeditor. But that’s a subject I’m prejudiced about.
There are some extraordinary passages in which Dylan describes his influences. He’s always been surrounded by music and books. Although he was determined to pursue folk music, he liked and absorbed everything: classical music, modern jazz, even a great deal of pop music, including commercial performers like Rick Nelson and the Kingston Trio. He’s always been more focused on the songs, particularly their stories and words, than the artifice used to put them across.
Dylan relates an experience where he woke up in the apartment of friends he was staying with, and explored their vast library, everything from the Greek, Latin, and old English classics to modern times, also history, art, and philosophy. Dylan devoured such things during his hours alone. Although Dylan was apparently a mediocre student back in Hibbing, Minnesota, it’s apparent he had by this time acquired a substantial storehouse of knowledge about many subjects. He made special effort to memorize longer and longer passages of difficult poetry, mostly because he liked it, but also for practice. Dylan thereby demonstrated something I’ve long believed, that being a good student and getting an education does not mean getting top grades in school, but actually learning something.
His tale is suffused with enough back references that it’s not necessary for him to devote a whole chapter, section, or other discrete part to his being born, his family, growing up, school, friends, and the like. He doesn’t try to hide any of it as though he disowned his past, which he manifestly has never done. But most of these details are not important to telling the story that people who are interested in Bob Dylan the self-invented character need to hear about.
Another segment, similar to the bookshelf exploration sequence, is his telling of going regularly to the New York Public Library and reading newspapers from 1855–65 on microfilm in order to absorb the flavor of their language, and to become more familiar firsthand with what the real stories and issues were in those days, which included far more than just states’ rights and slavery. The nation was a powder keg at the time, and the conflict that came was unstoppable, a cancer that the nation had to battle to get rid of. Those times generated a lot of good music that few people today have ever heard.
Dylan seems to have been conscious from an early age of what he wanted to do in life: to be a folk singer, and to make a mark in the world that way. Fame and wealth were not objectives; in fact, he anticipated working in relative obscurity while recording for some minor folk music label, rather than becoming a mainstream artist. He never expected to become as big as his idol Woodie Guthrie.
Suddenly readers are shifted forward in time ten years. Imagine an autobiography by John Lennon, in which he skips covering most of what happened to him between ages twenty and thirty. It’s kind of an important period in his life, don’tcha think?
Still, the stories Dylan tells of events that are highlights from his own perspective, particularly of recording sessions for certain key albums, are remarkably cogent and informative.
Finally, readers are time-shifted once again back to the signing of his record deal, to his discovery on the very same day of blues man Robert Johnson by means of an unreleased acetate John Hammond gave him, as Columbia had bought all Johnson’s recordings and intended to release them, and to a scene of Dylan, who had worked hard recently to manufacture himself, feeling a sense of destiny, that something big was about to happen — as indeed it did.
Will there be a Chronicles: Volume 2? I certainly hope so.
As of July 25, 2011, I have migrated over 130 articles from my Neologistics blog, where since August 2005 I have posted many unsorted articles, including items unrelated to editing, writing, or literature. The articles copied from the old site have all been labeled with the category LEGACY.
It has been a longstanding shortcoming of Google’s otherwise excellent blog service that authors cannot order the display in any way except chronologically, with the newest material on top. In contrast, WordPress allows assigning any number of categories to any post, allowing visitors variety in sifting and sorting.
In addition, it also makes sense to me not to have to support two blogs at once. This morning I posted my last article to the old site, announcing my intention to use this one exclusively from now on.
The job of migration is done. Each older article’s publication date has been revised to show the date of its original publication on the other site.
Readers may find some of these articles enjoyable. I invite you to explore and by all means provide feedback if you would like.
Recently I read a news story that referred to Osama Bin Laden as the “former leader of al-Qaeda”. Former? Ha! Perhaps so in the same way that Hitler is a former Nazi, or Ted Bundy a former serial murderer, if we may refer to them at all in the present tense. But somehow in such cases it seems that “former” is not quite le mot juste. Why are people afraid to use the word dead regarding these guys?
Bin Laden is indisputably no longer in a position to head up al-Qaeda as long as he remains in a deceased state. Presumably, having been given an honorable funeral to prevent ticking off any more terrorists, he has gone wherever good terrorists go when they die, and has been busy fooling around with the army of virgins promised to him by his spiritual advisers, who I’m sure checked their holy books at least twice to be sure they could rightly offer that reward. This strikes me as a terrible waste of virgins. I’ll bet he’s real sorry now about all the mean things he did, too. Former my foot.
Wouldn’t it be funny if that teaching turned out to be true, but when he got there the virgins all turned out to be thirteen-year-old boys? A little detail his holy men forgot to mention. Maybe the God of terrorists has a sense of humor; his worshipers certainly don’t.
What can I say? Religion often makes people stupid. But that’s a topic for another post someday, and I’ve digressed.
Some designations remain for life, even though the designee goes on to other things; and some do not.
In 2011 it would be inappropriate to call the Boston Celtics the National Basketball Association champions, even though they have won that championship seventeen times. At this writing the Dallas Mavericks hold that title. That a team has to compete for it and win it in successive years, and with different team members, is an indication that the honor, while memorable for a lifetime, is not permanently current. There is only one NBA championship team at any given time. Therefore, the Boston Celtics are presently former NBA champs. They have been seventeen times, and could very well be such many more times in the future.
A use I’ve often heard for “former” is in reference to various Beatles, who as a band have earned a unique station in the world of popular music. Paul McCartney is often called a former Beatle, and true enough, I’ve never heard Paul himself dispute the term. However, even though John, Paul, George, and Ringo no longer work together and never will, the ghost of the band’s business is still going strong. New Beatles-branded product is periodically released to the world, and continues to sell very well. In this no one has a greater hand of overseership than Paul McCartney himself. No item is labeled as being from the Beatles unless Paul says it can be, undoubtedly with Ringo’s agreement.
If a Beatle still exists it would be Paul, and if Paul is a Beatle, then the same reasoning would include Ringo, but the case for Paul is much stronger. In both my mind and my heart Paul and Ringo are Beatles, and will never be former Beatles as long as they live. John and George are not former Beatles. Sadly, they’re merely dead Beatles, but by the terms I’ve just described remained actual Beatles as long as they lived, regardless of the band’s inactivity.
In the United States we use “former” in cases where someone definitely changes course and does not return to it. We write of former presidents, because these men (and someday women) step down, and another person takes their place, though I’ll admit that their status is muddied somewhat in that ex-presidents by accepted convention continue to brandish the honorific Mr. President for the rest of their lives. In comparison, we do not do the same for former US senators. When their terms of office expire and they are replaced, as long as they are living, they are former senators. When Harry S. Truman was elected vice president, he became a former senator.
Therefore, I would urge authors and copyeditors alike to agree to save “former” for cases where someone still living definitely changes course and completely relinquishes all evidence of still holding claim to the title formerly bestowed on him.
In the venerable British tradition of estate naming, we call our house Haddon Hall. We named it that because we live on Haddon Road in Columbus, Ohio, also in tribute to a beautiful English medieval castle by that name. We would love to put up a sign that says that HADDON HALL — perhaps carved on a big rock or stone tablet, or engraved on a brass plate.
Our Haddon Hall is home to us, but we also identify ourselves as its residents by means of the brass knocker on our front door. It says, in mixed case and a classy serif font, Newton, because that’s our name. We see no need to add anything more. The meaning is clear.
Signage is written in an extreme form of headline style, the type of writing used in titles of newspaper articles, where the objective is to say exactly what is needed using as few words and letters as possible.
Sometimes context fills in meaning that culturally literate readers are expected to supply for themselves. For instance, if you drive anywhere in the United States and see a thirty-inch wide octagonal sign with white letters in Helvetica Narrow Bold font, and a white border on a red background that says STOP, you know what it means. “The law requires you to bring your vehicle to a full stop right here.” But heaven help us all if we had to read all that on a sign. The simple imperative without punctuation is sufficient to communicate the desired conduct.
Returning to houses, think about the variations of style we observe on signs naming their residents. I first became conscious of this thirty-three years ago by means of a painted ceramic plaque above the door at the home of some friends named Olson. They are both now deceased, so I’ll honor their memory by using them as an example, while also poking fun at them posthumously, because their sign was wrong.
The simplest and most logical identifier would have been just the family name: Olson. Because Olson is commonly recognized as a name, no further explanation is needed to explain that’s the name of the people who live there. It could also have said Eggs, because they raised chickens and sold the eggs. Few people would have mistaken a sign with that word as someone’s name, particularly inasmuch as you had to drive right by the chicken yard to get to the house.
The Olsons, with the definite article and plural family name would have been acceptable, even though it says more than is needed. It suggests the sentence, “The Olsons live here,” or “The Olson family lives here.”
Olsons in the plural is marginally acceptable, but with some names may be ambiguous. Is it really plural, or is the final s part of their name? I found numerous examples among the commonest surnames where both the spelling with and without the “s” are common: Meyer and Meyers; Owens and Owen; Richard and Richards; Wood and Woods, and so forth. Specifying the definite article along with the correct plural form removes all doubt.
The Olson’s Bzzt! Wrong! (What was on my friends’ door.) Any form using an apostrophe forms the possessive, in this case, possession by one single Olson. The implied sentence is “This is the home belonging to the Olson,” meaning the one and only Olson in the whole wide world; and seems to present a question fragment: “The (one and only) Olson’s . . . what?” But there are many Olsons, and in this and most other cases there was more than one Olson to be found living behind these walls.
The Olsons’ is the plural possessive, and like the previous example, seems to ask a question, but in this case suggests the meaning is “The Olsons’ house.” There’s nothing grossly incorrect with this, but it looks wrong. Why slant the identification toward the structure, when the purpose of the sign is likely to identify the residents, not house itself?
I’m all for using the simplest form possible in such signage.
 You may not know the font, but you’ll notice if it’s something else. In fact, sometimes we see stop signs on private property (in malls) that are not provided by official sources, and that sometimes look slightly different. I remember coasting through one of those once, whereupon Suzy admonished me that I missed the sign. I told her, “Show me a real stop sign and I’ll show you a real stop.”
While I was an engineer at Motorola, I began editing the written work of others on a regular basis, and in doing so, discovered my ability to tear into someone else’s writing and make it better without making the author feel bad. What I did wasn’t a customary or assigned part of my job, so was never called anything as formally precise as copyediting. Instead, people called it get Lynn to look it over, meaning that I was expected to perform special favors for colleagues whenever asked.
The cycle would begin when someone in my department or a nearby cubicle dweller produced a report or proposal or some software documentation. The stuckee would wander by my office with a printed copy of first draft quality material and ask, “Hey Lynn, I’ve just finished writing this here massive tome that’s due this afternoon. Would you mind looking it over? Y’know, just to make sure I didn’t make no typos or nothing.” Apparently most people assumed I had nothing else to do, that the results of their labors were close to flawless, and that I could check over a seventy-five-page report in ten or fifteen minutes, maybe while eating lunch (which I never did, but that’s another story). I was always glad to help out because I enjoyed the work, and somehow I always managed to work it in with the other things I was doing.
As the process repeated itself, and those requesting my help saw their work returned with twenty or more edits per page, they discovered that I was actually pretty good at this looking things over business. That’s when some of them started showing up with their teenagers’s junior college term papers. I didn’t mind, especially if the students were trying their hardest to produce good work.
Rather than being insulted when I transformed their work from gobbledygook to something intelligible, authors were usually appreciative (or I wouldn’t have done it!), relieved that they hadn’t tried to submit their stuff without having another pair of eyes “look it over,” examining it critically.
That’s how doing other people favors came to be a part of my job that was never covered in performance reviews, but led to a new career in later years.
Note: This post is a duplicate of the article by the same title on my Neologistics Blog, but here is where I originally intended to put it. I decided that rather than moving it, I would just allow the duplication to exist.
Image via WikipediaOne dismal February morning in 1962, near the beginning of the second semester of my freshman year at University of Illinois, I arrived late for my early morning English class, interrupting proceedings while I climbed over students in the crowded classroom in making my way to my seat.
“Tedious journey, Mr. Newton?” asked the instructor, whose voice quivered with sarcasm like Paul Lynde’s.
“Not nearly so much as the destination, Mr. Prahlhans,” I replied, as I struggled to remove my wet overcoat.
At the university they offered new students two paths of study in basic academic subjects. I chose what was undoubtedly for me the wrong one, called DGS (for Division of General Studies) English. I adjudged the course to be trivial and the teacher to be loathsome. Always more concerned about expending time doing what I thought was interesting to myself than about superfluous abstractions like grades, I limped by, cut most of the time, and in the end managed to squeak out a D, despite having sufficient command of my native language to meet the university’s low standards.
The consequence for anyone getting a D or failing grade in their freshman English class, whether DGS or traditional Rhetoric, was being forced to take a class called Remedial English — a disgraceful subject to have to stand in registration lines to sign up for, and while I accept that I’d earned that humiliation for myself by my own actions, still I grumbled about it, and blamed the inferior course and teacher I’d had the previous year.
To make matters worse, no credit was given for Remedial English, attendance was mandatory (cutting twice for any reason whatsoever meant automatic failure), and no person would be permitted to graduate without having earned at least a C (I think) in that course. A person could repeat it as many times as necessary to accomplish that end. I was in academic debtor’s prison.
One relief was that there was no homework. We simply had to be present every session and listen, and we were required to write a series of six increasingly complicated essays in class, which the teacher then critiqued, graded, and returned.
For the very first exercise we had a choice of writing either about some issue of student politics on campus, about which I knew absolutely nothing, or about something having to do with Lyndon Johnson, who was then Vice President, and I cared equally little about him. Being angry about the choices, in addition to having to be there in the first place, knowing that the best I could do was make something up, and so was bound to fail, I submitted an altogether stupid @#$! off-topic rant about having to write this stupid @#$! paper on this stupid @#$! topic about which I knew nothing, and having to take this stupid @#$! class. I didn’t include the expletives, but was thinking them.
To my surprise, the teacher graded my paper thoughtfully and intelligently, as if it were just another badly written assignment from a clueless student (which it was). He included some written advice on how I could cope with the rest of the semester’s work.
I no longer remember the name of the graduate student instructor, but for his calm handling of my tirade he deserves highest marks, perhaps even a meritorious service medal, when he could have reprimanded me, and might have griped equally from his own side of the divide about having to teach such a class to mostly morons and losers unqualified to do university level work who all needed to go get jobs pumping gas and stop spending their parents’ money by being in college.
He never knew that his thoughtful comments probed a Good Attitude button in my head and triggered a permanent change in my life. Shortly thereafter my whole stance became transformed. I began to listen attentively to his carefully prepared and enthusiastically presented lectures, which constituted in toto a formal review of English, from basic grammar through advanced composition, over the course of a semester. As I listened and learned, the quality of my own writing escalated asymptotically.
As a result, despite the no-credit shameful status of Remedial English, I have always looked back on taking this course as a highlight of my undergraduate experience, and in some respects a turning point in my life, because it imposed a need for me to come directly and intelligently to grips with the techniques of writing, today one of my deepest everyday concerns. What I learned then has served me well all my adult lifetime. And it’s worth noting, too, that for the rest of my academic career I never got anything but A’s on term papers.
 Note on the image I used here. By coincidence, the classroom in which this episode took place was located in the building entered through the door under the outstretched arm of the figure in the statue.
 I have since learned a great deal about Lyndon Baines Johnson, whose greatest importance came after the period of this story, and find him to be a fascinating character in US history.
What do authors Stephen King and David Foster Wallace have in common? As authors, other than having been successful, very little. Their work emanates from about as far from opposite sides of the universe as can be.
Their commonality from the perspective of this neologistician is that they are two writers about whom I know far more personally than I do of their written works.
David Foster Wallace I wrote about in the flippant pseudo-blog article Although of Course You End Up Becoming Yourself, which I created as filler to initialize this blog. As that piece points out, the book of the post’s title is about Wallace, not by him, and in that regard it is enlightening. To date I still have not actually read anything Wallace wrote, though doing so is high on my gotta-do-RSN list.
Neither have I ever read a single word of any novel or short story by Stephen King, although I have seen at least two movies made from his work — The Shawshank Redemption and The Green Mile — and greatly liked them both.
One reason I’ve never read any King is that for the most part his subject matter does not appeal to me. I have zero interest in horror stories, never have, and never will, other than a couple of well-crafted tales spun by Alfred Hitchcock that can be dispensed with in two hours viewing time. Fantasy, mystery, and science fiction generally leave me cold as well, though I’ve read isolated examples of all that I have found enjoyable. Horror, though, I find particularly objectionable for its blood and violence, so have avoided it. Call me a moralist if you will, but I don’t see anything entertaining in reading about psychotics dismembering other human beings and the like, even though I know there are people in this world who actually do such things.
However, this morning I finished reading Stephen King’s non-fiction book On Writing: A Memoir of the Craft, in which King discusses, in a surprisingly informal tone, his life history and his substantial experience with writing, presenting just a few useful tips on getting it right for those who would follow in his path.
The first part of the book, titled C.V. (curriculum vitae), he devotes to a series of short vignettes, some less than a page, at first seemingly irrelevant tales from his early life experiences, ending with when he sold his first breakthrough novel, Carrie, written while he and his wife were still Maine-poor, living in a double-wide travel trailer. (I’ve lived there myself and know how that is.) In ways the sequence reminds me of James Joyce’s alter ego Stephen Dedalus, in A Portrait of the Artist as a Young Man, in that it tells in increasingly adult-like fashion of the events that shape the subject from sponge-absorbent child to productive artist.
Having exhausted the subject to the degree King cares to discuss it, he then tells the story of his near-fatal accident on June 19, 1999, when he was hit by a van while out for a walk. Telling this is a sort of self-referential feat of a type I admire, in that King was in process of writing On Writing when the accident happened.
In describing the drunken assailant Bryan Smith with commendable restraint, King tells of Smith’s cretinous decision to leave the scene of the accident while waiting for emergency assistance to arrive in order to buy candy from the corner store for himself and his rottweiler. King adds, “It occurs to me that I have nearly been killed by a character right out of one of my own novels. It’s almost funny.”
Correction. It is funny.
I’ll leave the details of King’s insightful views on writing as an exercise for the reader to discover. You can get the book from most any library or buy it on Amazon. At 288 pages it’s not long, and is an easy read. If you’re not willing to go to the trouble, suffice it to say that you are breaking King’s first principle, and are liable never to become a writer yourself. But that’s okay. Maybe you don’t want to become a writer.
To this little blurb I add one more postscript about David Foster Wallace that I didn’t include in my previous article about him. Wallace goes by his three-part name in writing, but is called only by his first name. For him this is certainly no problem, as no one is likely to assume he prefers to be called Foster. I also go by my three-part name in writing, and have since I was a child, but in my case, there are those who mistakenly assume I prefer to be called David or — curse those who are so presumptuous as to assume the uninvited familiarity — Dave.
I write about Wallace in the present tense, even though he is now dead, because a published writer has managed to accomplish a form of immortality, and will always live as long as his work remains.
And that’s about the only thing I have in common with David Foster Wallace, except that I also lived in central Illinois. But I don’t even play tennis.
When I hear or speak words, I see them spelled out in my head. Similarly, when I read I tend to see the letters in individual words, so that when called upon to read out loud, I rarely mispronounce words, unless I am outright unfamiliar with them.
Until recently, I have always supposed everyone does likewise. Upon inquiring of some other literate people, I was surprised to find that no one else I asked sees words.
Yesterday I heard my favorite NPR commentator Daniel Schorr use a word I have seen written but have never used myself, nor ever heard pronounced: “colloquy”, which is a conversation or a dialogue, particularly one that is formal or written down.
I was surprised to hear him say it with the first syllable accented, for until yesterday I had heard it in my head with the accent on the second syllable, as in the word “colloquialism.” Nonetheless, I saw the spelled-out word flash up in my head as with a red flag, because I knew the word’s meaning, but its pronunciation turned out to be different from what I expected. (Many listeners probably know neither.)
A quick check of an on-line dictionary verifies the venerable Mr. Schorr’s pronunciation to be spot on.
At the same time I considered it to be a delightful coincidence that it was that particular word that would serve to demonstrate my apparently anomalous tendency, in that it gave me an opportunity to develop this blog entry on the topic, a blog itself being essentially a form of colloquy.
As I am writing this, I hear in my head Daniel Schorr’s precise and fatherly voice reading it back to me. (Dream on!) Will my quirkiness never cease?