But I learned in second-grade English …

Que Seurat, Seurat: He probably didn't learn this in art school.

Que Seurat, Seurat: He probably didn’t learn this in art school.

As one who writes – mostly advertising –for a living, I hear that from people. A lot.

“I learned in second-grade English that you can’t start a sentence with ‘and’.” Or, “I was taught that it’s not a complete sentence unless it has a subject and a verb.” Or, “Miss Fussbudget drilled it into my head that you need to start out with a thesis statement.” Sometimes, even, “Mister Puffbuttock always said you should never use contractions in your writing.” Well, all of those things are probably true.

We do learn, early in our education, that it’s a “rule” that we don’t begin sentences with conjunctions. It’s a “rule” that a complete sentence contains a subject and predicate. Many things you write in school should, indeed, use the “keyhole” structure that builds a first-paragraph thesis statement. In certain forms of writing, you should avoid contractions.

When these issues come up, I like to think about a kid named Dale.

While at his North Carolina high school, young Dale played football, basketball, baseball and golf. He probably took English from Miss Fussbudget.

He dreamed of becoming a professional athlete. And he took driver’s training, where he learned the “rules” that a beginning driver must learn. Signal a lane change. Stay at or under the speed limit. Allow one car length per 10 mph. Hands at 10 and 2.

In his 20s, young Dale started fooling around with friends from work – doing odd jobs around a race track. And the racing bug bit him.

Retired since 2008, Dale Jarrett is still considered one of the top 10 all-time NASCAR drivers. He didn’t get there by following the rules of the road he learned in driver’s ed. He followed the rules of the track.

If you think about it, Georges Seurat’s second-grade art teacher would probably have been horrified by the rule-breaking of pointillism. (And don’t get me started on cubism.) Considered one of the greatest works of English literature ever, James Joyce’s Ulysses would be nothing but red marks if judged by the standards of middle-school English composition. The boundary-breaking improvisation that defined be-bop would have been frowned upon by young Dizzie Gillespie’s music teachers at the Laurinburg Institute.

In our early school days, we learn rules for the “science” of writing. Sadly, this is as far as writing education goes for most people. Those who pursue careers in writing, however, go well beyond that basic training and study the “art” of writing – just as Gillespie studied the art of the trumpet, Seurat the art of visual composition and Jarrett the art of stock car driving.

And in that advanced study, we learn it’s okay to start a sentence – even a paragraph – with a conjunction. A “fragment?” It can be a sentence. Contractions aren’t forbidden. Only a thesis-style paper should begin with a thesis statement.

It’s not necessarily breaking rules. It’s more a matter of having the skill and wisdom to know which rules can be bent, and by how much. It’s knowing what works, and, when appropriate, what wins.

In the marketing communications business, it’s a matter of playing by the rules of the track, rather than the rules of the road. On this particular track, the field is extremely crowded. The competition is fierce. The prize is your customer’s attention, interest and action. And the rules of the road probably won’t win them.

Go back and re-read great sections of your favorite novel or your favorite poem or favorite song lyrics. Watch your favorite movie, and imagine the dialog as it was born – as words written on a page. Pay attention not only to specific words and their meanings, but to their rhythm and cadence. Notice how the “rules” are bent to reflect the style of the writer, the personality of the character … and the rhythm of the character’s life. That’s what good writing does.

Good advertising writing reflects the style of the brand, the personality of the customer and the rhythm of the customer’s life. Those are skills that Miss Fussbudget couldn’t teach you in second grade. Because they take many years of practice to learn – after you’ve learned the basic rules first.

And when it comes to starting a sentence with a conjunction … you don’t need to take my word for it. The first chapter of Genesis in the King James Bible – arguably the most “formal” of all English translations – consists of 34 sentences.

Thirty-two of them begin with “And.”

And if it’s good enough for God, it’s good enough for me.

This was originally written as a newsletter article for the Princing & Ewend newsletter, way, way back in August of 2004. It has been updated. 
Advertisements

The Story of the People Who Told Stories

Storytelling is not dead yet. In fact, it’s getting better.

StoriesOnce upon a time, we grew up learning things from people who told stories. We entertained ourselves with people who told stories. We enriched our minds, constructed our faiths and built our businesses through people who told stories.

Then, one day, from deep within a dark, foreboding cave near San Jose – unearthing itself from decades’ worth of Red Bull cans, pizza boxes and Foosball tables – the big, bad Internet reared its pixelly head. It gobbled up all the storytellers. Then it pooped them all out again in pretty JPEGs, cool GIFs and 200-word chunks of bullet points.

The end.

At least, that’s the conclusion we’re offered in a recent post on Adobe’s cmo.com in which Pum Lefubre, chief creative officer at Design Army, declares that “Storytelling is dead.”

The post cites a study released by Microsoft a few months ago that has received plenty of publicity on its own. It declares that the “human attention span is 8.25 seconds,” shorter than that of a goldfish.

It’s time we all stopped using this study to declare that the human neurological, intellectual and emotional pathways that developed over 200,000 years of constantly changing and evolving media have been completely reconfigured by the Internet in just 20 years.

Without wading too deeply into the weeds of the study itself, let’s just say that the testing was measuring brainwave activity of people assigned a “gamified” task: noting the sequence of different objects, or classifying vowels and consonants and odd or even numbers. Not exactly the kind of riveting activity one plans for a big Friday night.

Sadly, the

Sadly, the “Methodology” section of the Microsoft study does not explain how they measured the attention span of a goldfish.

And the 8.25-second figure is for only one type of attention: Transient attention, a short-term response to a new or different stimulus that distracts you from something else. This is actually a good thing, because it means we are able to more quickly transition back to selective sustained attention – the task we were focusing on. The study, curiously, doesn’t offer a duration on this … mostly likely because it wouldn’t offer the clickbait potential of a comparison to a member of the carp family.

There’s more than one way to skin a fish.

In all fairness, much of the advice Ms. Lefubre offers marketing communicators as a “replacement” for “storytelling” is sound advice. Eliminating unnecessary messaging is always a good thing to do – including a tagline if, indeed, it isn’t needed. Understand the value of less as more. And, by all means, don’t “dumb down” your material and spoon-feed your audience. Leave something to the imagination to draw them in.

After all, those just happen to be some of the key roles of, um, storytelling.

The examples that accompany the article – all very fine work – don’t demonstrate the death of storytelling; they simply represent another way – a more visual way – of telling a story. Sometimes this is the most effective way. Sometimes it’s not.

It depends on what story you’re trying to tell. And to whom you’re telling it. A poster for a ballet company’s performance of Alice in Wonderland isn’t exactly telling the same story, to the same audience, as a message to a medical device manufacturer suggesting using a silicone elastomer over PVC, latex or TPE. Nor is a brochure designed to get people – most of whom have no previous connection with the organization – to contribute to a local Habitat for Humanity chapter’s endowment because it was established as a memorial to a beloved former mayor.

Those might need a different kind of story.

‘I’m not dead yet!’

Of course, this is nothing new. I’ve been a marketing communications writer for 39 years. For almost all that time, I’ve heard people say: “Nobody will read anything that long.” “People don’t have the attention span for copy that long.” And, yes, “Storytelling is dead.”

By people selling the next big media thing – and, of course, art directors – storytelling has been rumored dead more than Paul McCartney.

The Internet world didn’t create the shorter-is-always better shibboleth. But it’s determined to accelerate it – hence studies that confirm our shrinking attention spans. (These studies, of course, reinforce the need for devices and software to enable that acceleration. See! They’re telling a story!)

But are people really too impatient to read more than 100 words? Let’s find out, by looking at what was at the top 10 on the New York Times’ best-selling fiction list for one week in June of this year:

  1. Grey: Fifty Shades of Grey as Told By Christian, 576 pages
  2. The Girl on the Train, 336 pages
  3. The Rumor, 384 pages
  4. Tom Clancy Under Fire, 512 pages
  5. Country, 336 pages
  6. The President’s Shadow, 416 pages
  7. The Martian, 387 pages
  8. All the Light We Cannot See, 531 pages
  9. Finders Keepers, 448 pages
  10. In the Unlikely Event, 416 pages

At an average of 250 words per page, those 10 novels average 108,000 words each. And a hell of a lot of people have either bought or downloaded them. It’s not that people won’t take the time to read – or listen to, or watch – a story. They just won’t waste their time with one that’s not told well. E.L. James, apparently, being an exception.

People won’t pay attention to marketing communications messages that don’t tell an interesting, engaging or compelling story. But if you’re telling a good one (or a titillating one that taps into repressed S&M fantasies), people will take the time to read it.

We’re wired for it. Since language developed, we have explained our world, built our businesses, made our reputations and educated our children through the telling of stories. And stories are like products – the ones that survive are the good ones.

I think we’re all a little tired of “brand storytelling” as a puffed-up cliche to describe copywriting. But the process of storytelling is far from dead. As more and more people try to tell more and more stories – and, as they try to tell them the wrong way, or tell them poorly – the ability to tell a story well is growing more important every day.

The businesses that tell them well are the ones that will live happily ever after.

The Refractive Index of Time

The Confederate flag represents evil. That’s why we can’t hide it away completely.

Flags

Here’s a fun little experiment.

On a piece of paper, draw a line with an arrow pointing to the left. Tape this to the backsplash of a kitchen counter. Fill a clear, smooth glass with water and set it on the counter. Now look through the glass of water at your arrow. It will be pointing to the right.

Much of the discussion over the removal of the Confederate battle flag from state houses – and, apparently, everywhere else – is an excellent example of how time refracts and distorts the events and beliefs of the past just as the water in the glass gives those arrows you a completely different meaning.

The removal of what most of us know, incorrectly, as “the Confederate flag” or “the stars and bars” from the South Carolina state house is a good thing, as will be its furling at other state government facilities – and its removal from state flags – throughout the South. To the State of South Carolina, there was one reason and one reason only for seceding from the U.S., and it’s best expressed in its declaration of cause:

[The] ends for which this Government was instituted have been defeated, and the Government itself has been made destructive of them by the action of the non-slaveholding States. Those States have assumed the right of deciding upon the propriety of our domestic institutions; and have denied the rights of property established in fifteen of the States and recognized by the Constitution; they have denounced as sinful the institution of slavery …

It bears noting here that the “domestic institution” here is slavery, and “property” it refers to is other human beings. While we can pretend that the casus belli of the Civil War was “states’ rights,” it’s crystal clear from that declaration – and that of the other Confederate states – that the only right the states were concerned with was slavery.

The Myth of the Lost Cause

The fiction of a higher cause arose quickly after the war. It was, as Nolan Finley notes in the Detroit News, enabled, if not directly advanced, by the US government. Lincoln and his successor, Andrew Johnson, understood the ancient strategy of allowing a vanquished opponent to save face; the myth of the “lost cause” over states’ rights helped pave the way for reconstruction.

As we have grown more distant, though, the refractive index of time has given the “lost cause” a luster that outshines the dark reality of human bondage beneath it. The fact that the long-retired “stars and bars” were unfurled above southern state capitals in reaction to the civil rights movement a century after the Civil War reminds us that the darkness was never very far away.

But we must also consider how the refraction of time has changed our perception of other aspects of that war – most significantly, the men who fought it, and their reasons for fighting.

Only 20 percent of the CSA’s troops were conscripted. Most were there for reasons that are difficult for us, today, to understand.

The United States: Plural

Before the Civil War, and its unprecedented rise of federalism, the “United States” were a group of individual states in the classical sense of the word: sovereign nations, banded together like NATO or the European Union. As Shelby Foote notes in Ken Burns’ Civil War documentary, before the war, people said “the United States are.” After the war, they said “the United States is.

A century and a half later, it’s hard for us to understand how generals in the U.S. Army would resign their commissions to fight for Virginia, or how an average citizen would put his loyalty to South Carolina above his loyalty to the nation. But at the time, nearly everyone did. It was, in large part, Lincoln’s gift for oratory that drew the northern states around the cause of “union.” After all, only 30 years earlier, the State of Ohio and the Territory of Michigan declared war on each other.

A related concept that’s also difficult for us to understand today is the premodern concept of “duty.” To us, today, it’s a rarity, something we see in the “few good men” of the Marine Corps and in a select occupations and situations. Society has changed enough that most of us cannot imagine marching shoulder-to-shoulder into a storm of flying lead. But at that time – and, really, up through World War One – men could not only imagine it and romanticize it. They did it.

The common Confederate soldier did not own slaves, nor did most of his officers. In fact, only six percent of all the people in the South owned slaves. The South’s economy depended on slavery, so all had some financial skin in the game. But was still, as are most wars, a rich man’s war and a poor man’s fight.

Most of the 1.6 million Confederate solders who fought, the 80,000 or so who died and the 137,000 who were wounded were there because their sense of duty compelled them to fight for their home states. “My country, right or wrong,” and at the time, one’s state was as much – or more – one’s country than the US.

Racism was Not a Southern Monopoly

Secession was driven by a desire to perpetuate slavery. That slavery was based on the premise that the Negro was a subhuman class of animal; it is inherently racist.

But also lost in the refraction of time: Racism was not exclusive to the South.

The abolition movement was most emphatically not an anti-racism movement. Many white abolitionists believed the Negro was morally and intellectually inferior to the Caucasian – as did Lincoln. This belief was at the time, in fact, the subject of what we now call scientific consensus, and it was an article of faith for many mainline Christian denominations.

At a political level, the Civil War was over slavery. At the personal level of the infantry solider, it was over duty to his so sovereign state. But neither Confederacy nor Union was innocent of the stain of racism. If we cast ourselves back to 1861, the stars and stripes are almost as racist as the stars and bars; they just don’t stand for slavery. Racism didn’t magically disappear in the North after the emancipation proclamation or the 14th amendment; it has not yet disappeared to this day.

The Confederate Navy Jack is an ugly, remnant of a nation founded by acts of treason over the right to own other people. This is why it should not be flown over state houses.

But one thing contemporary Americans do well is take a good thing too far. And that’s what we’re  doing with the wholesale removal of the flag from everywhere – not to mention the attendant madness such as the exhumation of Confederate generals, even one as loathsome as Nathan Bedford Forrest. That flag is an important part of the family histories of millions of Americans whose ancestors bravely followed a sense of duty to their government – misguided as that government may have been. This nuance – the first amendment aside – is why it should not be banned outright, and why people should be free to display it on their property as they choose.

Most importantly, though, that flag is also an important part of the history of the United States of America. It should not be purged and hidden away as an embarrassing secret from our past.

Because it serves as an important reminder of the cruelty, inhumanity and evil we are all capable of hiding under a cloak of tradition, custom, economic expediency or political demagoguery. This is a time when we need that reminder more than ever.

It’s the End of the World (as Glenn Beck Knows It)

But I feel fine.

DarkestIt’s been both fun and somewhat horrifying reading headlines and social media posts the last few days. Here’s a sampling:

“Five lawyers overruled 2.7 million Michigan voters,” says Michigan State Rep. Gary Glenn.

“This irrational, unconstitutional rejection of the expressed will of the people in over 30 states will prove to be one of the court’s most disastrous decisions,” says former Arkansas Gov. and presidential wannabe Mike Huckabee.

“The country as we know it is done,” says TV and radio host Glenn Beck.

“Now the destruction of the family begins,” says Michigan GOP committeeman David Agema.

“So sad we seem to keep going down the drain faster and faster, But God is not mocked!” says a comment on a friend’s Facebook page.

“June 26, 2015: the day the twin towers of truth and righteousness were blown up by moral jihadists,” a tweet from American Family Radio personality Bryan Fischer.

“This is indeed a rogue act by the SCOTUS which effectively ends Western Civilization as we know it,” a Facebook post by Michigan activist and reputed pastor Stacy Swimp.

And my favorite, from both the Breitbart Facebook page and from Texas Sen. Ted Cruz: “Darkest week in America’s history.”

Let’s let that one sink a little. The “darkest week in America’s history.”

I’m relieved that we made it through the “darkest week in America’s history” without the 7,000 deaths of the first week of July, 1863. Or the 2,500 deaths of the first Sunday of December, 1941. Or the 2,900 from the second week of September, 2001. Not to mention the 8,000 deaths of Galveston in September of 1900, the millions in poverty after October, 1929 or the 3,000 dead and 225,000 homeless of San Francisco in April,1906.

Now, we’re hearing from the same bloviators who complain about the excessive self-absorption of the so-called “Me Generation.” They are now apoplectic: A ruling on a badly worded provision of a two-year-old, poorly written healthcare law, and the ability of four percent of the American population to have their relationships legally recognized, overshadows dozens of wars, recessions and depressions, national disasters … even the Harding and Nixon administrations.

GRIP

Now, the “overruled the voters” complaint is an odd and disingenuous expression of intellectual inconsistency. It is, of course, SCOTUS’ job to “overrule” voters when voters – and their elected representatives – pass laws that do not meet the standards of the constitution. They overruled the voters in Brown v. Board of Education and Loving v. Virginia. Of course, those cases would probably cause the same wailing and gnashing of teeth among the same people today. But these very same people were beside themselves with joy when the Court overruled the voters in Bush v. Gore and in Citizens United v. FEC.

But the other overly dramatic complaints? They would be silly, if they weren’t such an insult to the people who have suffered or died because of one of the aforementioned dark weeks, or many others, in America’s history.

Less hubris would allow them to remember that every week, the families of 47,000 Americans each face their darkest week, as they lose a father, mother, grandparent, sibling or child.

For the record, my positions on these two decisions is pretty much 180 degrees from those of Chief Justice Roberts. I am 100-percent behind the decision on marriage. And I believe the Affordable Care Act is a deeply flawed law that, while addressing a few of the symptoms of a huge disease, will ultimately only make the disease worse.

But now that we’ve had a couple of days to take a breath, I think all the conservatives who have called this “the darkest week” in our history, “the beginning of the end of our republic” and all other manner of drama-queen hyperbole, need to consider something.

A few paragraphs ago, I ticked off five “dark weeks” in which a total of 66,000 people lost their lives and many times that many lost their homes. America is still here.

In fact, this republic is one tough old bird.

It survived an invasion, less than 40 years after its founding, by what was then the world’s greatest superpower. It survived a war with its southern neighbor, two world wars, Korea, Vietnam and dozens of other incursions – not to mention a civil war that literally ripped the nation apart. It survived – and won – a 40-year stare-down with the Soviets.

It survived three major economic crashes – one accompanied by the nation’s most devastating drought– and dozens of smaller ones.

It survived the assassinations of four of its leaders and attempts on several others. It survived hurricanes, tornadoes, floods, earthquakes, volcanic eruptions and explosions.

Ted Cruz, Bryan Fischer, Rush Limbaugh, Bobby Jindal and other so-called patriots have declared that this republic will crumble. Because the GOP still hasn’t been able to put together a real healthcare cost-containment solution. And because all Americans can now legally marry the ones they love.

America has proven that it’s stronger than that. And true patriots believe it.

The Most Dangerous Places in America

Is Benton Harbor really the most dangerous place in Michigan?
Is Benton Harbor really the most dangerous place in Michigan? Photo from homesnacks.net

Yet another website has published a list of “most dangerous places.” A Facebook friend tipped me off to this one, which listed the “most dangerous cities” in Michigan.

These sites – and many news organizations – do the same thing. They pore over the FBI’s Uniform Crime Reports, compile violent crime rates from them, and then rank them – usually against other cities of comparable size.

So this particular site that a friend tagged for me said the “most dangerous” city in Michigan was Benton Harbor … which had, in 2013 (the most current statistics available) three homicides and 225 violent crimes.

Detroit, on the other hand, which ranked third on the list, had 316 homicides and 14,500 violent crimes. Flint, all the way down in sixth place, had 48 homicides and 19 violent crimes.

Now, before I reveal what the most dangerous places in the U.S. really are, let me help you understand the flawed logic behind these ranking sites.

Statistics Don’t Lie. But the People Who Use them Do.

Let’s start by naming the top five home run hitters in Major League Baseball history

1. Mark McGwire
2. Babe Ruth
3. Barry Bonds
4. Jim Thome
5. Ralph Kiner

What’s that, you say? Everyone knows the top five are Bonds, Aaron, Ruth, Rodriguez and Mays? How can this be?

It’s the difference between incidence and rate. Mark McGwire has MLB’s highest home-run percentage vs. at-bats, Ruth is second. This logic, then, dictates that Benton Harbor is “more dangerous” than Detroit, that Camden, N.J. is “more dangerous” than Chicago.

But basing the average person’s “danger” on the rate presumes that violent crime is like the lottery, and every person in the city has bought a ticket.

Personally, I think a city in which there were 300+ murders and 14,000 violent crimes is far more “dangerous” than one where three people were killed and 225 robbed or assaulted.

But then, I think Barry Bonds hit 180 more home runs than Mark McGwire did.

The Most Dangerous Places

But let’s look at the ultimate danger – that is, death. Based on national death rates per 100,000, here are the most dangerous places in America:

1. Fast food restaurants (cardiovascular disease: 252)

2. Places that sell tobacco products (chronic lower respiratory disease: 47.2; malignant neoplasms of trachea, bronchus and lung, 49.4)

3. Your car (motor vehicle accidents: 41.3)

4. A workplace, or home, that presents a danger of falls or exposure to toxic materials, or a lake or swimming pool (nontransport accidents, 29.3)

5. Your pantry (diabetes: 23.9)

6. A hospital (hospital-acquired infections: 23.0)

7. The dark recesses of depression (suicide: 13.0)

Homicide, the gold standard of “danger” for place-ranking websites, comes in nationally at 5.1 deaths per 100,000 – just a little behind alcoholic liver disease (5.7)

Ponder number seven another moment though. If you’re going to die at someone’s hand, nationally, it’s 2.6 times as likely to be your own than someone else’s.

Yes, Detroit’s homicide rate per 100,000 is significantly higher than the national average, at 45. But even in Detroit, you’re 5.6 times as likely to die from heart disease than from a homicide.

What’s the Harm?

It bears noting that on its Uniform Crime Reports website, the FBI specifically cautions against using the statistics for ranking locations against each other. There are lots of reasons, not the least of which is inconsistency in reporting methodology (because FBI feels the number is underreported, Chicago shows zero rapes for 2013) and jurisdictional issues (the Sandy Hook shootings do not show up in Newtown, CT’s 2012 statistics – because the case was handled by the state police).

But we do it anyway. Why? I believe it to be a grotesque manifestation of Americans’ seemingly insatiable fetish for ranking, ratings and lists. Like quarterback ratings, college football polls and standardized test scores, they really tell us nothing but how something was arbitrarily measured at a given time. And, as we know, not everything that’s measured is important … and not everything that’s important can be measured.

So it could be considered harmless fun. If it were, indeed, harmless.

But it isn’t. It makes it even more difficult for cities to pull out of the decline that has led to high crime rates. You know how a lottery ad portrays everyone who plays as a winner? These ratings portray everyone who sets foot in the city as a violent crime victim. The characterizations are equally misleading.

Worse, it helps Americans do even more of what is one of their most self-defeating behaviors: being afraid of the wrong things. We have millions of preventable deaths because people are too afraid of the stranger at their door to be concerned about the cheesecake in their fridge.