A Month of “Um” Days

As writers go, I am slow and deliberate. Though I don’t often find it, I can spend hours looking for le mot juste. It’s not the ideal approach for a blogger, needless to say. So this month, as I hoard my psychic energies for a major writing and editing project (more about that later), I’ve had to make what is, for me, a difficult decision about this blog.

No, this is not a farewell to blogging, or even an announcement of a hiatus. Rather, it’s an explanation and an apology for what’s about to happen here for the next month. You see, rather than just give up on writing the weekly, well-crafted post and go dark for 30 days, I’m going to do just the opposite. I will write a post a day (or more) until the end of November. But the writing of each post will be subject to a strict and, for me, highly challenging time limit—one half hour.

It won’t be pretty. I would expect that there will be more than a few grammatical gaffes, a bunch of stylistic infelicities, and the writerly equivalent of tons of “ums” (that’s “erms” for you Brits). Compared to my usual work, whatever you think of it, this month’s posts will be:

  • More personally revealing, less socially useful.
  • Suggestive rather than definitive.
  • Based on what I remember rather than what I research.
  • Written directly in WordPress rather than drafted in MacJournal.

My rules are pretty simple. I have only half-an-hour from start to finish to write the post. I will allow myself to mull the post topic over in advance, and make a few notes, but no advance writing. And I will try to stay more or less on topic—no reflections on my misspent youth, no sports commentary, no streams of consciousness.

Well, there you have it. My time is up, and for better or worse, this post is done.

 

Worried That Journalist Robots Will Replace You? Say “I”

Angry Writing Robot by Brittstift/Flickr

They are not going away. After a flurry of attention last year, we hadn’t heard too much in the interim about the robots that were going to displace humans as content creators. Then last month, Steve Lohr of the New York Times revived the issue. Although the natural reaction of writers and editors might be fear, I think that’s the wrong reaction. The robots aren’t going to replace us, they’re going to free us.

Both Lohr’s article and a more recent series by Farhad Manjoo in Slate, “Will Robots Steal Your Job,” examine the efforts of IT startups to develop software that performs skilled, creative work such as writing. Two of those companies, Narrative Science and Automated Insights, are developing programs that churn through computerized data about sports and other topics and spit out news stories. Though I suspect it’s partly for entertainingly hyperbolic effect, Manjoo claims to be “terrified” that his livelihood as a writer is in peril.

In her reflections on the topic yesterday, and despite an opening feint at the “scary” job-threatening Internet, freelance writer Tam Harbert took a more optimistic approach than Manjoo. She’s skeptical of claims that software can win Pulitzers or successfully mimic the human element in journalism. Moreover, she sees some benefit in using software to replace those deadwood journalists who “don’t add any value” through their work:

“Writers, for example, who simply gather information, get a few comments from people and then regurgitate it onto the page, should probably start looking for another profession. As James W. Michaels, former editor of Forbes, was known to bellow: That is ‘not reporting, it’s stenography!’”

Though Harbert might not go this far, I’d put it this way: Computer-generated journalism is not terrifying, it’s liberating.

This is especially true in the world of trade journalism, where much of the work entry-level journalists are asked to do could be handled just as well by an algorithm. It doesn’t take very long for rewriting new-product press releases to evolve from informative introduction to an industry to stultifying drudgery. The fact that trade publisher Hanley Wood is one of the companies working with Narrative Science is, to me at least, encouraging.

The way forward for journalists is not commodity content but uniquely personal content. You can already see this direction developing in the field. Though it wasn’t her intent, Stefanie Botelho stated as much last month in a Folio: article on “The New ‘I’ in Journalism.”

Botelho’s aim was to critique journalists who let their subjects be overshadowed by their own self-regard. But “ego preening,” as she put it, is a problem in all walks of life, not just journalism. That doesn’t mean journalism shouldn’t be conversational or personal. Why would we want to avoid the one thing that computers can’t convincingly do? That’s one reason, I’d guess, that Manjoo’s articles about robot job thieves are written so relentlessly in the first-person, and rely so extensively on himself and his family for his examples.

As Harbert argues, what gives the journalist’s work true value is the human, personal perspective. Without the I, there’s no you. Without the I, there’s no conversation, no meaningful interaction. Without the I, journalism is just an exchange of data.

Should You Publish? A Tale of Two Melvilles

George Whyte-Melville via Wikipedia

Not Herman

Is what you write worth publishing?

Once upon a time, that wasn’t your choice to make. It used to be that the threshold to publication was as high as the transom. The only way most people could hope to cross it and break into print was through an unlikely toss over a publisher’s front door.

The Web, of course, has flung the door wide open, and there are few barriers to publishing left standing. One significant one, however, remains: The fear—or conviction—that your work isn’t good enough to deserve publication. Even though the power to publish is entirely in your hands, you may not do it.

Whether you call it the lizard brain, or the Resistance, or simply taste, most of us have personal quality filters that aim to eradicate anything we perceive as flawed. These filters are reinforced on a larger scale by devices like best-seller lists or books such as Andrew Keen’s The Cult of the Amateur, which either by effect or intent try to set limits on what is worth writing, publishing, or reading.

Wouldn’t we all be better off if we didn’t spend so much time writing, reading, or otherwise consuming second-rate content? Sure. But here’s the problem. While it’s obviously true that 95% of content is crap, what isn’t so obvious is which part is crap and which isn’t.

You may think you know. But statistically speaking, you probably don’t. Most of us just aren’t very good at judging the true worth of content.

This has always been the case. In the 19th century, for instance, literary judgment was often dead wrong.  I realized this back in my grad student days. Once, when roaming the seventh floor stacks of the enormous Olin Library at Cornell, I came across an impressive, luxuriously bound set of the complete works of a British writer named George Whyte-Melville.

Whyte-Melville, I found, was a popular and well-regarded novelist  from the 19th century. Though I was a student of that period’s literature and had been grinding through a book a day from that era for the past year, I’d never heard of him. I sampled a few of his novels. They were, to put it charitably, unremarkable. Yet at the end of the century, some publisher had determined that there was enough interest in Whyte-Melville to justify an expensive set of his books.

At the very same historical moment, Whyte-Melville’s semi-namesake and near contemporary, Herman Melville, had not a single book in print. Despite some popularity at the beginning of his career, he had fallen into near-complete literary oblivion by 1900. Twenty years later, the literary world finally came to its senses and now the right Melville is justly celebrated, the other sensibly forgotten.

The fact that so many could be so wrong in their judgments of what’s worth publishing underscores for me the importance of simply publishing everything and letting circumstances and posterity sort out what was really worth it.

Though I have mixed feelings about Dan Conover’s Xark attack last weekend on the literary establishment and its “tyranny of the smugly insignificant,” he’s right to urge creators not to worry whether they measure up:

“I’m calling on writers—professionals, amateurs, anyone who puts words together—to stop caring about what the literati think, write and say. Get over your insecure quest for ‘legitimate’ acceptance.”

Unlike Conover, I wouldn’t write off “MFA programs, book critics and humanities professors.” Nor do I think one should completely ignore Andrew Keen or one’s lizard brain. Now and then, they all have good points.

But in the end, when you’ve done the hard work and it comes to deciding whether or not to publish, the answer should almost always be, “Do it.” You might think it’s not good enough, but as history suggests, you might well be wrong.

Bloggers: Feel Free to Repeat Yourself

Hedgehog

Big ideas justify repetition

Imagine: After days of writer’s block, you’re suddenly inspired to write a long and insightful blog post. You’ve found the perfect illustration, and your headline is brilliant. You’re crushing it. Then, just before you click the publish button, a small blip of doubt appears on your radar. Somehow, what you’ve written sounds so familiar.

In a flash you remember: you’ve already covered this topic. The words are  different, the examples are new, but the case you’re making is more or less the same.

So what do you do now? As I’ve suggested before, a concern that someone else has already made your point shouldn’t stop you from publishing. But what if the person who made the point was you?

Fear not: There may be very good reasons to publish anyway.

In the right circumstances, there is a strong rationale for repeating yourself. But before we leap blindly into the upside of repetition, let’s consider the downside.

1. You may be subtracting value, not adding it. Once in a while, the first time you express an idea, it’s so well put that any subsequent efforts diminish the impact of the original. If you can’t improve on it or extend it, just link to it.

2. You may be using your desire to repeat yourself as an excuse. You may have other topics or ideas that you know you need to address, but it’s hard work. Going back to your old idea is so much easier. If that’s the case, put it on hold and focus on the new ones. The old one will always be there if you need it.

3. You may lack new ideas. Maybe you need to get out more. If you aren’t actively engaging with your community by reading, asking, and listening, your ideas, old or new, won’t be relevant.

If your urge to repeat yourself survives these three arguments against it, take heart. There are at least three equally compelling arguments in favor of it:

1. If you’ve forgotten what you said before, so has your reader. So say it again. What makes ideas grow on people is repetition. One of the findings of Edelman’s 2011 “Trust Barometer” is, as Krishna De puts it, that “the more we hear something, the more likely we are to believe it—59% of respondents will believe the information they receive if they hear it 3 – 5 times.”

Last week, Ardath Albee suggested that even more repetition may be required for maximum retention: “Here’s the dirty little secret about repetition: It takes 5 – 12 repetitions of an idea to make it stick.”

2. You’re not repeating, you’re refining. Most ideas aren’t hermetically sealed packages of eternal truth. Instead, they evolve and grow. The blog format is ideal not only for documenting this growth process, but also for enhancing it through interactions with and feedback from others.

Do all those earlier iterations of an idea in a blog become disposable the moment the latest version is published? Not at all. In fact, for me, one of the glories of the blog format is the way it allows readers to go back and follow the development of an idea over time. In blogs like Joe Pulizzi’s Junta42 blog or Jeff Jarvis’s BuzzMachine, to cite two very different examples, going back to their earliest posts and reading forward through time reveals the detail and depth in their ideas that wouldn’t exist without repetition and reworking.

3. Your idea is so important that it’s all you need. There’s nothing wrong with one-trick ponies if the trick is really good. Long ago, the philosopher Isaiah Berlin wrote a short book on Tolstoy called The Hedgehog and the Fox. The title was inspired by an ancient Greek fragment that says “The fox knows many things, but the hedgehog knows one big thing.”

The insight Berlin drew from this was that there are two kinds of thinkers. One, the fox, gets his brilliance from the many different ideas he throws out for consideration. The brilliance of the other, the hedgehog, is based on one very big, complex idea that he devotes himself to exploring and explaining. If you’re a hedgehog, repetition is an asset, not a liability.

There are probably more than these three reasons not to fear repetition in your blog posts. If you can add one in the comments here, please do. But otherwise, feel free to paraphrase Walt Whitman and repeat after me:  “Do I repeat myself? Very well then I repeat myself.”

 

Writing, Photography, and the Art of Thinking Visually

As some of my recent posts suggest, I’m a big fan of adding visual elements to written content, whether with infographics, illustrations, or photos. For the last few weeks, though, I’ve been wondering if I’m not putting too much stress on visual media. The graphic arts are brilliant tools for communication, yes, but words are every bit their match.Camera with "words" in lens

What started me worrying about this matter was a casual comment by Nieman Journalism Lab’s Justin Ellis. In his opening for an article on the quite different subject of photography’s potential to mislead, he made a “painful” admission.  There are times, he wrote, “when photos can tell more of a story than words could ever express.”

Sometimes the urge for a good lead makes you say things you don’t quite mean. But even if Ellis believes his claim, I’m not buying it. Writing can tell a story just as powerfully as a photo. But that’s only true if the writer learns to see and write in a visual way.

One of the reasons a photograph can seem so powerful is that it captures details of an event that many news or business writers might not think pertinent  or appropriate—a facial expression, the relation of people to their surroundings, the sense of place. But writers can see those same details. They just have to recognize their value and put them in their writing.

One writer who does so brilliantly is Steve Coll. Here is his opening paragraph from “The Casbah Coalition” in the April 4th issue of The New Yorker:

The office of the Prime Minister of Tunisia is situated in a three-story white-washed building with an arched Moorish entry. It faces north onto the Casbah, a plaza in the old quarter of Tunis. The view from the Prime Minister’s window is normally serene, taking in a tiled fountain and pruned ficus trees, but, by the afternoon of a day in late February, thousands of citizens had transformed the Casbah into what looked like a squatters’ camp. They had organized a round-the-clock sit-in to demand the resignation of Prime Minister Ghannouchi, and they were joined each weekend by large numbers of like-minded protesters. The fountain was completely covered by tents; ropes hoisted tarps from the trees.

This is visual writing, but it is not simply a snapshot of what the reporter saw. It sandwiches two views together—the ordinary serene picture of the Casbah with an extraordinary chaotic one. It shows the collision of stasis and change, a process of transformation unfolding before our eyes.

I’m not suggesting that writers don’t need or shouldn’t use photographs or other illustrations in their work. Rather, I’m arguing against two dangerous temptations for writers.

First, the simple availability of visual media should not constrain the visual element in our writing. It’s a false choice anyway: I suspect that if you can’t write visually, you won’t be very good at choosing graphics either.

Second, one medium is not inherently superior to the other. They are not categorically different, but lie along a continuum of representational media.

In the end, the key is learning to think with your eyes. The more you do, the better both your writing and the graphics you choose will be.