Writing for the Web: The Human Algorithm and Zero-Sum SEO

Rockhopper Penguin Photo © Samuel Blanc [CC-BY-SA-3.0 (www.creativecommons.org/licenses/by-sa/3.0)], via Wikimedia Commons

Do you write for the Penguin, or the human?

I sometimes fear that search-engine optimization (SEO)  is the only aspect of new media that people have really cottoned to. Not that they’ve understood it, necessarily, but that they feel it is both justified and essential. It is something they simply accept.

But for any content creator, SEO (as most people practice it, at least) is the kiss of death. If you want your content to work, write for people, not for search engines.

I was reminded of this at last week’s SIPA meeting. In the course of a wandering and inconclusive presentation on writing for the Web, one of my fellow audience members asked the room, “Does anyone here think SEO isn’t important?” Out of perhaps 20 editors and writers in attendance, I was the only one who raised a hand.

This struck me as both worrisome and curious. No one there was particularly enthusiastic about SEO or how it aided their craft, but all glumly accepted its necessity.

In my defense, I argued that SEO is a losing game. The moment you achieve that precious optimization, Google changes its algorithm and reverses all your gains.

I might have added, it’s also frequently a zero-sum game. That is, whatever you gain from writing for search engines, your site visitors lose through irrelevant or shallow content.

My cynicism about SEO doesn’t mean I’m not in favor of marketing your content. Most writers, I think, need to do more marketing to potential readers, not less. But both parties should gain from that marketing effort. You should want your visitors to find your content because it’s exactly what they need, not because you successfully gamed a search engine.

Another way to put this is that, as a writer or journalist, you should worry less about Google’s algorithm and more about the human algorithm.

That to me is the takeaway from Guillaume Bouchard’s recent Search Engine Watch article on Google’s Penguin update.

Bouchard argues that the “solution for not getting pummeled every time Google changes its algorithm is to focus on providing the best possible relevancy to users.” You should focus on users, not SEO, in creating your content, he says, because “people, not just machines, have to get something out of it.” The best strategy for bringing your content to the attention of your target readers, he suggests, is to make it clean, clear, and useful.

That’s good SEO advice. Just as important, it’s good writing advice, too. Take it, and both you and your readers will avoid the zero-sum game.

 

Signals of Quality vs. Good SEO

Last month, I wrote about a discussion on an episode of This Week in Google (TWiG) featuring Google’s Matt Cutts. I noted that Cutts seemed to say that Google was aware of the rise of so-called content farms like Demand Media and that it would adjust its search algorithm so that low-quality commodity content didn’t overwhelm better material.

The following week, TWiG host Leo Laporte cited my article at the start of an expanded discussion of Google’s intent regarding content farms. In the clip from the episode below, Jeff Jarvis speculates that Google will “try to get more links to original content . . . and have signals of quality.”

What that means, he said, is that “if all you do is rewrite the 87th page about how to fix your toilet,” no matter how great your search engine optimization, you shouldn’t rise up in the search results. Instead, “Bob Vila’s original masterpiece about fixing toilets should rise up because it’s original and high quality.”

As Jarvis suggested, Google isn’t directing this effort against Demand Media or other content producers per se. Rather, it’s trying to ensure that quality content always rises to the top, regardless of who creates it and what SEO tactics are used. In other words, it’s pretty much business as usual for Google.

The entire episode can be viewed at Twit.tv.

Google to Rein In Content Farms?

Matt Cutts on This Week in Google

Matt Cutts: Raising the Bar

Is Google poised to slow the growing domination of its search results by content farms like Demand Media and Associated Content? At the end of last Saturday’s episode of the podcast This Week in Google, Matt Cutts, the head of Google’s Webspam team, suggested that it would: “If your business model is solely based on mass-generating huge amounts of nearly worthless content, that’s not going to work as well in 2010.”

Cutts’s remark came in response to a question by host Leo Laporte near the end of the episode. Though Laporte only learned about Demand Media a week earlier in his This Week in Tech Podcast, as he glancingly noted, he left no mistake about where he stood on the merits of its approach: “it seems like a way to game Google by creating a lot of pages with . . . barely adequate content in a niche area [in order] to drive traffic.”

Though Cutts avoided taking a position on Demand Media itself, he made it clear that Google was looking to address the generic problem:

“Within Google, we have seen a lot of feedback from people saying, Yeah, there’s not as much web spam, but there is this sort of low-quality, mass-generated content . . . where it’s a bunch of people being paid a very small amount of money. So we have started projects within the search quality group to sort of spot stuff that’s higher quality and rank it higher, you know, and that’s the flip side of having stuff that’s lower-quality not rank as high.”

In response to a question from co-host Jeff Jarvis, Cutts gave some specific ideas of how Google might try to adjust for the content-farm effect:

“You definitely want to write algorithms that will find the signals of good sites. You know, the sorts of things like original content rather than just scraping someone, or rephrasing what someone else has said. And if you can find enough of those signals—and there are definitely a lot of them out there—then you can say, OK, find the people who break the story, or who produce the original content, or who produce the impact on the Web, and try to rank those a little higher. . . .”

Jarvis, it should be noted, is not a cookie-cutter critic of Demand Media. He argued that Demand’s system for determining what content readers and advertisers want is “very smart.” But he seemed to agree that its resulting product is ranked too high on Google’s results. In the link economy, he said, it becomes an “ethical matter” to support original content by linking to it “at its source.”

Jarvis took Cutts’s thoughts further by stressing the growing importance of “Twitter, Buzz, and Facebook,” or “human recommendation of content,” as a way “to get past this notion of spam and content farms.” The more Google and others can capture the value of this social-media validation, he said, “the less this content-farm chaff is going to be a problem.”

In a BuzzMachine post published on Monday, Jarvis expanded on the topic of how content will be discovered in the future. Thanks to new tools like Twitter, Facebook, Buzz, he wrote, “human links are exploding as a means of discovery.” Earlier forms of discovery, he said, have been prone to manipulation, but in the new “content ecosystem,” where we “discover more and more content through people we trust,” quality will again rise to the top.

Well, here’s hoping.

Which Do You Prefer, Users or Readers?

A tweet from Jarvis this morning led to a short column in the Guardian that should be a useful corrective to a B2B audience worried about search-engine optimization (SEO).

For many in B2B media, the most important Web site metric is the total number of users. As quoted in the column, Matt Kelly of the Daily Mirror riffs on the pointlessness of this metric for special-interest newspaper sites. The quest for ever-increasing numbers of unique users, he said, “values one visit from one random Google News user as highly as daily visits, for an hour a time, from someone who treasures the content we produce.”

Instead of fretting about building up the number of random users, Kelly argued, special-interest newspaper sites should focus on building an engaged audience of appropriate readers. The SEO mantra for B2B sites should be the same: readers, not users; better, not bigger. The key metrics should be things like time spent on the site, number of pages read, and percentage of returning users.

The biggest impediment to changing the numbers-driven mindset is probably B2B advertisers. Until they can be educated to look for quality rather than quantity—something they do regularly for their print efforts—it will be difficult for publishers to avoid the numbers trap.