Search engine agnosticism

When I first heard that Rupert Murdoch was going to block search engines from indexing his news sites (Wall Street Journal, FOX News, and more) I thought it was an interesting idea. Possibly brilliant.

Blocking all search engines could very well destroy most of the traffic to his sites, or it might make fans start checking the actual sites again rather than looking for the news to show up on Yahoo! or Google. FOX and the WSJ cater to enough of a niche that I think it might just work.

But now it’s possible that Murdoch will not block all the search engines, just the ones who don’t pay him.

Leaving aside whether or not this would work for Murdoch, it would be awful for the internet. Search engine competition should be about which service offers the best results, not which one pays for the biggest index. It would be a minefield for users that would now need to keep track of which services were indexing the sites they most wanted to search.

3 Comments

  1. Ian

    I think its a pretty stupid idea. The business model for online newspapers should focus on increasing traffic, and search engines seem like a very simple way of greatly upping your visitor count.

    Say I go to the WSJ’s website, I will only see their most recent news stories on the front page, but what if I wanted to see a news story that was over a week old? I could use their search function to scour their site for the story, or I could use Google News to search tons of sites for the story which excludes the WSJ (which is actually what I do).

    Oh well, I have typically avoided the WSJ since it was taken over by Murdoch. I didn’t full on boycott, but if I saw two stories on the same subject, one from the WSJ and another from a reputable source, I would pick the alternate source. Its all about the FOX News association for me. This just does more to isolate their viewpoints, which seems like a poor strategy if you actually want to influence people. Why make yourself more fringe and less mainstream?

    “Leaving aside whether or not this would work for Murdoch, it would be awful for the internet. Search engine competition should be about which service offers the best results, not which one pays for the biggest index.”

    Perhaps, but only if you assume that people would use a search engine because it accesses the WSJ. I don’t know if the average internet user is going to care or think about what sites a search engine accesses, since the basic assumption is that it accesses them all. It could work in the opposite way. People might continue to use their current search engine and news reading methods, and this would result in a drop in traffic for the WSJ. This might force Murdoch to reconsider. If I had to guess, I’d say that is the more likely outcome.

  2. Clint

    I can’t imagine it working. As the article points out, interesting articles will end up indexed anyway, through blogs and what not. And I doubt people are going to change their surfing habits for a handful of publications (which cater to less Internet-experienced demographics, I believe).

    I don’t think Murdoch understands how the Internet works.

  3. Chris

    Good comments from the both of you.

    My only disagreement would be with the idea that all these places need is more traffic. The NY Times, for instance, is one of the most popular news sites on the planet in terms of pure traffic numbers. They’ve been known to get nearly 10 million unique visitors in a day. In contrast, their print circulation is somewhere around 1 million. Yet the print edition subsidizes the online edition. They’d have to cut their operations considerably if they switched to an online only format.

    The problem is that they can’t turn the visitors into dollars, so something needs to change or they’re gonna die.