Wired Says Screw It To All Search Engines After SEL Inspired Spam Attack, Disallows EVERYTHING With Robots.txt

So, it looks like after the mishap this past Friday, where SEL accidentally exposed the Wired How-To wiki to spammers, Wired has instituted their new spam deterrent measures. They seem to have gone just a tad bit overboard, if you ask me.

Apparently they have just decided to block all search engines from indexing anything whatsoever on the wiki now. This is what their robots.txt (cached version) now looks like:

User-agent: *
Disallow: /

While this might not actually stop the spammers from trying (since that would entail them actually bothering to look at robots.txt), It will stop them from gaining any link credit when they do so. Unfortunately, however, it will also eventually have the consequence of the entire Wiki being deindexed from all of the major search engines. This is of course a slightly different solution than what they had originally told SEL’s Danny Sullivan they were going to implement:

NOTE FROM DANNY: We’ve talked with Wired about the situation, and they are putting a robots.txt block on links coming out of the wiki so that links won’t pass credit. – Danny Sullivan

Yep, they blocked the links all right… along with all of the rest of their content as well. 😀

Leave a Comment