Yesterday, Smackdown noted that Wired put a robots.txt in place in response to spammers trying to use their wiki for spamalicious links. Oddly enough, that robots.txt also blocked everyone and everything, including Google and all the rest of the search engines, from the site. (Caveat: When I say “blocked everyone”, I’m referring to everyone who bothers to adhere to robots.txt).
So, it looks like after the mishap this past Friday, where SEL accidentally exposed the Wired How-To wiki to spammers, Wired has instituted their new spam deterrent measures. They seem to have gone just a tad bit overboard, if you ask me.
Danny Sullivan is actually off the hook cause I’m sick tonight, and have no energy to speak of… I was going to do a deeply involved detailed post having to do with cojones growing and owning up to events as they really happened…
Instead, I’ll just ask…
Sometimes, as geeks, we forget how the non-geek mind works. Things we take for granted as being obvious aren’t always so to the untrained eye. Yes, when most of us who live on the Internet perform a search on Google, we know without even thinking what is sponsored and what is not. Most of us probably won’t even see the ads in front of us, aside form the habitual scan for competitors when performing queries that might relate to our own sites.
As many of you know, Andy Beard has been out of the posting loop lately, due to being preoccupied with moving. Luckily he did manage to get in a much needed review of Matt Cutts post on paid links.
I stumbled across the post last night, and I saw a quote by Matt Cutts that caught my attention a bit:
Way to go, Matt! 😀
Although he said that it could take a while, it looks like the changes Matt mentioned he started the ball rolling on, that DazzlinDonna posted about back on Dec 3rd, “Admission of Guilt Will No Longer Be Required For Google Reconsideration Request“, have indeed gone through. For the most part, that is.
Donna wrote a piece today entitled, “Less PageRank Floating Around In SEO Niche“. The post was based around a comment Matt Cutts made on SEOmoz having to do with the fact that certain sites seeing lower PageRank had to do with the fact that “there’s less PageRank flowing around in some areas (e.g. search and SEO)” and not that those sites had been penalized. Her conclusion from Matt’s statement is that she thinks most people were wrong about the recent visible PageRank penalties being just visible, and that the the penalties were probably also behind there being “less PageRank” in the SEO niche.
One of the sites I own happens to involve electronic poetry. On that site, on the bottom of the pages, I incorporated a news feed. Nothing fancy, just shows a few stories, their headlines, links, and brief snippets. Occasionally, for news stories with very few results, someone will stumble across my site when researching the topic. The site doesn’t have a ton of ranking power, and it is in no way optimized around the content of the news stories. They are just there to give the readers access to more sites to browse through, should they want to.
Occasionally, I will get an email from someone relatively new to the internet, wondering why my site shows on a search for the title of a poem they wrote, or a speaking engagement they performed at, but they do not see anything about it on the page. I will write these people back, explaining that they need to look at the cache of the page, since the news feed is of course dynamic, and the stories indexed when Google went there are usually not the same ones that are there days later. Most say thank you, and wind up understanding just a wee bit more about the internet.
All of them are for the most part just curious, knowing that they don’t know that much about the internet, and all of them are generally speaking quite polite. Until, that is, this peach involved with some obscure work, “Teesway One Nine Nine”.
A quick update to the Microsoft Rogue Bot Fiasco. It looks like now they have correctly DNS’d the IP range that they are sending these bogus requests from. Previously, all of the IP’s (which I first mentioned were all coming from the 65.55.165.* block) reverse DNS’d to names such as bl2sch1081901.phx.gbl. They have apparently changed this, so the IP’s are more readily identifiable as coming from Microsoft, reverse DNS’ing to the Live.com domain, eg. livebot-65-55-165-99.search.live.com.
I have blogged in the past about how annoying Rand Fishkin’s tactic of avoiding direct questions by obfuscation is. It especially irks me because in order to work it relies on taking advantage of people’s low attention spans, and in making the conversation too painful for most to bear. Often times pursuing winning an argument with someone using those tactics runs the risk of appearing obsessive, since doing so involves repeating the same questions over and over, and to let it go is to allow the other person to appear to win.
It is a politicians trick, not one to be used in polite conversation, and it is inherently slimy. Verbally wrestling with someone who uses it to answer questions that would otherwise make them look bad can leave one with an unwashed, unwholesome feeling, and I personally much prefer to debate with someone who can simply show me that I am wrong.