Bambarbia Kirkudu

Independent Consultant (Toronto, Canada) specializing in Lucene, Hadoop, HBase, Nutch, SOLR, LingPipe, GATE, Data Mining, Search Engines, WebLogic, Oracle, Liferay Portal, Java, J2EE, SOA, and more. Master in MATH, Moscow State University n.a.Lomonosov

Friday, June 22, 2007

 

Dynamic robots.txt

I found a lot of stupidness... first at Matt Cutts blog (only in comments!), then at WMW:
http://www.webmasterworld.com/robots.txt?view=producecode

First: it is cloaking. I don't like this word just because we indeed need to provide different content for different HTTP headers, including User-Agent string. For instance, for cell-phone users ;)

Some so-called 'experts' in SEO area advice to have dynamic robots.txt, and even to dynamically overwrite static .htaccess file!

This is a real sheet.

Scalability, security, and more. Can dynamic service handle HEAD request? Redirects from another site? 304? 403? What about 5xx 'Service Unavailable'? Really stupid.

Labels:


Comments:

Post a Comment

Subscribe to Post Comments [Atom]



Links to this post:

Create a Link



<< Home

Archives

May 2007   June 2007   July 2007   August 2007   October 2007   April 2008   June 2008   July 2008   August 2008   January 2009  

This page is powered by Blogger. Isn't yours?

Subscribe to Posts [Atom]