Independent Consultant (Toronto, Canada) specializing in Lucene, Hadoop, HBase, Nutch, SOLR, LingPipe, GATE, Data Mining, Search Engines, WebLogic, Oracle, Liferay Portal, Java, J2EE, SOA, and more.
Master in MATH, Moscow State University n.a.Lomonosov
I found a lot of stupidness... first at Matt Cutts blog (only in comments!), then at WMW:http://www.webmasterworld.com/robots.txt?view=producecode
First: it is cloaking. I don't like this word just because we indeed need to provide different content for different HTTP headers, including User-Agent string. For instance, for cell-phone users ;)
Some so-called 'experts' in SEO area advice to have dynamic robots.txt, and even to dynamically overwrite static .htaccess file!
This is a real sheet.
Scalability, security, and more. Can dynamic service handle HEAD request? Redirects from another site? 304? 403? What about 5xx 'Service Unavailable'? Really stupid.