Strange things happen when you’ve gone too far, been up too long, worked too hard, and you’re separated from the rest of the world.
Annotation: the best work is done, the most inspiration comes, from the only real muse: Fear. Especially the fear of what’s due in the morning.
In keeping with the notion that everything ever imagined can somehow be turned into evidence of the unquestionable perfection of Z, a new up-close docudrama casts a loving revisionist glance back at the muddled ideas of Ukraine’s leader which were an ironic source of inspiration when a Russian horde attempted to decapitate Kyiv.
Meanwhile, I remembered the most ludicrous scheme ever to energize a dwindling internet gold rush (complete with manifestos, slogans and Gueveran imagery) that didn’t receive much warmth, or even kindness.
I’ve refashioned it into the Poem of the Day.
# Robots file for www.abdymok.net
# Contact Webmaster for changes to this file.
#
# $Id$
#
# See <URL:http://info.webcrawler.com/mak/projects/robots/robots.html>
#
#
# In addtion to the following information, the following meta tag
# can also be used
# <META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">
#
#
# User-agent
#
# The value of this field is the name of the robot the record is
# describing access policy for.
#
# If more than one User-agent field is present the record describes an
# identical access policy for more than one robot. At least one field
# needs to be present per record.
#
# The robot should be liberal in interpreting this field. A case
# insensitive substring match of the name without version information is
# recommended.
#
# If the value is '*', the record describes the default access policy
# for any robot that has not not matched any of the other records. It is
# not allowed to have two such records in the "/robots.txt" file.
#
# Disallow
#
# The value of this field specifies a partial URL that is not to be
# visited. This can be a full path, or a partial path; any URL that
# starts with this value will not be retrieved. For example, Disallow:
# /help disallows both /help.html and /help/index.html, whereas
# Disallow: /help/ would disallow /help/index.html but allow /help.html.
#
# Any empty value, indicates that all URLs can be retrieved. At least
# one Disallow field needs to be present in a record.
#
# The presence of an empty "/robots.txt" file has no explicit associated
# semantics, it will be treated as if it was not present, i.e. all
# robots will consider themselves welcome.
#
#User-agent: Ultraseek
#Disallow:
#User-agent: *
#Disallow: /
User-agent: *
Disallow: /cgi
Disallow: /include
Disallow: /isyndicate
#Disallow: /pdf
Disallow: /process
Disallow: /screamingmedia
Disallow: /search
# Rover is a bad dog <http://www.roverbot.com>
User-agent: Roverbot
Disallow: /