This is the mail archive of the
docbook-apps@lists.oasis-open.org
mailing list .
Re: Website - Invisible tocentry
- From: David Garnier <david dot garnier at eleve dot emn dot fr>
- To: Dave Pawson <dpawson at nildram dot co dot uk>
- Cc: docbook-apps at lists dot oasis-open dot org
- Date: Tue, 01 Oct 2002 19:38:09 +0200
- Subject: Re: DOCBOOK-APPS: Website - Invisible tocentry
- References: <5.1.1.6.2.20021001180551.027d6b00@pop3.nildram.co.uk>
Le mar 01-10-2002 à 19:06, Dave Pawson a écrit :
> At 10:22 01/10/2002, David Garnier wrote:
> >Hello,
> >
> >I'm like to put a spambot poisoner on my Website-based homepage
> >(http://www.davidgarnier.com).
>
> I'm intrigued!
>
> what on earth is a spambot poisoner!
>
> DaveP
Check for yourself at http://dgarnier.etudier-online.com/honey/pot.php.
It won't do anything nasty to your browser. The basic idea is that there
are spiders crawling the web that are specifically designed to collect
email addresses for spam purposes. There is little to do about it
because it's difficult to make the difference between good robots like
indexing robots from Google and bad robots. The idea is trap bad robots
using their very own vices, i.e. the fact that they do not respect the
exclusions specified in robots.txt.
The main strategy goes like this.
* Create a secret page that poison the robot,
* Forbid access to this page to good robots using robots.txt,
* Spread invisible links (that only a robot can see) to the secret page
all over your website.
Now in order to poison the robot, there is several choices:
* Trap the robot in an endless stream of dynamically generated pages,
* Feed him fake email addresses,
* Redirect it elsewhere.
This strategy is really really far from perfect, but it does something
against spam.
More references on the web.
Best Regards,
David Garnier