On Mon, Jan 27, 2003 at 10:53:15 -0600,
will trillich <will(at)serensoft(dot)com> wrote:
>
> maybe have a different "driver" return idocs for known spiders
> (base content-spewed on user-agent); there's gotta be a way to
> have both search-engine-able idocs and enough connections to
> keep it open for the human browsers...
If the robots are well behaved you can control things through robots.txt.
If they are evil robots, you need to have something that limits request
rates for a single IP address (or maybe a /24).