From: | Greg Stark <stark(at)mit(dot)edu> |
---|---|
To: | PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org> |
Subject: | robots.txt on git.postgresql.org |
Date: | 2013-07-09 15:24:42 |
Message-ID: | CAM-w4HOzgXu9WVOygrXRpUn4dh9PL7idD1TDTnLb4HKzwfiHdA@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
I note that git.postgresql.org's robot.txt refuses permission to crawl
the git repository:
http://git.postgresql.org/robots.txt
User-agent: *
Disallow: /
I'm curious what motivates this. It's certainly useful to be able to
search for commits. I frequently type git commit hashes into Google to
find the commit in other projects. I think I've even done it in
Postgres before and not had a problem. Maybe Google brought up github
or something else.
Fwiw the reason I noticed this is because I searched for "postgresql
git log" and the first hit was for "see the commit that fixed the
issue, with all the gory details" which linked to
http://git.postgresql.org/gitweb/?p=postgresql.git;a=commitdiff;h=a6e0cd7b76c04acc8c8f868a3bcd0f9ff13e16c8
This was indexed despite the robot.txt because it was linked to from
elsewhere (Hence the interesting link title). There are ways to ask
Google not to index pages if that's really what we're after but I
don't see why we would be.
--
greg
From | Date | Subject | |
---|---|---|---|
Next Message | Andres Freund | 2013-07-09 15:30:49 | Re: robots.txt on git.postgresql.org |
Previous Message | Dmitriy Igrishin | 2013-07-09 15:17:34 | Re: Millisecond-precision connect_timeout for libpq |