From: | "Greg Sabino Mullane" <greg(at)turnstep(dot)com> |
---|---|
To: | pgsql-advocacy(at)postgresql(dot)org |
Subject: | Re: Getting better Google search results |
Date: | 2006-08-29 10:37:55 |
Message-ID: | 64f7c031a223d74df451478d38901644@biglumber.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-advocacy pgsql-www |
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
> Wouldn't this be better on -www? :-P
I considered that, but this seems more of an advocacy problem
with a technical solution. I thought the wider advocacy audience
might have some ideas about it.
> Don't we have enough silly domains already? If we want to get rid of
> those hits, why don't we just add a robots.txt and tell google not to
> index the old docs at all? (If people *need* hits in the old docs, they
> can always hit our own search engine for those docs)
I considered that, but I would not want to completely eliminate the
old docs from Google searching. There's always a chance something useful is
there. However, since 99.9% of generic non-version specific searches should
*not* hit those pages, it's best to stick those on the "e" in
Goooooooooooooooooooooooogle. :)
- --
Greg Sabino Mullane greg(at)turnstep(dot)com
End Point Corporation
PGP Key: 0x14964AC8 200608290634
http://biglumber.com/x/web?pk=2529DF6AB8F79407E94445B4BC9B906714964AC8
-----BEGIN PGP SIGNATURE-----
iD8DBQFE9BhTvJuQZxSWSsgRAjMuAJ9eHXUVE1teHRNVMuH7lKOKxIgG6wCffYak
S9CKiT3zI/FitV09LeAKPHs=
=CwM6
-----END PGP SIGNATURE-----
From | Date | Subject | |
---|---|---|---|
Next Message | Andreas Pflug | 2006-08-29 11:11:47 | Re: database contest results |
Previous Message | Hans-Juergen Schoenig | 2006-08-29 10:32:59 | Re: database contest results |
From | Date | Subject | |
---|---|---|---|
Next Message | Robert Treat | 2006-08-29 11:50:37 | Re: PostgreSQL rebranding |
Previous Message | Dave Page | 2006-08-29 08:11:32 | Re: A counter productive conversation about search. |