From: | Marko Kreen <markokr(at)gmail(dot)com> |
---|---|
To: | Andrew Dunstan <andrew(at)dunslane(dot)net> |
Cc: | Magnus Hagander <magnus(at)hagander(dot)net>, Aidan Van Dyk <aidan(at)highrise(dot)ca>, Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>, PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: Git cvsserver serious issue |
Date: | 2010-10-08 13:15:46 |
Message-ID: | AANLkTikhyss_vw+nGosGPR+mtSQ_M_wKDUnV9vFoMn07@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | buildfarm-members pgsql-hackers |
On Fri, Oct 8, 2010 at 3:13 PM, Andrew Dunstan <andrew(at)dunslane(dot)net> wrote:
> On 10/08/2010 02:09 AM, Magnus Hagander wrote:
>> On Fri, Oct 8, 2010 at 03:52, Andrew Dunstan<andrew(at)dunslane(dot)net> wrote:
>>> There's a simpler solution which I have just tested. Instead of patching,
>>> use the Pg driver instead of SQLite. Set the dbname to %m. If the
>>> database
>>> doesn't exist the cvs checkout will fail. So we just set up databases for
>>> the modules we want to export (master and RELn_m_STABLE for the live
>>> branches).
Wouldn't it be simpler be to generate hourly tarball on some host and wget it?
It can be generated even more often, as no history need to be kept.
Considering the state of cvsserver, can you be certain that whatever
is coming from it is really the most recent code?
--
marko
From | Date | Subject | |
---|---|---|---|
Next Message | Andrew Dunstan | 2010-10-08 13:18:27 | Re: Git cvsserver serious issue |
Previous Message | Andrew Dunstan | 2010-10-08 12:13:02 | Re: Git cvsserver serious issue |
From | Date | Subject | |
---|---|---|---|
Next Message | Andrew Dunstan | 2010-10-08 13:18:27 | Re: Git cvsserver serious issue |
Previous Message | Robert Haas | 2010-10-08 13:06:10 | Re: standby registration (was: is sync rep stalled?) |