From: | Oliver Kohll - Mailing Lists <oliver(dot)lists(at)gtwm(dot)co(dot)uk> |
---|---|
To: | pgsql-general <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: Multi master use case? |
Date: | 2012-01-28 19:52:02 |
Message-ID: | 9612B2A3-CC33-403B-84E4-956EC251C30F@gtwm.co.uk |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 28 Jan 2012, at 15:27, "Greg Sabino Mullane" <greg(at)turnstep(dot)com> wrote:
>> Is this a case for multi master do you think?
>> I.e. running one on the internet, one locally.
>
> Yes, could be.
>
>> b) changing schemas (new tables, fields, views etc.) as well as data
>
> That's a tall order; I don't think anything will do that automatically,
> although rubyrep claims to at least pick up new tables.
OK, I guess I could treat one as 'schema master' and pg_dump schema + data across to the other once a night, once all activity has stopped and standard replication completed.
>
>> Any experiences/thoughts?
>
> My experience is with Bucardo, which should do the job admirably
> (but with the data only). My advice would be to just set up a test
> system and try rubyrep and Bucardo out. For the latter, use the
> latest Bucardo5 beta, as Bucardo4 will be deprecated soon:
>
> http://bucardo.org/downloads/Bucardo-4.99.3.tar.gz
Thanks, I'll do that.
Oliver
www.agilebase.co.uk
From | Date | Subject | |
---|---|---|---|
Next Message | Little, Douglas | 2012-01-28 22:31:45 | help with normalizing |
Previous Message | Adrian Klaver | 2012-01-28 18:11:03 | Re: pg_dump -s dumps data?! |