<div>Hello,</div><div> </div><div>I have faced with a complicated case of table synchronisation. The full description of the problem (with some figures) is posted here: http://stackoverflow.com/questions/26237661/postgresql-update-table-with-new-records-from-the-same-table-on-remote-server</div><div> </div><div>Here it the partial repost of my stackowerflow's topic:</div><div> </div><blockquote><div>We have a PostgreSQL server running in production and a plenty of workstations with an isolated development environments. Each one has its own local PostgreSQL server (with no replication with the production server). Developers need to receive updates stored in production server periodically.</div><div> </div><div>I am trying to figure out how to dump the contents of several selected tables from server in order to update the tables on development workstations. The biggest challenge is that the tables I'm trying to synchronize may be diverged (developers may add - but not delete - new fields to the tables through the Django ORM, while schema of the production database remains unchanged for a long time).</div></blockquote><div><blockquote><p>Therefore the updated records and new fields of the tables stored on workstations <em>must be</em> preserved against the overwriting.</p><p>I guess that direct dumps (e.g. pg_dump -U remote_user -h remote_server -t table_to_copy source_db | psql target_db) are not suitable here.</p><p><strong>UPD</strong>: If possible I would also like to avoid the use of third (intermediate) database while transferring the data from production database to the workstations.</p></blockquote></div><div> </div><div>Have no idea how to work it out. Any help will be appreciated.</div><div>Sincerely,</div><div>-- </div><div>Vitaly Isaev</div><div>software engeneer</div><div>Team112.ru</div><div> </div>