From: | "David G(dot) Johnston" <david(dot)g(dot)johnston(at)gmail(dot)com> |
---|---|
To: | Jeremy Finzel <finzelj(at)gmail(dot)com> |
Cc: | Carter Thaxton <carter(dot)thaxton(at)gmail(dot)com>, Pg Hackers <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: Add --include-table-data-where option to pg_dump, to export only a subset of table data |
Date: | 2018-09-06 15:47:50 |
Message-ID: | CAKFQuwbPHH0JEcch0NcohMfdqcCBch-EpBSLA0ZyW+ecBMfEFA@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
On Thu, Sep 6, 2018 at 8:40 AM, Jeremy Finzel <finzelj(at)gmail(dot)com> wrote:
> Why not simply use \copy (select * from largetable where created_at >=
>> '2018-05-01') to stdout? That is what I’ve always done when I need
>> something like this and have not found it particularly bothersome but
>> rather quite powerful. And here you have tons of flexibility because you
>> can do joins and whatever else.
>>
>
Just skimming the thread but I'd have to say being able to leverage
pg_dump's dependency resolution is a major reason for adding features to it
instead sticking to writing psql scripts. This feature in a multi-tenant
situation is something with, I suspect, reasonably wide appeal.
David J.
From | Date | Subject | |
---|---|---|---|
Next Message | Jeremy Finzel | 2018-09-06 16:08:53 | Re: Add --include-table-data-where option to pg_dump, to export only a subset of table data |
Previous Message | Jeremy Finzel | 2018-09-06 15:40:55 | Re: Add --include-table-data-where option to pg_dump, to export only a subset of table data |