From: | John R Pierce <pierce(at)hogranch(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: copy/dump database to text/csv files |
Date: | 2014-07-24 19:13:56 |
Message-ID: | 53D15AF4.4090600@hogranch.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 7/24/2014 12:04 PM, William Nolf wrote:
>
> This is probably an easy one for most sql users but I don't use it
> very often.
>
> We have a postgres database that was used for an application we no
> longer use. However, we would
>
> like to copy/dump the tables to files, text or csv so we can post them
> to sharepoint.
>
> Copy seems to be what I want. But it only does one table at a time.
> There is 204 tables with a schemaname=public. I need to be copy the
> tables with data to files. I need something like a for
>
> loop which checks if the table is populated if so, copy it to
> tablename.csv file
>
> Any ideas?
>
what scripting/programming language are you best in? I'd probably whip
up a perl script to do that.
start with something like,
select table_schema||'.'||table_name from information_schema.tables
where table_schema not in ('pg_catalog','information_schema')
which will output a list of all table names.
I suppose even bash scripting...
for $table in $(psql -tc "select table_schema||'.'||table_name from
information_schema.tables where table_schema not in
('pg_catalog','information_schema')" dbname; do
psql -c "\copy $table to '$table.csv' with (format csv,
header,...)" dbname
done
--
john r pierce 37N 122W
somewhere on the middle of the left coast
From | Date | Subject | |
---|---|---|---|
Next Message | Thomas Kellerer | 2014-07-24 19:23:22 | Re: copy/dump database to text/csv files |
Previous Message | William Nolf | 2014-07-24 19:04:14 | copy/dump database to text/csv files |