From: | Craig Ringer <ringerc(at)ringerc(dot)id(dot)au> |
---|---|
To: | AI Rumman <rummandba(at)gmail(dot)com> |
Cc: | pgsql-general General <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: Is there any way to import a portion of a large database |
Date: | 2011-09-20 06:41:05 |
Message-ID: | 4E783581.5040508@ringerc.id.au |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 09/20/2011 02:35 PM, AI Rumman wrote:
> I have a production Postgresql 9 database of 2 TB+. For development
> purpose, I have to import this database in development server where I
> have only 1 TB of disk space. No more space can be added at present.
> Is there any way so that I might import the whole schema definition of
> the database with a portion of data in my development server?
You can do a schema-only dump and restore that, but as for the data ...
that's something you really have to figure out yourself. PostgreSQL
can't just randomly select rows to export, because there will be foreign
key relationships that would prevent successful restoration.
Additionally, in most databases some tables will have to be exported in
their entirety, as they'll contain lookup tables or other data the app
requires.
There are ETL tools like Talend that may be able to help simplify this.
I haven't looked into them. Personally for simpler databases I'd
probably just hack something together using an appropriate scripting
language and COPY (SELECT...).
--
Craig Ringer
From | Date | Subject | |
---|---|---|---|
Next Message | John R Pierce | 2011-09-20 06:41:17 | Re: Is there any way to import a portion of a large database |
Previous Message | AI Rumman | 2011-09-20 06:35:11 | Is there any way to import a portion of a large database |