From: | Alvaro Herrera <alvherre(at)2ndquadrant(dot)com> |
---|---|
To: | tony(at)exquisiteimages(dot)com |
Cc: | pgsql-general(at)lists(dot)postgresql(dot)org |
Subject: | Re: pg_dump of database with numerous objects |
Date: | 2020-06-03 20:10:55 |
Message-ID: | 20200603201055.GA30037@alvherre.pgsql |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 2020-May-31, tony(at)exquisiteimages(dot)com wrote:
> I am now needing to upgrade to a new version of PostgreSQL and I am running
> into problems when pg_upgrade calls pg_dump. pg_dump stalled at: "pg_dump:
> saving database definition" for 24 hours before I killed the process.
>
> My pg_class table contains 9,000,000 entries and I have 9004 schema.
We've made a number of performance improvements to pg_dump so that it
can dump databases that are "large" in several different dimensions, but
evidently from your report it is not yet good enough when it comes to
dumping millions of tables in thousands of schemas. It will probably
take some profiling of pg_dump to figure out where the bottleneck is,
and some careful optimization work in order to make it faster. Not a
weekend job, I'm afraid :-(
--
Álvaro Herrera https://www.2ndQuadrant.com/
PostgreSQL Development, 24x7 Support, Remote DBA, Training & Services
From | Date | Subject | |
---|---|---|---|
Next Message | Adrian Klaver | 2020-06-03 20:22:25 | Re: Fine grained permissions on User Mapping |
Previous Message | Rob Sargent | 2020-06-03 19:57:14 | Re: Can we get SQL Server-like cross database queries |