From: | "Thomas F(dot) O'Connell" <tfo(at)sitening(dot)com> |
---|---|
To: | PgSQL General <pgsql-general(at)postgresql(dot)org> |
Subject: | pg_dump in a production environment |
Date: | 2005-05-23 19:54:46 |
Message-ID: | 82A4DF1E-C09D-4F20-B761-035F2510E6DD@sitening.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I have a web application backed by a PostgreSQL 7.4.6 database. It's
an application with a fairly standard login process verified against
the database.
I'd like to use pg_dump to grab a live backup and, based on the
documentation, this would seem to be a realistic possibility. When I
try, though, during business hours, when people are frequently
logging in and otherwise using the application, the application
becomes almost unusable (to the point where logins take on the order
of minutes).
According to the documentation, pg_dump shouldn't block other
operations on the database other than operations that operate with
exclusive locks. Ordinarily, I run pg_autovacuum on the box, so I
tried again after killing that, thinking that perhaps any substantial
vacuum activity might affect pg_dump. I tried again to no avail.
Excepting the rest of the application, the login process should be
completely read-only and shouldn't require any exclusive locks.
Connections don't really pile up excessively, and load on the machine
does not get in the red zone. Is there anything else I should be
noticing?
-tfo
--
Thomas F. O'Connell
Co-Founder, Information Architect
Sitening, LLC
Strategic Open Source: Open Your i™
http://www.sitening.com/
110 30th Avenue North, Suite 6
Nashville, TN 37203-6320
615-260-0005
From | Date | Subject | |
---|---|---|---|
Next Message | Scott Frankel | 2005-05-23 19:58:36 | urgent: another postmaster |
Previous Message | LiSim: Rainer Mokros | 2005-05-23 19:07:55 | Performance question |