From: | Bill Moran <wmoran(at)potentialtech(dot)com> |
---|---|
To: | "alan bryan" <alan(dot)bryan(at)gmail(dot)com> |
Cc: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: How to track query execution time |
Date: | 2008-02-05 22:24:04 |
Message-ID: | 20080205172404.165b379c.wmoran@potentialtech.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
In response to "alan bryan" <alan(dot)bryan(at)gmail(dot)com>:
> I've got a web site (apache/php) with a postgres 8.2.5 database(s).
>
> We're now getting some periods of high load. We have a lot of dynamic
> queries so I'm not able to just tune and optimize a few known queries
> ahead of time.
>
> Is there a way that I can get a list of all the actually SQL queries
> as passed to Postgres and the start/completed times (or even just
> duration)?
> This doesn't need to be real time, log parsing after the fact would
> probably be fine too.
>
> Then, I could look at the worst performers and figure out where in the
> PHP code that query was built from and then try to tune from there.
Have a look at the log_min_duration_statement parameter in
postgresql.conf.
Once you've captured data into your log files, you can use whatever
tools you want to analyze it, but I've found pgFouine to be particularly
helpful in isolating slow queries.
--
Bill Moran
http://www.potentialtech.com
From | Date | Subject | |
---|---|---|---|
Next Message | Jeff Davis | 2008-02-05 22:28:00 | Re: Out of Memory errors while running pg_dump |
Previous Message | Barnaby Scott | 2008-02-05 22:21:31 | mutli row/table constraints |