Re: Analyze tool?

From: Rob Sargent <robjsargent(at)gmail(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Re: Analyze tool?
Date: 2010-10-01 13:43:17
Message-ID: 4CA5E575.8080703@gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Then to get all statements would one simply set log_min_duration to some
arbitrarily small value?

On 10/01/2010 04:30 AM, Thom Brown wrote:
> 2010/10/1 Bjørn T Johansen <btj(at)havleik(dot)no>:
>> We are using both DB2 and PostgreSQL at work and DB2 has a nice tool, i5 Navigator, where one can enable logging of SQL statements and then it will
>> recommed indexes that should/could be created to increase speed...
>> Does there exist a similar tool for PostgreSQL?
>
> You can set log_min_duration_statement to log statements which take
> over a certain amount of time, and then use pgFouine to read the log
> files and identify the most frequently run queries, and the longest
> queries.
>
> You can also use the auto_explain contrib module
> (http://www.postgresql.org/docs/9.0/static/auto-explain.html) to log
> the plans of queries which take too long. However, I don't think
> pgFouine can use those outputs.. at least not yet.
>
> But to find out what indexes you'll need, getting used to reading
> query plans will help as it will show you more than just where
> sequentials scans are taking place. It will also show you what the
> planner believes a query will cost compared to how much it actually
> costs, which can provide insight into tables which require vacuuming,
> indexes which might need clustering, or table stats which require
> modifying to match you data.
>
> There might be a tool out there for PostgreSQL like you describe,
> although I'm not personally aware of it.
>

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Thom Brown 2010-10-01 13:47:56 Re: Analyze tool?
Previous Message Raymond O'Donnell 2010-10-01 13:41:36 Re: Slony-I cluster creation Help.