From: | bricklen <bricklen(at)gmail(dot)com> |
---|---|
To: | jesse(dot)hietanen(at)vaisala(dot)com |
Cc: | "pgsql-performance(at)postgresql(dot)org list" <pgsql-performance(at)postgresql(dot)org> |
Subject: | Re: Inefficient max query when using group by |
Date: | 2017-05-04 13:21:47 |
Message-ID: | CAGrpgQ9+n3-bzq=bXi8SW8ivUDjyLoxW_b7sXyx=qYCT4Xxmww@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
On Thu, May 4, 2017 at 3:52 AM, <jesse(dot)hietanen(at)vaisala(dot)com> wrote:
> Hi,
>
> I have a performance problem with my query. As a simplified example, I
> have a table called Book, which has three columns: id, released (timestamp)
> and author_id. I have a need to search for the latest books released by
> multiple authors, at a specific point in the history. This could be latest
> book between beginning of time and now, or latest book released last year
> etc. In other words, only the latest book for each author, in specific time
> window. I have also a combined index for released and author_id columns.
>
As far as the query itself, I suspect you are paying a penalty for the
to_timestamp() calls. Try the same query with hard-coded timestamps:
"AND released<='2017-05-05 00:00:00' AND released>='1970-01-01 00:00:00'"
If you need these queries to be lightning fast then this looks like a good
candidate for using Materialized Views:
https://www.postgresql.org/docs/current/static/sql-creatematerializedview.html
From | Date | Subject | |
---|---|---|---|
Next Message | David Rowley | 2017-05-04 13:51:34 | Re: Inefficient max query when using group by |
Previous Message | jesse.hietanen | 2017-05-04 10:52:03 | Inefficient max query when using group by |