From: | aditya desai <admad123(at)gmail(dot)com> |
---|---|
To: | "David G(dot) Johnston" <david(dot)g(dot)johnston(at)gmail(dot)com> |
Cc: | Pgsql Performance <pgsql-performance(at)lists(dot)postgresql(dot)org>, Michael Lewis <mlewis(at)entrata(dot)com> |
Subject: | Re: CPU Consuming query. Sequential scan despite indexing. |
Date: | 2020-10-22 05:36:08 |
Message-ID: | CAN0SRDH2A_nq1jvpz92xWtaFPq1xLxOK9V35i1Hi+M2FZc0J=w@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Hi David,
Thanks for the suggestion. Let me try to implement this as well. WIll get
back to you soon.
Regards,
Aditya.
On Thu, Oct 22, 2020 at 11:03 AM David G. Johnston <
david(dot)g(dot)johnston(at)gmail(dot)com> wrote:
> On Wed, Oct 21, 2020 at 10:22 PM aditya desai <admad123(at)gmail(dot)com> wrote:
>
>> As per application team, it is business requirement to show last 60 days
>>> worth data.
>>>
>>
> I didn't look deeply but it sounds like you are looking backwards into 60
> days worth of detail every single time you perform the query and computing
> an aggregate directly from the detail. Stop doing that. By way of
> example, at the end of every day compute the aggregates on the relevant
> dimensions and save them. Then query the saved aggregates from previous
> days and add them to the computed aggregate from the current day's detail.
>
> David J.
>
>
From | Date | Subject | |
---|---|---|---|
Next Message | Mats Olsen | 2020-10-22 06:21:46 | Re: Query Performance / Planner estimate off |
Previous Message | David G. Johnston | 2020-10-22 05:32:55 | Re: CPU Consuming query. Sequential scan despite indexing. |