Re: Query takes around 15 to 20 min over 20Lakh rows

From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: "David G(dot) Johnston" <david(dot)g(dot)johnston(at)gmail(dot)com>
Cc: Shubham Mittal <mittalshubham30(at)gmail(dot)com>, PostgreSQL General <pgsql-general(at)lists(dot)postgresql(dot)org>
Subject: Re: Query takes around 15 to 20 min over 20Lakh rows
Date: 2021-09-02 23:22:46
Message-ID: 2441707.1630624966@sss.pgh.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

"David G. Johnston" <david(dot)g(dot)johnston(at)gmail(dot)com> writes:
> On Thu, Sep 2, 2021 at 3:16 PM Shubham Mittal <mittalshubham30(at)gmail(dot)com>
> wrote:
>> *Please help in optimizing this query. I need to actually generate reports
>> daily using this query.. It takes almost 15 to 20 min to execute this query
>> due to joins.. *

> Use jsonb_populate_recordset (or one of its siblings) to get rid of as many
> of these key-based value extraction operations as possible and build a
> table from the contents of the jsonb.

While those duplicative extractions sure look inefficient, it's not
clear from the (lack of) given facts whether that's the main cost,
or whether the complicated FROM clause is producing a bad plan.
I'd suggest first looking at EXPLAIN ANALYZE output to verify which
plan step(s) are slow. If it's the final output step that's expensive,
then yes the next step is to optimize the extractions.

Otherwise, see

https://wiki.postgresql.org/wiki/Slow_Query_Questions

regards, tom lane

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message jesusthefrog 2021-09-02 23:25:17 gen_random_uuid key collision
Previous Message David G. Johnston 2021-09-02 23:08:52 Re: Query takes around 15 to 20 min over 20Lakh rows