From: | Michał Kłeczek <michal(at)kleczek(dot)org> |
---|---|
To: | Lok P <loknath(dot)73(at)gmail(dot)com> |
Cc: | pgsql-general <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | Re: How batch processing works |
Date: | 2024-09-21 04:21:12 |
Message-ID: | 4178E73A-24F5-4E3C-92F6-1532D8102C3E@kleczek.org |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Hi,
> On 19 Sep 2024, at 07:30, Lok P <loknath(dot)73(at)gmail(dot)com> wrote:
>
[snip]
>
> Method-4
>
> INSERT INTO parent_table VALUES (1, 'a'), (2, 'a');
> INSERT INTO child_table VALUES (1,1, 'a'), (1,2, 'a');
> commit;
I’ve done some batch processing of JSON messages from Kafka in Java.
By far the most performant way was to:
1. Use prepared statements
2. Parse JSON messages in Postgres
3. Process messages in batches
All three can be achieved by using arrays to pass batches:
WITH parsed AS (
SELECT msg::json FROM unnest(?)
),
parents AS (
INSERT INTO parent SELECT … FROM parsed RETURNING ...
)
INSERT INTO child SELECT … FROM parsed…
Not the single parameter that you can bind to String[]
Hope that helps.
--
Michal
From | Date | Subject | |
---|---|---|---|
Next Message | Dan Kortschak | 2024-09-21 06:52:36 | Re: re-novice coming back to pgsql: porting an SQLite update statement to postgres |
Previous Message | Dan Kortschak | 2024-09-21 03:33:55 | Re: re-novice coming back to pgsql: porting an SQLite update statement to postgres |