From: | Tomas Vondra <tomas(dot)vondra(at)enterprisedb(dot)com> |
---|---|
To: | Kostas Chasialis <koschasialis(at)gmail(dot)com>, pgsql-hackers(at)postgresql(dot)org |
Subject: | Re: [ERROR] Copy from CSV fails due to memory error. |
Date: | 2022-01-19 14:47:41 |
Message-ID: | 0dc23243-65b6-8cb9-8925-d91a9d7de9d2@enterprisedb.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
On 1/19/22 14:01, Kostas Chasialis wrote:
> Hey.
>
> I am facing an issue when I try to run the following command
>
> COPY <table_name> FROM <file> WITH DELIMITER E',’;
>
> This file, is rather large, it's around 178GBs.
>
> When I try to run this COPY command I get the following error:
>
> ERROR: out of memory
> DETAIL: Failed on request of size 2048 in memory context "AfterTriggerEvents".
> CONTEXT: COPY ssbm300_lineorder, line 50796791
>
> Clearly a memory allocation function is failing but I have no clue how to fix it.
>
> I have tried experimenting with shared_buffers value in postgresql.conf file but after searching a bit I quickly realized that I do not know what I am doing there so I left it with default value. Same with work_mem value.
>
> Did you face this issue before? Can you help me resolve it?
>
Well, it's clearly related to "after" triggers - do you have anything
such triggers on the table? AFAIK it might be related to deferred
constraints (like unique / foreign keys). Do you have anything like that?
If yes, I guess the only solution is to make the constraints not
deferred or split the copy into smaller chunks.
regards
--
Tomas Vondra
EnterpriseDB: http://www.enterprisedb.com
The Enterprise PostgreSQL Company
From | Date | Subject | |
---|---|---|---|
Next Message | Robert Haas | 2022-01-19 14:56:32 | Re: Removing more vacuumlazy.c special cases, relfrozenxid optimizations |
Previous Message | Nikita Malakhov | 2022-01-19 14:25:19 | Re: Pluggable toaster |