From: | Adrien Nayrat <adrien(dot)nayrat(at)anayrat(dot)info> |
---|---|
To: | Anto Aravinth <anto(dot)aravinth(dot)cse(at)gmail(dot)com> |
Cc: | <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | Re: Using COPY to import large xml file |
Date: | 2018-06-24 16:15:33 |
Message-ID: | e9d0128c-23d7-d162-7922-cc8436c835b7@anayrat.info |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 06/24/2018 06:07 PM, Anto Aravinth wrote:
> Thanks for the response. I'm not sure, how long does this tool takes for the
> 70GB data.
In my memory, it took several hours. I can't remember if it is xml conversion or
insert which are longer.
>
> I used node to stream the xml files into inserts.. which was very slow..
> Actually the xml contains 40 million records, out of which 10Million took around
> 2 hrs using nodejs. Hence, I thought will use COPY command, as suggested on the
> internet.
>
> Definitely, will try the code and let you know.. But looks like it uses the same
> INSERT, not copy.. interesting if it runs quick on my machine.
Yes it use INSERT, maybe it is not difficult to change the code to use COPY instead.
--
Adrien NAYRAT
https://blog.anayrat.info
From | Date | Subject | |
---|---|---|---|
Next Message | Adrian Klaver | 2018-06-24 16:57:58 | Re: Using COPY to import large xml file |
Previous Message | Anto Aravinth | 2018-06-24 16:07:54 | Re: Using COPY to import large xml file |