From: | "David G(dot) Johnston" <david(dot)g(dot)johnston(at)gmail(dot)com> |
---|---|
To: | Ravi Krishna <srkrishna(at)yahoo(dot)com> |
Cc: | PG mailing List <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | Re: Load data from a csv file without using COPY |
Date: | 2018-06-19 20:49:33 |
Message-ID: | CAKFQuwYhfbUqbevbgv=nT2HmOeeAcncUwZ=YWdyF8Y3+Nz9f_w@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Tue, Jun 19, 2018 at 1:16 PM, Ravi Krishna <srkrishna(at)yahoo(dot)com> wrote:
> In order to test a real life scenario (and use it for benchmarking) I want
> to load large number of data from csv files.
> The requirement is that the load should happen like an application writing
> to the database ( that is, no COPY command).
> Is there a tool which can do the job. Basically parse the csv file and
> insert it to the database row by row.
>
I'm skeptical that injesting CSV of any form, even if you intentionally
blow things up by converting into:
BEGIN;
INSERT INTO tbl VALUES ('','','');
COMMIT;
BEGIN;
INSERT INTO tbl VALUES ('','','');
COMMIT;
(which is what auto-commit mode looks like)
Is going to provide a meaningful benchmark for application-like usage
patterns.
But anyway, I'm not familiar with any tools that make doing this
particularly simple. In most situations like this I'll just import the CSV
into a spreadsheet and create a formula that builds out the individual SQL
commands. Whether that's useful depends a lot on how often the source CSV
is updated.
That said, I have the following tool to be generally helpful in this area -
though I'm thinking it doesn't do what you want here.
http://csvkit.readthedocs.io/en/1.0.3/scripts/csvsql.html
David J.
From | Date | Subject | |
---|---|---|---|
Next Message | Alban Hertroys | 2018-06-19 21:05:19 | Re: Is postorder tree traversal possible with recursive CTE's? |
Previous Message | Hans Schou | 2018-06-19 20:31:56 | Re: Load data from a csv file without using COPY |