Re: create batch script to import into postgres tables

From: Pepe TD Vo <pepevo(at)yahoo(dot)com>
To: Christopher Browne <cbbrowne(at)gmail(dot)com>
Cc: Pgsql-admin <pgsql-admin(at)postgresql(dot)org>
Subject: Re: create batch script to import into postgres tables
Date: 2020-06-16 15:42:18
Message-ID: 1761277119.1163490.1592322138731@mail.yahoo.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-admin pgsql-general

Yes, I do have putty installed but can't connect to the aws postgres instance.  Only work for oracle instance.  Only connect postgres instance using pgadmin.
follow the url and the login prompt for username and hung there.
thank you.
Bach-Nga

No one in this world is pure and perfect.  If you avoid people for their mistakes you will be alone. So judge less, love, and forgive more.To call him a dog hardly seems to do him justice though in as much as he had four legs, a tail, and barked, I admit he was, to all outward appearances. But to those who knew him well, he was a perfect gentleman (Hermione Gingold)
**Live simply **Love generously **Care deeply **Speak kindly.*** Genuinely rich *** Faithful talent *** Sharing success

On Tuesday, June 16, 2020, 11:17:21 AM EDT, Christopher Browne <cbbrowne(at)gmail(dot)com> wrote:


On Tue, 16 Jun 2020 at 10:59, Pepe TD Vo <pepevo(at)yahoo(dot)com> wrote:

I can run \copy in Linux with individual csv file into the table fine and run import using pgadmin into AWS instance.  I am trying to run \copy all csv files import into its own table in Linux and in AWS instance. If all csv files into one table is fine but each csv for each table.  Should I create one batch job for each imported table?  If each batch file import csv to its table would be fine via \copy table_name(col1, col2, ... coln) from '/path/tablename.csv' delimiter ',' csv header;  right?

There is no single straightforward answer to that.
Supposing I want a batch to either all be processed, or to all not process, then I might write a sql file like:
begin;\copy table_1 (c1, c2, c3) from '/path/tabledata1.csv' csv header;\copy table_2 (c1, c2, c3) from '/path/tabledata2.csv' csv header;\copy table_3 (c1, c2, c3) from '/path/tabledata3.csv' csv header;commit;
But you may be fine with having a separate SQL script for each table.
There will be conditions where one or the other is more appropriate, and that will be based on the requirements of the process.

Also, the problem is I can't pull/execute psql from window client to pull the psql in aws instance and don't know how to create the batch script for this run.  I tried simple \copy pull from c:\tes.csv and psql is unknown.

You cannot run psql without having it installed; there is a Windows installer for PostgreSQL, so you could use that to get it installed.
Hopefully there is an installer that will just install PostgreSQL client software (like psql, pg_dump, and notably *not* the database server software); I don't use WIndows, so I am not too familiar with that.
 --
When confronted by a difficult problem, solve it by reducing it to the
question, "How would the Lone Ranger handle this?"

In response to

Responses

Browse pgsql-admin by date

  From Date Subject
Next Message Achilleas Mantzios 2020-06-16 16:04:05 please help! losing my subscriber node
Previous Message Christopher Browne 2020-06-16 15:16:50 Re: create batch script to import into postgres tables

Browse pgsql-general by date

  From Date Subject
Next Message Jim Hurne 2020-06-16 15:59:37 autovacuum failing on pg_largeobject and disk usage of the pg_largeobject growing unchecked
Previous Message Christopher Browne 2020-06-16 15:16:50 Re: create batch script to import into postgres tables