From: | "Johnson, Shaunn" <SJohnson6(at)bcbsm(dot)com> |
---|---|
To: | "pg-general (E-mail)" <pgsql-general(at)postgresql(dot)org> |
Subject: | DBI connection to multiple database |
Date: | 2003-05-21 14:18:04 |
Message-ID: | 73309C2FDD95D11192E60008C7B1D5BB05FED33F@snt452.corp.bcbsm.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Howdy
Running PostgreSQL 7.2.1 on RedHat Linux 7.2.
I know this may be a non-Pg question, but here goes:
Is there a way to connect to one database, extract data,
(say, some select query) and then export that data
into another database without having to table the
data into a file of some sort?
For example, let's say I have two databases running on the
same machine. I have a script that opens a connection to
Pg (at some port number) and execute a sql ... that would
give me the variable (maybe an array?) to review / test / QC.
Then, I would need to open a connection to Pg (at some new
port) and figure out how to test to see if the target table
exists and has the right amount of columns and then
export / insert / copy the previously build array into
that target table.
It sounds straight forward (at least now that I'm thinking
out loud), but I've never done it and wondered if any of
you have done some thing like that. If so, any tips or
suggestions?
Thanks!
-X
From | Date | Subject | |
---|---|---|---|
Next Message | alex b. | 2003-05-21 14:34:34 | caching query results |
Previous Message | Joseph Healy | 2003-05-21 14:06:29 | Re: TIMESTAMP SUBTRACTION |