From: | John R Pierce <pierce(at)hogranch(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: Getting Mysql data into Postgres: least painful methods? |
Date: | 2013-01-12 03:13:25 |
Message-ID: | 50F0D4D5.6070601@hogranch.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 1/11/2013 3:54 PM, Ken Tanzer wrote:
> Here's the fuller description of what I'm trying to do. I've got a
> dataset (a UMLS//Metathesaurus subset) that I need to get into a
> Postgres database. It's all reference data, and so will be
> read-only. There's no functions or logic involved. I anticipate
> having to update it at least quarterly, so I'd like to get to a
> well-grooved import process.
how many tables? if its just one or a couple tables, can you get the
data as CSV? then it would be trivial to import into postgres, using
the COPY command (or, \c from psql)...
another alternative, investigate "ETL" tools, these are general purpose
data manglers that can connect to a source database (usually any of
about 20 supported), extract data, transform it if needed, and load it
into a destination database (from a list of 20 or so typically supported)
From | Date | Subject | |
---|---|---|---|
Next Message | pasilveira | 2013-01-12 05:31:17 | pgadmin connection via tunnel and ubuntu user instead of postgres |
Previous Message | Rob Sargentg | 2013-01-12 02:37:43 | Re: psql copy from through bash |