RE: Converting Access tables to PostGre

From: Dave Page <dpage(at)vale-housing(dot)co(dot)uk>
To: "'Richard Holland'" <rholland(at)equitechinfo(dot)com>, pgsql-interfaces(at)postgresql(dot)org
Subject: RE: Converting Access tables to PostGre
Date: 2000-12-05 15:07:18
Message-ID: 8568FC767B4AD311AC33006097BCD3D61A27B1@woody.vale-housing.co.uk
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-interfaces

> -----Original Message-----
> From: Richard Holland [mailto:rholland(at)equitechinfo(dot)com]
> Sent: 05 December 2000 14:41
> To: pgsql-interfaces(at)postgresql(dot)org
> Subject: [INTERFACES] Converting Access tables to PostGre
>
>
> I've come across tons of information on using Access with
> PostGre, and am thankful for this. Be this as it may, the
> upsizing tool I
> found to convert our access databases to postgre seems to
> puke on our 800+ meg databases (which is why we're doing this, our
> software was created by idiots with no scalability in mind).
> I downloaded pgadmin, the latest version as of a week ago. This works
> great, congrats to it's maintainers.

Thanks.

> My problem is the same
> above mentioned idiots used double and single field types to handle
> money. Being that our software handles property tax for
> government counties, we can't have any rounding errors. So I
> figured on
> using the numeric(x,y) type, so I fire up the migration tool
> built into pgadmin, edit the typemaps to throw singles and doubles to
> numeric, and it fails because it doesnt seem to give the
> numerics a length. It feeds postgre fieldname
> numeric(0,lenghtofdecimalsspecifiedinaccess)

pgAdmin attempts to figure out the scale/precision of the numeric from the
source database using the NumericScale and Precision properties of the
ADOX.Field object representing the source field (If I'm using the wrong
properties here corrections would be appreciated...). As the source database
has floats/longs in it (and therefore no (x,y) values) you will get odd
results (eg. zeros).

> this fails.. anybody have a better solution for migrating
> our data? I can make all 300 tables by hand if need be, but
> it's getting
> the millions of records in i'm worried about.

I suspect that pgAdmin would fail with a Migration this large as well as it
creates a recordset from the source table and uses that to fire insert
queries at PostgreSQL - in my experience ADO recordsets get unusable around
the 100,000 record mark.

I would suggest a different tactic along the lines of:

1) Migrate the table/index definitions 'as-is' using pgAdmin's migration
wizard.
2) Dump the PostgreSQL database using pg_dump and use vi or something to
change the floats to numerics.
3) Reload the database.
4) Write some Access/VB code to dump the source databases to ASCII text
files. Use a quote and a delimiter eg:

'fld1'|'fld2'|'fld3'|

Hint: If you have ' in you data replace it with '' and don't use ' as a
quote character

5) Use the pgAdmin Import Wizard to load the ASCII files.

I'd be interested to know how pgAdmin copes with such large text files if
you try this method...

HTH, Regards,
Dave.

Browse pgsql-interfaces by date

  From Date Subject
Next Message Peter Mount 2000-12-05 15:08:37 RE: Fwd: errors with getObject() with JDBC driver
Previous Message Peter Mount 2000-12-05 14:55:21 RE: jdbc7.0-1.2.jar