From: | "Brett W(dot) McCoy" <bmccoy(at)chapelperilous(dot)net> |
---|---|
To: | Matt <matthewf9(at)aol(dot)com> |
Cc: | <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: Import Database |
Date: | 2001-02-05 20:22:31 |
Message-ID: | Pine.LNX.4.30.0102051516040.30791-100000@chapelperilous.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Mon, 29 Jan 2001, Matt wrote:
> I am trying to find if importing a very large delimited text file is faster
> with postgresql or mysql (with mysqlimport). Each night the transaction
> system we use completes a text file of the days activities, which must be
> loaded into a database, the speed is very important, mysqlimport takes less
> than an hour, however sometimes crashes. Is postgresql likely to be faster
> or slower at importing such vast amounts of data?
How much data are you talking about? Megabytes? Gigabytes?
PostgreSQL will load fairly fast if you turn off fsync and delete your
indexes and rebuild them after the import. I haven't played with large
imports on the newer Postgres, but a couple of years ago I was importing
millions of rows into 6.5 on a lowly Pentium 200, with no indexes and with
fsync turned off. I had to load each table separately (each one was
several million rows, plain old delimited delimited text), and they loaded
fairly quickly -- maybe 10 or 15 minutes, just using the COPY command
inside of psql. With fsync on and indexes in place, it took *hours* to
load and basically slowed the server to a crawl because of the I/O
overhead.
-- Brett
From | Date | Subject | |
---|---|---|---|
Next Message | Alex Pilosov | 2001-02-05 20:25:47 | Re: Compiling Perl code |
Previous Message | Mike Miller | 2001-02-05 20:04:50 | Re: [HACKERS] Re: Re: grant privileges to a database [URGENT] |