Re: COPY to table with array columns (Longish)

From: Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us>
To: phillips(at)weatherbeeta(dot)com(dot)au
Cc: "'Aaron Bono'" <aaron(dot)bono(at)aranya(dot)com>, pgsql-sql(at)postgresql(dot)org
Subject: Re: COPY to table with array columns (Longish)
Date: 2006-06-13 00:30:14
Message-ID: 14241.1150158614@sss.pgh.pa.us
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-sql

"Phillip Smith" <phillips(at)weatherbeeta(dot)com(dot)au> writes:
> The whole sys file is variable length records like this - they range =
> from 1
> to over 17,000 fields per record.

17000? I think you really need to rethink your schema. While you could
theoretically drop 17000 elements into a PG array column, you wouldn't
like the performance --- it'd be almost unsearchable for instance.

I'd think about two tables, one with a single row for each SYS record
from the original, and one with one row for each detail item (the
invoice numbers in this case). With suitable indexes and a foreign key
constraint, this will perform a lot better than an array-based
translation.

And no, in neither case will you be able to import that file without
massaging it first.

regards, tom lane

In response to

Responses

Browse pgsql-sql by date

  From Date Subject
Next Message Aaron Bono 2006-06-13 02:58:13 Re: COPY to table with array columns (Longish)
Previous Message Phillip Smith 2006-06-12 23:45:26 Re: COPY to table with array columns (Longish)