From: | Indraneel Majumdar <indraneel(at)www(dot)cdfd(dot)org(dot)in> |
---|---|
To: | pgsql-sql(at)postgresql(dot)org |
Subject: | dynamic object creation |
Date: | 2000-10-12 19:56:46 |
Message-ID: | Pine.SGI.3.96.1001012124609.90724C-100000@www.cdfd.org.in |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
Hi,
I'm not sure if the subject line has been proper. I have this following
problem which I hope PostgreSQL can handle.
I'm converting a complex flatfile where records are arranged serially.
some fields are as 'n' times repeating blocks of multiple lines. Some
subfields within these are also 'n' time repeating blocks of multiple
lines. So in my main table I do not know (until at run time) how many
fields to create (same for any sub tables). How can I do this dynamically?
I tried using arrays, but retrieval from that is causing some problems. I
have already checked the array utilities in the contrib section and have
extended the operator list for other types (I'll send the file to it's
original author so that he may include it if he wishes).
I think there must be some object-oriented way of doing this without
creating too many keys. or are keys the only and best method? Using this
is causing a performance hit. If it's any help, what I'm trying to convert
are biological databases distributed in 'SRS' flatfile format from
ftp.ebi.ac.uk/pub/databases/
Thank you,
Indraneel
/************************************************************************.
# Indraneel Majumdar ¡ E-mail: indraneel(at)123india(dot)com #
# Bioinformatics Unit (EMBNET node), ¡ URL: http://scorpius.iwarp.com #
# Centre for DNA Fingerprinting and Diagnostics, #
# Hyderabad, India - 500076 #
`************************************************************************/
From | Date | Subject | |
---|---|---|---|
Next Message | Philip Cook | 2000-10-12 21:50:55 | xml perl & pgsql |
Previous Message | Tom Lane | 2000-10-12 19:32:26 | Re: Standard syntax? |