| From: | John R Pierce <pierce(at)hogranch(dot)com> |
|---|---|
| To: | pgsql-general(at)postgresql(dot)org |
| Subject: | Re: Loading different files |
| Date: | 2010-12-14 07:45:17 |
| Message-ID: | 4D07208D.9070203@hogranch.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
On 12/13/10 11:25 PM, Sven Krosse wrote:
> Dear all,
>
> I am looking for a mechanism to load a specific file into an existing
> database. The file is a CTM (Compact Topic Maps Syntax ) file and I
> have written a CTM parser in plpgsql which works fine. The main
> problem is, that I have to load the file into memory and sent a query
> to database which causes an memory error because of a hugh file size.
> Is it possible to copy data from a file into database by using own
> functions?
>
you could use the Large Object API to stream the data into postgres.
but that just stores it as a large blob of data. if you need more
structured storage, I don't know what to suggest, except maybe parsing
your CTM in your application and sending the data to postgres a row at a
time.
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Filip Rembiałkowski | 2010-12-14 08:27:34 | Re: crosstab function |
| Previous Message | Sven Krosse | 2010-12-14 07:25:12 | Loading different files |