From: | Sean Davis <sdavis2(at)mail(dot)nih(dot)gov> |
---|---|
To: | SunWuKung <Balazs(dot)Klein(at)axelero(dot)hu>, <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: inserting many rows |
Date: | 2006-01-03 14:29:19 |
Message-ID: | BFDFF46F.2A9B%sdavis2@mail.nih.gov |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On 1/2/06 5:34 PM, "SunWuKung" <Balazs(dot)Klein(at)axelero(dot)hu> wrote:
> I will need to insert multiple rows into a table from php.
> The data will come in 'packages' of 50-500 rows (they are responses from
> different questionnaires). As there will be many people sending their
> results in at the same time I need an effective method for this.
>
> What do you suggest is the most effective way to insert this type of
> data into the db? Issuing multiple inserts from php seems to be a waste
> of resources.
>
> I was thinking of writing the responses into a pg array field with a
> single insert and than explode the content of that field into rows with
> a function.
>
> Could you suggest an efficient aproach?
You could look at using COPY to insert many records very quickly. However,
inserting inside a transaction may be all that you need. Have you tried
simulating your application under expected loads so that you are sure that
you are making the right choice?
Sean
From | Date | Subject | |
---|---|---|---|
Next Message | Albert Vernon Smith | 2006-01-03 14:36:56 | insert serial numbers |
Previous Message | Ardian Xharra (Boxxo) | 2006-01-03 14:19:52 | Re: Query in postgreSQL version Windows |