From: | Oliver Elphick <olly(at)lfix(dot)co(dot)uk> |
---|---|
To: | postgres(at)countup(dot)de |
Cc: | pgsql-performance(at)postgresql(dot)org |
Subject: | Re: Select-Insert-Query |
Date: | 2004-03-02 11:48:06 |
Message-ID: | 1078228086.13842.60.camel@cerberus.lfix.co.uk |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
On Tue, 2004-03-02 at 00:49, postgres(at)countup(dot)de wrote:
> what is the most performant way to select for example the first 99
> rows of a table and insert them into another table...
>
> at the moment i do this:
>
> for userrecord in select * from table where account_id = a_account_id
> and counter_id = userrecord.counter_id and visitortable_id between
> a_minid and a_maxid limit 99 loop
Using LIMIT without ORDER BY will give a selection that is dependent on
the physical location of rows in the table; this will change whenever
one of them is UPDATEd.
> insert into lastusers (account_id, counter_id, date, ip, hostname)
> values(a_account_id,userrecord.counter_id,userrecord.date
> ,userrecord.ip,userrecord.hostname);
> end loop;
>
> i think "limit" is a performance killer, is that right? but what to do
> instead
I'm sure it is the loop that is the killer. Use a query in the INSERT
statement:
INSERT INTO lastusers (account_id, counter_id, date, ip, hostname)
SELECT * FROM table
WHERE account_id = a_account_id AND
counter_id = userrecord.counter_id AND
visitortable_id between a_minid and a_maxid
ORDER BY date DESC
LIMIT 99;
--
Oliver Elphick <olly(at)lfix(dot)co(dot)uk>
LFIX Ltd
From | Date | Subject | |
---|---|---|---|
Next Message | Bruce Momjian | 2004-03-02 16:40:46 | Re: [PERFORM] Materialized View Summary |
Previous Message | Fred Moyer | 2004-03-02 10:57:27 | Re: Scaling further up |