Performance issues when the number of records are around 10 Million

From: venu madhav <venutaurus539(at)gmail(dot)com>
To: pgadmin-support(at)postgresql(dot)org
Subject: Performance issues when the number of records are around 10 Million
Date: 2010-05-11 06:34:16
Message-ID: AANLkTinFR92eAjVJcukjHV8Nrb4CyjKRwayUTLYTcjMv@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgadmin-support

Hi all,
In my database application, I've a table whose records can reach 10M
and insertions can happen at a faster rate like 100 insertions per second in
the peak times. I configured postgres to do auto vacuum on hourly basis. I
have frontend GUI application in CGI which displays the data from the
database. When I try to get the last twenty records from the database, it
takes around 10-15 mins to complete the operation.This is the query which
is used:

* select e.cid, timestamp, s.sig_class, s.sig_priority, s.sig_name,
e.sniff_ip, e.sniff_channel, s.sig_config, e.wifi_addr_1,
e.wifi_addr_2, e.view_status, bssid FROM event e, signature s WHERE
s.sig_id = e.signature AND e.timestamp >= '1270449180' AND e.timestamp <
'1273473180' ORDER BY e.cid DESC, e.cid DESC limit 21 offset 10539780;
*
Can any one suggest me a better solution to improve the performance.

Please let me know if you've any further queries.

Thank you,
Venu

Responses

Browse pgadmin-support by date

  From Date Subject
Next Message Guillaume Lelarge 2010-05-11 06:43:26 Re: Performance issues when the number of records are around 10 Million
Previous Message Dave Page 2010-05-10 13:07:29 Re: error on getting sources of pgadmin3 using rsync