On Thu, May 21, 2009 at 3:13 PM, Alex Thurlow <alex(at)blastro(dot)com> wrote:
> I have a postgresql database that I'm using for logging of data. There's
> basically one table where each row is a line from my log files. It's
> getting to a size where it's running very slow though. There are about 10
> million log lines per day and I keep 30 days of data in it. All the columns
Are you using partitioning on this table? Your use case is literally
the exact example everyone uses to show how to do partitioning on
tables.
Since you mostly scan on date, this will speed up your queries significantly.