| From: | Henry <henry(at)zen(dot)co(dot)za> |
|---|---|
| To: | wickro <robwickert(at)gmail(dot)com> |
| Cc: | pgsql-general(at)postgresql(dot)org |
| Subject: | Re: work_mem greater than 2GB issue |
| Date: | 2009-05-14 17:31:03 |
| Message-ID: | 20090514193103.13743hjv1b5nj0ow@zenmail.co.za |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
Quoting wickro <robwickert(at)gmail(dot)com>:
> I have a largish table (> 8GB). I'm doing a very simple single group
> by on.
This doesn't answer your question, but you might want to take
advantage of table partitioning:
http://www.postgresql.org/docs/8.3/interactive/ddl-partitioning.html
I've recently gone through this exercise (several tables were 10GB+,
some almost 30GB) and if your WHERE clauses qualify, then expect
significant performance gains with /much/ better memory consumption.
You only have one large table, so partitioning it should be painless
and not take too long (unlike our scenario).
Cheers
Henry
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Sam Mason | 2009-05-14 17:34:40 | Re: postgresql on windows98 |
| Previous Message | wickro | 2009-05-14 15:47:24 | Re: work_mem greater than 2GB issue |