| From: | mlw <pgsql(at)mohawksoft(dot)com> | 
|---|---|
| To: | pgsql-hackers(at)postgresql(dot)org | 
| Subject: | Aggregate "rollup" | 
| Date: | 2003-03-05 20:47:18 | 
| Message-ID: | 3E666256.9050001@mohawksoft.com | 
| Views: | Whole Thread | Raw Message | Download mbox | Resend email | 
| Thread: | |
| Lists: | pgsql-hackers | 
I had written a piece of code about two years ago that used the 
aggregate feature of PostgreSQL to create an array of integers from an 
aggregate, as:
select int_array_aggregate( column ) from table group by column
While it seems pointless to create an array on a select, it has a 
purpose in OLAP. For instance, suppose you do this:
create table fast_lookup as select reference, 
int_array_aggregate(result) from table group by result
The "fast_lookup" table now has all the result entries as an array in a 
single row. In the systems that I have used this, it has provided a 
dramatic improvement, especially when you have a high number of 
identical "reference" entries in a classic "one to many" table.
The question is, would a more comprehensive solution be wanted? 
Possible? Something like:
create table fast_lookup as select reference, aggregate_array( field ) 
from table group by field
Where the function aggregate_array takes any number of data types.
Any thoughts? I think I need to fix the code in the current 
/contrib/intagg anyway, so is it worth doing the extra work to included 
multiple data types?
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Joe Conway | 2003-03-05 21:10:46 | Re: Aggregate "rollup" | 
| Previous Message | mlw | 2003-03-05 20:05:03 | Re: Best setup for RAM drive |