out of memory with large queries

From: Massimo Dal Zotto <dz(at)cs(dot)unitn(dot)it>
To: hackers(at)postgreSQL(dot)org (PostgreSQL Hackers)
Subject: out of memory with large queries
Date: 1999-06-09 20:36:00
Message-ID: 199906092036.WAA29474@fandango.cs.unitn.it
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-hackers

Hi,

I have a problem with large queries: I have a table with 300000 rows and
when I try the following query the backends runs out of memory:

select upper(name) from my_table;

The following queries without funcs or with funcs of int4 work fine:

select name from my_table;
select max(id,0) from my_table;

so I suspect that the trouble is with memory allocated by functions
returning data by address, which is not released until the end of the
transaction. With more then 300000 rows you eat a lot of memory.

This means that postgres is currently unable to execute large queries
that involve functions on text fields. A pretty bad limitation IMHO.

I tried to look at the code but haven't found a way to release the
storage allocated for each tuple and the context allocation code is
not very documented.

Any suggestion?

--
Massimo Dal Zotto

+----------------------------------------------------------------------+
| Massimo Dal Zotto email: dz(at)cs(dot)unitn(dot)it |
| Via Marconi, 141 phone: ++39-0461534251 |
| 38057 Pergine Valsugana (TN) www: http://www.cs.unitn.it/~dz/ |
| Italy pgp: finger dz(at)tango(dot)cs(dot)unitn(dot)it |
+----------------------------------------------------------------------+

Browse pgsql-hackers by date

  From Date Subject
Next Message Tom Lane 1999-06-09 21:14:12 Re: [HACKERS] out of memory with large queries
Previous Message Jan Wieck 1999-06-09 20:34:48 Re: [HACKERS] Priorities for 6.6