From: | Stephan Szabo <sszabo(at)megazone23(dot)bigpanda(dot)com> |
---|---|
To: | Wilkinson Charlie E <Charlie(dot)E(dot)Wilkinson(at)irs(dot)gov> |
Cc: | <pgsql-sql(at)postgresql(dot)org> |
Subject: | Re: Working with very large datasets |
Date: | 2003-02-12 00:02:50 |
Message-ID: | 20030211160155.O19527-100000@megazone23.bigpanda.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-sql |
On Tue, 11 Feb 2003, Wilkinson Charlie E wrote:
> Greetings,
> Can anyone enlighten me or point me at resources concerning use of pgsql
> with
> very large datasets?
>
> My specific problem is this:
>
> I have two tables, one with about 100 million rows and one with about 22,000
> rows. My plan was to inner join the two tables on an integer key and output
> the 4 significant columns, excluding the keys. (Those with a better
> understanding
> of pgsql internals, feel free to laugh.) The result was a big angry psql
> that
> grew to 800+MB before I had to kill it.
Was it psql that grew to 800Mb or a backend? If the former, how many rows
do you expect that to return? You probably want to look into using
cursors rather than returning the entire result set at once.
From | Date | Subject | |
---|---|---|---|
Next Message | Lex Berezhny | 2003-02-12 00:33:03 | Re: Working with very large datasets |
Previous Message | Susan | 2003-02-11 23:44:50 | adding not null constraints on columns |