"Sean Davis" <sdavis2(at)mail(dot)nih(dot)gov> writes:
> I am trying to build a full-text index (gin(to_tsvector('english',
> title || abstract))) on about 18 million abstracts and titles from
> medical literature. However, I keep getting out-of-memory errors. (I
> am on a 32Gb linux system with maintenance_work_mem set to 20Gb and
> shared buffers at 4Gb; postgres 8.3beta). Does creation of a
> full-text index require that the entire index fit into memory?
I can't reproduce any memory-leak issue here. I wonder whether your
maintenance_work_mem setting is optimistically large (like, higher
than the ulimit restriction on the postmaster).
regards, tom lane