From: | "Thomas F(dot) O'Connell" <tfo(at)monsterlabs(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | large number of files open... |
Date: | 2002-01-16 20:53:27 |
Message-ID: | 3C45E847.7080303@monsterlabs.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
i'm running postgres 7.1.3 in a production environment. the database
itself contains on the order of 100 tables, including some complex
triggers, functions, and views. a few tables (on the order of 10) that
are frequently accessed have on the order of 100,000 rows.
every now and then, traffic on the server, which is accessed publicly
via mod_perl (Apache::DBI) causes the machine itself to hit the kernel
hard limit of number of files open: 8191.
this, unfortunately, crashes the machine. in a production environment of
this magnitude, is that a reasonable number of files to expect postgres
to need at any given time? is there any documentation anywhere on what
the number of open files depends on?
-tfo
From | Date | Subject | |
---|---|---|---|
Next Message | Doug Royer | 2002-01-16 20:57:04 | Re: [ANNOUNCE] Commercial: New Book!! PostgreSQL book is |
Previous Message | Thomas F. O'Connell | 2002-01-16 20:46:52 | news.postgresql.org |