From: | "mike focosi" <mike(at)goldendome(dot)com> |
---|---|
To: | <pgsql-general(at)postgresql(dot)org> |
Subject: | Table Corruption |
Date: | 2000-11-09 16:38:39 |
Message-ID: | 00ce01c04a6b$880503c0$2a252fd0@goldendome.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin pgsql-general |
I have several php scripts that allow my clients to add stories to a
database. I've noticed that once in a while, when they attempt to
insert/update more than 8k into a table, the table is becoming corrupted.
When a select is done on the table, the database hangs. When this happens to
one table in one database, it brings down the whole postgreSQL server
backend. Not good.
I've created a database with the same scheme/data and have only been able to
recreate the problem once. Can't seem to do it everytime. PostgreSQL usually
gives the "PQsendQuery() -- query is too long..." error and just doesn't
follow thru with the query which is fine. I can deal with limiting my
clients to less than 8k stories. But sometimes the table gets corrupted by
a query that shouldn't be executed (ie: stopped at the too long warning).
Has anybody ever had any problems like the one I described. I've yet to
upgrade our database servers to 7+ (running 6.4) so I'm hoping maybe that
would solve the problem. But I'd like to know what the exact problem is.
thanks
-mike
mike focosi
senior applications developer
golden dome media
http://www.goldendomemedia.com
http://www.surfmichiana.com
http://www.wndu.com
From | Date | Subject | |
---|---|---|---|
Next Message | Tom Lane | 2000-11-09 17:19:47 | Re: Table Corruption |
Previous Message | Peter Eisentraut | 2000-11-09 15:53:47 | Re: [SQL] alter pg_shadow |
From | Date | Subject | |
---|---|---|---|
Next Message | Josh Berkus | 2000-11-09 16:40:41 | Requests for Development |
Previous Message | Peter Eisentraut | 2000-11-09 16:33:53 | Re: need an information on PostgreSQL |