From: | "Travis" <travis(dot)schmidt(at)gmail(dot)com> |
---|---|
To: | pgsql-general(at)postgresql(dot)org |
Subject: | Re: best way to kill long running query? |
Date: | 2007-03-21 18:31:21 |
Message-ID: | 1174501881.147610.159330@p15g2000hsd.googlegroups.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Mar 21, 11:36 am, E(dot)(dot)(dot)(at)aeroantenna(dot)com ("Bill Eaton") wrote:
> I want to allow some queries for my users to run for a prescribed period of
> time and kill them if they go over time. Is there a good way to do this? Or
> is this a bad idea?
>
> I've been struggling with trying to figure out the best way to allow users
> to browse through large tables. For example, I have one table with about
> 600,000 rows and growing at about 100,000/month.
>
> I want to allow users to browse through this table, but only if their
> effective SELECT statement only generates 100 or maybe 1000 rows. There are
> several fields that can be used in the WHERE clause, such as user, date,
> model, etc. It will be difficult for me to predict how large a result set is
> a priori. So I want to allow the query to run for a prescribed period of
> time, then kill it.
>
> I'll probably be using ADO --> ODBC at the client. So I could probably kill
> the Connection/Recordset. I just don't know the best way to do it. pgAdmin
> allows queries to be killed. How does it do it?
>
> Thanks in advance,
>
> Bill Eaton
> Thousand Oaks, CA
>
> ---------------------------(end of broadcast)---------------------------
> TIP 4: Have you searched our list archives?
>
> http://archives.postgresql.org/
You could use "limit" to set the max returned result set allowed when
you put together the query.
Travis
From | Date | Subject | |
---|---|---|---|
Next Message | Bill Eaton | 2007-03-21 19:09:41 | Re: best way to kill long running query? |
Previous Message | Teodor Sigaev | 2007-03-21 18:13:55 | Re: to_tsvector in 8.2.3 |