| From: | Roxanne Reid-Bennett <rox(at)tara-lu(dot)com> | 
|---|---|
| To: | pgsql-general(at)postgresql(dot)org | 
| Subject: | Re: controlling memory management with regard to a specific query (or groups of connections) | 
| Date: | 2015-11-19 07:26:55 | 
| Message-ID: | 564D79BF.4030407@tara-lu.com | 
| Views: | Whole Thread | Raw Message | Download mbox | Resend email | 
| Thread: | |
| Lists: | pgsql-general | 
On 11/18/2015 5:10 PM, Jonathan Vanasco wrote:
> As a temporary fix I need to write some uploaded image files to PostgreSQL until a task server can read/process/delete them.
>
> The problem I've run into (via server load tests that model our production environment), is that these read/writes end up pushing the indexes used by other queries out of memory -- causing them to be re-read from disk.   These files can be anywhere from 200k to 5MB.
>
> has anyone dealt with situations like this before and has any suggestions?  I could use a dedicated db connection if that would introduce any options.
We have a system that loads a bunch of files up to be processed - we 
queue them for processing behind the scenes.  We don't load them into 
Postgres before processing.  We put them in a temp directory and just 
save the location of the file to the database.  This configuration does 
have limitations.  Post-processing can not be load balanced across 
servers unless the temp directory is  shared.
I'm sure you'll get more DB centric answers from others on the list.
Roxanne
-- 
[At other schools] I think the most common fault in general is to teach students how to pass exams instead of teaching them the science.
Donald Knuth
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Marc Mamin | 2015-11-19 13:51:18 | Fetching from psql procedures | 
| Previous Message | Jonathan Vanasco | 2015-11-19 01:10:00 | controlling memory management with regard to a specific query (or groups of connections) |