From: | Jeff Janes <jeff(dot)janes(at)gmail(dot)com> |
---|---|
To: | Rémi Cura <remi(dot)cura(at)gmail(dot)com> |
Cc: | PostgreSQL General <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: python modul pre-import to avoid importing each time |
Date: | 2014-06-25 19:46:22 |
Message-ID: | CAMkU=1z+8=j5rUhU7WD+Kv+kw4gHwoFAD1kKBcRUPSayBWmKuQ@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Thu, Jun 19, 2014 at 7:50 AM, Rémi Cura <remi(dot)cura(at)gmail(dot)com> wrote:
> Hey List,
>
> I use plpython with postgis and 2 python modules (numpy and shapely).
> Sadly importing such module in the plpython function is very slow (several
> hundreds of milliseconds).
Is that mostly shapely (which I don't have)? numpy seems to be pretty
fast, like 16ms. But that is still slow for what you want, perhaps.
>
> I also don't know if this overhead is applied each time the function is
> called in the same session.
It is not. The overhead is once per connection, not once per call.
So using a connection pooler could be really be a help here.
> Is there a way to pre-import those modules once and for all,
> such that the python function are accelerated?
I don't think there is. With plperl you can do this by loading the
module in plperl.on_init and by putting plperl into
shared_preload_libraries so that this happens just at server start up.
But I don't see a way to do something analogous for plpython due to
lack of plpython.on_init. I think that is because the infrastructure
to do that is part of making a "trusted" version of the language,
which python doesn't have. (But it could just be that no one has ever
gotten around to adding it.)
Cheers,
Jeff
From | Date | Subject | |
---|---|---|---|
Next Message | Heikki Linnakangas | 2014-06-25 21:08:48 | Re: Extended Prefetching using Asynchronous IO - proposal and patch |
Previous Message | Shaun Thomas | 2014-06-25 19:39:56 | Re: DATA corruption after promoting slave to master |