From: | Atri Sharma <atri(dot)jiit(at)gmail(dot)com> |
---|---|
To: | Bruce Momjian <bruce(at)momjian(dot)us> |
Cc: | Merlin Moncure <mmoncure(at)gmail(dot)com>, Seref Arikan <serefarikan(at)kurumsalteknoloji(dot)com>, PG-General Mailing List <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: What happens if I create new threads from within a postgresql function? |
Date: | 2013-02-18 17:03:26 |
Message-ID: | CEF7562B-414F-4E8F-8C19-1993C0C66336@gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Sent from my iPad
On 18-Feb-2013, at 22:27, Bruce Momjian <bruce(at)momjian(dot)us> wrote:
> On Mon, Feb 18, 2013 at 09:56:22AM -0600, Merlin Moncure wrote:
>> On Mon, Feb 18, 2013 at 5:10 AM, Seref Arikan
>> <serefarikan(at)kurumsalteknoloji(dot)com> wrote:
>>> Greetings,
>>> What would happen if I create multiple threads from within a postgresql
>>> function written in C?
>>> I have the opportunity to do parallel processing on binary data, and I need
>>> to create multiple threads to do that.
>>> If I can ensure that all my threads complete their work before I exit my
>>> function, would this cause any trouble ?
>>> I am aware of postgresql's single threaded nature when executing queries,
>>> but is this a limitation for custom multi threaded code use in C based
>>> functions?
>>> I can't see any problems other than my custom spawn threads living beyond my
>>> function's execution and memory/resource allocation issues, but if I can
>>> handle them, should not I be safe?
>>>
>>> I believe I've seen someone applying a similar principle to use GPUs with
>>> postgresql, and I'm quite interested in giving this a try, unless I'm
>>> missing something.
>>
>> Some things immediately jump to mind:
>> *) backend library routines are not multi-thread safe. Notably, the
>> SPI interface and the memory allocator, but potentially anything. So
>> your spawned threads should avoid calling the backend API. I don't
>> even know if it's safe to call malloc.
>>
>> *) postgres exception handling can burn you, so I'd be stricter than
>> "before I exit my function"...really, you need to make sure threads
>> terminate before any potentially exception throwing backend routine
>> fires, which is basically all of them including palloc memory
>> allocation and interrupt checking. So, we must understand that:
>>
>> While your threads are executing, your query can't be cancelled --
>> only a hard kill will take the database down. If you're ok with that
>> risk, then go for it. If you're not, then I'd thinking about
>> sendinging the bytea through a protocol to a threaded processing
>> server running outside of the database. More work and slower
>> (protocol overhead), but much more robust.
>
> You can see the approach of not calling any PG-specific routines from
> theads here:
>
> http://wiki.postgresql.org/wiki/Parallel_Query_Execution#Approaches
>
Is there any way to locally synchronise the threads in my code,and send the requests to the PostgreSQL backend one at a time? Like a waiting queue in my code?
Regards,
Atri
From | Date | Subject | |
---|---|---|---|
Next Message | Bruce Momjian | 2013-02-18 17:08:38 | Re: What happens if I create new threads from within a postgresql function? |
Previous Message | Bruce Momjian | 2013-02-18 16:57:30 | Re: What happens if I create new threads from within a postgresql function? |