From: | "Robert Brewer" <fumanchu(at)aminus(dot)org> |
---|---|
To: | "bricklen" <bricklen(at)gmail(dot)com> |
Cc: | <pgsql-bugs(at)postgresql(dot)org> |
Subject: | Re: SELECT '(1, nan, 3)'::cube; |
Date: | 2011-03-16 18:14:31 |
Message-ID: | F1962646D3B64642B7C9A06068EE1E64110D7CB6@ex10.hostedexchange.local |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-bugs |
bricklen wrote:
> On Tue, Mar 15, 2011 at 9:08 AM, Robert Brewer <fumanchu(at)aminus(dot)org>
> wrote:
> > I'm working on a hypercube implementation in Postgres using
> contrib/cube
> >
> > and need to insert 80,000 rows in one go from Python. Doing so with
> > INSERT, even multiple statements in one call, is pretty slow. I've
> been
> > investigating if using COPY is faster.
>
> When you say "multiple statements", do you mean
>
> INSERT INTO foo (coords) VALUES
> (cube(ARRAY[1, 'nan', 3]::float[])),
> (cube(ARRAY[2, 'nan', 4]::float[])),
> (cube(ARRAY[3, 'nan', 5]::float[])),
> (cube(ARRAY[4, 'nan', 6]::float[]));
>
> I was going to suggest trying that method, but if you already have
> then please ignore me!
Yes, I'm using the above now. Looking for something faster. I'll probably settle on using -Inf for the short term so I can use COPY.
Bob
From | Date | Subject | |
---|---|---|---|
Next Message | Tambet Matiisen | 2011-03-16 19:08:07 | Re: BUG #5929: ERROR: found toasted toast chunk for toast value 260340218 in pg_toast_260339342 |
Previous Message | bricklen | 2011-03-16 16:18:02 | Re: SELECT '(1, nan, 3)'::cube; |