Re: Is it reasonable to store double[] arrays of 30K elements

From: Rob Sargent <robjsargent(at)gmail(dot)com>
To: pgsql-general(at)postgresql(dot)org
Subject: Re: Is it reasonable to store double[] arrays of 30K elements
Date: 2014-02-04 20:59:32
Message-ID: 52F154B4.2090606@gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On 02/04/2014 01:52 PM, AlexK wrote:
> Every row of my table has a double[] array of approximately 30K numbers. I
> have ran a few tests, and so far everything looks good.
>
> I am not pushing the limits here, right? It should be perfectly fine to
> store arrays of 30k double numbers, correct?
>
>
>
> --
> View this message in context: http://postgresql.1045698.n5.nabble.com/Is-it-reasonable-to-store-double-arrays-of-30K-elements-tp5790562.html
> Sent from the PostgreSQL - general mailing list archive at Nabble.com.
>
>
What sorts of tests and what sorts of results?
Each record has something like 30000*16 + 30000*(per cell overhead,
which could be zero) but that is definitely spilling over to toast.
Have you done any large scale deletes?

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Adrian Klaver 2014-02-04 21:00:34 Re: The timezone oddities
Previous Message Sergey Konoplev 2014-02-04 20:55:58 Re: The timezone oddities