From: | Vick Khera <vivek(at)khera(dot)org> |
---|---|
To: | pgsql-general <pgsql-general(at)postgresql(dot)org> |
Subject: | Re: Indexing large table of coordinates with GiST |
Date: | 2015-01-15 14:04:23 |
Message-ID: | CALd+dccOeL5qLm8jut1JxENKt7tMOFeXcnqS4MBgmu3TjGAJfg@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I'd restructure the table to be split into perhaps 100 or so inherited
tables (or more). That many rows in a table are usually not efficient with
postgres in my experience. My target is to keep the tables under about 100
million rows. I slice them up based on the common query patterns, usually
by some ID number modulo 100. I don't really ever use date ranges like most
tutorials you'll see will suggest.
On Thu, Jan 15, 2015 at 7:44 AM, Daniel Begin <jfd553(at)hotmail(dot)com> wrote:
> Hi, I'm trying to create an index on coordinates (geography type) over a
> large table (4.5 billion records) using GiST...
>
> CREATE INDEX nodes_geom_idx ON nodes USING gist (geom);
>
> The command ran for 5 days until my computer stops because a power outage!
> Before restarting the index creation, I am asking the community if there
> are
> ways to shorten the time it took the first time :-)
>
> Any idea?
>
> Daniel
>
>
>
> --
> Sent via pgsql-general mailing list (pgsql-general(at)postgresql(dot)org)
> To make changes to your subscription:
> http://www.postgresql.org/mailpref/pgsql-general
>
From | Date | Subject | |
---|---|---|---|
Next Message | Andy Colson | 2015-01-15 14:45:12 | Re: Indexing large table of coordinates with GiST |
Previous Message | Vincent Veyron | 2015-01-15 13:42:08 | Re: Need advice for handling big data in postgres |