From: | Bill Thoen <bthoen(at)gisnet(dot)com> |
---|---|
To: | pgsql-general General <pgsql-general(at)postgresql(dot)org> |
Subject: | How Big is Too Big for Tables? |
Date: | 2010-07-28 17:09:44 |
Message-ID: | 4C506458.7010502@gisnet.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
I'm building a national database of agricultural information and one of
the layers is a bit more than a gigabyte per state. That's 1-2 million
records per state, with a mult polygon geometry, and i've got about 40
states worth of data. I trying to store everything in a single PG table.
What I'm concerned about is if I combine every state into one big table
then will performance will be terrible, even with indexes? On the other
hand, if I store the data in several smaller files, then if a user zooms
in on a multi-state region, I've got to build or find a much more
complicated way to query multiple files.
So I'm wondering, should I be concerned with building a single national
size table (possibly 80-100 Gb) for all these records, or should I keep
the files smaller and hope there's something like ogrtindex out there
for PG tables? what do you all recommend in this case? I just moved over
to Postgres to handle big files, but I don't know its limits. With a
background working with MS Access and bitter memories of what happens
when you get near Access' two gigabyte database size limit, I'm a
little nervous of these much bigger files. So I'd appreciate anyone's
advice here.
TIA,
- Bill Thoen
From | Date | Subject | |
---|---|---|---|
Next Message | Joshua D. Drake | 2010-07-28 17:10:54 | Re: Which CMS/Ecommerce/Shopping cart ? |
Previous Message | Sandeep Srinivasa | 2010-07-28 17:07:48 | Re: Which CMS/Ecommerce/Shopping cart ? |