Does anyone have any experience with very large postgresql tables? By
very large, I mean a table with ~38 million records, each record will
have between 80 and 128 bytes (we are not sure of some column sizes yet)
in ~10 columns with probably 3 btree-indexes. Basically the table will
hold all of the Postal Service deliverable addresses in the US in a
somewhat compressed form.
My concerns are in the area of performance and robustness.
I know I haven't been specific enough about the table layout, but I am
not sure yet exactly what it will look like. I am just try to get a gut
level feeling that this has been done before and there are no "got'chas"
out there.
Thank you all,
Jeff