From: | Craig James <craig_james(at)emolecules(dot)com> |
---|---|
To: | fabio(dot)lafarcioli(at)molinoalimonti(dot)com |
Cc: | pgsql-performance(at)postgresql(dot)org |
Subject: | Re: Number of tables |
Date: | 2009-08-20 20:16:06 |
Message-ID: | 4A8DAF06.7060108@emolecules.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-performance |
Fabio La Farcioli wrote:
> i am developing a web app for thousands users (1.000/2.000).
>
> Each user have a 2 table of work...I finally have 2.000 (users) x 2
> tables = 4.000 tables!
>
> Postgres support an elevate number of tables??
> i have problem of performance ???
We have run databases with over 100,000 tables with no problems.
However, we found that it's not a good idea to have a table-per-user design. As you get more users, it is hard to maintain the database. Most of the time there are only a few users active.
So, we create a single large "archive" table, identical to the per-user table except that it also has a user-id column. When a user hasn't logged in for a few hours, a cron process copies their tables into the large archive table, and returns their personal tables to a "pool" of available tables.
When the user logs back in, a hidden part of the login process gets a table from the pool of available tables, assigns it to this user, and copies the user's data from the archive into this personal table. They are now ready to work. This whole process takes just a fraction of a second for most users.
We keep a pool of about 200 tables, which automatically will expand (create more tables) if needed, but we've never had more than 200 users active at one time.
Craig
From | Date | Subject | |
---|---|---|---|
Next Message | Greg Stark | 2009-08-20 21:41:57 | Re: Number of tables |
Previous Message | Robert Haas | 2009-08-20 17:55:29 | Re: number of rows estimation for bit-AND operation |