I am designing a system that it takes information of several databases distributed in Interbase (RDBMS). It is a system web and each user can to do out near 50 consultations for session. I can have simultaneously around 100 users. Therefore I can have 5000 consultations simultaneously. Each consultation goes join to a space component in Postgis, therefore I need to store each consultation in PostgreSQL to be able to use all the capacity of PostGIS. The question is if for each consultation in execution time build a table in PostGRESQL I use it and then I erase it. Is a system efficient this way? Is it possible to have 5000 tables in PostGRESQL? How much performance?
Thanks for your help!