Re: question

From: Adrian Klaver <adrian(dot)klaver(at)aklaver(dot)com>
To: anj patnaik <patna73(at)gmail(dot)com>, pgsql-general(at)postgresql(dot)org
Subject: Re: question
Date: 2015-10-15 15:04:08
Message-ID: 561FC068.4020700@aklaver.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On 10/14/2015 06:39 PM, anj patnaik wrote:
> Hello,
>
> I recently downloaded postgres 9.4 and I have a client application that
> runs in Tcl that inserts to the db and fetches records.
>
> For the majority of the time, the app will connect to the server to do
> insert/fetch.
>
> For occasional use, we want to remove the requirement to have a server
> db and just have the application retrieve data from a local file.
>
> I know I can use pg_dump to export the tables. The questions are:
>
> 1) is there an in-memory db instance or file based I can create that is
> loaded with the dump file? This way the app code doesn't have to change.

No.

>
> 2) does pg support embedded db?

No.

> 3) Or is my best option to convert the dump to sqlite and the import the
> sqlite and have the app read that embedded db.

Sqlite tends to follow Postgres conventions, so you might be able to use
the pg_dump output directly if you use --inserts or --column-inserts:

http://www.postgresql.org/docs/9.4/interactive/app-pgdump.html

>
> Finally, I am noticing pg_dump takes a lot of time to create a dump of
> my table. right now, the table has 77K rows. Are there any ways to
> create automated batch files to create dumps overnight and do so quickly?

Define long time.

What is the pg_dump command you are using?

Sure use a cron job.

>
> Thanks for your inputs!

--
Adrian Klaver
adrian(dot)klaver(at)aklaver(dot)com

In response to

  • question at 2015-10-15 01:39:14 from anj patnaik

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Geoff Winkless 2015-10-15 15:04:26 Re: postgres function
Previous Message Joe Conway 2015-10-15 15:03:58 Re: postgres function