Re: huge price database question..

From: Brent Wood <Brent(dot)Wood(at)niwa(dot)co(dot)nz>
To: Jim Green <student(dot)northwestern(at)gmail(dot)com>, David Kerr <dmk(at)mr-paradox(dot)net>
Cc: "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org>
Subject: Re: huge price database question..
Date: 2012-03-21 01:54:58
Message-ID: B30242D206AB9543A3406649674DB4190D0D8D03@welwexmb01.niwa.local
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general


Also look at a clustered index on timestamp

Brent Wood

GIS/DBA consultant
NIWA
+64 (4) 4 386-0300
________________________________________
From: pgsql-general-owner(at)postgresql(dot)org [pgsql-general-owner(at)postgresql(dot)org] on behalf of Jim Green [student(dot)northwestern(at)gmail(dot)com]
Sent: Wednesday, March 21, 2012 2:50 PM
To: David Kerr
Cc: pgsql-general(at)postgresql(dot)org
Subject: Re: [GENERAL] huge price database question..

On 20 March 2012 21:40, David Kerr <dmk(at)mr-paradox(dot)net> wrote:
> On 03/20/2012 04:27 PM, Jim Green wrote:
>
> Greetings list!
> I am pretty new to postgresql from mysql and did a fairly extensive
> search of the list and came up with a few good ones but didn't find
> the exact same situation as I have now. so I am venturing asking here.
>
> I have daily minute stock price data from 2005 on and each day with
> columns timestamp, open,high,low,close,volume and a few more. each
> day's data is about 1.2million rows. I want import all the data to
> postgresql and analyze using R with the help of Rpostgresql.
>
> right now I am having about 7000 tables for individual stock and I use
> perl to do inserts, it's very slow. I would like to use copy or other
> bulk loading tool to load the daily raw gz data. but I need the split
> the file to per stock files first before I do bulk loading. I consider
> this a bit messy.
>
> I would seek advise on the following idea:
> store everything in a big table, partition by month(this gives a
> reasonable number of partitions) and do bulk loading on the daily
> file. my queries would consist mostly select on a particular symbol on
> a particular day.
>
> Also in the future, I will import daily data to the db every day.
>
> my hardware is 16G Ram, 4x5400rpm raid10 with enough space.
>
> Thanks!
>
> Jim.
>
>
> Seems like you'd want to do this?
> http://search.cpan.org/~turnstep/DBD-Pg-2.19.2/Pg.pm#pg_putcopydata
> COPY support
>
> DBD::Pg allows for quick (bulk) reading and storing of data by using the
> COPY command. The basic process is to use $dbh->do to issue a COPY command,
> and then to either add rows using "pg_putcopydata", or to read them by using
> "pg_getcopydata".

Thanks! would you comment on the table setup as well?

Jim.
>
>
>

--
Sent via pgsql-general mailing list (pgsql-general(at)postgresql(dot)org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-general
--
Please consider the environment before printing this email.
NIWA is the trading name of the National Institute of Water & Atmospheric Research Ltd.

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Jim Green 2012-03-21 02:02:57 Re: huge price database question..
Previous Message Jim Green 2012-03-21 01:50:42 Re: huge price database question..