| From: | Jim Green <student(dot)northwestern(at)gmail(dot)com> |
|---|---|
| To: | pgsql-general <pgsql-general(at)postgresql(dot)org> |
| Subject: | huge price database question.. |
| Date: | 2012-03-20 23:27:16 |
| Message-ID: | CACAe89w4r9LbZCgv1p=xaZULtqBHcPTFz0XLRbH8MXGN9XLm-Q@mail.gmail.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
Greetings list!
I am pretty new to postgresql from mysql and did a fairly extensive
search of the list and came up with a few good ones but didn't find
the exact same situation as I have now. so I am venturing asking here.
I have daily minute stock price data from 2005 on and each day with
columns timestamp, open,high,low,close,volume and a few more. each
day's data is about 1.2million rows. I want import all the data to
postgresql and analyze using R with the help of Rpostgresql.
right now I am having about 7000 tables for individual stock and I use
perl to do inserts, it's very slow. I would like to use copy or other
bulk loading tool to load the daily raw gz data. but I need the split
the file to per stock files first before I do bulk loading. I consider
this a bit messy.
I would seek advise on the following idea:
store everything in a big table, partition by month(this gives a
reasonable number of partitions) and do bulk loading on the daily
file. my queries would consist mostly select on a particular symbol on
a particular day.
Also in the future, I will import daily data to the db every day.
my hardware is 16G Ram, 4x5400rpm raid10 with enough space.
Thanks!
Jim.
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Michael Nolan | 2012-03-20 23:45:20 | Re: huge price database question.. |
| Previous Message | Sam Loy | 2012-03-20 23:19:26 | Re: Is it even possible? |