| From: | Paul Jones <pbj(at)cmicdo(dot)com> |
|---|---|
| To: | "pgsql-general(at)postgresql(dot)org" <pgsql-general(at)postgresql(dot)org> |
| Subject: | Can LC_TIME affect timestamp input? |
| Date: | 2013-01-25 18:24:45 |
| Message-ID: | 1359138285.56820.YahooMailNeo@web122202.mail.ne1.yahoo.com |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-general |
Is it possible for LC_TIME locale to affect the format with which
timestamps are input?
I have DB2 CSV dumps with timestamps like '2003-10-21-22.59.44.000000'
that I want to load into Postgres with \copy. I would like to eliminate
the sed scripts that convert the timestamps in order to speed up the load.
(I know I could stage it through a temp table and use to_timestamp()
but that requires a temp table for each real table, which is not viable
w.r.t. our project goals).
I created a special locale with the DB2 timestamp format defined and did
set lc_time='en_DB.UTF-8';
It didn't affect anything, in or out with Postgres. I know the locale
works because date(1) displays the DB2 format correctly.
Postgres version: 9.2.2 (Built from source)
OS: Centos 6.3
Paul Jones
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Jeff Janes | 2013-01-25 18:42:38 | Re: Optimizing select count query which often takes over 10 seconds |
| Previous Message | Jeff Janes | 2013-01-25 17:58:54 | Re: Running update in chunks? |