From: | jack <jack4pg(at)a7q(dot)com> |
---|---|
To: | "pgsql-general(at)lists(dot)postgresql(dot)org" <pgsql-general(at)lists(dot)postgresql(dot)org> |
Subject: | re: constant crashing |
Date: | 2024-04-14 20:18:30 |
Message-ID: | RFS5d0T3zgpLa3Rd39K96mcUndlPZD7ktJmzEXkPTrcIwO3U0MDFC0StbozM0qW9tl7X1YDVVYGFQhCOMFi9r4kp3OohQCFIh198LV1HZvQ=@a7q.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
The CSV files are being produced by another system, a WIndows app on a Windows machine. I then copy them to a USB key and copy them onto the ubuntu machine. The data is then imported via the COPY command.
COPY master (field01,field02..fieldX) FROM '/data/file.text' DELIMITER E'\t'
The fields are tab delimited.
But importing the data works. I can get all the data into a single table without any problems. The issue is only when I start to update the single table. And that is why I started using smaller temporary tables for each CSV file, to do the updates in the smaller tables before I move them all to a single large table.
After all the data is loaded and updated, I run php programs on the large table to generate reports. All of which works well EXCEPT for performing the updates on the data. And I do not want to use perl or any outside tool. I want it all one in SQL because I am required to document all my steps so that someone else can take over, so everything needs to be as simple as possible.
From | Date | Subject | |
---|---|---|---|
Next Message | Adrian Klaver | 2024-04-14 20:28:31 | Re: constant crashing |
Previous Message | jack | 2024-04-14 20:11:24 | re: constant crashing |