Re: constant crashing

From: Adrian Klaver <adrian(dot)klaver(at)aklaver(dot)com>
To: jack <jack4pg(at)a7q(dot)com>, "pgsql-general(at)lists(dot)postgresql(dot)org" <pgsql-general(at)lists(dot)postgresql(dot)org>
Subject: Re: constant crashing
Date: 2024-04-14 20:28:31
Message-ID: 311b047d-01f1-4ce3-86de-09ff0181ba00@aklaver.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On 4/14/24 13:18, jack wrote:
> The CSV files are being produced by another system, a WIndows app on a
> Windows machine. I then copy them to a USB key and copy them onto the
> ubuntu machine. The data is then imported via the COPY command.

The app?

The locale in use on the Windows machine?

The locale in use in the database?

>
> COPY master (field01,field02..fieldX) FROM '/data/file.text' DELIMITER E'\t'
> The fields are tab delimited.
>
> But importing the data works. I can get all the data into a single table
> without any problems. The issue is only when I start to update the
> single table. And that is why I started using smaller temporary tables
> for each CSV file, to do the updates in the smaller tables before I move
> them all to a single large table.

The import is just dumping the data in, my suspicion is the problem is
related to using string functions on the data.

>
> After all the data is loaded and updated, I run php programs on the
> large table to generate reports. All of which works well EXCEPT for
> performing the updates on the data. And I do not want to use perl or any
> outside tool. I want it all one in SQL because I am required to document
> all my steps so that someone else can take over, so everything needs to
> be as simple as possible.
>

--
Adrian Klaver
adrian(dot)klaver(at)aklaver(dot)com

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Adrian Klaver 2024-04-15 00:50:46 Re: constant crashing
Previous Message jack 2024-04-14 20:18:30 re: constant crashing