hi Urs,
this is perhaps not what you meant but...
I'd use a simple script to filter out the offending lines before they go
into the database, like the following written in Tcl (but you could
write this in almost any language):
----------<snip>----------
#!/usr/bin/tclsh
set fp [open badlines.txt a]
while {0 <= [gets stdin line]} {
set n [string length $line]
if {$n < 1 || $n > 20} {
puts $fp $line
} else {
puts stdout $line
}
}
close $fp
exit 0
----------<snip>----------
now your pipeline becomes:
su - postgres -c "cat test.csv | badlines | psql -X -q test -c \"COPY
t_test FROM stdin WITH DELIMITER AS ',' NULL AS '?' CSV QUOTE AS '\\\"'
ESCAPE AS '\\\"'\""
and you can process the lines in 'badlines.txt' later, at your leisure.
HTH
--
regards, jr. (jr(at)tailorware(dot)org(dot)uk)