Ways to "serialize" result set for later use?

From: Adam Brusselback <adambrusselback(at)gmail(dot)com>
To: pgsql-general <pgsql-general(at)lists(dot)postgresql(dot)org>
Subject: Ways to "serialize" result set for later use?
Date: 2021-04-11 02:06:08
Message-ID: CAMjNa7cRWpRykp1ZOHoXxuwzoU8OBGqAyHkp9uVha8D85WAO4g@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

Hey there everyone,

I am going through the process of writing my first pgtap tests for my
database, and I wanted to get some feedback on if my solution seems fine,
is just dumb, or could be acomplished much easier another way.

So my main problem I was trying to work around, was my tests are written in
functions and called using runtests(), so using psql and \copy to save test
data to the filesystem isn't really part of the workflow, but I still
needed a way to have my "expected" query resultset passed into results_eq [
https://pgtap.org/documentation.html#results_eq] easily within a function
body.

I originally manually dumped some "known good" data from a query to csv,
and built some SELECT ... FROM VALUES (...) statements by hand to do this.
That obviously sucks.

Then I started looking to see if there was a way to get Postgres to
"serialize" a query resultset to a values list similar to how pg_dump is
able to be configured to dump data as inserts for the plain format. I
couldn't find anything at all though. No mention of anything similar on the
mailing list, stackoverflow, etc... I feel like I must be searching
incorrectly.

Anyways, since I didn't find anything, or any discussion online for
something like I wanted, I just tried building it, see attached for source.
It's a function that takes in the text of a query that produces a
resultset, and returns a SELECT ... FROM VALUES (...) statement that will
produce the exact same output.
It does so by running that query and creating a temporary table with the
results, then we query the system catalogs to get the data types, and
column names of the temporary table produced by the query, and then uses
that information to build a VALUES from clause that contains each row by
scanning the temp table, and also dynamically builds the SELECT ... columns
list to cast everything to the correct data type, we then put all of that
together and return the query text to the caller.

Not fully tested or anything, and not guaranteed to work well. Please, if
you see any issues let me know. I got this together in an hour of
hacking... but it did solve my immediate problem and I have been able to
generate easy "expected" result sets for my tests.

So I really just wanted to see if there is a better way to go about what
i'm trying to do, does Postgres already support something similar I can
harness instead of this hack? Or is this really an alright way to go?

Attachment Content-Type Size
query_to_values.sql application/octet-stream 1.4 KB

Responses

Browse pgsql-general by date

  From Date Subject
Next Message Stephan Knauss 2021-04-11 09:53:30 Re: The Amazon CloudFront distribution is configured to block access from your country.
Previous Message felix.quintgz 2021-04-11 00:31:01 Re: The Amazon CloudFront distribution is configured to block access from your country.