From: | Markus Schaber <schabi(at)logix-tt(dot)com> |
---|---|
To: | PostgreSQL-development <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: large object regression tests |
Date: | 2006-09-08 12:13:32 |
Message-ID: | 45015E6C.2090907@logix-tt.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
Hi, Jeremy,
Jeremy Drake wrote:
> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested. Is
> there any precedent on where to grab such a large chunk of data from?
You could generate such data on the fly, as part of the test scripts.
E. G. a blob of zero bytes, blob of 0xff bytes, a blob of pseudo random
data...
Markus
--
Markus Schaber | Logical Tracking&Tracing International AG
Dipl. Inf. | Software Development GIS
Fight against software patents in EU! www.ffii.org www.nosoftwarepatents.org
From | Date | Subject | |
---|---|---|---|
Next Message | Peter Eisentraut | 2006-09-08 12:14:58 | Re: Fixed length data types issue |
Previous Message | Martijn van Oosterhout | 2006-09-08 12:06:43 | Re: Fixed length data types issue |