| From: | "Lamar Owen" <lowen(at)pari(dot)edu> |
|---|---|
| To: | pgsql-hackers(at)postgresql(dot)org |
| Cc: | Jeremy Drake <jeremyd(at)jdrake(dot)com> |
| Subject: | Re: large object regression tests |
| Date: | 2006-09-09 20:28:48 |
| Message-ID: | 200609091628.48477.lowen@pari.edu |
| Views: | Whole Thread | Raw Message | Download mbox | Resend email |
| Thread: | |
| Lists: | pgsql-hackers |
On Tuesday 05 September 2006 02:59, Jeremy Drake wrote:
> I am considering, and I think that in order to get a real test of the
> large objects, I would need to load data into a large object which would
> be sufficient to be loaded into more than one block (large object blocks
> were 1 or 2K IIRC) so that the block boundary case could be tested. Is
> there any precedent on where to grab such a large chunk of data from? I
> was thinking about using an excerpt from a public domain text such as Moby
> Dick, but on second thought binary data may be better to test things with.
A 5 or 6 megapixel JPEG image. Maybe a photograph of an elephant.
--
Lamar Owen
Director of Information Technology
Pisgah Astronomical Research Institute
1 PARI Drive
Rosman, NC 28772
(828)862-5554
www.pari.edu
| From | Date | Subject | |
|---|---|---|---|
| Next Message | Martijn van Oosterhout | 2006-09-09 20:45:41 | Re: log_duration is redundant, no? |
| Previous Message | Tom Lane | 2006-09-09 20:21:43 | Re: @ versus ~, redux |