Hi,
What would be the most efficient way, performance wise, of storing alot of
rather big chunks of text in seperate records in PostgreSQL. I'm dividing
huge XML-documents into smaller bits and placing the bits into seperate
records. Requests want all or just some of the records, and the document is
re-built based on the request. So everything is heavy IO-based.
What would be the best way to do this? LargeObject, the binary blob feature
of PostgreSQL or .... ????
The chunks can be everything from a few lines to entire documents of
several megabytes ( ok, that's the extreme example, but still .... )
Best regards,
Thomas