From: | Dominique Devienne <ddevienne(at)gmail(dot)com> |
---|---|
To: | Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> |
Cc: | Erik Wienhold <ewie(at)ewie(dot)name>, Sai Teja <saitejasaichintalapudi(at)gmail(dot)com>, pgsql-general(at)lists(dot)postgresql(dot)org |
Subject: | Re: Huge input lookup exception when trying to create the index for XML data type column in postgreSQL |
Date: | 2023-09-08 09:39:00 |
Message-ID: | CAFCRh-83y136owqFsC6YYAbeR8utEFAur4ELMs8C2AEY_iEA2A@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
On Thu, Sep 7, 2023 at 10:22 PM Tom Lane <tgl(at)sss(dot)pgh(dot)pa(dot)us> wrote:
> Erik Wienhold <ewie(at)ewie(dot)name> writes:
> > Looks like "Huge input lookup" as reported in [1] (also from Sai) and
> that
> > error is from libxml.
>
> Ah, thanks for the pointer. It looks like for the DOCUMENT case,
> we could maybe relax this restriction by passing the XML_PARSE_HUGE
> option to xmlCtxtReadDoc(). However, there are things to worry about:
>
Just a remark from the sidelines, from someone having done a fair bit of
XML in years past.
That XPath is simple, and a streaming parser (SAX or StAX) could handle it.
While that
XML_PARSE_HUGE option probably applies to a DOM parser. So is there a
work-around
to somehow force using a streaming parser instead of one that must produce
the whole Document,
just so a few elements are picked out of it? FWIW. --DD
From | Date | Subject | |
---|---|---|---|
Next Message | Dominique Devienne | 2023-09-08 09:45:47 | Re: Huge input lookup exception when trying to create the index for XML data type column in postgreSQL |
Previous Message | Matthias Apitz | 2023-09-08 08:26:04 | Re: Will PostgreSQL 16 supports native transparent data encryption ? |