Re: generic modelling of data models; enforcing constraints dynamically...

From: Sam Mason <sam(at)samason(dot)me(dot)uk>
To: pgsql-general(at)postgresql(dot)org
Subject: Re: generic modelling of data models; enforcing constraints dynamically...
Date: 2009-09-24 20:25:24
Message-ID: 20090924202523.GJ22438@samason.me.uk
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-general

On Thu, Sep 24, 2009 at 09:23:35PM +0200, Rob Marjot wrote:
> SELECT doesComply('relationname', keyValues.*) FROM (VALUES('col1',
> CAST(col1 AS TEXT)), VALUES('col2', CAST(col2 AS TEXT))) AS
> keyValues(the_key, the_value);
>
> The function "doesComply()" will then process the CONSTRAINTS table and
> raise an Error if the new / updated row does not fit...

I'd have a set of doesComply functions, the first two parameters
as you have them but overload a set to support different datatypes
specifically. Something like:

CREATE FUNCTION doesComply(_rel TEXT, _key TEXT, _val INT) ...
CREATE FUNCTION doesComply(_rel TEXT, _key TEXT, _val DATE) ...
CREATE FUNCTION doesComply(_rel TEXT, _key TEXT, _val TEXT) ...
CREATE FUNCTION doesComply(_rel TEXT, _key TEXT, _val NUMERIC) ...

And then have a set of "attribute" tables (one for each datatype) to
store the actual values in. At least PG can do some type checking for
you that way. Either that, or just leave them all as text to text
mappings in the database and only attempt to type things out in the
client code.

Not sure why you're doing the VALUES contortions as well, why not just:

SELECT doesComply('relationname', 'col1', col2);

?

--
Sam http://samason.me.uk/

In response to

Responses

Browse pgsql-general by date

  From Date Subject
Next Message InterRob 2009-09-24 20:27:17 Re: generic modelling of data models; enforcing constraints dynamically...
Previous Message Ben Chobot 2009-09-24 20:17:39 Re: generic modelling of data models; enforcing constraints dynamically...