From: | Aleksander Alekseev <a(dot)alekseev(at)postgrespro(dot)ru> |
---|---|
To: | Magnus Hagander <magnus(at)hagander(dot)net> |
Cc: | David Steele <david(at)pgmasters(dot)net>, Pg Hackers <pgsql-hackers(at)postgresql(dot)org> |
Subject: | Re: 2018-03 CFM |
Date: | 2018-03-02 12:18:02 |
Message-ID: | 20180302121802.GC29307@e733.localdomain |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-hackers |
Hello Magnus,
> You do realize we have the actual source database available, I hope? Since
> it's our own system... There is no need to scrape the data back out -- if
> we can just define what kind of reports we want, we can trivially run it on
> the source database. Or if we want it more often, we can easily make a
> webview for it. It's basically just a "map this URL to a SQL query"...
I don't think commitfest.cputube.org has the SQL data on whether patch
pass the tests. It just displays SVG images from travis-ci.org. Also
unfortunately both commitfest.postgresql.org and commitfest.cputube.org
currently don't have any kind of public API and don't allow to export
data, e.g. in CSV or JSON.
I guess it would be nice if both services supported export, in any
format, so anyone could build any kind of reports or automation tools
without parsing HTML with regular expressions or depending on other
people.
If I'm not mistaken, there was a discussion regarding public APIs.
I wonder what prevents adding it, at least a simple export of everything.
After all, it is indeed just mapping URL to a SQL query. For instance,
this one:
select array_to_json(array_agg(row_to_json(tbl))) from tbl;
--
Best regards,
Aleksander Alekseev
From | Date | Subject | |
---|---|---|---|
Next Message | Dagfinn Ilmari =?utf-8?Q?Manns=C3=A5ker?= | 2018-03-02 12:30:08 | Re: perltidy version |
Previous Message | Greg Stark | 2018-03-02 11:59:06 | Re: Better Upgrades |