From: | Thomas Kellerer <spam_eater(at)gmx(dot)net> |
---|---|
To: | pgsql-general(at)lists(dot)postgresql(dot)org |
Subject: | Re: PostgreSQL Volume Question |
Date: | 2018-06-20 08:16:54 |
Message-ID: | dc0bb140-3965-371c-d942-71c0d0ceb942@gmx.net |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-general |
Data Ace schrieb am 15.06.2018 um 18:26:
> Well I think my question is somewhat away from my intention cause of
> my poor understanding and questioning :(
>
> Actually, I have 1TB data and have hardware spec enough to handle
> this amount of data, but the problem is that it needs too many join
> operations and the analysis process is going too slow right now.
>
> I've searched and found that graph model nicely fits for network data
> like social data in query performance.
>
> Should I change my DB (I mean my DB for analysis)? or do I need some
> other solutions or any extension?
AgensGraph is a Postgres fork implemententing a graph database supporting
Cypher as the query language while at the same time still supporting SQL
(and even queries mixing both)
I have never used it, but maybe it's worth a try.
http://bitnine.net/agensgraph/
Thomas
From | Date | Subject | |
---|---|---|---|
Next Message | Thomas Kellerer | 2018-06-20 10:51:26 | Plan output: actual execution time not considering loops? |
Previous Message | Pierre Timmermans | 2018-06-20 08:06:31 | Re: using pg_basebackup for point in time recovery |