Re: Initial insert

From: Laurent Cathala <lca(at)sylob(dot)com>
To: pgsql-performance <pgsql-performance(at)postgresql(dot)org>
Subject: Re: Initial insert
Date: 2015-01-23 08:54:52
Message-ID: CAE8N+H=GVW2YvraVASYtPAtdapzjvsKTwZnAd3CQuYXhyLECxw@mail.gmail.com
Views: Raw Message | Whole Thread | Download mbox | Resend email
Thread:
Lists: pgsql-performance

Hello,
I forgot to mention my version : 9.2

thanks,

*Laurent CATHALA*Architecte
lca(at)sylob(dot)com

*7 rue Marcel Dassault - Z.A. La Mouline - 81990 Cambon d'Albi - FRANCETel
: 05 63 53 08 18 - Fax : 05 63 53 07 42 - www.sylob.com
<http://www.sylob.com/>Support : 05 63 53 78 35 - support(at)sylob(dot)com
<support(at)sylob(dot)com>Entreprise certifiée ISO 9001 version 2008 par Bureau
Veritas.*

<http://twitter.com/SylobErp> <http://www.google.com/+sylob>
<http://www.viadeo.com/fr/company/sylob-sas>
<http://www.linkedin.com/company/sylob>

2015-01-22 17:46 GMT+01:00 Laurent Cathala <lca(at)sylob(dot)com>:

> Hi,
> I'm trying to create datas on an initial import and i'm encountering a
> performance issue.
> I've 2 tables, my process create a record in each table and execute a sum
> with join on this 2 tables. (and other requests but there are very fast)
>
> My 2 tables are empty before the import.
>
> My count query is :
> select sum(quantitest0_.quantite_valeur) as col_0_0_ from
> dm5_quantitestock quantitest0_, dm5_caracteristiquearticlestock
> caracteris1_ where
> quantitest0_.id_caracteristiquearticlestock=caracteris1_.id and
> caracteris1_.id_article='4028804c4a311178014a346546967c59'
>
> i use parameterized request.
>
> My process create only 6000 records in each table.
>
> During the whole process this sum request lasts longer and longer.
>
> The auto-explain plan show an seq scan
>
> ----------
> Query Text: select sum(quantitest0_.quantite_valeur) as col_0_0_ from
> dm5_quantitestock quantitest0_, dm5_caracteristiquearticlestock
> caracteris1_ where
> quantitest0_.id_caracteristiquearticlestock=caracteris1_.id and
> caracteris1_.id_article=$1
> Aggregate (cost=2.04..2.05 rows=1 width=26) (actual
> time=862.621..862.621 rows=1 loops=1)
> Output: sum(quantitest0_.quantite_valeur)
> -> Nested Loop (cost=0.00..2.04 rows=1 width=26) (actual
> time=862.618..862.618 rows=0 loops=1)
> Output: quantitest0_.quantite_valeur
> Join Filter:
> ((quantitest0_.id_caracteristiquearticlestock)::text =
> (caracteris1_.id)::text)
> Rows Removed by Join Filter: 1869
> -> Seq Scan on public.dm5_quantitestock quantitest0_
> (cost=0.00..1.01 rows=1 width=164) (actual time=0.004..0.408 rows=1869
> loops=1)
> Output: quantitest0_.id,
> quantitest0_.datefinvalidite, quantitest0_.quantite_valeur,
> quantitest0_.id_caracteristiquearticlestock,
> quantitest0_.id_caracteristiquelieustock, quantitest0_.datecreationsysteme,
> quantitest0_.datemodificationsysteme, quantitest0_.id_creeparsysteme,
> quantitest0_.id_modifieparsysteme
> -> Seq Scan on public.dm5_caracteristiquearticlestock
> caracteris1_ (cost=0.00..1.01 rows=1 width=42) (actual time=0.456..0.456
> rows=1 loops=1869)
> Output: caracteris1_.id,
> caracteris1_.datefinvalidite, caracteris1_.id_lot, caracteris1_.id_article,
> caracteris1_.id_numeroserie, caracteris1_.datecreationsysteme,
> caracteris1_.datemodificationsysteme, caracteris1_.id_modifieparsysteme,
> caracteris1_.id_creeparsysteme
> Filter: ((caracteris1_.id_article)::text =
> ($1)::text)
> Rows Removed by Filter: 1869
> -----------
>
> if a launch an analyse during the process, the explain use index, but the
> time remains the same.
>
> ---------
> Query Text: select sum(quantitest0_.quantite_valeur) as col_0_0_ from
> dm5_quantitestock quantitest0_, dm5_caracteristiquearticlestock
> caracteris1_ where
> quantitest0_.id_caracteristiquearticlestock=caracteris1_.id and
> caracteris1_.id_article=$1
> Aggregate (cost=16.55..16.56 rows=1 width=26) (actual
> time=654.998..654.998 rows=1 loops=1)
> Output: sum(quantitest0_.quantite_valeur)
> -> Nested Loop (cost=0.00..16.55 rows=1 width=26) (actual
> time=654.994..654.994 rows=0 loops=1)
> Output: quantitest0_.quantite_valeur
> Join Filter: ((quantitest0_.id_caracteristiquearticlestock)::text
> = (caracteris1_.id)::text)
> Rows Removed by Join Filter: 1651
> -> Index Scan using x_dm5_quantitestock_00 on
> public.dm5_quantitestock quantitest0_ (cost=0.00..8.27 rows=1 width=164)
> (actual time=0.011..0.579 rows=1651 loops=1)
> Output: quantitest0_.id, quantitest0_.datefinvalidite,
> quantitest0_.quantite_valeur, quantitest0_.id_caracteristiquearticlestock,
> quantitest0_.id_caracteristiquelieustock, quantitest0_.datecreationsysteme,
> quantitest0_.datemodificationsysteme, quantitest0_.id_creeparsysteme,
> quantitest0_.id_modifieparsysteme
> -> Index Scan using dm5_caracteristiquearticlestock_pkey on
> public.dm5_caracteristiquearticlestock caracteris1_ (cost=0.00..8.27
> rows=1 width=42) (actual time=0.395..0.395 rows=1 loops=1651)
> Output: caracteris1_.id, caracteris1_.datefinvalidite,
> caracteris1_.id_lot, caracteris1_.id_article, caracteris1_.id_numeroserie,
> caracteris1_.datecreationsysteme, caracteris1_.datemodificationsysteme,
> caracteris1_.id_modifieparsysteme, caracteris1_.id_creeparsysteme
> Filter: ((caracteris1_.id_article)::text =
> '4028804c4a311178014a346547307cce'::text)
> Rows Removed by Filter: 1651
>
> ----------
>
> If i create the first 1000 records, commit and end transaction, the whole
> import is very fast.
>
>
> I can't change my process to cut the process in little part...
>
> An idea ?
>
> Thanks.
>
>
> *Laurent CATHALA*Architecte
> lca(at)sylob(dot)com
>
>
>
>
>
> *7 rue Marcel Dassault - Z.A. La Mouline - 81990 Cambon d'Albi - FRANCETel
> : 05 63 53 08 18 - Fax : 05 63 53 07 42 - www.sylob.com
> <http://www.sylob.com/>Support : 05 63 53 78 35 - support(at)sylob(dot)com
> <support(at)sylob(dot)com>Entreprise certifiée ISO 9001 version 2008 par Bureau
> Veritas.*
>
>
> <http://twitter.com/SylobErp> <http://www.google.com/+sylob>
> <http://www.viadeo.com/fr/company/sylob-sas>
> <http://www.linkedin.com/company/sylob>
>
>

In response to

Browse pgsql-performance by date

  From Date Subject
Next Message AlexK987 2015-01-24 23:33:19 How to tell ANALYZE to collect statistics from the whole table?
Previous Message Laurent Cathala 2015-01-22 16:46:12 Initial insert