From: | Lucio Chiessi <lucio(dot)chiessi(at)trustly(dot)com> |
---|---|
To: | Raj kumar <rajkumar820999(at)gmail(dot)com> |
Cc: | Pgsql-admin <pgsql-admin(at)lists(dot)postgresql(dot)org> |
Subject: | Re: Load 500 GB test data with Large objects and different types |
Date: | 2023-02-16 20:02:44 |
Message-ID: | CADoTbHVCTX8YhrBeDN2KbyZm9kyTXGygP+pE9AfJA6NmGo01BQ@mail.gmail.com |
Views: | Raw Message | Whole Thread | Download mbox | Resend email |
Thread: | |
Lists: | pgsql-admin |
Hi Raj. You can use the generate_series() function to create millions of
rows and do an insert from this select.
I'm used to using it in my test data generation.
*Lucio Chiessi*
Senior Database Administrator
Trustly, Inc.
M: +55 27 996360276
<https://www.linkedin.com/company/trustly/>
<https://www.facebook.com/trustly> <https://twitter.com/Trustly>
<https://www.linkedin.com/company/trustly/>
<https://www.facebook.com/trustly> <https://twitter.com/Trustly>
PayWith*MyBank®* is now part of *Trustly*
On Thu, Feb 16, 2023 at 3:46 PM Raj kumar <rajkumar820999(at)gmail(dot)com> wrote:
> Hi,
>
> What is the easy/best way to load 500gb of data for testing purpose with
> flexible data types
> 1) Timestamp Datetime
> 2) Blob Large Objects
> 3) Different data types.
>
> I tried pgbench and sysbench which only gives int and varchar types.
>
> Thanks,
> Raj Kumar Narendiran.
>
--
Please read our privacy policy here
<https://www.trustly.net/about-us/privacy-policy> on how we process your
personal data in accordance with the General Data Protection Regulation
(EU) 2016/679 (the “GDPR”) and other applicable data protection legislation
From | Date | Subject | |
---|---|---|---|
Next Message | Holger Jakobs | 2023-02-16 21:03:53 | Re: Load 500 GB test data with Large objects and different types |
Previous Message | pradeep pandey | 2023-02-16 19:56:47 | EOL of Pglogical replication support |