1

I am using a simple perl script to pupulate millions of rows in a mysql DB table. I'm using perl DBI and DBD::mysql with the script. Example code below

my $dbh = DBI->connect(<DB INFO>); my $sth; my $insert_com = "INSERT INTO TAB1 VALUES(?,?,?,?,?)"; for (1..50000000){ $sth = $dbh->prepare($insert_com); $sth->execute(<val1>,<val2>,<val3>,<val4>,<val5>); } 

As per the above code, I think a commit is sent for each iteration of the loop. The question I have is, whether it's possible to send a commit every n iteration ? i.e to commit after inserting n number of rows to the table. If its possible, can someone please tell me how. Thanks in advance. Cheers ...

2

1 Answer 1

3

You have to set then "AutoCommit to zero:

$dbh = DBI->connect($dsn, $user, $password, { RaiseError => 1, AutoCommit => 0 }); 

and call all n rows $dbh->commit()

See DBI Documentation for more details.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.