www.digitalmars.com         C & C++   DMDScript  

digitalmars.D.learn - mysql-native: SQL Transaction support?

reply salvari <salvari gmail.com> writes:
I'm using mysql-native library for massive data load. I've been 
trying to use transactions to improve performance but it doesn't 
seem to work.

I'm parsing a text input file, the generated sql is about 1 
million lines of SQL. By using mysql-native it takes about 4 
hours to load data.

I've tried to divide the sql in transactions by calling:

auto cmdStartTran = cdb.Command("start transaction;");
cmdStartTran.execSQL();

And some inserts later:
auto cmdCommit = cdb.Command("commit;");
cmdCommit.execSQL();

Parsing the input file to plain sql and executing the one million 
lines into the db takes more or less the same 4 hours.

But, by using transactions in the text file, the performance 
improves dramatically: 60 sec to load all data.

Is it posible that my mysql-native database connection has 
autocommit activated? How can I know?
Sep 17 2015
parent reply Gary Willoughby <dev nomad.so> writes:
On Thursday, 17 September 2015 at 19:47:33 UTC, salvari wrote:
 I'm parsing a text input file, the generated sql is about 1 
 million lines of SQL. By using mysql-native it takes about 4 
 hours to load data.
I've used mysql-native before to handle hundreds of millions of rows of data and I remember it being blazingly fast, nowhere near the time you suggest. Perhaps try to split the SQL into single lines and use prepared statements, this is where the big wins in speed are. Expecting any server to deal with a million lines of SQL in a timely manner is asking too much.
Sep 17 2015
parent salvari <salvari gmail.com> writes:
On Thursday, 17 September 2015 at 21:19:07 UTC, Gary Willoughby 
wrote:
 On Thursday, 17 September 2015 at 19:47:33 UTC, salvari wrote:
 I'm parsing a text input file, the generated sql is about 1 
 million lines of SQL. By using mysql-native it takes about 4 
 hours to load data.
 I've used mysql-native before to handle hundreds of millions of 
 rows of data and I remember it being blazingly fast, nowhere 
 near the time you suggest. Perhaps try to split the SQL into 
 single lines and use prepared statements, this is where the big 
 wins in speed are.

 Expecting any server to deal with a million lines of SQL in a 
 timely manner is asking too much.
On Thursday, 17 September 2015 at 21:19:07 UTC, Gary Willoughby wrote:
 On Thursday, 17 September 2015 at 19:47:33 UTC, salvari wrote:
 I'm parsing a text input file, the generated sql is about 1 
 million lines of SQL. By using mysql-native it takes about 4 
 hours to load data.
I've used mysql-native before to handle hundreds of millions of rows of data and I remember it being blazingly fast, nowhere near the time you suggest. Perhaps try to split the SQL into single lines and use prepared statements, this is where the big wins in speed are. Expecting any server to deal with a million lines of SQL in a timely manner is asking too much.
Thanks for the advice. I know I should have used prepared inserts but the input file is a bit messy and I was in a hurry. MySQL server is not the problem, as I said, executing exactly the same SQL divided in chunks and using transaction gets the job done in 1 minute instead of 4 hours. So far my problem is solved but I would like to know more about transaction support in mysql-native library. I have to make more tests. -- salvari
Sep 18 2015