Preface This article mainly introduces 4 methods of inserting large amounts of data into MySQL, and shares them for your reference and study. Let's take a look at the detailed introduction. Method 1: Loop Insertion This is also the most common method. If the amount of data is not large, it can be used, but it consumes resources to connect to the database each time. The general idea is as follows (I write pseudocode here, and the specific writing can be combined with your own business logic or framework syntax) for($i=1;$i<=100;$i++){ $sql = 'insert...............'; //querysql } foreach($arr as $key => $value){ $sql = 'insert...............'; //querysql } while($i <= 100){ $sql = 'insert...............'; //querysql $i++ } Because it is too common and not difficult, and it is not what I am mainly writing about today, so I won’t say more here. Method 2: Reduce connection resources and splice a SQL statement The pseudo code is as follows //Here we assume that the key of arr is synchronized with the database field. In fact, most frameworks use this design when operating the database in PHP. $arr_keys = array_keys($arr); $sql = 'INSERT INTO tablename (' . implode(',' ,$arr_keys) . ') values'; $arr_values = array_values($arr); $sql .= " ('" . implode("','" ,$arr_values) . "'),"; $sql = substr($sql,0,-1); //After splicing, it is probably INSERT INTO tablename ('username','password') values ('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx') ....... //querysql This way of writing is basically no problem in inserting 10,000 records normally, unless the data is very long. It is enough to handle ordinary batch insertion, such as: batch generation of card numbers, batch generation of random codes, etc. . . Method 3: Using stored procedures I have this in my hand and I'll provide the SQL. You can combine the specific business logic yourself. delimiter $$$ create procedure zqtest() begin declare i int default 0; set i=0; start transaction; while i<80000 do //your insert sql set i=i+1; end while; commit; end $$$ delimiter; call zqtest(); This is just a test code, you can define the specific parameters yourself I am inserting 80,000 records at a time. Although not a lot, each record has a large amount of data, and there are many varchar4000 and text fields. It takes 6.524 seconds. Method 4: Use MYSQL LOCAL_INFILE I am currently using this, so I copied the pdo code here for your reference //Set pdo to enable MYSQL_ATTR_LOCAL_INFILE /*[email protected] public function pdo_local_info () { global $system_dbserver; $dbname = '[email protected]'; $ip = '[email protected]'; $user = '[email protected]'; $pwd = '[email protected]'; $dsn = 'mysql:dbname=' . $dbname . ';host=' . $ip . ';port=3306'; $options = [PDO::MYSQL_ATTR_LOCAL_INFILE => true]; $db = new PDO($dsn,$user,$pwd,$options); return $db; } //The pseudo code is as follows public function test(){ $arr_keys = array_keys($arr); $root_dir = $_SERVER["DOCUMENT_ROOT"] . '/'; $my_file = $root_dir . "[email protected]/sql_cache/" . $order['OrderNo'] . '.sql'; $fhandler = fopen($my_file,'a+'); if ($fhandler) { $sql = implode("\t" ,$arr); $i = 1; while ($i <= 80000) { $i++; fwrite($fhandler ,$sql . "\r\n"); } $sql = "LOAD DATA local INFILE '" . $myFile . "' INTO TABLE "; $sql .= "tablename (" . implode(',' ,$arr_keys) . ")"; $pdo = $this->pdo_local_info(); $res = $pdo->exec($sql); if (!$res) { //TODO insertion failed} @unlink($my_file); } } This also has a large amount of data, with many varchar4000 and text fields. Time taken: 2.160s The above meets the basic requirements. One million data points are not a big problem. Otherwise, if the data is too large, it may involve sharding the database and tables, or using queues for insertion. Summarize The above is the full content of this article. I hope that the content of this article will have certain reference learning value for your study or work. Thank you for your support of 123WORDPRESS.COM. You may also be interested in:
|
<<: WeChat applet to achieve automatic video playback imitating GIF animation effect example
>>: Awk command line or script that helps you sort text files (recommended)
Table of contents UNION Table initialization Exec...
When using MySQL database, you often encounter su...
Record the installation and use of openssh-server...
1. This afternoon, due to the requirements of the...
Some properties in CSS are preceded by "*&qu...
1. All tags must have a corresponding end tag Prev...
Demand background The team has the need for integ...
No way, no way, it turns out that there are peopl...
apt install CMake sudo apt install cmake This met...
illustrate: VMware IOInsight is a tool that helps...
First: <abbr> or <acronym> These two s...
This blog is a work note environment: nginx versi...
It is troublesome to install the db2 database dir...
MYSQL is case sensitive Seeing the words is belie...
When developing for mobile devices, you often enc...