Examples of 4 methods for inserting large amounts of data in MySQL

Examples of 4 methods for inserting large amounts of data in MySQL

Preface

This article mainly introduces 4 methods of inserting large amounts of data into MySQL, and shares them for your reference and study. Let's take a look at the detailed introduction.

Method 1: Loop Insertion

This is also the most common method. If the amount of data is not large, it can be used, but it consumes resources to connect to the database each time.

The general idea is as follows

(I write pseudocode here, and the specific writing can be combined with your own business logic or framework syntax)

for($i=1;$i<=100;$i++){
 $sql = 'insert...............';
 //querysql
}
foreach($arr as $key => $value){
$sql = 'insert...............';
 //querysql
}
while($i <= 100){
$sql = 'insert...............';
 //querysql
 $i++
}

Because it is too common and not difficult, and it is not what I am mainly writing about today, so I won’t say more here.

Method 2: Reduce connection resources and splice a SQL statement

The pseudo code is as follows

//Here we assume that the key of arr is synchronized with the database field. In fact, most frameworks use this design when operating the database in PHP. $arr_keys = array_keys($arr);
$sql = 'INSERT INTO tablename (' . implode(',' ,$arr_keys) . ') values';
$arr_values ​​= array_values($arr);
$sql .= " ('" . implode("','" ,$arr_values) . "'),";
$sql = substr($sql,0,-1);
//After splicing, it is probably INSERT INTO tablename ('username','password') values 
('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx'),('xxx','xxx')
.......
//querysql

This way of writing is basically no problem in inserting 10,000 records normally, unless the data is very long. It is enough to handle ordinary batch insertion, such as: batch generation of card numbers, batch generation of random codes, etc. . .

Method 3: Using stored procedures

I have this in my hand and I'll provide the SQL. You can combine the specific business logic yourself.

delimiter $$$
create procedure zqtest()
begin
declare i int default 0;
set i=0;
start transaction;
while i<80000 do
 //your insert sql 
set i=i+1;
end while;
commit;
end
$$$
delimiter;
call zqtest();

This is just a test code, you can define the specific parameters yourself

I am inserting 80,000 records at a time. Although not a lot, each record has a large amount of data, and there are many varchar4000 and text fields. It takes 6.524 seconds.

Method 4: Use MYSQL LOCAL_INFILE

I am currently using this, so I copied the pdo code here for your reference

//Set pdo to enable MYSQL_ATTR_LOCAL_INFILE
/*[email protected]
public function pdo_local_info ()
{
  global $system_dbserver;
  $dbname = '[email protected]';
  $ip = '[email protected]';
  $user = '[email protected]';
  $pwd = '[email protected]';
  $dsn = 'mysql:dbname=' . $dbname . ';host=' . $ip . ';port=3306';
  $options = [PDO::MYSQL_ATTR_LOCAL_INFILE => true];
  $db = new PDO($dsn,$user,$pwd,$options);
  return $db;
 }
//The pseudo code is as follows public function test(){
  $arr_keys = array_keys($arr);
  $root_dir = $_SERVER["DOCUMENT_ROOT"] . '/';
  $my_file = $root_dir . "[email protected]/sql_cache/" . $order['OrderNo'] . '.sql';
  $fhandler = fopen($my_file,'a+');
  if ($fhandler) {
  $sql = implode("\t" ,$arr);
   $i = 1;
   while ($i <= 80000)
   {
    $i++;
    fwrite($fhandler ,$sql . "\r\n");
   }
   $sql = "LOAD DATA local INFILE '" . $myFile . "' INTO TABLE ";
   $sql .= "tablename (" . implode(',' ,$arr_keys) . ")";
   $pdo = $this->pdo_local_info();
   $res = $pdo->exec($sql);
   if (!$res) {
    //TODO insertion failed}
   @unlink($my_file);
  }
}

This also has a large amount of data, with many varchar4000 and text fields.

Time taken: 2.160s

The above meets the basic requirements. One million data points are not a big problem. Otherwise, if the data is too large, it may involve sharding the database and tables, or using queues for insertion.

Summarize

The above is the full content of this article. I hope that the content of this article will have certain reference learning value for your study or work. Thank you for your support of 123WORDPRESS.COM.

You may also be interested in:
  • MYSQL batch insert data implementation code
  • Tutorial on implementing batch inserts in MySQL to optimize performance
  • How to avoid MySQL batch inserts with unique indexes
  • Mysql uses insert to insert multiple records to add data in batches
  • Detailed example code of mysql batch insert loop
  • MySQL batch insert data script
  • Detailed explanation of MySQL batch SQL insert performance optimization
  • MySql batch insert optimization Sql execution efficiency example detailed explanation
  • MySQL batch inserts data through function stored procedures

<<:  WeChat applet to achieve automatic video playback imitating GIF animation effect example

>>:  Awk command line or script that helps you sort text files (recommended)

Recommend

Docker deployment and installation steps for Jenkins

First, we need a server with Docker installed. (I...

Steps to configure nginx ssl to implement https access (suitable for novices)

Preface After deploying the server, I visited my ...

MySQL dual-master (master-master) architecture configuration solution

In enterprises, database high availability has al...

Basic knowledge of website design: newbies please read this

Now many people are joining the ranks of website ...

How to compile and install opencv under ubuntu

Easy installation of opencv2: conda install --cha...

Detailed process of installing various software in Docker under Windows

1. Install MySQL # Download mysql in docker docke...

Overview and Introduction to Linux Operating System

Table of contents 1. What is an Operating System ...

The pitfalls encountered when learning Vue.js

Table of contents Class void pointing ES6 Arrow F...

Detailed steps for installing Tomcat, MySQL and Redis with Docker

Table of contents Install Tomcat with Docker Use ...