Preface: In daily study and work, we often encounter the need to export data. For example, data migration, data recovery, creating a new slave database, etc. These operations may involve the import of a large amount of data. Sometimes the import progress is slow and the computer fan is spinning wildly, which is really frustrating. In fact, there are some tips that can make the import faster. In this article, the author will talk about how to import data quickly. Note: This article only discusses how to quickly import SQL scripts generated by logical backups, and other file formats are not discussed for the time being. 1. Try to reduce the size of the imported file First of all, I would like to give you a suggestion. When exporting and importing data, try to use the command line tools that come with MySQL, and do not use graphical tools such as Navicat and workbench. Especially when dealing with large amounts of data, exporting and importing with MySQL's built-in command line tools is several times faster than using graphical tools such as Navicat. Moreover, graphical tools such as Navicat can easily get stuck when performing operations on large amounts of data. The following is a brief introduction on how to use the command line tool that comes with MySQL for import and export. # Export the entire instance mysqldump -uroot -pxxxxxx --all-databases > all_database.sql # Export the specified library mysqldump -uroot -pxxxxxx --databases testdb > testdb.sql # Export the specified table mysqldump -uroot -pxxxxxx testdb test_tb > test_tb.sql #Import the specified SQL file (specify import into the testdb library) mysql -uroot -pxxxxxx testdb < testdb.sql Most of the imported SQL scripts create databases and tables first, and then insert data. The longest time-consuming part is probably inserting data. To reduce the file size, it is recommended to use the extended insert method, that is, batch insert of multiple rows together, similar to this: insert into table_name values (),(),(),...,(); . Using extended insert will result in a much smaller file size and several times faster insertion speed than inserting one by one. Files exported using mysqldump use the batch insert method by default. When exporting, you can use the --skip-extended-insert parameter to change to inserting one by one. Below, we take a data table with 10 million data points as an example to test the speed of inserting files exported in different ways. As can be seen from the figure above, it takes about 10 minutes to import SQL scripts using extended insert, while it takes too long to import SQL scripts one by one, and it is still not finished after about an hour. The import of a text of more than 2G has not been completed for more than an hour, and the author can't wait and cancels it manually? However, it can be seen that inserting multiple entries together saves several times more time than inserting data one by one. 2. Try to modify the parameters to speed up the import In MySQL, there is a pair of famous "double one" parameters, namely innodb_flush_log_at_trx_commit and sync_binlog. For security reasons, the default values of these two parameters are 1. In order to quickly import the script, we can temporarily modify these two parameters. The following is a brief introduction to these two parameters:
These two parameters can be modified online. If you want to import quickly, you can follow the steps below: # 1. Enter the MySQL command line to temporarily modify these two parameters set global innodb_flush_log_at_trx_commit = 2; set global sync_binlog = 2000; # 2. Execute the SQL script to import mysql -uroot -pxxxxxx testdb < testdb.sql # 3. After the import is complete, change the parameters back to set global innodb_flush_log_at_trx_commit = 1; set global sync_binlog = 1; There is another scenario where you need to create a new slave database or do not need to generate binlog. In this case, you can set it not to record binlog temporarily when importing SQL scripts. You can add set sql_log_bin=0; at the beginning of the script and then execute the import, which will further speed up the process. If binlog is not enabled on your MySQL instance, you do not need to execute this statement. Summarize: This article mainly introduces methods for quickly importing data. There may be other methods to quickly import data, such as loading data or writing programs for multi-threaded insertion. The method described in this article is only suitable for manually importing SQL scripts. The following is a summary of the methods mentioned in this article.
In fact, there are some other solutions, such as not creating the index first, and then adding the index after inserting the data; or changing the table to MyISAM or MEMORY engine first, and then changing it to InnoDB engine after the import is complete. However, these two methods are troublesome to implement and their effects are unknown. The above methods are just the author’s summary based on personal experience, which may not be comprehensive. You are welcome to add to them. The above is the details of how to quickly import data into MySQL. For more information about importing data into MySQL, please pay attention to other related articles on 123WORDPRESS.COM! You may also be interested in:
|
<<: jQuery implements the function of adding and deleting employee information
>>: Windows 10 is too difficult to use. How to customize your Ubuntu?
MySQL tuning Explain tool detailed explanation an...
This article shares the specific code of JavaScri...
Preface The logical judgment statements we use in...
Table of contents background Technical Solution S...
Stored procedures and coding In MySQL stored proc...
Table of contents Bidirectional binding principle...
1. Introduction Sometimes, after the web platform...
With the development of Internet technology, user...
BFC Concept: The block formatting context is an i...
Table of contents 1. Install Vue scaffolding 2. C...
Overview I have recently started learning MySQL r...
1. Development environment vue+vant 2. Computer s...
Table of contents Overview 1. How to animate a DO...
Now, let me ask you a question. What happens when...
This article shares the MySQL free installation c...