1. Create a table1.1 Create test table t_userCREATE TABLE `t_user` ( `id` int(11) NOT NULL AUTO_INCREMENT, `c_user_id` varchar(36) NOT NULL DEFAULT '' COMMENT 'User Id', `c_name` varchar(22) NOT NULL DEFAULT '' COMMENT 'User name', `c_province_id` int(11) NOT NULL COMMENT 'Province Id', `c_city_id` int(11) NOT NULL COMMENT 'City Id', `create_time` datetime NOT NULL COMMENT 'Creation time', PRIMARY KEY (`id`), KEY `idx_user_id` (`c_user_id`) )ENGINE=InnoDB DEFAULT CHARSET=utf8mb4; 1.2 Creating a temporary tableCREATE TABLE `tmp_table` ( `id` int(11) NOT NULL, PRIMARY KEY (`id`) )ENGINE=InnoDB DEFAULT CHARSET=utf8; 2. Generate data2.1 Generate a data file with [100 million] records using Python (this does take a little time)python -c "for i in range(1, 1+100000000): print(i)" > base.txt 2.2 Import the generated file into the temporary table tmp_tableFind the corresponding database Type 'help;' or '\h' for help. Type '\c' to clear the current input statement. mysql> use test; Database changed mysql> show tables; +----------------+ | Tables_in_test | +----------------+ | student | | t_user | | tmp_table | +----------------+ 3 rows in set (0.00 sec) Execute the import command mysql> load data infile 'E:/base.txt' replace into table tmp_table; ERROR 1290 (HY000): The MySQL server is running with the --secure-file-priv option so it cannot execute this statement mysql> An error may occur when importing data because MySQL does not have secure_file_priv enabled by default (this parameter is used to limit the effects of data import and export operations, such as executing LOAD DATA, SELECT ... INTO OUTFILE statements, and LOAD_FILE() functions. These operations require the user to have the FILE privilege.) Solution: Add You can use show variables like '%secure%'; first take a look at the configuration: mysql> show variables like '%secure%'; +--------------------------+-------+ | Variable_name | Value | +--------------------------+-------+ | require_secure_transport | OFF | | secure_auth | ON | | secure_file_priv | NULL | +--------------------------+-------+ 3 rows in set, 1 warning (0.00 sec) illustrate: secure_file_prive=null Restrict mysqld from importing and exporting secure_file_priv=/var/lib/mysql-files/ Restrict mysqld importing and exporting to the /var/lib/mysql-files/ directory secure_file_priv=' ' Do not restrict mysqld importing and exporting Note: The configuration should be added under the [mysqld] node. As for whether to put quotation marks around the path, you can try: Restart MySQL and check the configuration first: mysql> use test; Database changed mysql> show variables like '%secure%'; +--------------------------+-------+ | Variable_name | Value | +--------------------------+-------+ | require_secure_transport | OFF | | secure_auth | ON | | secure_file_priv | E:\ | +--------------------------+-------+ 3 rows in set, 1 warning (0.00 sec) And then re-import: mysql> load data infile 'E:/base.txt' replace into table tmp_table; Query OK, 100000000 rows affected (3 min 53.42 sec) Records: 100000000 Deleted: 0 Skipped: 0 Warnings: 0 mysql> Billions of data, 233.42s. If you look at other people's data, it's about the same. 3. Using the temporary table as the basic data, insert data into t_userOne hundred million data points will take almost half an hour. . . (Perhaps it would be faster to run it directly from the command line...) Update the creation time field to make the creation time of the inserted data more random: mysql> UPDATE t_user SET create_time=date_add(create_time, interval FLOOR(1 + (RAND() * 7)) year); Query OK, 100000000 rows affected (7 min 24.17 sec) Rows matched: 100000000 Changed: 100000000 Warnings: 0 mysql> UPDATE t_user SET create_time=date_add(create_time, interval FLOOR(1 + (RAND() * 7)) year); Query OK, 100000000 rows affected (8 min 2.49 sec) Rows matched: 100000000 Changed: 100000000 Warnings: 0 At this point, the insertion of 100 million data is completed. 4. ReferenceHow to quickly create tens of millions of test data in MySQL The MySQL server is running with the --secure-file-priv option This is the end of this article about how to quickly insert 100 million test data into MySQL. For more information about how to insert 100 million data into MySQL, please search for previous articles on 123WORDPRESS.COM or continue to browse the following related articles. I hope you will support 123WORDPRESS.COM in the future! You may also be interested in:
|
<<: Detailed process of deploying Docker to WSL2 in IDEA
>>: CSS3 uses scale() and rotate() to achieve zooming and rotation
In a page, there are many controls (elements or ta...
MySQL 8.0.25 decompression version installation t...
This article shares the specific code for JavaScr...
1. Unzip MySQL 8.0.16 The dada folder and my.ini ...
Table of contents cause: go through: 1. Construct...
Effect There are currently 2 projects (project1, ...
Note: The nginx version must be 1.9 or above. Whe...
1. readonly read-only attribute, so you can get th...
Table of contents 1. Background: 2. Design ideas:...
Nginx is a powerful, high-performance web and rev...
CSS display property Note: If !DOCTYPE is specifi...
Example source code: https://codepen.io/shadeed/p...
This article example shares the specific implemen...
netem and tc: netem is a network simulation modul...
HTML consists of two parts: head and body ** The ...