PrefaceIn backend development, in order to prevent too much data from being loaded at one time, resulting in excessive memory and disk IO overhead, paging display is often required. At this time, the LIMIT keyword of MySQL is needed. But do you think that everything will be fine with LIMIT paging? It’s too young and too simple. When the amount of data is large, one problem that LIMIT is likely to cause is deep paging. CaseHere I take the display of e-commerce order details as an example, and the new table is as follows: CREATE TABLE `cps_user_order_detail` ( `id` bigint(20) unsigned NOT NULL AUTO_INCREMENT COMMENT 'Primary key', `user_id` varchar(32) NOT NULL DEFAULT '' COMMENT 'User ID', `order_id` bigint(20) DEFAULT NULL COMMENT 'Order id', `sku_id` bigint(20) unsigned NOT NULL COMMENT 'Product ID', `order_time` datetime DEFAULT NULL COMMENT 'Order time, format yyyy-MM-dd HH:mm:ss', PRIMARY KEY (`id`), KEY `idx_time_user` (`order_time`,`user_id`) USING BTREE ) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_bin COMMENT='User order details'; Then manually insert 1.2 million records into the table. Here we look at the execution plan, as shown below:
Looking at the picture above, the same statement has very different execution plans just because of the different offsets (allow me to exaggerate a little). In the first statement, optimization Now that the cause has been analyzed, how can we optimize LIMIT deep paging in actual development? Here I give you two solutions. SELECT * FROM cps_user_order_detail d WHERE d.id > #{maxId} AND d.order_time>'2020-8-5 00:00:00' ORDER BY d.order_time LIMIT 6; As shown in the above code, it is also paginated, but there is a maxId restriction. What does this mean? maxId is the maximum primary key Id in the previous page. Therefore, the premise of using this method is: 1) The primary key must be auto-incrementing and cannot be UUID, and in addition to passing the basic paging parameters pageNo, pageSize, the front end must also bring the maximum ID of each previous page, 2) This method does not support random page jumps, that is, it can only page up and down. The following figure shows an actual page from a well-known e-commerce company. The second is through Elastic Search search engine optimization (based on inverted index). In fact, e-commerce companies like Taobao basically put all their products into the ES search engine (it is impossible to put such massive data into MySQL, and it is not realistic to put it into Redis). But even if you use the ES search engine, deep paging problems may still occur. What should you do then? The answer is through the cursor scroll. We won’t go into detail on this point here, but those who are interested can do some research. summary I wrote this blog because I actually experienced it during development some time ago, and I did discuss it with the interviewer during the Byte interview. Knowing the limitations and optimization of LIMIT will be a plus if you can mention it in the interview. Don't say that MySQL optimization is just about building indexes and adjusting SQL (in fact, the effects of these two optimization solutions are minimal in real development). After all, if MySQL optimization was so awesome, there wouldn’t be so many middlewares. This is the end of this article about MySQL deep paging (how to quickly paginate tens of millions of data). For more information about MySQL deep paging, please search for previous articles on 123WORDPRESS.COM or continue to browse the following related articles. I hope you will support 123WORDPRESS.COM in the future! You may also be interested in:
|
<<: Shorten the page rendering time to make the page run faster
>>: Learn one minute a day to use Git server to view debug branches and fix them
Configure Tomcat First install Tomcat Installing ...
1. What is master-slave replication? Master-slave...
This article shares a simple HTML shopping quanti...
Without further ado, let me show you the code. Th...
Here are 10 tips on how to design better-usable w...
mysql-8.0.19-winx64 downloaded from the official ...
Preface: The Linux host is relatively easy to han...
Preface In Linux kernel programming, you will oft...
This article uses examples to explain the knowled...
When we install and configure the server LNPM env...
Before further analyzing why MySQL database index...
This article mainly introduces the process of imp...
In the front-end and back-end separation developm...
Table of contents Preface Related Materials Achie...
Table of contents 1. Master-slave replication Mas...