Detailed explanation of MySQL Limit performance optimization and paging data performance optimization

Detailed explanation of MySQL Limit performance optimization and paging data performance optimization

MySQL Limit can query database data in segments and is mainly used in paging. Although the data of the websites written nowadays are in the thousands, some small optimizations will not play a big role, but development must be done to the extreme and pursue perfect performance. The following are some limit performance optimization methods.

Limit syntax:

SELECT * FROM table LIMIT [offset,] rows | rows OFFSET offset

The LIMIT clause can be used to force a SELECT statement to return a specified number of records. LIMIT accepts one or two numeric arguments. The argument must be an integer constant.

If two arguments are given, the first argument specifies the offset of the first returned row, and the second argument specifies the maximum number of returned rows. The initial record row has an offset of 0 (not 1).

Support limit # offset # syntax:

mysql> SELECT * FROM table LIMIT 5,10; // Retrieve rows 6-15
//To retrieve all the rows from a certain offset to the end of the record set, you can specify the second parameter as -1
mysql> SELECT * FROM table LIMIT 95,-1; // Retrieve rows 96-last
//If only one parameter is given, it indicates the maximum number of rows to return. In other words, LIMIT n is equivalent to LIMIT 0,n
mysql> SELECT * FROM table LIMIT 5; // Retrieve the first 5 rows

limit n,m means to select m records starting from the nth record. Most developers like to use this type of statement to solve the classic paging problem in the Web. For small data sets, this is not a big problem. For applications such as forums that may have very large amounts of data, the efficiency of limit n,m is very low. Because data needs to be selected every time. If you only select the first five records, it is very easy and convenient; however, for 1 million records, if you select five records starting from the 800,000th row, you still need to scan the records to this position.

That is to say, limit 10000,20 means scanning 10020 rows that meet the conditions, discarding the first 10000 rows, and returning the last 20 rows. The problem lies here. If limit 100000,100 is used, 100100 rows need to be scanned. In a highly concurrent application, each query needs to scan more than 100,000 rows, and the performance will definitely be greatly reduced.

Comparison of data reading efficiency with different data volumes:

1. When the offset is small:

select * from table limit 5,10

After multiple runs, the time remained between 0.0004-0.0005

Select * From table Where id >=( 
Select id From table Order By id limit 10,1 
) limit 10

After multiple runs, the time remained between 0.0005-0.0006. Therefore, when the offset is small, it is more efficient to use limit directly!

2. When the offset data is large:

select * from table limit 10000,10

After multiple runs, the time remained around 0.0187 seconds.

Select * From table Where id >=( 
Select id From table Order By id limit 10000,1 
) limit 10

After multiple runs, the time remained at around 0.061 seconds, which is about 1/3 of the former. Therefore, when the offset is large, using the latter will be less efficient! This is the result of using id as index.

If id is used as the primary key of the data table:

select id from table limit 10000,10

The query takes about 0.04 seconds, which is because the id primary key is used as the index.

Limit performance optimization:

Select * From cyclopedia Where ID>=( 
Select Max(ID) From ( 
Select ID From cyclopedia Order By ID limit 90001 
) As tmp 
) limit 100; 
Select * From cyclopedia Where ID>=( 
Select Max(ID) From ( 
Select ID From cyclopedia Order By ID limit 90000,1 
) As tmp 
) limit 100;

The second sentence will be faster when fetching the last 100 records from 90,000. Because the first sentence first takes the first 90,001 records, takes the largest ID value as the starting identifier, and then uses it to quickly locate the next 100 records; while the second sentence only takes the last record, and then takes the ID value as the starting identifier to locate the next 100 records. The second sentence can be shortened to:

Select * From cyclopedia Where ID>=( 
Select ID From ( 
Select ID From cyclopedia Order By ID limit 90000,1 
) As tmp 
) limit 100;

Max operation is omitted, and IDs are generally incremented.

Paging data performance optimization:

1. For data tables with large amounts of data, you can create primary keys and index fields to create index tables, query the corresponding primary keys through the index tables, and then query the data tables with large amounts of data through the primary keys;

2. If you have a where condition and want to use the index to use limit, you must design an index, put where first, the primary key used for limit second, and you can only select the primary key! This will increase the reading speed

3. Use in: first obtain the corresponding primary key value through the where condition, and then use the primary key value to query the corresponding field value.

Paging using a cursor:

In order to achieve the best query performance for MySQL, I changed the paging query to cursor query mode:

select * from table where id > last_id limit 20 order by reply_id ASC;

The last_id above is the id of the last record on this page, so that the "next page" query can be realized, and similarly, the "previous page" query can also be realized.

Cursor paging is only suitable for sequential data and does not support page jumps. We can create an auto-increment ID or add ordered fields to the data table: For projects with large amounts of data, page jumps are not very useful, and you can use filtering conditions to achieve the purpose of searching.

Summarize

The above is the full content of this article. I hope that the content of this article will have certain reference learning value for your study or work. Thank you for your support of 123WORDPRESS.COM. If you want to learn more about this, please check out the following links

You may also be interested in:
  • MySQL query optimization: LIMIT 1 avoids full table scan and improves query efficiency
  • Why does MySQL paging become slower and slower when using limit?
  • MySQL optimization query_cache_limit parameter description
  • Detailed explanation of the pitfalls of mixing MySQL order by and limit
  • Simple example of limit parameter of mysql paging
  • Reasons and optimization solutions for slow MySQL limit paging with large offsets
  • Mysql sorting and paging (order by & limit) and existing pitfalls
  • How to use MySQL limit and solve the problem of large paging
  • Detailed explanation of the problem of mixed use of limit and sum functions in MySQL
  • How to improve MySQL Limit query performance
  • A brief discussion on the implementation of MySQL's limit paging optimization solution
  • The impact of limit on query performance in MySQL

<<:  Complete steps to use vue-router in vue3

>>:  Docker deploys Macvlan to achieve cross-host network communication

Recommend

How familiar are you with pure HTML tags?

The following HTML tags basically include all exis...

How to install Docker on Raspberry Pi

Because the Raspberry Pi is based on ARM architec...

Linux sudo vulnerability could lead to unauthorized privileged access

Exploiting a newly discovered sudo vulnerability ...

Specific use of node.js global variables

Global Object All modules can be called global: r...

How to connect JDBC to MySQL 5.7

1. First prepare the MySQL and Eclipse environmen...

Detailed explanation of application scenarios of filters in Vue

filter is generally used to filter certain values...

Vue+el-table realizes merging cells

This article example shares the specific code of ...

How to configure the same domain name for the front and back ends of nginx

This article mainly introduces the method of conf...

Use of marker tags in CSS list model

This article mainly introduces the ::master pseud...

Analysis of the difference between HTML relative path and absolute path

HTML beginners often encounter the problem of how ...

Detailed explanation of the basic use of Apache POI

Table of contents Basic Introduction Getting Star...

Detailed explanation of where Docker saves log files

Table of contents Where are the logs stored? View...

Detailed explanation of common methods of JavaScript String

Table of contents 1. charAt grammar parameter ind...