Paginating Results with LIMIT and OFFSET in MySQL
Jul 05, 2025 am 02:41 AMMySQL paging is commonly implemented by LIMIT and OFFSET, but its performance is poor under large data volumes. 1. LIMIT controls the number of each page, OFFSET controls the starting position, and the syntax is LIMIT N OFFSET M; 2. Performance problems are caused by excessive records and discarding OFFSET, resulting in low efficiency; 3. Optimization suggestions include using cursor paging, index acceleration, and lazy loading; 4. Cursor paging locates the starting point of the next page through the unique value of the last record of the previous page, avoiding OFFSET, which is suitable for "next page" operations, and is not suitable for random jumps.
Paging is a common requirement when MySQL querying large amounts of data. For example, when the website displays user lists, article lists and other scenarios, usually only a fixed amount of data is displayed on a page. At this time, LIMIT
and OFFSET
need to be used to achieve pagination.

Basic usage: LIMIT controls the number, OFFSET controls the starting position
MySQL's LIMIT
is used to limit the number of records returned by the query, and OFFSET
is used to specify which record to return from. The basic syntax is as follows:

SELECT * FROM table_name ORDER BY id LIMIT N OFFSET M;
-
N
is the number of records to be displayed per page; -
M
is to skip the previous records, that is(當(dāng)前頁碼- 1) * N
For example, if you want to obtain the data on page 3 and display 10 pieces per page, you need to write it as:
SELECT * FROM users ORDER BY id LIMIT 10 OFFSET 20;
Note: If there are no appropriate sorting fields in the table, duplicate or missing data may occur between different pages.
Pagination performance problem: The bigger the OFFSET, the slower it is?
When the data volume is very large, using OFFSET
may cause performance degradation. Because MySQL actually scans the OFFSET LIMIT
bar record and then discards the previous OFFSET
bar.
For example:
SELECT * FROM orders ORDER BY created_at DESC LIMIT 10 OFFSET 100000;
Although this statement seems to only take 10 pieces of data, the database actually needs to find out 100,010 pieces first and then throw away the first 100,000 pieces, which is very inefficient.
Optimization suggestions:
- If the pagination is very deep (such as more than 10,000 pages), you can consider using "cursor-based pagination" instead of
LIMIT/OFFSET
. - Use indexes to speed up sorting and searches, especially fields that are often used to make paging conditions.
- For scenarios that do not require precise jumps, they can be implemented in a lighter way, such as lazy loading.
Introduction to cursor paging: an efficient solution to replace OFFSET
The so-called "cursor paging" refers to positioning the starting point of the next page by a unique value (such as self-increment ID or timestamp) of the last record of the previous page, thereby avoiding the use of OFFSET
.
For example, if id
of the last record on the previous page is 12345, then the next page can be written like this:
SELECT * FROM users WHERE id > 12345 ORDER BY id LIMIT 10;
This method does not slow down as the page number becomes deeper, because it is always checked from a specific value.
Applicable premise:
- There must be an ordered and unique field (such as an autoincrement primary key or timestamp).
- It is not suitable for random jumps (such as jumping directly to page 100), but it is more suitable for operations like "next page".
In general, it is simple and common to implement paging with LIMIT
and OFFSET
, but you should pay attention to performance issues in large data scenarios. If the page is deep or the performance requirements are high, you can consider replacing it with cursor paging and other methods.
The above is the detailed content of Paginating Results with LIMIT and OFFSET in MySQL. For more information, please follow other related articles on the PHP Chinese website!

Hot AI Tools

Undress AI Tool
Undress images for free

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

1. The first choice for the Laravel MySQL Vue/React combination in the PHP development question and answer community is the first choice for Laravel MySQL Vue/React combination, due to its maturity in the ecosystem and high development efficiency; 2. High performance requires dependence on cache (Redis), database optimization, CDN and asynchronous queues; 3. Security must be done with input filtering, CSRF protection, HTTPS, password encryption and permission control; 4. Money optional advertising, member subscription, rewards, commissions, knowledge payment and other models, the core is to match community tone and user needs.

There are three ways to connect Excel to MySQL database: 1. Use PowerQuery: After installing the MySQLODBC driver, establish connections and import data through Excel's built-in PowerQuery function, and support timed refresh; 2. Use MySQLforExcel plug-in: The official plug-in provides a friendly interface, supports two-way synchronization and table import back to MySQL, and pay attention to version compatibility; 3. Use VBA ADO programming: suitable for advanced users, and achieve flexible connections and queries by writing macro code. Choose the appropriate method according to your needs and technical level. PowerQuery or MySQLforExcel is recommended for daily use, and VBA is better for automated processing.

To achieve MySQL deployment automation, the key is to use Terraform to define resources, Ansible management configuration, Git for version control, and strengthen security and permission management. 1. Use Terraform to define MySQL instances, such as the version, type, access control and other resource attributes of AWSRDS; 2. Use AnsiblePlaybook to realize detailed configurations such as database user creation, permission settings, etc.; 3. All configuration files are included in Git management, support change tracking and collaborative development; 4. Avoid hard-coded sensitive information, use Vault or AnsibleVault to manage passwords, and set access control and minimum permission principles.

There are three main ways to set environment variables in PHP: 1. Global configuration through php.ini; 2. Passed through a web server (such as SetEnv of Apache or fastcgi_param of Nginx); 3. Use putenv() function in PHP scripts. Among them, php.ini is suitable for global and infrequently changing configurations, web server configuration is suitable for scenarios that need to be isolated, and putenv() is suitable for temporary variables. Persistence policies include configuration files (such as php.ini or web server configuration), .env files are loaded with dotenv library, and dynamic injection of variables in CI/CD processes. Security management sensitive information should be avoided hard-coded, and it is recommended to use.en

To collect user behavior data, you need to record browsing, search, purchase and other information into the database through PHP, and clean and analyze it to explore interest preferences; 2. The selection of recommendation algorithms should be determined based on data characteristics: based on content, collaborative filtering, rules or mixed recommendations; 3. Collaborative filtering can be implemented in PHP to calculate user cosine similarity, select K nearest neighbors, weighted prediction scores and recommend high-scoring products; 4. Performance evaluation uses accuracy, recall, F1 value and CTR, conversion rate and verify the effect through A/B tests; 5. Cold start problems can be alleviated through product attributes, user registration information, popular recommendations and expert evaluations; 6. Performance optimization methods include cached recommendation results, asynchronous processing, distributed computing and SQL query optimization, thereby improving recommendation efficiency and user experience.

PHP plays the role of connector and brain center in intelligent customer service, responsible for connecting front-end input, database storage and external AI services; 2. When implementing it, it is necessary to build a multi-layer architecture: the front-end receives user messages, the PHP back-end preprocesses and routes requests, first matches the local knowledge base, and misses, call external AI services such as OpenAI or Dialogflow to obtain intelligent reply; 3. Session management is written to MySQL and other databases by PHP to ensure context continuity; 4. Integrated AI services need to use Guzzle to send HTTP requests, safely store APIKeys, and do a good job of error handling and response analysis; 5. Database design must include sessions, messages, knowledge bases, and user tables, reasonably build indexes, ensure security and performance, and support robot memory

To recycle MySQL user permissions using REVOKE, you need to specify the permission type, database, and user by format. 1. Use REVOKEALLPRIVILEGES, GRANTOPTIONFROM'username'@'hostname'; 2. Use REVOKEALLPRIVILEGESONmydb.FROM'username'@'hostname'; 3. Use REVOKEALLPRIVILEGESONmydb.FROM'username'@'hostname'; 3. Use REVOKE permission type ON.*FROM'username'@'hostname'; Note that after execution, it is recommended to refresh the permissions. The scope of the permissions must be consistent with the authorization time, and non-existent permissions cannot be recycled.

To enable PHP containers to support automatic construction, the core lies in configuring the continuous integration (CI) process. 1. Use Dockerfile to define the PHP environment, including basic image, extension installation, dependency management and permission settings; 2. Configure CI/CD tools such as GitLabCI, and define the build, test and deployment stages through the .gitlab-ci.yml file to achieve automatic construction, testing and deployment; 3. Integrate test frameworks such as PHPUnit to ensure that tests are automatically run after code changes; 4. Use automated deployment strategies such as Kubernetes to define deployment configuration through the deployment.yaml file; 5. Optimize Dockerfile and adopt multi-stage construction
