I remember when I was interviewing for my current job, a leader said earnestly - today's world is the world of the Internet, and competition among IT companies is very fierce. If the loading and display speed of a web page is improved by 0.1 seconds compared to other people's site pages, it is also a great achievement. Then I didn’t know how to continue writing, so I asked those military advisors in the group, and the result was this. . . Okay, it’s time to change the subject and get back to the topic - how to improve the access speed of our site pages, reduce waiting time, and maximize the user access experience? To address this problem, we will propose a series of solutions from the front-end perspective today, all of which can effectively improve the access speed of your pages. 1. Reduce file requests to the server Conventional HTTP requests are short connections in the form of "request"-"response"-"disconnect". For each independent resource, we will send a get request to the server and then wait for the server to send back the file we need. Each resource request actually consumes a "connect-wait-receive" time (of course, setting the http request to keep-alive long connection status can reduce the number and time of "connection"). If we can effectively reduce the number of requests for server files, it means that we can save some page waiting time and reduce the burden on the server. For this solution, we can do: 1. Use CSS sprite technology to merge multiple images into a single image file. When actually using it, use background-position to locate the background position (I believe this is the first thing that comes to your mind); 2. Merge multiple CSS style files into a single style file, merge multiple scripts into a single script, and then reference the merged style/script file in the page. You can use r.js to help with this, but I personally don’t recommend this method because after merging the files, the common styles/script files between multiple pages cannot be cached on the client; 3. Use base64 encoding to display images. Just like the github 404 page: You can use this tool to help you convert images into base64-encoded file streams, but it is generally recommended that you only use this method on pages that are visited less frequently by users, because although they do not need to be retrieved from the server, they cannot be cached on the client, causing the page to be re-rendered every time the user visits it. Moreover, the lengthy file stream code will take up a lot of code space on your page, and it will probably be very frustrating to maintain the page; 4. Write small pieces of CSS and JS code snippets directly on the page instead of introducing independent style/script files on the page. I believe that some friends are used to the norm of "keeping structure (markup), presentation (style), and behavior (script) separate", and may have some opinions on this point of view. All I can say is that standards are not dogmas, and what suits you is the hard truth. The advantages of writing small, low-reuse styles/scripts directly on the page still outweigh the disadvantages (the disadvantages may be that the page code size increases and it is not so easy to maintain). On the other hand, looking at the page source files of all mainstream portals, almost none of them import all styles/scripts as external files (regardless of whether they start from the point of reducing server requests, the fact is this); 5. Use the http-equiv="expires" meta tag to set a future time point as the page file expiration time. Page files obtained by users before the expiration time will only be taken from the cache. However, this method is too rigid (sometimes even if the server changes the expiration time to the end time in time, the client may not go to the server to obtain new file resources according to the newly changed rules), so it is generally not recommended. 2. Reduce file size Large files (especially images) lead to long loading times, which are often the number one enemy affecting the page loading experience. Therefore, it is very important to reduce the size of the requested file as much as possible. What we can do are: 1. Compress style/script files. You can use gulp or grunt to do this. They can both reduce the size of css/js files (and obfuscate variables and function names for js). 2. Select the image format in a targeted manner. If there is no need for a transparent background, only use the gif format for images with a single color and no color gradient. For jpg images, you can also optimize the corresponding "quality" when exporting jpg according to its clarity requirements: If you like to try new things, you can use the webp image format like Taobao does, which can optimize the file size of the same image quality: 3. Use Font Awesome to replace the icons on the page. The principle is to use @font-face to let users download a very small UI font package, and display the icons used on the page in the form of characters, thereby reducing the image requirements and icon file size. 3. Use CDN appropriately There are several benefits to using CDN: if the user has downloaded this CDN resource from other sites, then when they come to our site, they only need to obtain it from the cache; it reduces file requests to their own site server (in the case of external CDN), reducing the server burden; multiple domains will increase the maximum number of resources allowed to be downloaded asynchronously by the browser. For example, if a site only requests resources from one domain, then FireFox only allows a maximum of 2 files to be downloaded asynchronously at the same time. However, if an external CDN is used to introduce resources, then FF allows not only the simultaneous asynchronous download of two resources in the current domain, but also the simultaneous asynchronous download of 2 resources under another domain (CDN). However, there is a big problem with using CDN - it increases the overhead of DNS resolution. If a page introduces resources from multiple CDNs at the same time, it may be stuck in a long waiting time due to DNS resolution, which makes the cost outweigh the gain. Regarding this issue, the general recommendation is to use only the same reliable and fast CDN to introduce various required resources under a site. In other words, it is recommended that a page request resources from two different domains (such as the site domain and the CDN domain) as the best option (it is said that this conclusion was proposed by Yahoo front-end engineers). 4. Delay requests and asynchronous script loading Under all major browsers, our script files are usually downloaded asynchronously like other resource files. However, there is a problem here. For example, after FireFox downloads the script, there will be a "blocked execution" situation. That is to say, during the period when the browser downloads the script and executes it, other behaviors of the browser are blocked, resulting in other resources on the page being unable to be requested and downloaded: If the js code in your page takes too long to execute, the user will obviously feel the delay of the page. There is a simple way to solve this problem - place the script request tag before the </body> closing tag, so that the script on the page becomes the last resource requested, and naturally will not block the request events of other page resources. In addition, although it is mentioned above that "our script files are downloaded asynchronously like other resource files", asynchronous download does not mean asynchronous execution. In order to strictly ensure the correctness of the script logic order and dependency relationships, the browser will execute the scripts in the order in which the scripts are requested. Then the problem arises - if the scripts on the page do not have much dependency, or even no dependency on each other, then the browser's rules will only increase the blocking time of the page request (just like you spend a lot of money to buy an insurance policy, but nothing happens to you during the insurance period... Well, this metaphor is a bit anti-human...). The solution to this problem is to make the script execute asynchronously without blocking, such as adding defer and async attributes to the script tag or dynamically injecting the script (see here), but these are not good solutions. Either there are compatibility issues or they are too troublesome and cannot handle dependencies. I personally recommend using requireJS (AMD specification) or seaJS (CMD specification) to asynchronously load scripts and handle module dependencies. The former will "preplace dependencies" (preload all dependent script modules for the fastest execution speed), while the latter will "depend on the nearest" (lazy load dependent script modules, requesting scripts is more scientific). You can choose the most appropriate one based on the specific needs of the project. 5. Delay requesting files outside the first screen Let me explain first. The "first screen" refers to the page content display area when the page is initialized, that is, the area that the user sees first when the page is loaded. For example, JD.com and Taobao have implemented similar lazyload processing for images that require scrolling to see the page. These are nothing more than the concept of the proxy model, but they do give users an illusion that the page loads faster because I can see the content on the screen quickly (even if I haven't pulled down the scroll bar and the files at the back of the page haven't actually loaded yet). We can implement this solution like this, without relying on any lazyload library. Taking a picture as an example, we can write the img tag of a picture outside the first screen (assuming that the address of a picture is a.jpg) like this:
As shown above, when the page initially loads this image, it directly uses base64 (of course, you can also use a placeholder image loading.gif instead) to quickly display a very small image, and the real path of the image itself is in the data-src attribute. We can request its real file from the server and replace it after the page is loaded:
The above is the delayed loading process for images. For video and audio files, the same principle can be used to delay loading, thereby effectively reducing the waiting time for page initialization. 6. Optimize the order of page module placement Here is a good example. For example, there is a page like this - on the left is the sidebar, which is used to store the user's avatar, information, and advertisements placed on the website, and on the right is the article content area: Then our code might look like this:
Therefore, the browser loads the sidebar first from top to bottom according to its UI single-threaded principle, and then loads our article. . . Obviously, this is not a humane loading order. We need to figure out which area module on the page is the most important to the user and which should be displayed first. For the above example, the article content should be the area that users should see first and that the browser should request and display first. So we have to modify our code to:
Of course, this is just a small example to stimulate your imagination. Knowing how to draw inferences and apply it in practice is the real truth. VII. Other suggestions
I believe that the solutions to speed up pages are definitely not limited to the ones mentioned above, and each project has various needs and situations, so you can only choose the solution that suits you as needed. But the most important thing is to put the user experience first. Whether it is page loading or interaction, you should think and design the best solution from the user's perspective. In this way, I believe you will be able to create outstanding and popular works. That’s all we have to talk about today. Let’s encourage each other~ |
<<: CSS+HTML to realize the top navigation bar function
>>: Management of xinetd-based services installed with RPM packages in Linux
Modify /etc/my.cnf or /etc/mysql/my.cnf file [cli...
This article shares the specific code of JavaScri...
Two examples of the use of the a tag in HTML post...
SQLyog connects to mysql error number 1129: mysql...
Unzip the Maven package tar xf apache-maven-3.5.4...
1. Installation of the decompressed version (1). ...
The full name of Blog should be Web log, which mea...
The decompressed version of MYSQL is installed 1:...
This article is original by 123WORDPRESS.COM Ligh...
Table of contents DragEvent Interface DataTransfe...
1. HBase Overview 1.1 What is HBase HBase is a No...
This article records the installation and configu...
Table of contents 1. Demand 2. Database Design 3....
Portainer Introduction Portainer is a graphical m...
During the development and debugging process, it ...