OverviewNode's fs documentation contains a lot of APIs, after all, it fully supports file system operations. The documentation is well organized, and the operations are basically divided into file operations, directory operations, file information, and streams. The programming method also supports synchronization, asynchrony, and Promise. This article records several issues that are not described in detail in the document, which can better connect the fs document ideas:
File DescriptorsA file descriptor is a non-negative integer. It is an index value that the operating system can use to find the corresponding file. In many low-level APIs of fs, file descriptors are required. In the documentation, descriptors are usually represented by fd. For example: fs.read(fd, buffer, offset, length, position, callback). The corresponding api is: fs.readFile(path[, options], callback). Because the operating system has a limit on the number of file descriptors, don't forget to close the file after completing the file operation: const fs = require("fs"); fs.open("./db.json", "r", (err, fd) => { if (err) throw err; // File operations... // After completing the operation, close the file fs.close(fd, err => { if (err) throw err; }); }); Synchronous, asynchronous and PromiseAll file system APIs have both synchronous and asynchronous forms. Synchronous writingIt is not recommended to use synchronous APIs, as they will block the thread. try { const buf = fs.readFileSync("./package.json"); console.log(buf.toString("utf8")); } catch (error) { console.log(error.message); } Asynchronous writingAsynchronous writing is easy to fall into callback hell. fs.readFile("./package.json", (err, data) => { if (err) throw err; console.log(data.toString("utf8")); }); (Recommended) Promise writingBefore node v12, you need to use promise encapsulation yourself: function readFilePromise(path, encoding = "utf8") { const promise = new Promise((resolve, reject) => { fs.readFile(path, (err, data) => { if (err) return reject(err); return resolve(data.toString(encoding)); }); }); return promise; } readFilePromise("./package.json").then(res => console.log(res)); In node v12, the fs Promise api was introduced. They return Promise objects instead of using callbacks. The API is accessible via require('fs').promises. This reduces development costs. const fsPromises = require("fs").promises; fsPromises .readFile("./package.json", { encoding: "utf8", flag: "r" }) .then(console.log) .catch(console.error); Catalogs and Catalog Itemsfs.Dir class: encapsulates operations related to file directories fs.Dirent class: encapsulates operations related to directory entries. For example, determine the device type (character, block, FIFO, etc.). The relationship between them is shown in code: const fsPromises = require("fs").promises; async function main() { const dir = await fsPromises.opendir("."); let dirent = null; while ((dirent = await dir.read()) !== null) { console.log(dirent.name); } } main(); File informationfs.Stats class: encapsulates operations related to file information. It is returned in the fs.stat() callback function. fs.stat("./package.json", (err, stats) => { if (err) throw err; console.log(stats); }); Note, about checking if the file exists:
ReadStream and WriteStreamIn nodejs, stream is a very important library. The APIs of many libraries are encapsulated based on streams. For example, the ReadStream and WriteStream in fs described below. fs itself provides readFile and writeFile, but the price of their usefulness is performance issues, as all the content will be loaded into memory at once. But for large files of several GB, there will obviously be problems. So the solution for large files is naturally: read them out bit by bit. This requires the use of stream. Taking readStream as an example, the code is as follows: const rs = fs.createReadStream("./package.json"); let content = ""; rs.on("open", () => { console.log("start to read"); }); rs.on("data", chunk => { content += chunk.toString("utf8"); }); rs.on("close", () => { console.log("finish read, content is:\n", content); }); With the help of stream pipe, a large file copy function can be quickly encapsulated in one line: function copyBigFile(src, target) { fs.createReadStream(src).pipe(fs.createWriteStream(target)); } The above is the details of how to use the module fs file system in Nodejs. For more information about Nodejs, please pay attention to other related articles on 123WORDPRESS.COM! You may also be interested in:
|
<<: C# implements MySQL command line backup and recovery
>>: Ansible automated operation and maintenance deployment method for Linux system
Table of contents Code Optimization Using key in ...
Note: The nginx version must be 1.9 or above. Whe...
Table of contents background How does element-ui&...
The title images on Zhihu Discovery columns are g...
Preface Use js to achieve a year rotation selecti...
1. Storage Engine In the last section, we mention...
1. Go to the official website: D:\mysql-5.7.21-wi...
This article records the installation and configu...
Table of contents Overview Global hook function R...
CentOS8 was released a few days ago. Although it ...
The detailed installation process of mysql5.7.21 ...
tomcat server configuration When everyone is lear...
The following code introduces MySQL to update som...
Preface Everyone knows that the partition field m...
Vue data two-way binding principle, but this meth...