OverviewNode's fs documentation contains a lot of APIs, after all, it fully supports file system operations. The documentation is well organized, and the operations are basically divided into file operations, directory operations, file information, and streams. The programming method also supports synchronization, asynchrony, and Promise. This article records several issues that are not described in detail in the document, which can better connect the fs document ideas:
File DescriptorsA file descriptor is a non-negative integer. It is an index value that the operating system can use to find the corresponding file. In many low-level APIs of fs, file descriptors are required. In the documentation, descriptors are usually represented by fd. For example: fs.read(fd, buffer, offset, length, position, callback). The corresponding api is: fs.readFile(path[, options], callback). Because the operating system has a limit on the number of file descriptors, don't forget to close the file after completing the file operation: const fs = require("fs"); fs.open("./db.json", "r", (err, fd) => { if (err) throw err; // File operations... // After completing the operation, close the file fs.close(fd, err => { if (err) throw err; }); }); Synchronous, asynchronous and PromiseAll file system APIs have both synchronous and asynchronous forms. Synchronous writingIt is not recommended to use synchronous APIs, as they will block the thread. try { const buf = fs.readFileSync("./package.json"); console.log(buf.toString("utf8")); } catch (error) { console.log(error.message); } Asynchronous writingAsynchronous writing is easy to fall into callback hell. fs.readFile("./package.json", (err, data) => { if (err) throw err; console.log(data.toString("utf8")); }); (Recommended) Promise writingBefore node v12, you need to use promise encapsulation yourself: function readFilePromise(path, encoding = "utf8") { const promise = new Promise((resolve, reject) => { fs.readFile(path, (err, data) => { if (err) return reject(err); return resolve(data.toString(encoding)); }); }); return promise; } readFilePromise("./package.json").then(res => console.log(res)); In node v12, the fs Promise api was introduced. They return Promise objects instead of using callbacks. The API is accessible via require('fs').promises. This reduces development costs. const fsPromises = require("fs").promises; fsPromises .readFile("./package.json", { encoding: "utf8", flag: "r" }) .then(console.log) .catch(console.error); Catalogs and Catalog Itemsfs.Dir class: encapsulates operations related to file directories fs.Dirent class: encapsulates operations related to directory entries. For example, determine the device type (character, block, FIFO, etc.). The relationship between them is shown in code: const fsPromises = require("fs").promises; async function main() { const dir = await fsPromises.opendir("."); let dirent = null; while ((dirent = await dir.read()) !== null) { console.log(dirent.name); } } main(); File informationfs.Stats class: encapsulates operations related to file information. It is returned in the fs.stat() callback function. fs.stat("./package.json", (err, stats) => { if (err) throw err; console.log(stats); }); Note, about checking if the file exists:
ReadStream and WriteStreamIn nodejs, stream is a very important library. The APIs of many libraries are encapsulated based on streams. For example, the ReadStream and WriteStream in fs described below. fs itself provides readFile and writeFile, but the price of their usefulness is performance issues, as all the content will be loaded into memory at once. But for large files of several GB, there will obviously be problems. So the solution for large files is naturally: read them out bit by bit. This requires the use of stream. Taking readStream as an example, the code is as follows: const rs = fs.createReadStream("./package.json"); let content = ""; rs.on("open", () => { console.log("start to read"); }); rs.on("data", chunk => { content += chunk.toString("utf8"); }); rs.on("close", () => { console.log("finish read, content is:\n", content); }); With the help of stream pipe, a large file copy function can be quickly encapsulated in one line: function copyBigFile(src, target) { fs.createReadStream(src).pipe(fs.createWriteStream(target)); } The above is the details of how to use the module fs file system in Nodejs. For more information about Nodejs, please pay attention to other related articles on 123WORDPRESS.COM! You may also be interested in:
|
<<: C# implements MySQL command line backup and recovery
>>: Ansible automated operation and maintenance deployment method for Linux system
Every website usually encounters many non-search ...
1. The ul tag has a padding value by default in M...
Table of contents 1. Prototype chain inheritance ...
Table of contents LAMP architecture 1.Lamp Introd...
Table of contents this Method In the object Hidde...
This article describes the Linux file management ...
The default storage directory of mysql is /var/li...
A colleague asked me to help him figure out why m...
What is a mata tag The <meta> element provi...
First download JDK. Here we use jdk-8u181-linux-x...
Table of contents 0. Background 1. Installation 2...
Use pure CSS to change the background color of a ...
1. Introduction When we log in to MySQL, we often...
Vue - implement the shuttle box function, the eff...
1. Call the parent component method directly thro...