Test the efficiency of deleting a large number of files under Linux. First create 500,000 files 1. rm delete
rm does not work due to the large number of files. 2. Find and delete
About 43 minutes on my computer. . . . . . I deleted it while watching the video. 3. find with delete
It takes 9 minutes. 4. rsync delete
Very good and powerful. 5. Python Delete import os import timeit def main(): for pathname,dirnames,filenames in os.walk('/home/username/test'): for filename in filenames: file = os.path.join(pathname,filename) os.remove(file) if __name__ == '__main__': t = timeit.Timer('main()','from __main__ import main') print t.timeit(1) 1 2 $ python test.py 529.309022903 It takes about 9 minutes. 6. Perl Delete
This should be the fastest. 7. Results:
Conclusion: rsync is the fastest and most convenient way to delete a large number of small files. The above is the full content of this article. I hope it will be helpful for everyone’s study. I also hope that everyone will support 123WORDPRESS.COM. You may also be interested in:
|
<<: Understanding v-bind in vue
>>: Detailed explanation of the use of MySQL mysqldump
Effect screenshots: Implementation code: Copy code...
Shell is a program written in C language, which i...
Keyboard Characters English ` backquote ~ tilde !...
Find the containerID of tomcat and enter the toma...
Introduction MySQL slow query log is an important...
Since myeclipse2017 and idea2017 are installed on...
In this article, I will show you how to install a...
Table of contents 1. First, configure the main.js...
The code looks like this: <!DOCTYPE html> &...
Table of contents 1. Introduction 2. Why do we ne...
Now let's summarize several situations of con...
■ Website theme planning Be careful not to make yo...
Table of contents When setting up a MySQL master-...
Today I helped a classmate solve a problem - Tomc...
Many friends will report the following error when...