I have a directory that contains multiple files of really huge size and the total size of the directory is around 285G where if i do ls -ltrh to list files in the directory, it is taking its own time to list the files. I want to delete all the contents in that directory in a faster way, i have tried the below way and is taking around 45 mins to clear files and directory. Is there any other fastest way to do so?
[loguser@npdlogmt01 DEVW]$ du -sh 2021-03-26_TEST 285G 2021-03-26_TEST [loguser@npmt01 DEV]$ cat Delete_Find_test_v10.out + date Sun Apr 11 11:20:43 UTC 2021 + find /op_reqs_logs/OPC/DEV/2021-03-26_TEST/ONLINE/V10 -type f -iname '*txt' -delete + date Sun Apr 11 11:20:44 UTC 2021 + find /op_reqs_logs/OPC/DEV/2021-03-26_TEST/BATCH/V10 -type f -iname '*txt' -delete + date Sun Apr 11 12:03:55 UTC 2021 + exit 0 rm -rf 2021-03-26_TEST
rm -rfthe old directory. It certainly allows you to start using the new directory before all the old files have been deleted, which minimises downtime. This will also ensure that the newly created directory is of minimal size, which can improve performance when using the directory.