Timeline for Get list of subdirectories which contain a file whose name contains a string
Current License: CC BY-SA 4.0
6 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Jan 11, 2024 at 15:01 | comment | added | mgutt | The different depths are (hopefully) performance tweaks to avoid traversal into unnecessary (sub)directories. For example "depth 2" targets directly the parent directory of the files without going deeper. Regarding uniq: yes should work as find does not return the files in a random order. | |
| Jan 11, 2024 at 14:23 | comment | added | CervEd | I wouldn't be surprised if this solution is slower. One of the main reasons for this approach was not waiting for the entire tree traversal, required by sort. I would probably opt for one of the top rated answers if I was trying to do this today, but probably just find / -name '*f*' -printf "%h\n" | uniq. Sorting shouldn't be necessary | |
| Jan 11, 2024 at 14:18 | comment | added | CervEd | @mgutt why the different depth options? | |
| Jan 11, 2024 at 14:07 | comment | added | mgutt | Nice idea, but sadly slower. I compared time find "$src_path" -mindepth 2 -maxdepth 2 -type d -path "$src_path/4???_?/archive" -print0 | xargs -0 -I{} find {} -maxdepth 1 -type f -printf "%h\n" -quit (1.5s) with time find "$src_path" -mindepth 3 -maxdepth 3 -type f -path "$src_path/4???_?/archive/*" -printf "%h\0" | sort -zu (0.5s). In my case I have ~1000 dirs without any files and ~20 dirs with files. | |
| Apr 25, 2021 at 22:25 | review | First posts | |||
| Apr 28, 2021 at 8:15 | |||||
| Apr 25, 2021 at 22:17 | history | answered | CervEd | CC BY-SA 4.0 |