Timeline for How to delete all duplicate hardlinks to a file?
Current License: CC BY-SA 3.0
4 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Jun 1, 2016 at 2:04 | comment | added | Lee-Man | Okay, that was not very clear. :) But most backup programs (even tar) understand hard links and don't waste space. But I'm glad you solved your problem. | |
| May 31, 2016 at 22:04 | comment | added | n.st | Thanks for the explanation, but I'm already familiar with how hardlinks work and that they don't consume any noteworthy amount of disk space — like I said I just wanted to prune my old rsnapshot directory to only keep a single link to each inode, so I won't end up looking at the same file twice when I go through the data and sort it into my new archive. | |
| May 31, 2016 at 21:54 | comment | added | Gilles 'SO- stop being evil' | This is all true but it doesn't answer the question. I think you missed the point: this isn't about disk space, it's about processing the files without processing the same file twice under different names. | |
| May 31, 2016 at 18:39 | history | answered | Lee-Man | CC BY-SA 3.0 |