Timeline for Is there an easy way to replace duplicate files with hardlinks?
Current License: CC BY-SA 2.5
7 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Feb 1, 2016 at 16:56 | comment | added | Charles Duffy | Checksumming every single file, rather than only files where there exists at least one other with identical size, is unnecessarily inefficient (and unnecessarily prone to hash collisions). | |
| Jun 26, 2015 at 6:59 | comment | added | phunehehe | @oligofren I was thinking the same, but then I hit [Errno 31] Too many links. This scrips seems to be the only thing that handles that. | |
| Jan 3, 2015 at 13:42 | comment | added | oligofren | Upvoted this, but after researching some more, I kind of which I didn't. rdfind is available via the package managers for ALL major platforms (os x, linux, (cyg)win, solaris), and works at a blazing native speed. So do check out the answer below. | |
| Dec 8, 2010 at 20:13 | comment | added | Josh | This did exactly what I asked for. However I believe that ZFS with dedup will eventually be the way to do, since I did find that the files had slight differences so only a few could be hardlinked. | |
| Dec 8, 2010 at 20:12 | vote | accept | Josh | ||
| Oct 12, 2010 at 20:09 | comment | added | Josh | Sounds perfect, thanks!! I'll try it and accept if it works as described! | |
| Oct 12, 2010 at 20:04 | history | answered | fschmitt | CC BY-SA 2.5 |