According to multiple sources such as snyk and infoq, preventing a "Zip Slip" vulnerability in Java can be achieved by denying writes outside the target directory:
String canonicalDestDirPath = destDir.getCanonicalPath(); File destfile = new File(destDir, e.getName()); String canonicalDestFile destfile.getCanonicalPath(); if (!canonicalDestFile.startsWith(canonicalDestDirPath + File.separator)) { throw new ArchiverException("Entry outside of the target dir"); } This is indeed safe but has the non-ideal side effect that a zip archive containing a file with path ../tmp/file is OK if extracted to /tmp but not anywhere else. Wouldn't it be more consistent to prevent all path traversals that navigate out of ANY destination? A more consistent check would allow to reliably mark a zip file as "tainted", independently of where it is going to be extracted. Consistent checks would be an advantage when validation and extraction happen at a different time in a backend processing pipeline.
Two questions:
- why are consistent checks not adopted more widely?
- what would be a safe implementation of this stricter Zip Slip check? I was thinking of setting
destDir = java.util.UUID.randomUUID().toString()in the above implementation.
java.util.zipyou are still vulnerable to zip slip.