Skip to main content
AI Assist is now on Stack Overflow. Start a chat to get instant answers from across the network. Sign up to save and share your chats.
added 1 character in body
Source Link
Nat
  • 3.7k
  • 23
  • 22

I need to calculate a SHA-256 hash of a large file (or portion of it). My implementation works fine, but its much slower than the C++'s CryptoPP calculation (25 Min. vs. 10 Min for ~30GB file). What I need is a similar execution time in C++ and Java, so the hashes are ready at almoustalmost the same time. I also tryedtried the Bouncy Castle implementation, but it gave me the same result. Here is how I calculate the hash:

int buff = 16384; try { RandomAccessFile file = new RandomAccessFile("T:\\someLargeFile.m2v", "r"); long startTime = System.nanoTime(); MessageDigest hashSum = MessageDigest.getInstance("SHA-256"); byte[] buffer = new byte[buff]; byte[] partialHash = null; long read = 0; // calculate the hash of the hole file for the test long offset = file.length(); int unitsize; while (read < offset) { unitsize = (int) (((offset - read) >= buff) ? buff : (offset - read)); file.read(buffer, 0, unitsize); hashSum.update(buffer, 0, unitsize); read += unitsize; } file.close(); partialHash = new byte[hashSum.getDigestLength()]; partialHash = hashSum.digest(); long endTime = System.nanoTime(); System.out.println(endTime - startTime); } catch (FileNotFoundException e) { e.printStackTrace(); } 

I need to calculate a SHA-256 hash of large file (or portion of it). My implementation works fine, but its much slower than the C++'s CryptoPP calculation (25 Min. vs. 10 Min for ~30GB file). What I need is a similar execution time in C++ and Java, so the hashes are ready at almoust the same time. I also tryed the Bouncy Castle implementation, but it gave me the same result. Here is how I calculate the hash:

int buff = 16384; try { RandomAccessFile file = new RandomAccessFile("T:\\someLargeFile.m2v", "r"); long startTime = System.nanoTime(); MessageDigest hashSum = MessageDigest.getInstance("SHA-256"); byte[] buffer = new byte[buff]; byte[] partialHash = null; long read = 0; // calculate the hash of the hole file for the test long offset = file.length(); int unitsize; while (read < offset) { unitsize = (int) (((offset - read) >= buff) ? buff : (offset - read)); file.read(buffer, 0, unitsize); hashSum.update(buffer, 0, unitsize); read += unitsize; } file.close(); partialHash = new byte[hashSum.getDigestLength()]; partialHash = hashSum.digest(); long endTime = System.nanoTime(); System.out.println(endTime - startTime); } catch (FileNotFoundException e) { e.printStackTrace(); } 

I need to calculate a SHA-256 hash of a large file (or portion of it). My implementation works fine, but its much slower than the C++'s CryptoPP calculation (25 Min. vs. 10 Min for ~30GB file). What I need is a similar execution time in C++ and Java, so the hashes are ready at almost the same time. I also tried the Bouncy Castle implementation, but it gave me the same result. Here is how I calculate the hash:

int buff = 16384; try { RandomAccessFile file = new RandomAccessFile("T:\\someLargeFile.m2v", "r"); long startTime = System.nanoTime(); MessageDigest hashSum = MessageDigest.getInstance("SHA-256"); byte[] buffer = new byte[buff]; byte[] partialHash = null; long read = 0; // calculate the hash of the hole file for the test long offset = file.length(); int unitsize; while (read < offset) { unitsize = (int) (((offset - read) >= buff) ? buff : (offset - read)); file.read(buffer, 0, unitsize); hashSum.update(buffer, 0, unitsize); read += unitsize; } file.close(); partialHash = new byte[hashSum.getDigestLength()]; partialHash = hashSum.digest(); long endTime = System.nanoTime(); System.out.println(endTime - startTime); } catch (FileNotFoundException e) { e.printStackTrace(); } 
Source Link
stefita
  • 1.8k
  • 2
  • 20
  • 35

Java: Calculate SHA-256 hash of large file efficiently

I need to calculate a SHA-256 hash of large file (or portion of it). My implementation works fine, but its much slower than the C++'s CryptoPP calculation (25 Min. vs. 10 Min for ~30GB file). What I need is a similar execution time in C++ and Java, so the hashes are ready at almoust the same time. I also tryed the Bouncy Castle implementation, but it gave me the same result. Here is how I calculate the hash:

int buff = 16384; try { RandomAccessFile file = new RandomAccessFile("T:\\someLargeFile.m2v", "r"); long startTime = System.nanoTime(); MessageDigest hashSum = MessageDigest.getInstance("SHA-256"); byte[] buffer = new byte[buff]; byte[] partialHash = null; long read = 0; // calculate the hash of the hole file for the test long offset = file.length(); int unitsize; while (read < offset) { unitsize = (int) (((offset - read) >= buff) ? buff : (offset - read)); file.read(buffer, 0, unitsize); hashSum.update(buffer, 0, unitsize); read += unitsize; } file.close(); partialHash = new byte[hashSum.getDigestLength()]; partialHash = hashSum.digest(); long endTime = System.nanoTime(); System.out.println(endTime - startTime); } catch (FileNotFoundException e) { e.printStackTrace(); }