3

I tried to create byte array blocks from file whil the process was still using the file for writing. Actually I am storing video into file and I would like to create chunks from the same file while recording.

The following method was supposed to read blocks of bytes from file:

private byte[] getBytesFromFile(File file) throws IOException{ InputStream is = new FileInputStream(file); long length = file.length(); int numRead = 0; byte[] bytes = new byte[(int)length - mReadOffset]; numRead = is.read(bytes, mReadOffset, bytes.length - mReadOffset); if(numRead != (bytes.length - mReadOffset)){ throw new IOException("Could not completely read file " + file.getName()); } mReadOffset += numRead; is.close(); return bytes; } 

But the problem is that all array elements are set to 0 and I guess it is because the writing process locks the file.

I would bevery thankful if anyone of you could show any other way to create file chunks while writing into file.

1
  • Does your own application write the video file (you've written it)? Or do you try to chunk the output of a foreign application? Commented Oct 6, 2009 at 16:04

2 Answers 2

8

Solved the problem:

private void getBytesFromFile(File file) throws IOException { FileInputStream is = new FileInputStream(file); //videorecorder stores video to file java.nio.channels.FileChannel fc = is.getChannel(); java.nio.ByteBuffer bb = java.nio.ByteBuffer.allocate(10000); int chunkCount = 0; byte[] bytes; while(fc.read(bb) >= 0){ bb.flip(); //save the part of the file into a chunk bytes = bb.array(); storeByteArrayToFile(bytes, mRecordingFile + "." + chunkCount);//mRecordingFile is the (String)path to file chunkCount++; bb.clear(); } } private void storeByteArrayToFile(byte[] bytesToSave, String path) throws IOException { FileOutputStream fOut = new FileOutputStream(path); try { fOut.write(bytesToSave); } catch (Exception ex) { Log.e("ERROR", ex.getMessage()); } finally { fOut.close(); } } 
Sign up to request clarification or add additional context in comments.

Comments

0

If it were me, I would have it chunked by the process/thread writing to the file. This is how Log4j seems to do it, at any rate. It should be possible to make an OutputStream which automatically starts writing to a new file every N bytes.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.