2

I am copying data from one file to another file.

It takes more time. What's the reason?

My code is here

 public void copyData( InputStream in, OutputStream out ) throws IOException { try { in = new CipherInputStream( in, dcipher ); int numRead = 0; byte[] buf = new byte[512]; while ( ( numRead = in.read( buf ) ) >= 0 ) { out.write( buf, 0, numRead ); } out.close(); in.close(); } catch ( java.io.IOException e ) { } } 

7
  • 1
    Try this byte[] buf = new byte[1024]; Commented Jun 26, 2012 at 10:25
  • 1
    What do you mean by more time ? In comparison to what it takes more time. How much data are you trying to copy ? Where are your new and old files located ? Also the buffer size is 512 bytes, is there a reason to do so ? Commented Jun 26, 2012 at 10:26
  • My file size is 11.8 (MB). My new and old file is stored in the sdcard. Commented Jun 26, 2012 at 10:30
  • 1
    Buffer is too small, increase to 4kb or 8kb Commented Jun 26, 2012 at 10:31
  • 2
    512 bytes buffer is WAY too small. Commented Jun 26, 2012 at 10:31

3 Answers 3

1

Please check the code, what I did is increased the buffer size and flushing the data as soon as it touches 1 MB, so that you don't encounter Out of memory error.

Reason is mainly due to small buffer size which takes time in writing small bytes of information. Better to put a large chunk at a time.

You can modify these values according to your needs.

public void copyData( InputStream in, OutputStream out ) throws IOException { try { int numRead = 0; byte[] buf = new byte[102400]; long total = 0; while ( ( numRead = in.read( buf ) ) >= 0 ) { total += numRead; out.write( buf, 0, numRead ); //flush after 1MB, so as heap memory doesn't fall short if (total > 1024 * 1024) { total = 0; out.flush(); } } out.close(); in.close(); } catch ( java.io.IOException e ) { } } 
Sign up to request clarification or add additional context in comments.

1 Comment

You don't need all this. Flushing won't make it faster: slower, if anything. There is no risk of OutOfMemoryException with this code, unless it is writing to a ByteArrayOutputStream, which would be stupid, and contrary to the terms of the question.
0

I am copying data from one file to another file.

No you aren't. You are decrypting an input stream and writing the plaintext to an output stream.

It takes more time.

More time than what?

What's the reason?

Basically your tiny buffer size. Raise it to at least 8192 bytes: more if there continues to be a benefit.

int numRead = 0; 

You don't need to initialize this variable.

byte[] buf = new byte[512]; 

See above. Change to at least 8192.

while ( ( numRead = in.read( buf ) ) >= 0 ) 

read(byte[]) can only return zero if buf.length is zero, which is a programming error you don't want to loop forever on. Change the condition to > 0.

 catch ( java.io.IOException e ) { } 

Never ignore an exception.

I am using operation as Encrypt/Decrypt a file. So that's the reason i am using the buffer size is 512 bytes.

No it isn't. There's nothing about encryption or decryption that requires a 512-byte buffer.

Comments

-1

2 Reasons

  1. Buffer is too small, make it 4kb or 8kb, keep increasing until your phone crash's, then move 1 step back
  2. Reading and Writing need to be on 2 different threads. As reads complete, put it on a q, and as write completes read it from the q. Dont forget to synchronize the q object.

When writing such code, you need to use CPU and Memory to its highest extent. On thread and a while loop is so College C'ish.. :)

2 Comments

Sorry for disturbing. Can you give me any sample part of code. Thanks.
Reading and writing do not need to be on different threads.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.