0

I'm using Python 3 to download a file:

local_file = open(file_name, "w" + file_mode) local_file.write(f.read()) local_file.close() 

This code works, but it copies the whole file into memory first. This is a problem with very big files because my program becomes memory hungry. (Going from 17M memory to 240M memory for a 200 MB file)

I would like to know if there is a way in Python to download a small part of a file (packet), write it to file, erase it from memory, and keep repeating the process until the file is completely downloaded.

1

1 Answer 1

3

Try using the method described here:

Lazy Method for Reading Big File in Python?

I am specifically referring to the accepted answer. Let me also copy it here to ensure complete clarity of response.

 def read_in_chunks(file_object, chunk_size=1024): """Lazy function (generator) to read a file piece by piece. Default chunk size: 1k.""" while True: data = file_object.read(chunk_size) if not data: break yield data f = open('really_big_file.dat') for piece in read_in_chunks(f): process_data(piece) 

This will likely be adaptable to your needs: it reads the file in smaller chunks, allowing for processing without filling your entire memory. Come back if you have any further questions.

Sign up to request clarification or add additional context in comments.

5 Comments

What method are you even referring to?
The one that I linked to, which I will link here again: link
You just linked to the question. Which of the answers is the method you're suggesting?
I assumed it was clear that I was referring to the accepted answer. I will edit my post to clarify further.
The accepted answer isn't atomic and can change after you point vaguely in its direction

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.