3

What is the most efficient way to read heavy files(>~10Gb) using Node.JS(LTS)?

Essentially in today's world, I need to read the file content, parse each line to a known data-structure , perform certain validations, and push the data-structure into the database(SQL Server). I do it using C#(memory-mapped files). It works pretty well because I am able to read the file in chunks (in parallel).

I am planning to migrate the solution to Node(and MongoDB) for a business use-case.

Any leads/suggestions?

Environment:

I am using a 64-bit Windows OS, x64 based processor, 8Gigs of RAM

1 Answer 1

3

What you're looking for is usually referred to as streams in node.js.

You can read or write very large files with streams by processing portions of it.

Here are a few links that could help you to get started.

Parsing huge logfiles in Node.js - read in line-by-line

Using Node.js to Read Really, Really Large Datasets & Files

Read large text files in nodejs

Sign up to request clarification or add additional context in comments.

1 Comment

Additionally... there are often standard filter streams and stream parsers for handling common file types.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.