87

I have a little problem with my function. I would like to get all files in many directories. Currently, I can retrieve the files in the file passed in parameters. I would like to retrieve the html files of each folder in the folder passed as a parameter. I will explain if I put in parameter "test" I retrieve the files in "test" but I would like to retrieve "test / 1 / *. Html", "test / 2 / . /.html ":

var srcpath2 = path.join('.', 'diapo', result); function getDirectories(srcpath2) { return fs.readdirSync(srcpath2).filter(function (file) { return fs.statSync(path.join(srcpath2, file)).isDirectory(); }); } 

The result : [1,2,3]

thanks !

0

24 Answers 24

110

It looks like the glob npm package would help you. Here is an example of how to use it:

File hierarchy:

test ├── one.html └── test-nested └── two.html 

JS code:

const glob = require("glob"); var getDirectories = function (src, callback) { glob(src + '/**/*', callback); }; getDirectories('test', function (err, res) { if (err) { console.log('Error', err); } else { console.log(res); } }); 

which displays:

[ 'test/one.html', 'test/test-nested', 'test/test-nested/two.html' ] 
Sign up to request clarification or add additional context in comments.

5 Comments

the shortest way i found
I was little disappointed that glob will skip Dot Files. What is the purpose of this package if we cannot get dotfiles with simple search?
@AsifAshraf per the documentation: You can make glob treat dots as normal characters by setting dot:true in the options. -- npmjs.com/package/glob
@Paul Mougel The question mentions to get all the "FILES" you are also returning the FOLDERS. Please provide another solution to get only the list of files using glob.
With dree you can achieve something very similar
86

I've seen many very long answers, and it's kinda a waste of memory space. Some also use packages like glob, but if you don't want to depend on any package, here's my solution.

const Path = require("path"); const FS = require("fs"); let Files = []; function ThroughDirectory(Directory) { FS.readdirSync(Directory).forEach(File => { const Absolute = Path.join(Directory, File); if (FS.statSync(Absolute).isDirectory()) return ThroughDirectory(Absolute); else return Files.push(Absolute); }); } ThroughDirectory("./input/directory/"); 

It's pretty self-explanatory. There's an input directory, and it iterates through that. If one of the items is also a directory, go through that and so on. If it's a file, add the absolute path to the array.

Hope this helped :]

2 Comments

const fetchAllFilesFromGivenFolder = (fullPath) => { let files = []; fs.readdirSync(fullPath).forEach(file => { const absolutePath = path.join(fullPath, file); if (fs.statSync(absolutePath).isDirectory()) { const filesFromNestedFolder = fetchAllFilesFromGivenFolder(absolutePath); filesFromNestedFolder.forEach(file => { files.push(file); }) } else return files.push(absolutePath); }); return files }
The Files is a global variable, we can improve the solution by taking a Files variable or return the top level results
50

I really liked Smally's Solution but didn't like the Syntax.

Same solution but slightly easier to read:

const fs = require("fs"); const path = require("path"); let files = []; const getFilesRecursively = (directory) => { const filesInDirectory = fs.readdirSync(directory); for (const file of filesInDirectory) { const absolute = path.join(directory, file); if (fs.statSync(absolute).isDirectory()) { getFilesRecursively(absolute); } else { files.push(absolute); } } }; 

2023 Update

Node v18+ (LTS) offers a recursive flag for readdir. So you should probably use that. EOL for v16 is September 2023 btw.

import { readdir } from 'node:fs/promises'; try { const files = await readdir('./', { recursive: true }); console.log(files); } catch (err) { console.error(err); } 

1 Comment

I used your snippet and works very well. Thank you and Smally ;)
47

Using ES6 yield

const fs = require('fs'); const path = require('path'); function *walkSync(dir) { const files = fs.readdirSync(dir, { withFileTypes: true }); for (const file of files) { if (file.isDirectory()) { yield* walkSync(path.join(dir, file.name)); } else { yield path.join(dir, file.name); } } } for (const filePath of walkSync(__dirname)) { console.log(filePath); } 

2 Comments

never heard of this syntax and keyword
@GorvGoyl this is generator function, learn more here developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/…
17

Here's mine. Like all good answers it's hard to understand:

const isDirectory = path => statSync(path).isDirectory(); const getDirectories = path => readdirSync(path).map(name => join(path, name)).filter(isDirectory); const isFile = path => statSync(path).isFile(); const getFiles = path => readdirSync(path).map(name => join(path, name)).filter(isFile); const getFilesRecursively = (path) => { let dirs = getDirectories(path); let files = dirs .map(dir => getFilesRecursively(dir)) // go through each directory .reduce((a,b) => a.concat(b), []); // map returns a 2d array (array of file arrays) so flatten return files.concat(getFiles(path)); }; 

3 Comments

Good answers are usually the most simple to understand
This answer is well written and not that hard to understand. It works. It is not a lot of code. It is synchronous, unlike glob.
Instead of statSync now you can load all stats in one call. const dirs = await readdir('./', { withFileTypes: true })
10

With modern JavaScript (NodeJs 10) you can use async generator function and loop through them using for-await...of

// ES modules syntax that is included by default in NodeJS 14. // For earlier versions, use `--experimental-modules` flag import fs from "fs/promises" // or, without ES modules, use this: // const fs = require('fs').promises async function run() { for await (const file of getFiles()) { console.log(file.path) } } async function* getFiles(path = `./`) { const entries = await fs.readdir(path, { withFileTypes: true }) for (let file of entries) { if (file.isDirectory()) { yield* getFiles(`${path}${file.name}/`) } else { yield { ...file, path: path + file.name } } } } run() 

2 Comments

To make this faster, switch the first loop to await getFiles().forEach((file) => ... and the second loop to for(let i = 0; i < entries.length; i++).
Why would that be faster? It would take longer to get first result. It would also take more memory. Please back up this claim with a benchmark.
8

The accepted answer needs to install a package. If you want a native option that is ES6:

import { readdirSync } from 'fs' import { join } from 'path' function walk(dir) { return readdirSync(dir, { withFileTypes: true }).flatMap((file) => file.isDirectory() ? walk(join(dir, file.name)) : join(dir, file.name)) } 

This works for me.

  • Read root directory with readdirSync
  • Then map over but flatten as we go
  • if it's a directory, go recursive; else return the filename

1 Comment

great solution, deserves its own npm package
6

Packed into library: https://www.npmjs.com/package/node-recursive-directory

https://github.com/vvmspace/node-recursive-directory

List of files:

const getFiles = require('node-recursive-directory'); (async () => { const files = await getFiles('/home'); console.log(files); })() 

List of files with parsed data:

const getFiles = require('node-resursive-directory'); (async () => { const files = await getFiles('/home', true); // add true console.log(files); })() 

You will get something like that:

 [ ..., { fullpath: '/home/vvm/Downloads/images/Some/Some Image.jpg', filepath: '/home/vvm/Downloads/images/Some/', filename: 'Some Image.jpg', dirname: 'Some' }, ] 

2 Comments

For me, just running the require crashes nodemon.
There's a typo @JCraine. It should be recursive
3

Node.js v20 added a recursive option to the readdir() method. So assuming you have the directory as the first answer(** Paul Mougel **):

└── test ├── one.html └── test-nested └── two.html 

You can do it succinctly as follows without any dependency:

const { readdir } = require("node:fs/promises"); async function getFiles(dir) { const files = await readdir(dir, { recursive: true }); const entries = files.map((filename) => `${dir}/${filename}`); console.log(entries); } getFiles("test); 

The output will look as follows:

[ 'test/one.html', 'test/test-nested', 'test/test-nested/two.html' ] 

1 Comment

This Script only shows parent folder, and child folders not the files inside each child folder, for me the answer from @FakeFootball was the right one to solve my issue.
2

You can also write your own code like below to traverse the directory as shown below :

var fs = require('fs'); function traverseDirectory(dirname, callback) { var directory = []; fs.readdir(dirname, function(err, list) { dirname = fs.realpathSync(dirname); if (err) { return callback(err); } var listlength = list.length; list.forEach(function(file) { file = dirname + '\\' + file; fs.stat(file, function(err, stat) { directory.push(file); if (stat && stat.isDirectory()) { traverseDirectory(file, function(err, parsed) { directory = directory.concat(parsed); if (!--listlength) { callback(null, directory); } }); } else { if (!--listlength) { callback(null, directory); } } }); }); }); } traverseDirectory(__dirname, function(err, result) { if (err) { console.log(err); } console.log(result); }); 

You can check more information about it here : http://www.codingdefined.com/2014/09/how-to-navigate-through-directories-in.html

2 Comments

thanks ! but how send the result in res.send ? please
@coco62 Once you get the result inside the function, you can pass that instead of logging it.
2

I needed to so something similar, in an Electron app: get all subfolders in a given base folder, using TypeScript, and came up with this:

import { readdirSync, statSync, existsSync } from "fs"; import * as path from "path"; // recursive synchronous "walk" through a folder structure, with the given base path getAllSubFolders = (baseFolder, folderList = []) => { let folders:string[] = readdirSync(baseFolder).filter(file => statSync(path.join(baseFolder, file)).isDirectory()); folders.forEach(folder => { folderList.push(path.join(baseFolder,folder)); this.getAllSubFolders(path.join(baseFolder,folder), folderList); }); } 

Comments

2
const fs = require('fs'); const path = require('path'); var filesCollection = []; const directoriesToSkip = ['bower_components', 'node_modules', 'www', 'platforms']; function readDirectorySynchronously(directory) { var currentDirectorypath = path.join(__dirname + directory); var currentDirectory = fs.readdirSync(currentDirectorypath, 'utf8'); currentDirectory.forEach(file => { var fileShouldBeSkipped = directoriesToSkip.indexOf(file) > -1; var pathOfCurrentItem = path.join(__dirname + directory + '/' + file); if (!fileShouldBeSkipped && fs.statSync(pathOfCurrentItem).isFile()) { filesCollection.push(pathOfCurrentItem); } else if (!fileShouldBeSkipped) { var directorypath = path.join(directory + '\\' + file); readDirectorySynchronously(directorypath); } }); } readDirectorySynchronously(''); 

This will fill filesCollection with all the files in the directory and its subdirectories (it's recursive). You have the option to skip some directory names in the directoriesToSkip array.

Comments

2

Speaking of npm packages - another short option is to use fs-readdir-recursive:

const read = require("fs-readdir-recursive"); const foundFiles = read("test"); console.log(foundFiles); 

Output:

[ 'one.html', 'test-nested/some_text.txt', 'test-nested/two.html' ] 

If you're interested only in files with specific extension (like .html mentioned in the question) you can filter them using .endsWith():

const filteredFiles = read("test").filter(item => item.endsWith(".html")); 

Comments

2

Slightly modified version of @Stephen's response (https://stackoverflow.com/a/66083078/4421370) above that returns the files' path relative to the directory you are searching. Or any arbitrary base path you supply to the function call in-place of the default base. If you want the full path just call it as walkSync(dir, dir).

Search Path is: c:\tmp, File path is c:\tmp\test\myfile.txt, Result is test\myfile.txt

Hopefully helpful to some.

const fs = require('fs'); const path = require('path'); function *walkSync(dir, base="") { const files = fs.readdirSync(dir, { withFileTypes: true }) for (const file of files) { if (file.isDirectory()) { yield* walkSync(path.join(dir, file.name), path.join(base, file.name)); } else { yield path.join(base, file.name); } } } for (const filePath of walkSync(__dirname)) { console.log(filePath); } 

1 Comment

Your link to the Slightly modified version of @_Stephen's response returns me to this post, (The current post) stackoverflow.com/questions/41462606/… without any news about @Stephen's response
1

If you rather work synchronously with glob, use the glob.sync() function as mentioned in their documentation. Here's the equivalent example provided by @Paul Mougel but written synchronously:

const glob = require("glob"); var getDirectories = function (src) { return glob.sync(src + '/**/*'); }; var rest = getDirectories('test'); console.log(res); 

Comments

1

A solution with Promises based on globby:

import { globby } from 'globby'; (async () => { const path = '/path/to/dir'; const files = await globby([`${path}/**/*`]); console.log(files); // [ // '/path/to/dir/file1.txt', // '/path/to/dir/subdir/file2.txt', // ... // ] })() 

Comments

1

Synchrone method with two option, simple and efficacy.

const path = require('path');const fs = require('fs'); function toHierarchie_files(pathDir, output_normalize=false, removeEmpty=true) { var result = {}, enqueue = [pathDir]; //normalize slash separator if output_normalize is true or just return val output_normalize = output_normalize == false?val => {return val}:val => {return path.normalize(val)}; //allows absolute or relative path with extended resolution. Returns path normalize absolute to work with or 'none' string. const path_exist = (path_test) => {var tmpTab = fs.existsSync(path.normalize(path.resolve(path_test))) == true?[path.normalize(path.resolve(path_test))]:['', '../', '../../'].map(val => path.normalize(path.resolve(__dirname, val+path_test))).filter((val, index) => fs.existsSync(path.normalize(path.resolve(__dirname, val+path_test))) == true);return tmpTab.length > 0?tmpTab[0]:'none'}; //Check if file exist and return her type or 'none' string const getType = (path_test) => {path_test = path_exist(path_test);return path_test == 'none'?'none':fs.lstatSync(path_test).isDirectory() == true?'dir':fs.lstatSync(path_test).isFile() == true?'file':'none';}; function recursive() { //init new entrie var parentDir = enqueue.pop();result[parentDir]=[]; //read dir fs.readdirSync(path_exist(parentDir)).forEach((file, index) =>{ switch(getType(parentDir+'/'+file)) { //if detect dir push in queue case 'dir': enqueue.push(output_normalize(parentDir+'/'+file)); break; //if file, add in entrie case 'file': result[parentDir].push(file); break; //else done default: break; }; }); //if optional arg remove empty is true, delete entries if not contains files if(result[parentDir].length == 0 && removeEmpty == true){Reflect.deleteProperty(result, parentDir);} //if queue is not empty continue processing if(enqueue.length > 0){recursive();} }; //if dir renseign exist, go recusive if(getType(pathDir) == 'dir'){recursive();} return result; }; 

Result:

{ "public/assets": [ "favicon.ico" ], "public/assets/js": [ "dede.js", "test.js" ], "public/assets/js/css/secure": [ "config.json", "index.js" ], "public/assets/css": [ "style.css" ] 

}

Comments

0

You can use loop through all the files and directories of the root folder, if it's a directory, then get inside it and repeat the process. Consider the code below:

const fs = require('fs'); const path = require('path'); const target = './'; // choose the directory to target var result = [] var filePaths = [] var tempFolder = [] const targetPath = fs.readdirSync(target); function hit(mainPath = targetPath) { mainPath.forEach((file) => { let check = fs.statSync(file); if (!check.isDirectory()) { filePaths.push(file) } else { if (file[0] != '.') { tempFolder.push(file) } } }); // get files from folder if (tempFolder.length > 0) { tempFolder.forEach((dir) => { getFiles(dir) }) } // filePaths contains path to every file } function getFiles(dir) { var paths = fs.readdirSync(dir); var files = []; paths.forEach(function (file) { var fullPath = dir + '/' + file; files.push(fullPath); }); files.forEach((tempFile) => { let check = fs.statSync(tempFile); if (check.isDirectory()) { getFiles(tempFile) } else { filePaths.push(tempFile) } }) } hit(); // main function 

Comments

0

Although not perfect in some scenarios, it must be helpful in many.

const getAllFilePath = (path: string) => { const addData = (_paths: string[]) => { const newFoldersToScrape: string[] = []; _paths.forEach(_path => { fs.readdirSync(_path).forEach((file: string) => { if (file.indexOf(".") === -1) { newFoldersToScrape.push(`${_path}/${file}`); } else { filePaths.push(`${_path}/${file}`); } }); }); foldersToScrape = newFoldersToScrape; }; const baseDirPath = `<YOUR BASE PATH HERE>/${path}`; let foldersToScrape: string[] = []; const filePaths: string[] = []; addData([baseDirPath]); while (foldersToScrape.length !== 0) { addData(foldersToScrape); } return filePaths; }; 

Comments

0

This is how I did it, I think it is similar to yet simpler than most of the other answers here.

const fs = require('fs') let files = [] const getFiles = (path) => { if (fs.lstatSync(path).isDirectory()) { // is this a folder? fs.readdirSync(path).forEach(f => { // for everything in this folder getFiles(path + '/' + f) // process it recursively }) } else if (path.endsWith(".ts")) { // is this a file we are searching for? files.push(path) // record it } } getFiles("src") 

It fills the "files" array with every .ts file under the "src/" directory.

Comments

0

IMO, using Asynchronous Generators and recursion makes it really elegant

import { opendir } from "fs/promises" import { join } from "path" /** * Recursivly yields files from a child directory tree * @param path Starting directory path */ export async function* dirGenerator(path: string): AsyncGenerator<string, void, void> { const dirIterator = await opendir(path) for await (const dirent of dirIterator) { if (dirent.isDirectory()) { yield* dirGenerator(join(path, dirent.name)) } else { yield join(path, dirent.name) } } } 

Comments

-1

Here is a compact pure function that returns all the paths (relatives) in the directory.

import path from 'path' const getFilesPathsRecursively = (directory: string, origin?: string): string[] => fs.readdirSync(directory).reduce((files, file) => { const absolute = path.join(directory, file) return [ ...files, ...(fs.statSync(absolute).isDirectory() ? getFilesPathsRecursively(absolute, origin || directory) : [path.relative(origin || directory, absolute)]), ] }, []) 

2 Comments

you have a ReferenceError: path is not defined
@SamSedighian You just need to import it: it is Node's path utility. I edited the answer to add the import.
-1
  • The solution is written in TypeScript.
  • modern solution with async/await
  • No external dependencies.
  • Asynchronous function (non-blocking like other solutions with readdirSync and statSync)
  • It is extremely fast because multiple processes work in parallel (it is not waiting for a response from every file in the list).
  • It has also some naive error handling (if something happens with one file or folder it will not blow whole process)
import path from "path"; import fs from "fs/promises" export default async function readDirectory(directory: string): Promise<string[]> { const files = await fs.readdir(directory) const filesPromises = files.map(async (file) => { try { const absolutePath = path.join(directory, file); const fileStat = await fs.stat(absolutePath) if (fileStat.isDirectory()) { return await readDirectory(absolutePath); } else { return absolutePath; } } catch (err) { // error handling return []; } }); const filesWithArrays = await Promise.all(filesPromises) const flatArray = filesWithArrays.reduce<string[]>((acc, fileOrArray) => acc.concat(fileOrArray), []); return flatArray; } 

usage (if this is a separate file please remember to import)

const results = await readDirectory('some/path'); 

Comments

-2

I did mine with typescript works well fairly easy to understand

 import * as fs from 'fs'; import * as path from 'path'; export const getAllSubFolders = ( baseFolder: string, folderList: string[] = [] ) => { const folders: string[] = fs .readdirSync(baseFolder) .filter(file => fs.statSync(path.join(baseFolder, file)).isDirectory()); folders.forEach(folder => { folderList.push(path.join(baseFolder, folder)); getAllSubFolders(path.join(baseFolder, folder), folderList); }); return folderList; }; export const getFilesInFolder = (rootPath: string) => { return fs .readdirSync(rootPath) .filter( filePath => !fs.statSync(path.join(rootPath, filePath)).isDirectory() ) .map(filePath => path.normalize(path.join(rootPath, filePath))); }; export const getFilesRecursively = (rootPath: string) => { const subFolders: string[] = getAllSubFolders(rootPath); const allFiles: string[][] = subFolders.map(folder => getFilesInFolder(folder) ); return [].concat.apply([], allFiles); }; 

1 Comment

I had some trouble with typescript+eslint and the flattening of the array in the last lines. So I replaced the last steps by array.reduce. Since we can't post multiline code in comments, here's a single-liner :) export const getFilesRecursively = (rootPath: string) => getAllSubFolders(rootPath).reduce((result, folder) => [...result, ...getFilesInFolder(folder)], [] as string[])