6

I have this simple situation where I want to filter and map to the same value, like so:

 const files = results.filter(function(r){ return r.file; }) .map(function(r){ return r.file; }); 

To save lines of code, as well as increase performance, I am looking for:

const files = results.filterAndMap(function(r){ return r.file; }); 

does this exist, or should I write something myself? I have wanted such functionality in a few places, just never bothered to look into it before.

6
  • What is results? A multidimensional array [{file:{file:1}}, {notfile:{file:1}}]? Commented Jul 18, 2017 at 5:31
  • results is just an array of objects: [{},{file:x}, {}, {file:y}], etc. Commented Jul 18, 2017 at 5:38
  • "results is just an array of objects: [{},{file:x}, {}, {file:y}]" Well, that array does not match context of JavaScript at Question. In that case .map() is not necessary. You can use .filter() alone to return expected result Commented Jul 18, 2017 at 5:40
  • sorry I do not follow your comment Commented Jul 18, 2017 at 5:41
  • Perhaps did not interpret Question correctly. Initially interpreted results as a nested array. Why is .map() necessary? To return an array of [x, y]? What is expected result? Commented Jul 18, 2017 at 5:42

9 Answers 9

11

Transducers

In its most generic form, the answer to your question lies in transducers. But before we go too abstract, let's see some basics first – below, we implement a couple transducers mapReduce, filterReduce, and tapReduce; you can add any others that you need.

const mapReduce = map => reduce => (acc, x) => reduce (acc, map (x)) const filterReduce = filter => reduce => (acc, x) => filter (x) ? reduce (acc, x) : acc const tapReduce = tap => reduce => (acc, x) => (tap (x), reduce (acc, x)) const tcomp = (f,g) => k => f (g (k)) const concat = (xs,ys) => xs.concat(ys) const transduce = (...ts) => xs => xs.reduce (ts.reduce (tcomp, k => k) (concat), []) const main = transduce ( tapReduce (x => console.log('with:', x)), filterReduce (x => x.file), tapReduce (x => console.log('has file:', x.file)), mapReduce (x => x.file), tapReduce (x => console.log('final:', x))) const data = [{file: 1}, {file: undefined}, {}, {file: 2}] console.log (main (data)) // with: { file: 1 } // has file: 1 // final: 1 // with: { file: undefined } // with: {} // with: { file: 2 } // has file: 2 // final: 2 // => [ 1, 2 ]

Chainable API

Maybe you're satisfied with the simplicity of the code but you're unhappy with the somewhat unconventional API. If you want to preserve the ability to chain .map, .filter, .whatever calls without adding undue iterations, we can make a generic interface for transducing and make our chainable API on top of that – this answer is adapted from the link I shared above and other answers I have about transducers

// Trans Monoid const Trans = f => ({ runTrans: f, concat: ({runTrans: g}) => Trans (k => f (g (k))) }) Trans.empty = () => Trans(k => k) // transducer "primitives" const mapper = f => Trans (k => (acc, x) => k (acc, f (x))) const filterer = f => Trans (k => (acc, x) => f (x) ? k (acc, x) : acc) const tapper = f => Trans (k => (acc, x) => (f (x), k (acc, x))) // chainable API const Transduce = (t = Trans.empty()) => ({ map: f => Transduce (t.concat (mapper (f))), filter: f => Transduce (t.concat (filterer (f))), tap: f => Transduce (t.concat (tapper (f))), run: xs => xs.reduce (t.runTrans ((xs,ys) => xs.concat(ys)), []) }) // demo const main = data => Transduce() .tap (x => console.log('with:', x)) .filter (x => x.file) .tap (x => console.log('has file:', x.file)) .map (x => x.file) .tap (x => console.log('final:', x)) .run (data) const data = [{file: 1}, {file: undefined}, {}, {file: 2}] console.log (main (data)) // with: { file: 1 } // has file: 1 // final: 1 // with: { file: undefined } // with: {} // with: { file: 2 } // has file: 2 // final: 2 // => [ 1, 2 ]

Chainable API, take 2

As an exercise to implement the chaining API with as little dependency ceremony as possible, I rewrote the code snippet without relying upon the Trans monoid implementation or the primitive transducers mapper, filterer, etc – thanks for the comment @ftor.

This is a definite downgrade in terms of overall readability. We lost that ability to just look at it and understand what was happening. We also lost the monoid interface which made it easy for us to reason about our transducers in other expressions. A big gain here tho is the definition of Transduce is contained within 10 lines of source code; compared to 28 before – so while the expressions are more complex, you can probably finish reading the entire definition before your brain starts struggling

// chainable API only (no external dependencies) const Transduce = (t = k => k) => ({ map: f => Transduce (k => t ((acc, x) => k (acc, f (x)))), filter: f => Transduce (k => t ((acc, x) => f (x) ? k (acc, x) : acc)), tap: f => Transduce (k => t ((acc, x) => (f (x), k (acc, x)))), run: xs => xs.reduce (t ((xs,ys) => xs.concat(ys)), []) }) // demo (this stays the same) const main = data => Transduce() .tap (x => console.log('with:', x)) .filter (x => x.file) .tap (x => console.log('has file:', x.file)) .map (x => x.file) .tap (x => console.log('final:', x)) .run (data) const data = [{file: 1}, {file: undefined}, {}, {file: 2}] console.log (main (data)) // with: { file: 1 } // has file: 1 // final: 1 // with: { file: undefined } // with: {} // with: { file: 2 } // has file: 2 // final: 2 // => [ 1, 2 ]

> Talks about performance

When it comes to speed, no functional variant of this is ever going to beat a static for loop which combines all of your program statements in a single loop body. However, the transducers above do have the potential to be faster than a series of .map/.filter/.whatever calls where multiple iterations thru a large data set would be expensive.

Coding style & implementation

The very essence of the transducer lies in mapReduce, which is why I chose to introduce it first. If you can understand how to take multiple mapReduce calls and sequence them together, you'll understand transducers.

Of course you can implement transducers in any number of ways, but I found Brian's approach the most useful as it encodes transducers as a monoid – having a monoid allows us make all sorts of convenient assumptions about it. And once we transduce an Array (one type of monoid), you might wonder how you can transduce any other monoid... in such a case, get reading that article!

Sign up to request clarification or add additional context in comments.

3 Comments

Looks fun, will investigate :)
Brain talks about contravariant functors and combines them with boolean monoids under conjunction to create composable predicates. You actually finished his blog post by implementing a monoidal transducer - I am not sure if your composition is contravariant though. Anyway, brilliant work (if one likes method chaining - which I do not :D I'll try to simplify it in the next days, if such an undertaking is possible at all.
This answer boosted my Javascript IQ by 30 or more points. Wow!
9

If you really need to do it in 1 function, you'll need to use reduce like this

results.reduce( // add the file name to accumulator if it exists (acc, result) => result.file ? acc.concat([result.file]) : acc, // pass empty array for initial accumulator value [] ) 

And if you need to squeeze more performance you can change concat to push and return the original accumulator array to avoid creating extra arrays.

However, the fastest solution is probably a good old for loop which avoids all the function calls and stack frames

files = [] for (var i = 0; i < results.length; i++) { var file = results[i].file if (file) files.push(file) } 

But I think filter/map approach is much more expressive and readable

Comments

3

To increase performance you have to measure what solution will be faster. Let's play for a moment https://jsperf.com/filter-than-map-or-reduce/1

Any other test cases are welcome.

enter image description here

If you want to play with benchmark against NodeJS (remember to npm i benchmark)

var suite = new (require('benchmark')).Suite function getSampleInput() { return [{file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}, {file: 'foo'}, {other: 'bar'}, {file: 'baz'}, {file: 'quux'}, {other: 'quuxdoo'}, {file: 'foobar'}] } // author https://stackoverflow.com/users/3716153/gaafar function reduce(results) { return results.reduce( (acc, result) => result.file ? acc.concat([result.file]) : acc , [] ) } // author https://stackoverflow.com/users/1223975/alexander-mills function filterThanMap(results) { return results.filter(function(r){ return r.file; }) .map(function(r){ return r.file; }); } // author https://stackoverflow.com/users/5361130/ponury-kostek function forEach(results) { const files = []; results.forEach(function(r){ if(r.file) files.push(r.file); }); return files } suite .add('filterThanMap', function() {filterThanMap(getSampleInput())}) .add('reduce', function() {reduce(getSampleInput())}) .add('forEach', function() {forEach(getSampleInput())}) .on('complete', function() { console.log('results:') this.forEach(function(result) { console.log(result.name, result.count, result.times.elapsed) }) console.log('the fastest is', this.filter('fastest').map('name')[0]) }) .run() 

4 Comments

Can you add forEach to comparison?
@ponury-kostek here you are jsperf.com/filter-than-map-or-reduce/1 - looks like forEach is the most fastest in that challenge
Thx. I know, that's why I'm surprised they try to do this in some strange ways.
I'm guessing the performance issue with the reduce option isn't really in the fact that it uses reduce, it's in that it uses concat... I often end up using (acc.push(x), acc) instead of acc.concat(x). It mutates the acc array, but since the array is created inside your function, it shouldn't be too big of a problem.
2

Why not just forEach?

const files = []; results.forEach(function(r){ if(r.file) { files.push(r.file); } });

If this is not fast enough you can use fast.js and make some other micro optimizations

const files = []; const length = results.length; for(var i = 0; i < length; i++) { if (results[i].file) { files[files.length] = results[i].file; } }

Comments

1

You could use the value of o.file or concat with an empty array for the result.

results.reduce((r, o) => r.concat(o.file || []), []); 

1 Comment

das ist a lot of concatenations
0

You can use Array.prototype.reduce()

const results = [{file:{file:1}}, {notfile:{file:1}}]; const files = results.reduce(function(arr, r){ return r.file ? arr = [...arr, r.file.file] : arr; }, []); console.log(files); // 1

Comments

0

Arrays are iterable objects, and we can apply all needed operations in just one iteration.

Example below does such single iteration, using iter-ops library:

import {pipe, filter, map} from 'iter-ops'; const i = pipe( results, filter(r => !!r.file), map(m => m.file) ); console.log('files:', [...i]); 

Comments

0

Not faster and not in one method call, but you could avoid the repetition of r.file by swapping the map and filter calls, so that the filter only needs to check for a truthy value, for which you can borrow on Boolean:

const files = results.map(r => r.file).filter(Boolean); 

To avoid an intermediate array (only useful when you have huge arrays and need to save on space), you could use iterator helper methods (introduced in ECMAScript 2025):

const files = results.values().map(r => r.file).filter(Boolean).toArray(); 

Here, map and filter are iterator helper methods.

Comments

-2
 const file = (array) => { return array.reduce((acc,curr) => curr.file ? acc.concat(curr) : acc, []) } 

Process :

acc initiate as [ ] (empty array) . reduce docs

4 Comments

reduce is the most expensive operation than cheapest one - as jsperf.com/filter-than-map-or-reduce/1 proofs it
Comparing filter-then-map to reduce, at least in the screenshot provided in your answer, @KrzysztofSafjanowski, shows a very minor difference between the two.
@KrzysztofSafjanowski wow my apologies reduce is more expensive thanks for mention me . thanks for mention me, I was thinking reduce is cheaper .
@ajilantang wait a couple of more v8 iterations and it probably won't be. Logically, reduce should be cheaper as you're only iterating the array once. I would guess that one or both of two things is happening: the JIT is optimizing away one of the passes in the map/filter case and/or is optimizing away the creation of the intermediate data structures and generating less garbage.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.