6

tl;dr - if you have to filter the promises (say for errored ones) don't use async functions

I'm trying to fetch a list of urls with async and parse them, the problem is that if there's an error with one of the urls when I'm fetching - let's say for some reason the api endpoint doesn't exists - the program crushes on the parsing with the obvious error:

UnhandledPromiseRejectionWarning: Unhandled promise rejection (rejection id: 1): TypeError: ext is not iterable 

I've tried checking if the res.json() is undefined, but obviously that's not it as it complains about the entire 'ext' array of promises not being iterable.

async function fetchAll() { let data let ext try { data = await Promise.all(urls.map(url=>fetch(url))) } catch (err) { console.log(err) } try { ext = await Promise.all(data.map(res => { if (res.json()==! 'undefined') { return res.json()} })) } catch (err) { console.log(err) } for (let item of ext) { console.log(ext) } } 

Question 1:

How do I fix the above so it won't crash on an invalid address?

Question 2:

My next step is to write the extracted data to the database. Assuming the data size of 2-5mgb of content, is my approach of using Promise.all() memory efficient? Or will it be more memory efficient and otherwise to write a for loop which handles each fetch then on the same iteration writes to the database and only then handles the next fetch?

12
  • Your error is happening in the for loop Commented Apr 24, 2018 at 16:30
  • Clarification: you have if (res.json()==! 'undefined') which is not the same as !== undefined. Was this intentional? Commented Apr 24, 2018 at 16:31
  • 2
    if (res.json()==! 'undefined') makes no sense. Also you cannot call .json() twice on the same response Commented Apr 24, 2018 at 16:31
  • I've editted my suggestion, where I suggest you call res.json in your own fetch function Commented Apr 24, 2018 at 16:31
  • 1
    Clarification: You have for (let item of ext) in a spot where ext might be undefined. Is this intentional? Commented Apr 24, 2018 at 16:32

8 Answers 8

18

You have several problems with your code on a fundamental basis. We should address those in order and the first is that you're not passing in any URLS!

async function fetchAll(urls) { let data let ext try { data = await Promise.all(urls.map(url=>fetch(url))) } catch (err) { console.log(err) } try { ext = await Promise.all(data.map(res => { if (res.json()==! 'undefined') { return res.json()} })) } catch (err) { console.log(err) } for (let item of ext) { console.log(ext) } } 

First you have several try catch blocks on DEPENDANT DATA. They should all be in a single try catch block:

async function fetchAll(urls) { try { let data = await Promise.all(urls.map(url=>fetch(url))) let ext = await Promise.all(data.map(res => { // also fixed the ==! 'undefined' if (res.json() !== undefined) { return res.json()} })) for (let item of ext) { console.log(ext) } } catch (err) { console.log(err) } } 

Next is the problem that res.json() returns a promise wrapped around an object if it exists

if (res.json() !== undefined) { return res.json()} 

This is not how you should be using the .json() method. It will fail if there is no parsable json. You should be putting a .catch on it

async function fetchAll(urls) { try { let data = await Promise.all(urls.map(url => fetch(url).catch(err => err))) let ext = await Promise.all(data.map(res => res.json ? res.json().catch(err => err) : res)) for (let item of ext) { console.log(ext) } } catch (err) { console.log(err) } } 

Now when it cannot fetch a URL, or parse a JSON you'll get the error and it will cascade down without throwing. Now your try catch block will ONLY throw if there is a different error that happens.

Of course this means we're putting an error handler on each promise and cascading the error, but that's not exactly a bad thing as it allows ALL of the fetches to happen and for you to distinguish which fetches failed. Which is a lot better than just having a generic handler for all fetches and not knowing which one failed.

But now we have it in a form where we can see that there is some better optimizations that can be performed to the code

async function fetchAll(urls) { try { let ext = await Promise.all( urls.map(url => fetch(url) .then(r => r.json()) .catch(error => ({ error, url })) ) ) for (let item of ext) { console.log(ext) } } catch (err) { console.log(err) } } 

Now with a much smaller footprint, better error handling, and readable, maintainable code, we can decide what we eventually want to return. Now the function can live wherever, be reused, and all it takes is a single array of simple GET URLs.

Next step is to do something with them so we probably want to return the array, which will be wrapped in a promise, and realistically we want the error to bubble since we've handled each fetch error, so we should also remove the try catch. At that point making it async no longer helps, and actively harms. Eventually we get a small function that groups all URL resolutions, or errors with their respective URL that we can easily filter over, map over, and chain!

function fetchAll(urls) { return Promise.all( urls.map(url => fetch(url) .then(r => r.json()) .then(data => ({ data, url })) .catch(error => ({ error, url })) ) ) } 

Now we get back an array of similar objects, each with the url it fetched, and either data or an error field! This makes chaining and inspecting SUPER easy.

Sign up to request clarification or add additional context in comments.

11 Comments

Would it not be much simpler to do: urls.map(fetch).then(r=>r.json()).catch(error=>new Fail([error,url]) You get an array of objects and can pick out the Failed items more easily with information about the error and the url.
Yes, but I wanted to address each fundemental flaw. Something came up at work before I was able to post that EXACT line of code believe it or not...
I believe, been there.
Whoops forgot to hav closure to the individual URL
Amazing all of that and it brought me back to using just promises which was what I was using to begin with. I'm starting to wonder if I should drop the async functions I wrote for the db write and just go with promises as well.
|
4

You are getting a TypeError: ext is not iterable - because ext is still undefined when you caught an error and did not assign an array to it. Trying to loop over it will then throw an exception that you do not catch.

I guess you're looking for

async function fetchAll() { try { const data = await Promise.all(urls.map(url => fetch(url))); const ext = await Promise.all(data.map(res => res.json())); for (let item of ext) { console.log(item); } } catch (err) { console.log(err); } } 

7 Comments

This was pretty close to what my eventual code came out to as well, but you're missing something: If any individual promise errors, they will now all error and they will not know which one did. By instead handling the errors in place, we can instead allow cascading for better handling in the long run
@RobertMennell I would expect the error message to contain the url of the failed request. If it doesn't, sure we can add them, but I tried to address the fundamental problem only.
Fair and I did misspeak. They won't all error, but they will all reject
Also it would depend on the step it rejects at... Man I hate Async functions... Where is my error bubbling?
@RobertMennell Errors bubble just like in normal functions, as if await was a throw.
|
0

Instead of fetch(url) on line 5, make your own function, customFetch, which calls fetch but maybe returns null, or an error object, instead of throwing.

something like

async customFetch(url) { try { let result = await fetch(url); if (result.json) return await result.json(); } catch(e) {return e} } 

12 Comments

Fetch returns a promise, awaiting a fetch is fine especially for simple GETs, and especially if you want it to fail if any of them fail
I was suggesting this because I thought his problem was that fetch was throwing an error
It's not. The problem is in his for loop. He shouldn't be using a for loop unless he knows it's an iterable.
Promise.all errors whenever one of the internal promises errors, so whatever advice anyone gives you here, it will boil down to -- Make sure that you wrap the promises in an error handler before passing them to Promise.all
This example is a mutation of async handling and promises that can actually lead to more user error. We can save dozens of characters by just adding a .catch(err => err) to each fetch and allowing us to have a more compact and readable footprint since this is the code area that would ACTUALLY be where you want to handle the error
|
0

if (res.json()==! 'undefined')

Makes no sense whatsoever and is an asynchronous function. Remove that condition and just return res.json():

try { ext = await Promise.all(data.map(res => res.json())) } catch (err) { console.log(err) } 

Whether or not your approach is "best" or "memory efficient" is up for debate. Ask another question for that.

Comments

0

You can have fetch and json not fail by catching the error and return a special Fail object that you will filter out later:

function Fail(reason){this.reason=reason;}; const isFail = o => (o&&o.constructor)===Fail; const isNotFail = o => !isFail(o); const fetchAll = () => Promise.all( urls.map( url=> fetch(url) .then(response=>response.json()) .catch(error=>new Fail([url,error])) ) ); //how to use: fetchAll() .then( results=>{ const successes = results.filter(isNotFail); const fails = results.filter(isFail); fails.forEach( e=>console.log(`failed url:${e.reason[0]}, error:`,e.reason[1]) ) } ) 

As for question 2:

Depending on how many urls you got you may want to throttle your requests and if the urls come from a large file (gigabytes) you can use stream combined with the throttle.

Comments

0
async function fetchAll(url) { return Promise.all( url.map( async (n) => fetch(n).then(r => r.json()) ) ); } fetchAll([...]) .then(d => console.log(d)) .catch(e => console.error(e)); Will this work for you? 

3 Comments

Not sure why your map function is async it would return a promise anyway since fetch returns a promise. You're not catching the error so all urls will fail if one fails and calling fetchAll will only result in getting the one error, not all the successful ones.
AFAIK, should be no harm to put the async syntax there. It serves an indication to me that this is an asynchronous function. This applies to function as well. I could know immediately that this is async or not.
Yes but it still doesn't address the problem that the OP described. If urls has 100 values and the last one fails the function only goes into catch with the error and 99 successful requests are gone. Also the .then is a pretty good indication we're dealing with promises.
0

If you don't depend on every resource being a success I would have gone back to basics skipping async/await

I would process each fetch individual so I could catch the error for just the one that fails

function fetchAll() { const result = [] const que = urls.map(url => fetch(url) .then(res => res.json()) .then(item => { result.push(item) }) .catch(err => { // could't fetch resource or the // response was not a json response }) ) return Promise.all(que).then(() => result) } 

Something good @TKoL said:

Promise.all errors whenever one of the internal promises errors, so whatever advice anyone gives you here, it will boil down to -- Make sure that you wrap the promises in an error handler before passing them to Promise.all

Comments

-1

Regarding question 1, please refer to this:

Handling errors in Promise.all

Promise.all is all or nothing. It resolves once all promises in the array resolve, or reject as soon as one of them rejects. In other words, it either resolves with an array of all resolved values, or rejects with a single error.

1 Comment

The error is in the for loop. He's ACTUALLY handling with the try catch in the async await. The problem is he's calling a for loop on what is possibly and undefined

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.