I have a few foreach loops and at the end of every I use
set_time_limit(30) which restarts the counter back to zero.
Now the script runs for longer time (I fetch 5000-10000 articles using an API and store them in DB), but after a while (when it processes already lots of data) I get the "Maximum execution time of 30 seconds exceeded".
Could it be because of lack of memory? How could I tackle this problem?
The script is fetching articles using an API, and a foreach loop is used basically like this
foreach($articles as $article) { //do stuff with single article using $article set_time_limit(30); } and I do not expect it that it needs more than 30 seconds to fetch and process a single article, but apparently after the script is run for some time, it hits that limit. What am I doing wrong? I don't want to allow the script to do a 5-secs job longer than a maximum of 30 seconds by using set_time_limit(9000) or something like that, which will probably get the job done but I suppose this is not a good way to solve the issue?
set_time_limit()accounts for the entire PHP page, not just your loop. It should not be in a loop.set_time_limit(30)applies? If the query takes longer than 30 seconds to get to the first results, your code won't add any time to the total execution.