i am trying to create a program that will run for X amount of minutes.
minutes was always set to 1 in testing
var minutes = $('#minutes').val(); var runtime = minutes*60; // gets the seconds var secondsEpoch = new Date() / 1000; // Epoch time var End = secondsEpoch + runtime; // add the minutes to the current epoch if (secondsEpoch < End) { window.setInterval(RunClock, 1000/10); } else { clearInterval(RunClock); } function RunClock() { console.log(new Date() / 1000); //my code } The script runs for infinity and i'm confused on why ???
When alerting variable secondsEpoch and End i always end up with a time difference of 1 minute?
Alerted the start and finish times and got
Start 1395022190.621 Finish 1395022250.621 Total difference of 60 which = 1 minute but the console log at this minute is
1395022456.657 which is obviously greater than
1395022250.621 and the scrip is still running and not stopping