I've condensed a binary test into this function that I use:
function getStorageTotalSize(upperLimit/*in bytes*/) { var store = localStorage, testkey = "$_test"; // (NOTE: Test key is part of the storage!!! It should also be an even number of characters) var test = function (_size) { try { store.removeItem(testkey); store.setItem(testkey, new Array(_size + 1).join('0')); } catch (_ex) { return false; } return true; } var backup = {}; for (var i = 0, n = store.length; i < n; ++i) backup[store.key(i)] = store.getItem(store.key(i)); store.clear(); // (you could iterate over the items and backup first then restore later) var low = 0, high = 1, _upperLimit = (upperLimit || 1024 * 1024 * 1024) / 2, upperTest = true; while ((upperTest = test(high)) && high < _upperLimit) { low = high; high *= 2; } if (!upperTest) { var half = ~~((high - low + 1) / 2); // (~~ is a faster Math.floor()) high -= half; while (half > 0) high += (half = ~~(half / 2)) * (test(high) ? 1 : -1); high = testkey.length + high; } if (high > _upperLimit) high = _upperLimit; store.removeItem(testkey); for (var p in backup) store.setItem(p, backup[p]); return high * 2; // (*2 because of Unicode storage) }
It also backs up the contents before testing, then restores them.
How it works: It doubles the size until the limit is reached or the test fails. It then stores half the distance between low and high and subtracts/adds a half of the half each time (subtract on failure and add on success); honing into the proper value.
upperLimit is 1GB by default, and just limits how far upwards to scan exponentially before starting the binary search. I doubt this will even need to be changed, but I'm always thinking ahead. ;)
On Chrome:
> getStorageTotalSize(); > 10485762 > 10485762/2 > 5242881 > localStorage.setItem("a", new Array(5242880).join("0")) // works > localStorage.setItem("a", new Array(5242881).join("0")) // fails ('a' takes one spot [2 bytes])
IE11, Edge, and FireFox also report the same max size (10485762 bytes).