You are correct: the bit-hack ~i & (i + 1) should evaluate to an integer which is all binary 0's, except the one corresponding to the rightmost zero-bit of i, which is set to binary 1.
So at the end of each pass of the for loop, it adds this value to itself. Since the corresponding bit in i is zero, this has the effect of setting it, without affecting any other bits in i. This will strictly increase the value of i at each pass, until i overflows (or becomes -1, if you started with i<0). In context, you can probably expect that it is called with i>=0, and that i < n is set terminate the loop before your index walks off the array.
The overall function should have the effect of iterating through the zero-bits of the original value of i from least- to most-significant, setting them one by one, and incrementing the corresponding elements of the array.
Fenwick trees are a clever way to accumulate and query statistics efficiently; as you say, their update loop looks a bit like this, and typically uses a comparable bit-hack. There are bound to be multiple ways to accomplish this kind of bit-fiddling, so it is certainly possible that your source code is updating a Fenwick tree, or something comparable.
iin either hex or binary.~i & (i + 1)will be the lowest power of two that evenly dividesi. Afteri += ~i & (i + 1),iwill then be divisible by the next power of two (since the lowest bit got added, which will set the lowest nonzero bit to 0 and carry the 1 into the next higher position).