0
$\begingroup$

It is known that BPL $\subseteq$ P where BPL is the class of languages that have a probabilistic Turing machine which decides input correctly with probability $\ge \frac{2}{3}$ and always halts irrespective of the random bits. The inclusion follows by creating the transition matrix of configurations and exponentiating the matrix to a suitable polynomial power.

Is the same known for the version of the class where we instead halt with probability $1$? Equivalently, given a Markov chain $M$ where every vertex has two outgoing edges with equal probability except for accept and reject having self-loops with probability $1$, can we find the (approximate) probability of going from start (a vertex with no incoming edges) to accept, in polynomial time?

$\endgroup$

1 Answer 1

1
$\begingroup$

The following is an incomplete answer.

The challenge from relaxing the "always halts" condition to "almost surely halts" is that the graph of configurations is now allowed to have cycles. A consequence is that if you take polynomially many steps, the probability of halting may be exponentially small. For example, you can have a counter which, with probability $\frac{1}{2}$, either increments by one or resets to zero. That counter will reach $N$ in $N$ steps with probability $\frac{1}{2^N}$.

Remark that this is a lower bound: if there are $c$ configurations, you will reach a halting state within $c$ steps with probability at least $\frac{1}{2^c}$. This is because from every reachable configuration, you must get closer to a halting state with probability at least $\frac{1}{2}$.

Consequently, if we want to approximate the probability of accept vs reject to a reasonable degree, it is sufficient to exponentiate the transition matrix $M$ exponentially many times, $2^{\mathcal{O}(c)}$, where $c$ is the number of configurations. This is feasible in $\mathcal{O}(c)$ matrix multiplications thanks to fast exponentiation. The remaining obstacle is that the size of the scalars grows exponentially.

That's where I don't know how to conclude, but one idea is that maybe we don't need to compute the probability to exact precision. If we can show that truncating probabilities to a polynomial number of bits doesn't lose too much precision, then we can compute an approximation of $M^{2^{\mathcal{O}(c)}}$ in polynomial time.

$\endgroup$
2
  • 1
    $\begingroup$ Thanks for the answer. Makes sense. As the configuration graph is now an absorbing Markov chain with only two absorbing states accept and reject, I was looking at the wiki here: en.wikipedia.org/wiki/…. As per the wiki, calculating the probability of reaching an absorbing state is equivalent to calculating the inverse of a matrix related to the transition matrix. This matrix would have entries 0, +/-1, +/-1/2. I was hoping this could be done by some version of Gaussian elimination etc but I am not sure sure as of now. $\endgroup$ Commented Aug 25 at 21:15
  • $\begingroup$ Ah yes that is much simpler. I think Gaussian elimination just works indeed, and it also makes it much easier to bound the error from fixed-point arithmetic. $\endgroup$ Commented Aug 25 at 21:36

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.