1
$\begingroup$

Using stim, I created a circuit of distance-3, 2-round rotated surface code focusing on Z-memory:

surface_code_circuit_z = stim.Circuit.generated( "surface_code:rotated_memory_z", rounds=2, distance=3, before_round_data_depolarization=0.3, before_measure_flip_probability=0.01) 

In the case of rounds=1, edge that corresponds to a data qubit depolarization will be error(0.2) D0 D1, which means with 2/3 of the probability the depolarization are X or Y errors (2/3 * 0.3 = 0.2), triggering the detectors.

However, for the case of rounds=2, this is my detector error model:

 error(0.2) D0 error(0.2) D0 D1 error(0.01) D0 D8 error(0.2) D1 D2 error(0.01) D1 D5 error(0.32) D1 L0 error(0.32) D2 error(0.2) D2 D3 error(0.01) D2 D10 error(0.01) D3 D7 error(0.2) D3 L0 error(0.128039) D4 error(0.112702) D4 D6 error(0.112702) D4 D6 ^ D5 L0 error(0.112702) D4 ^ D5 L0 error(0.112702) D5 D8 error(0.112702) D5 D8 ^ D9 error(0.112702) D5 D10 error(0.112702) D5 D10 ^ D6 D9 error(0.01) D5 D13 error(0.2) D5 L0 error(0.21188) D6 error(0.112702) D6 D9 error(0.112702) D6 ^ D7 L0 error(0.112702) D7 D10 error(0.112702) D7 D10 ^ D6 error(0.01) D7 D15 error(0.112702) D7 L0 error(0.112702) D8 error(0.01) D8 D12 error(0.21188) D9 error(0.112702) D9 D11 error(0.112702) D9 D11 ^ D10 error(0.112702) D9 ^ D8 error(0.2) D10 error(0.01) D10 D14 error(0.128039) D11 error(0.112702) D11 ^ D10 error(0.01) D12 error(0.01) D12 D13 error(0.01) D13 D14 error(0.0198) D13 L0 error(0.0198) D14 error(0.01) D14 D15 error(0.01) D15 L0 detector(0, 4, 0) D0 detector(2, 2, 0) D1 detector(4, 4, 0) D2 detector(6, 2, 0) D3 shift_detectors(0, 0, 1) 0 detector(2, 0, 0) D4 detector(2, 2, 0) D5 detector(4, 2, 0) D6 detector(6, 2, 0) D7 detector(0, 4, 0) D8 detector(2, 4, 0) D9 detector(4, 4, 0) D10 detector(4, 6, 0) D11 detector(0, 4, 1) D12 detector(2, 2, 1) D13 detector(4, 4, 1) D14 detector(6, 2, 1) D15 ''') 

The first round contains only detectors for Z-measurements, which makes sense to me. However, for later rounds, detectors for X-measurements also appear.

First, it seems to me that the X and Z matching graphs are two separate matching graph (logical observables are on the Z graph in this case), and including the X part does not help with the simulation of the matchings on the Z graph. Are the two graphs connected when we also introduce after_clifford_depolarization?

Second, error(0.112702) D4 D6 ^ D5 L0 should correspond to an Y error, triggering both X detectors D4 D6 and Z detectors D5 L0. I would naively think this error probability being 0.1. How do we get 0.112702? This value also appears elsewhere describing depolarization of the second round. Why is it?

Thanks!

$\endgroup$

1 Answer 1

1
$\begingroup$

including the X part does not help with the simulation of the matchings on the Z graph

This is false. Y errors leave detection events in both the X and Z subgraphs, meaning detection events in the X subgraph give hints about the locations of error chains in the Z subgraph. Decoders that use this correlated information substantially outperform decoders that don't.

There are also edges between the X and Z subgraph at the corners of surface codes (though Stim tries to decompose those edges into boundary edges). And many operations, such as logical S gates and logical H gates, will interweave the X and Z subgraphs.

Splitting the graph into an X part and a Z part is a simplification used when teaching how to decode surface codes, but this simplification is quite costly in terms of accuracy and flexibility. You don't actually want to do that in practice.

Also, when benchmarking the speed of a decoder, it is absolutely cheating to only decode one of the subgraphs. That works for memory experiments, but wouldn't work in actual computations where the logical qubits are always highly entangled states, so it amounts to benchmarking the wrong thing.

Don't throw away half the problem.

$\endgroup$
3
  • $\begingroup$ Thank you for your answer! Could you please also explain why error(0.112702) D4 D6 ^ D5 L0 which should correspond to an Y error has this error probability? I am just very curious about how this value is calculated and related to before_round_data_depolarization=0.3. Thanks! $\endgroup$ Commented Dec 31, 2024 at 6:14
  • $\begingroup$ @KL_x because of the conversion from disjoint errors to independent errors, the 0.3/3=0.1 gets slightly perturbed upward to 0.112. Read algassert.com/post/2001 $\endgroup$ Commented Dec 31, 2024 at 7:36
  • $\begingroup$ Got it. Thank you! $\endgroup$ Commented Jan 4 at 18:31

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.