Table 2 Performance of the algorithm (in seconds) of Theorem 2.27 as implemented in our Mathematica code38.

From: Optimal two-qubit circuits for universal fault-tolerant quantum computation

CS-count

Mean time (s)

Std. Dev. (s)

10

0.0138

0.0044

100

0.0281

0.0051

1000

0.1135

0.0091

10,000

1.1883

0.0897

  1. Each run has constant overhead from computing the SO(6) representation for each unitary. Deviations from linearity are due to arithmetic operations on increasingly large integers. Each mean and standard deviation is computed using a sample of 1000 runs with pseudorandomly generated operators known to have the given minimal CS-count. Times are measured using Mathematica’s in-built AbsoluteTiming function. Computations performed on a laptop with an Intel(R) Core(TM) i7 CPU running at 2.6 GHz with 6 cores and 16 GB of RAM runnning macOS Catalina version 10.15.7.