The pygsti
package provides multiple levels of abstraction over the core Gate Set Tomography (GST) algorithms. This initial tutorial will show you how to run Gate Set Tomography on some simulated (generated) data, hopefully giving you an overall sense of what it takes (and how easy it is!) to run GST. Subsequent tutorials will delve into the details of pygsti
objects and algorithms, and how to use them in detail.
To run GST, we need three inputs:
a "target gate set" which describes the desired, or ideal, operations we want our experimental hardware to perform. In the example below, we use one of pyGSTi's "standard" gate sets - the on acting on a single qubit with the following operations:
a list of GST sequences corresponding to the target gate set; essentially a list of what experiments (= gate sequences) we need to run. Using a standard gate set makes things especially straightforward here, since the building blocks, called germ and fiducial sequences needed to make good GST sequences have already been computed.
data, in the form of experimental outcome counts, for each of the required sequences. In this example we'll generate "fake" or "simulated" data from a depolarized version of our ideal gate set.
#Make print statements compatible with Python 2 and 3
from __future__ import print_function
#Import the pygsti module (always do this) and the standard XYI gate set
import pygsti
from pygsti.construction import std1Q_XYI
# 1) get the target GateSet
gs_target = std1Q_XYI.gs_target
# 2) get the building blocks needed to specify which gate sequences are needed
prep_fiducials, meas_fiducials = std1Q_XYI.prepStrs, std1Q_XYI.effectStrs
germs = std1Q_XYI.germs
maxLengths = [1,2,4,8,16,32] # roughly gives the length of the sequences used by GST
# 3) generate "fake" data from a depolarized version of gs_target
gs_datagen = gs_target.depolarize(gate_noise=0.1, spam_noise=0.001)
listOfExperiments = pygsti.construction.make_lsgst_experiment_list(
gs_target, prep_fiducials, meas_fiducials, germs, maxLengths)
ds = pygsti.construction.generate_fake_data(gs_datagen, listOfExperiments, nSamples=1000,
sampleError="binomial", seed=1234)
#Note: from listOfExperiments we can also create an empty dataset file
# which has columns of zeros where actual data should go.
pygsti.io.write_empty_dataset("tutorial_files/GettingStartedDataTemplate.txt", listOfExperiments,
"## Columns = 1 count, count total")
# After replacing the zeros with actual data, the data set can be
# loaded back into pyGSTi using the line below and used in the rest
# of this tutorial.
#ds = pygsti.io.load_dataset("tutorial_files/GettingStartedDataTemplate.txt")
Now that we have all of the inputs, we can run GST in a standard way using the do_stdpractice_gst
high-level driver function. This returns a pygsti.report.Results
object, from which we can generate a report giving us a summary of the analysis.
#Run GST and create a report
results = pygsti.do_stdpractice_gst(ds, gs_target, prep_fiducials, meas_fiducials,
germs, maxLengths, verbosity=4)
pygsti.report.create_standard_report(results, filename="tutorial_files/gettingStartedReport",
title="Tutorial0 Example Report", verbosity=2)
-- Std Practice: Iter 1 of 3 (TP) --: --- Gate Sequence Creation --- 1702 sequences created Dataset has 1702 entries: 1702 utilized, 0 requested sequences were missing --- LGST --- Singular values of I_tilde (truncating to first 4 of 6) = 4.244089943192679 1.1594632778409208 0.9651516670737965 0.9297628363691268 0.049256811347238104 0.025150658372136828 Singular values of target I_tilde (truncating to first 4 of 6) = 4.242640687119286 1.414213562373096 1.4142135623730956 1.4142135623730954 2.5038933168948026e-16 2.023452063009528e-16 Resulting gate set: rho0 = TPParameterizedSPAMVec with dimension 4 0.71-0.01 0 0.75 Mdefault = TPPOVM with effect vectors: 0: FullyParameterizedSPAMVec with dimension 4 0.73 0 0.02 0.64 1: ComplementSPAMVec with dimension 4 0.68 0-0.02-0.64 Gi = TPParameterizedGate with shape (4, 4) 1.00 0 0 0 0.03 0.88-0.02-0.02 0 0.02 0.89 0 0 0-0.03 0.92 Gx = TPParameterizedGate with shape (4, 4) 1.00 0 0 0 0 0.87 0 0 0 0 0-1.00 -0.06 0 0.82 0 Gy = TPParameterizedGate with shape (4, 4) 1.00 0 0 0 0.03-0.04-0.04 1.02 0-0.02 0.89 0.02 -0.05-0.83-0.02 0 --- Iterative MLGST: Iter 1 of 6 92 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 218.898, mu=0, |J|=1064.94 --- Outer Iter 1: norm_f = 96.8938, mu=84.3204, |J|=1007.37 --- Outer Iter 2: norm_f = 95.7323, mu=28.1068, |J|=1001.38 --- Outer Iter 3: norm_f = 95.7297, mu=9.36894, |J|=1001.26 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 95.7297 (91 data params - 31 model params = expected mean of 60; p-value = 0.0023058) Completed in 0.1s 2*Delta(log(L)) = 96.1163 Iteration 1 took 0.2s --- Iterative MLGST: Iter 2 of 6 168 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 213.055, mu=0, |J|=1371.2 --- Outer Iter 1: norm_f = 162.361, mu=137.083, |J|=1384.2 --- Outer Iter 2: norm_f = 162.193, mu=45.6944, |J|=1383.81 --- Outer Iter 3: norm_f = 162.192, mu=15.2315, |J|=1383.82 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 162.192 (167 data params - 31 model params = expected mean of 136; p-value = 0.0622524) Completed in 0.2s 2*Delta(log(L)) = 162.56 Iteration 2 took 0.2s --- Iterative MLGST: Iter 3 of 6 450 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 640.634, mu=0, |J|=2275.39 --- Outer Iter 1: norm_f = 484.909, mu=345.406, |J|=2291.36 --- Outer Iter 2: norm_f = 484.676, mu=115.135, |J|=2291.01 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 484.676 (449 data params - 31 model params = expected mean of 418; p-value = 0.0133296) Completed in 0.3s 2*Delta(log(L)) = 485.572 Iteration 3 took 0.4s --- Iterative MLGST: Iter 4 of 6 862 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 924.084, mu=0, |J|=3277.64 --- Outer Iter 1: norm_f = 895.336, mu=634.437, |J|=3293.04 --- Outer Iter 2: norm_f = 895.303, mu=211.479, |J|=3293.59 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 895.303 (861 data params - 31 model params = expected mean of 830; p-value = 0.0571847) Completed in 0.7s 2*Delta(log(L)) = 896.23 Iteration 4 took 0.7s --- Iterative MLGST: Iter 5 of 6 1282 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 1374.53, mu=0, |J|=4248.81 --- Outer Iter 1: norm_f = 1350.91, mu=916.846, |J|=4251.97 --- Outer Iter 2: norm_f = 1350.86, mu=305.615, |J|=4252.49 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 1350.86 (1281 data params - 31 model params = expected mean of 1250; p-value = 0.0239316) Completed in 1.1s 2*Delta(log(L)) = 1351.9 Iteration 5 took 1.1s --- Iterative MLGST: Iter 6 of 6 1702 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 1803.57, mu=0, |J|=5154.89 --- Outer Iter 1: norm_f = 1800.55, mu=1196.75, |J|=5152.19 --- Outer Iter 2: norm_f = 1800.55, mu=398.917, |J|=5152.5 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 1800.55 (1701 data params - 31 model params = expected mean of 1670; p-value = 0.0134134) Completed in 1.5s 2*Delta(log(L)) = 1801.81 Iteration 6 took 1.5s Switching to ML objective (last iteration) --- MLGST --- --- Outer Iter 0: norm_f = 900.906, mu=0, |J|=3643.13 --- Outer Iter 1: norm_f = 900.852, mu=598.252, |J|=3645.57 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Maximum log(L) = 900.852 below upper bound of -2.84686e+06 2*Delta(log(L)) = 1801.7 (1701 data params - 31 model params = expected mean of 1670; p-value = 0.0127718) Completed in 0.8s 2*Delta(log(L)) = 1801.7 Final MLGST took 0.8s Iterative MLGST Total Time: 4.8s --- Re-optimizing logl after robust data scaling --- --- MLGST --- --- Outer Iter 0: norm_f = 900.852, mu=0, |J|=3645.57 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Maximum log(L) = 900.852 below upper bound of -2.84686e+06 2*Delta(log(L)) = 1801.7 (1701 data params - 31 model params = expected mean of 1670; p-value = 0.0127718) Completed in 0.5s -- Performing 'single' gauge optimization on TP estimate -- -- Adding Gauge Optimized (single) -- -- Conveying 'single' gauge optimization to TP.Robust+ estimate -- -- Adding Gauge Optimized (single) -- -- Std Practice: Iter 2 of 3 (CPTP) --: --- Gate Sequence Creation --- 1702 sequences created Dataset has 1702 entries: 1702 utilized, 0 requested sequences were missing --- Iterative MLGST: Iter 1 of 6 92 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 1.13518e+07, mu=0, |J|=21245.6 --- Outer Iter 1: norm_f = 891.017, mu=50045.8, |J|=992.264 --- Outer Iter 2: norm_f = 774.061, mu=16681.9, |J|=942.84 --- Outer Iter 3: norm_f = 764.411, mu=5560.65, |J|=928.406 --- Outer Iter 4: norm_f = 763.38, mu=1853.55, |J|=926.991 --- Outer Iter 5: norm_f = 763.166, mu=617.85, |J|=927.016 --- Outer Iter 6: norm_f = 763.149, mu=205.95, |J|=927.027 --- Outer Iter 7: norm_f = 762.358, mu=68.65, |J|=926.855 --- Outer Iter 8: norm_f = 699.839, mu=183.067, |J|=907.155 --- Outer Iter 9: norm_f = 569.542, mu=122.044, |J|=860.255 --- Outer Iter 10: norm_f = 403.723, mu=122.373, |J|=937.685 --- Outer Iter 11: norm_f = 182.266, mu=46.6731, |J|=935.976 --- Outer Iter 12: norm_f = 169.995, mu=46.6299, |J|=925.563 --- Outer Iter 13: norm_f = 151.674, mu=93.04, |J|=920.639 --- Outer Iter 14: norm_f = 124.342, mu=78.8749, |J|=919.362 --- Outer Iter 15: norm_f = 106.103, mu=26.2916, |J|=932.452 --- Outer Iter 16: norm_f = 102.679, mu=24.2483, |J|=934.557 --- Outer Iter 17: norm_f = 99.3558, mu=8.08276, |J|=940.665 --- Outer Iter 18: norm_f = 98.3039, mu=16.1219, |J|=938.007 --- Outer Iter 19: norm_f = 96.8129, mu=14.0494, |J|=940.03 --- Outer Iter 20: norm_f = 95.842, mu=4.68315, |J|=942.591 --- Outer Iter 21: norm_f = 95.73, mu=1.56105, |J|=943.85 --- Outer Iter 22: norm_f = 95.7297, mu=0.52035, |J|=943.91 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 95.7297 (91 data params - 31 model params = expected mean of 60; p-value = 0.00230581) Completed in 0.7s 2*Delta(log(L)) = 96.1164 Iteration 1 took 0.7s --- Iterative MLGST: Iter 2 of 6 168 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 213.055, mu=0, |J|=1248.63 --- Outer Iter 1: norm_f = 168.047, mu=137.083, |J|=1248.62 --- Outer Iter 2: norm_f = 163.212, mu=53.9621, |J|=1252.61 --- Outer Iter 3: norm_f = 162.257, mu=17.9874, |J|=1255.72 --- Outer Iter 4: norm_f = 162.193, mu=5.99579, |J|=1257.02 --- Outer Iter 5: norm_f = 162.192, mu=1.9986, |J|=1257.17 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 162.192 (167 data params - 31 model params = expected mean of 136; p-value = 0.0622524) Completed in 0.4s 2*Delta(log(L)) = 162.56 Iteration 2 took 0.4s --- Iterative MLGST: Iter 3 of 6 450 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 640.618, mu=0, |J|=1980.54 --- Outer Iter 1: norm_f = 507.521, mu=369.208, |J|=1967.38 --- Outer Iter 2: norm_f = 489.335, mu=257.174, |J|=1975.78 --- Outer Iter 3: norm_f = 484.817, mu=85.7247, |J|=1985.55 --- Outer Iter 4: norm_f = 484.68, mu=28.5749, |J|=1986.49 --- Outer Iter 5: norm_f = 484.676, mu=9.52497, |J|=1986.72 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 484.676 (449 data params - 31 model params = expected mean of 418; p-value = 0.0133299) Completed in 0.7s 2*Delta(log(L)) = 485.571 Iteration 3 took 0.8s --- Iterative MLGST: Iter 4 of 6 862 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 924.073, mu=0, |J|=2677.63 --- Outer Iter 1: norm_f = 895.701, mu=634.441, |J|=2684.23 --- Outer Iter 2: norm_f = 895.324, mu=211.48, |J|=2685.92 --- Outer Iter 3: norm_f = 895.304, mu=70.4935, |J|=2686.21 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 895.304 (861 data params - 31 model params = expected mean of 830; p-value = 0.057183) Completed in 0.9s 2*Delta(log(L)) = 896.235 Iteration 4 took 0.9s --- Iterative MLGST: Iter 5 of 6 1282 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 1374.55, mu=0, |J|=3139.83 --- Outer Iter 1: norm_f = 1351.38, mu=916.838, |J|=3140.19 --- Outer Iter 2: norm_f = 1350.87, mu=305.613, |J|=3141.78 --- Outer Iter 3: norm_f = 1350.86, mu=101.871, |J|=3141.95 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 1350.86 (1281 data params - 31 model params = expected mean of 1250; p-value = 0.0239315) Completed in 1.2s 2*Delta(log(L)) = 1351.9 Iteration 5 took 1.3s --- Iterative MLGST: Iter 6 of 6 1702 gate strings ---: --- Minimum Chi^2 GST --- Created evaluation tree with 1 subtrees. Will divide 1 procs into 1 (subtree-processing) groups of ~1 procs each, to distribute over 43 params (taken as 1 param groups of ~43 params). --- Outer Iter 0: norm_f = 1803.57, mu=0, |J|=3458.96 --- Outer Iter 1: norm_f = 1800.59, mu=1196.74, |J|=3459.28 --- Outer Iter 2: norm_f = 1800.55, mu=398.915, |J|=3459.8 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Sum of Chi^2 = 1800.55 (1701 data params - 31 model params = expected mean of 1670; p-value = 0.0134124) Completed in 1.5s 2*Delta(log(L)) = 1801.81 Iteration 6 took 1.5s Switching to ML objective (last iteration) --- MLGST --- --- Outer Iter 0: norm_f = 900.907, mu=0, |J|=2446.28 --- Outer Iter 1: norm_f = 900.853, mu=598.252, |J|=2448.18 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Maximum log(L) = 900.853 below upper bound of -2.84686e+06 2*Delta(log(L)) = 1801.71 (1701 data params - 31 model params = expected mean of 1670; p-value = 0.0127715) Completed in 0.8s 2*Delta(log(L)) = 1801.71 Final MLGST took 0.8s Iterative MLGST Total Time: 6.4s --- Re-optimizing logl after robust data scaling --- --- MLGST --- --- Outer Iter 0: norm_f = 900.853, mu=0, |J|=2448.18 Least squares message = Both actual and predicted relative reductions in the sum of squares are at most 1e-06 Maximum log(L) = 900.853 below upper bound of -2.84686e+06 2*Delta(log(L)) = 1801.71 (1701 data params - 31 model params = expected mean of 1670; p-value = 0.0127715) Completed in 0.5s -- Performing 'single' gauge optimization on CPTP estimate -- -- Adding Gauge Optimized (single) -- -- Conveying 'single' gauge optimization to CPTP.Robust+ estimate -- -- Adding Gauge Optimized (single) -- -- Std Practice: Iter 3 of 3 (Target) --: --- Gate Sequence Creation --- 1702 sequences created Dataset has 1702 entries: 1702 utilized, 0 requested sequences were missing -- Performing 'single' gauge optimization on Target estimate -- -- Adding Gauge Optimized (single) -- *** Creating workspace *** *** Generating switchboard *** Found standard clifford compilation from std1Q_XYI Found standard clifford compilation from std1Q_XYI Found standard clifford compilation from std1Q_XYI *** Generating tables *** targetSpamBriefTable took 0.824686 seconds targetGatesBoxTable took 0.677539 seconds datasetOverviewTable took 0.13666 seconds bestGatesetSpamParametersTable took 0.001495 seconds bestGatesetSpamBriefTable took 0.590332 seconds bestGatesetSpamVsTargetTable took 0.60857 seconds bestGatesetGaugeOptParamsTable took 0.001198 seconds bestGatesetGatesBoxTable took 1.379696 seconds bestGatesetChoiEvalTable took 1.561557 seconds bestGatesetDecompTable took 0.829067 seconds bestGatesetEvalTable took 0.011313 seconds bestGermsEvalTable took 0.050836 seconds bestGatesetVsTargetTable took 0.261804 seconds bestGatesVsTargetTable_gv took 1.702368 seconds bestGatesVsTargetTable_gvgerms took 0.263858 seconds bestGatesVsTargetTable_gi took 0.044312 seconds bestGatesVsTargetTable_gigerms took 0.085499 seconds bestGatesVsTargetTable_sum took 1.805243 seconds bestGatesetErrGenBoxTable took 2.563467 seconds metadataTable took 0.108511 seconds stdoutBlock took 0.00186 seconds profilerTable took 0.002073 seconds softwareEnvTable took 0.052951 seconds exampleTable took 0.07753 seconds singleMetricTable_gv took 2.063968 seconds singleMetricTable_gi took 0.047114 seconds fiducialListTable took 0.000814 seconds prepStrListTable took 0.000378 seconds effectStrListTable took 0.000313 seconds colorBoxPlotKeyPlot took 0.085208 seconds germList2ColTable took 0.002396 seconds progressTable took 6.519736 seconds *** Generating plots *** gramBarPlot took 0.300678 seconds progressBarPlot took 6.749808 seconds progressBarPlot_sum took 0.000939 seconds finalFitComparePlot took 1.657969 seconds bestEstimateColorBoxPlot took 18.53918 seconds bestEstimateTVDColorBoxPlot took 17.679865 seconds bestEstimateColorScatterPlot took 25.903657 seconds bestEstimateColorHistogram took 36.888232 seconds progressTable_scl took 9.508669 seconds progressBarPlot_scl took 11.306057 seconds bestEstimateColorBoxPlot_scl took 55.99207 seconds bestEstimateColorScatterPlot_scl took 58.094291 seconds bestEstimateColorHistogram_scl took 33.195115 seconds dataScalingColorBoxPlot took 0.680993 seconds *** Merging into template file *** Rendering topSwitchboard took 0.00021 seconds Rendering maxLSwitchboard1 took 0.000191 seconds Rendering targetSpamBriefTable took 0.047262 seconds Rendering targetGatesBoxTable took 0.046733 seconds Rendering datasetOverviewTable took 0.001074 seconds Rendering bestGatesetSpamParametersTable took 0.007914 seconds Rendering bestGatesetSpamBriefTable took 0.099182 seconds Rendering bestGatesetSpamVsTargetTable took 0.012723 seconds Rendering bestGatesetGaugeOptParamsTable took 0.011944 seconds Rendering bestGatesetGatesBoxTable took 0.105494 seconds Rendering bestGatesetChoiEvalTable took 0.062436 seconds Rendering bestGatesetDecompTable took 0.059123 seconds Rendering bestGatesetEvalTable took 0.087335 seconds Rendering bestGermsEvalTable took 0.336041 seconds Rendering bestGatesetVsTargetTable took 0.004318 seconds Rendering bestGatesVsTargetTable_gv took 0.02571 seconds Rendering bestGatesVsTargetTable_gvgerms took 0.052553 seconds Rendering bestGatesVsTargetTable_gi took 0.018376 seconds Rendering bestGatesVsTargetTable_gigerms took 0.022074 seconds Rendering bestGatesVsTargetTable_sum took 0.021959 seconds Rendering bestGatesetErrGenBoxTable took 0.156001 seconds Rendering metadataTable took 0.021103 seconds Rendering stdoutBlock took 0.00361 seconds Rendering profilerTable took 0.006368 seconds Rendering softwareEnvTable took 0.006334 seconds Rendering exampleTable took 0.005776 seconds Rendering metricSwitchboard_gv took 7e-05 seconds Rendering metricSwitchboard_gi took 6.2e-05 seconds Rendering singleMetricTable_gv took 0.023844 seconds Rendering singleMetricTable_gi took 0.016912 seconds Rendering fiducialListTable took 0.005249 seconds Rendering prepStrListTable took 0.003487 seconds Rendering effectStrListTable took 0.005791 seconds Rendering colorBoxPlotKeyPlot took 0.005782 seconds Rendering germList2ColTable took 0.00791 seconds Rendering progressTable took 0.035831 seconds Rendering gramBarPlot took 0.010647 seconds Rendering progressBarPlot took 0.009665 seconds Rendering progressBarPlot_sum took 0.016107 seconds Rendering finalFitComparePlot took 0.005184 seconds Rendering bestEstimateColorBoxPlot took 0.426408 seconds Rendering bestEstimateTVDColorBoxPlot took 0.476975 seconds Rendering bestEstimateColorScatterPlot took 0.639299 seconds Rendering bestEstimateColorHistogram took 0.411766 seconds Rendering progressTable_scl took 0.063147 seconds Rendering progressBarPlot_scl took 0.015555 seconds Rendering bestEstimateColorBoxPlot_scl took 0.341697 seconds Rendering bestEstimateColorScatterPlot_scl took 0.462071 seconds Rendering bestEstimateColorHistogram_scl took 0.279557 seconds Rendering dataScalingColorBoxPlot took 0.105574 seconds Output written to tutorial_files/gettingStartedReport directory *** Report Generation Complete! Total time 304.065s ***
<pygsti.report.workspace.Workspace at 0x10388fdd8>
You can now open the file tutorial_files/gettingStartedReport/main.html in your browser to view the report. That's it! You've just run GST!
The other tutorials in this directory will explain how to use the various objects and algorithms that comprise pyGSTi. These tutorial notebooks are meant to be fairly pedagogical and include details about the inner workings of and design choices within pyGSTi. In contrast, the "FAQ" directory contains notebooks which attempt to address specific questions as quickly and directly as possible, with little or no explanation of related topics or broader context.