Parameter Estimation for Quantum Information Christopher Granade - - PowerPoint PPT Presentation

parameter estimation for quantum information
SMART_READER_LITE
LIVE PREVIEW

Parameter Estimation for Quantum Information Christopher Granade - - PowerPoint PPT Presentation

Parameter Estimation for Quantum Information Christopher Granade www.cgranade.com cgranade@cgranade.com Joint work with: Christopher Ferrie Nathan Wiebe D. G. Cory Institute for Quantum Computing University of Waterloo, Ontario, Canada


slide-1
SLIDE 1

Parameter Estimation for Quantum Information

Christopher Granade

www.cgranade.com • cgranade@cgranade.com

Joint work with:

Christopher Ferrie Nathan Wiebe

  • D. G. Cory

Institute for Quantum Computing University of Waterloo, Ontario, Canada

June 18, 2013

LFQIS 2013, Denali Park

#lfqis • #qhl

slide-2
SLIDE 2

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

1

Motivation and Applications Overview Nitrogen Vacancy Centers Neutron Interferometry Superconducting Systems

2

Theory of Parameter Estimation Bayes’ Rule Decision Theory

3

Sequential Monte Carlo SMC Algorithm Performance

4

Going Quantum Weak and Strong Simulation Quantum Hamiltonian Learning

5

Conclusions

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-3
SLIDE 3

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Characterizing unknown quantum systems is critical for design of control.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-4
SLIDE 4

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Characterizing unknown quantum systems is critical for design of control. Enabling adaptive measurement allows for large reductions in data collection costs.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-5
SLIDE 5

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Characterizing unknown quantum systems is critical for design of control. Enabling adaptive measurement allows for large reductions in data collection costs. Want accurate reporting of errors incurred by estimate, and of smallest credible regions.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-6
SLIDE 6

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Online adaptive characterization of quantum systems can improve and enable experimental practice, including in nitrogen vacancy centers, neutron interferometers, and in superconducting qubit circuits.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-7
SLIDE 7

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Counting Statistics

Nitrogen vacancy centers in diamond are measured by optical

  • readout. The number of photons emitted by a center in an

interval ∆t depends on the ms quantum number of the center.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Manson et al. 2006

slide-8
SLIDE 8

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Counting Statistics

Nitrogen vacancy centers in diamond are measured by optical

  • readout. The number of photons emitted by a center in an

interval ∆t depends on the ms quantum number of the center. Dark counts, quantum efficiency, stray flourescences all affect statistics of photon detection.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Manson et al. 2006

slide-9
SLIDE 9

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Counting Statistics

Nitrogen vacancy centers in diamond are measured by optical

  • readout. The number of photons emitted by a center in an

interval ∆t depends on the ms quantum number of the center. Dark counts, quantum efficiency, stray flourescences all affect statistics of photon detection. Estimation Problem Given that nd photons were observed, what state was the NV center in?

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Manson et al. 2006

slide-10
SLIDE 10

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Precise Magnetometry

H = ∆S2

z + γB · S

Energy levels in an NV center are split by magnetic fields. By preparing, evolving and measuring different states, we thus gain information about B.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Said et al. 2011

slide-11
SLIDE 11

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Precise Magnetometry

H = ∆S2

z + γB · S

Energy levels in an NV center are split by magnetic fields. By preparing, evolving and measuring different states, we thus gain information about B.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Said et al. 2011

slide-12
SLIDE 12

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Precise Magnetometry

H = ∆S2

z + γB · S

Energy levels in an NV center are split by magnetic fields. By preparing, evolving and measuring different states, we thus gain information about B. Estimation Problem Given a set of observed photon counts, what is the strength and direction of the magnetic field B?

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Said et al. 2011

slide-13
SLIDE 13

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Precise Magnetometry

H = ∆S2

z + γB · S

Energy levels in an NV center are split by magnetic fields. By preparing, evolving and measuring different states, we thus gain information about B. Estimation Problem Given a set of observed photon counts, what is the strength and direction of the magnetic field B? By applying a magnetic field gradient such that B = B(r), measurement of the NV center reveals information about its location in the diamond.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Said et al. 2011

slide-14
SLIDE 14

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Neutron Interferometry Geometry

sample O-beam H-beam phase flag

The sample introduces a phase difference of φ between the two

  • paths. By rotating the phase flag, an additional phase of θ can

be introduced, so that the ideal probabilty of a neutron reaching the O-beam detector is Pr(O-beam) = cos2(φ + θ).

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Pushin 2007

slide-15
SLIDE 15

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Neutron Interferometry Geometry

sample O-beam H-beam phase flag

The sample introduces a phase difference of φ between the two

  • paths. By rotating the phase flag, an additional phase of θ can

be introduced, so that the ideal probabilty of a neutron reaching the O-beam detector is Pr(O-beam) = cos2(φ + θ). In practice, there is a limited contrast between the two beams, related to the visibility.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Pushin 2007

slide-16
SLIDE 16

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Improved Contrast

Due to interaction with the environment, a phase difference ∆φ(ǫ) is introduced for a state ǫ of the environment. Averaging

  • ver this random phase costs contrast in the final signal.
  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-17
SLIDE 17

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Improved Contrast

Due to interaction with the environment, a phase difference ∆φ(ǫ) is introduced for a state ǫ of the environment. Averaging

  • ver this random phase costs contrast in the final signal.

Estimation Problem By measuring the temperature, humidity, etc., as well as the neutron count, can we improve contrast and measure the static phase difference with better accuracy?

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-18
SLIDE 18

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Spectral Density Estimation

In order to characterize a superconducting qubit circuit, we must know what decoherence mechanisms the system is subject to. As such, we would like to know the power spectral density S(ω) of the environment.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Yan et al. 2012

slide-19
SLIDE 19

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Spectral Density Estimation

In order to characterize a superconducting qubit circuit, we must know what decoherence mechanisms the system is subject to. As such, we would like to know the power spectral density S(ω) of the environment. By measuring the circuit with a variety of dynamical decoupling pulse trains, we gain information about S.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Yan et al. 2012

slide-20
SLIDE 20

Motivation Theory SMC Quantum Conclusions Overview NVs Neutron Interferometry Superconducting

Spectral Density Estimation

In order to characterize a superconducting qubit circuit, we must know what decoherence mechanisms the system is subject to. As such, we would like to know the power spectral density S(ω) of the environment. By measuring the circuit with a variety of dynamical decoupling pulse trains, we gain information about S. Estimation Problem Given measurements of the superconducting circuit, what is the power spectral density of its environment?

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Yan et al. 2012

slide-21
SLIDE 21

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

1

Motivation and Applications Overview Nitrogen Vacancy Centers Neutron Interferometry Superconducting Systems

2

Theory of Parameter Estimation Bayes’ Rule Decision Theory

3

Sequential Monte Carlo SMC Algorithm Performance

4

Going Quantum Weak and Strong Simulation Quantum Hamiltonian Learning

5

Conclusions

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-22
SLIDE 22

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Modeling Data Collection

Model data collection as a probability distribution, called a likelihood function Pr(d|x; e). d: data, x: model, e: experiment

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-23
SLIDE 23

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Modeling Data Collection

Model data collection as a probability distribution, called a likelihood function Pr(d|x; e). d: data, x: model, e: experiment Example Consider a single qubit undergoing Larmor precession at an unknown frequency ω, with unknown dephasing time T2: H(ω) = ω 2 σz, |ψin = |+ , M = {|+ +| , |− −|} Pr(d = 0|x = (ω, T2); e = (t)) = 1 2(1 − e−t/T2) + e−t/T2 cos2(ωt/2)

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-24
SLIDE 24

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Updating Knowledge

Once we have a likelihood function for our model, we can reason about Pr(x|d, e), what we know about our model having seen some data.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-25
SLIDE 25

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Updating Knowledge

Once we have a likelihood function for our model, we can reason about Pr(x|d, e), what we know about our model having seen some data. By Bayes’ rule, Pr(x|d, e) = Pr(d|x; e) Pr(d|e) Pr(x), telling us that our knowledge is intimately connected to our ability to simulate.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-26
SLIDE 26

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Updating Knowledge

Once we have a likelihood function for our model, we can reason about Pr(x|d, e), what we know about our model having seen some data. By Bayes’ rule, Pr(x|d, e) = Pr(d|x; e) Pr(d|e) Pr(x), telling us that our knowledge is intimately connected to our ability to simulate. Report as estimate of x the expectation value over x, ˆ x = E[x] =

  • x Pr(x) dx.
  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-27
SLIDE 27

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Loss

We require a figure of merit how how well we have learned a

  • model. Thus, we assign to each estimate ˆ

x of a “true” model x a loss, describing how bad ˆ x does at estimating x.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-28
SLIDE 28

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Loss

We require a figure of merit how how well we have learned a

  • model. Thus, we assign to each estimate ˆ

x of a “true” model x a loss, describing how bad ˆ x does at estimating x. Definition (Quadratic Loss) LQ(ˆ x, x) = (ˆ x − x)TQ(ˆ x − x), where Q is a positive semidefinite matrix that establishes the scale between the various model parameters.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-29
SLIDE 29

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Loss

We require a figure of merit how how well we have learned a

  • model. Thus, we assign to each estimate ˆ

x of a “true” model x a loss, describing how bad ˆ x does at estimating x. Definition (Quadratic Loss) LQ(ˆ x, x) = (ˆ x − x)TQ(ˆ x − x), where Q is a positive semidefinite matrix that establishes the scale between the various model parameters. The quadratic loss generalizes the mean-squared error for the case of multiple parameters.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-30
SLIDE 30

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Risk and Bayes Risk

Thinking of an estimator as a function from data records D to estimates ˆ x(D), we can reason about what the loss will be on average.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-31
SLIDE 31

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Risk and Bayes Risk

Thinking of an estimator as a function from data records D to estimates ˆ x(D), we can reason about what the loss will be on average. Definition (Risk) R(ˆ x, x) = ED[L(ˆ x(D), x)]

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-32
SLIDE 32

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Risk and Bayes Risk

Thinking of an estimator as a function from data records D to estimates ˆ x(D), we can reason about what the loss will be on average. Definition (Risk) R(ˆ x, x) = ED[L(ˆ x(D), x)] Since we don’t know the true model a priori, we average again to obtain the Bayes risk.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-33
SLIDE 33

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Risk and Bayes Risk

Thinking of an estimator as a function from data records D to estimates ˆ x(D), we can reason about what the loss will be on average. Definition (Risk) R(ˆ x, x) = ED[L(ˆ x(D), x)] Since we don’t know the true model a priori, we average again to obtain the Bayes risk. Definition (Bayes Risk) r(ˆ x, π) = Ex∼π[R(ˆ x, x)]

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-34
SLIDE 34

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Cram´ er-Rao Bound

The Fisher information I(x) = ED[(∇x log Pr(D|x))(∇x log Pr(D|x))T] describes how much information about x is obtained by sampling data. ✶

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-35
SLIDE 35

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Cram´ er-Rao Bound

The Fisher information I(x) = ED[(∇x log Pr(D|x))(∇x log Pr(D|x))T] describes how much information about x is obtained by sampling data. The Cram´ er-Rao bound then tells us how well any unbiased estimator can perform. If Q = ✶, then R(ˆ x, x) = Tr(Cov(ˆ x)) ≥ Tr(I(x)−1).

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-36
SLIDE 36

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Cram´ er-Rao Bound

The Fisher information I(x) = ED[(∇x log Pr(D|x))(∇x log Pr(D|x))T] describes how much information about x is obtained by sampling data. The Cram´ er-Rao bound then tells us how well any unbiased estimator can perform. If Q = ✶, then R(ˆ x, x) = Tr(Cov(ˆ x)) ≥ Tr(I(x)−1). Compare to the quantum Cram´ er-Rao bound, which corresponds to the Heisenberg limit, and represents quantum mechanical limits rather than practical limits in specific scenarios.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-37
SLIDE 37

Motivation Theory SMC Quantum Conclusions Bayes’ Rule Decision Theory

Bayesian Cram´ er-Rao Bound

As before, integrating the Fisher information over the prior distribution π results in a Bayesian analog, the Bayesian Cram´ er-Rao bound: r(π) ≥

  • Ex[I(x)]

−1 . The BCRB can be computed iteratively, making it useful for tracking optimality in an online fashion.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Gill and Levit 1995

slide-38
SLIDE 38

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

1

Motivation and Applications Overview Nitrogen Vacancy Centers Neutron Interferometry Superconducting Systems

2

Theory of Parameter Estimation Bayes’ Rule Decision Theory

3

Sequential Monte Carlo SMC Algorithm Performance

4

Going Quantum Weak and Strong Simulation Quantum Hamiltonian Learning

5

Conclusions

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-39
SLIDE 39

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Sequential Monte Carlo

To implement our approach on a computer, we approximate distributions by a sum over weighted delta functions, Pr(x) =

  • i

wi · δ(x − xi).

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Doucet and Johansen 2011, Granade et al 2012

slide-40
SLIDE 40

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Sequential Monte Carlo

To implement our approach on a computer, we approximate distributions by a sum over weighted delta functions, Pr(x) =

  • i

wi · δ(x − xi). Each term in this sum is called a particle, and is described by the weight wi and the model xi.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Doucet and Johansen 2011, Granade et al 2012

slide-41
SLIDE 41

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Sequential Monte Carlo

To implement our approach on a computer, we approximate distributions by a sum over weighted delta functions, Pr(x) =

  • i

wi · δ(x − xi). Each term in this sum is called a particle, and is described by the weight wi and the model xi. Updates to distributions now require evaluation the model Pr(d|x; e) at a finite number of points. Integrals over distributions are now represented by finite sums.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Doucet and Johansen 2011, Granade et al 2012

slide-42
SLIDE 42

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Numerical Stability and Resampling

With large amounts of data, the likelihood of any particular data record becomes very small, such that the normalization of the weights introduces numerical instabilities.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Liu and West 2001

slide-43
SLIDE 43

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Numerical Stability and Resampling

With large amounts of data, the likelihood of any particular data record becomes very small, such that the normalization of the weights introduces numerical instabilities. Can mitigate by resampling: moving information from the weights to the density of SMC particles.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Liu and West 2001

slide-44
SLIDE 44

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Numerical Stability and Resampling

With large amounts of data, the likelihood of any particular data record becomes very small, such that the normalization of the weights introduces numerical instabilities. Can mitigate by resampling: moving information from the weights to the density of SMC particles. Sample new particle locations from mixed-normal distribution, with each normal centered on an old particle. Pr(x′) ∝

  • i

wi exp

  • (x′ − µi)TΣ(x′ − µi)
  • µi = axi + (1 − a)E[x]

Σ = (1 − a2) Cov[x] Set new weights to be uniform, hence resetting numerical stability.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Liu and West 2001

slide-45
SLIDE 45

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Sequential Monte Carlo

With SMC and resampling, particles move towards the true model as data is collected.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-46
SLIDE 46

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Sequential Monte Carlo

With SMC and resampling, particles move towards the true model as data is collected.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-47
SLIDE 47

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Sequential Monte Carlo

With SMC and resampling, particles move towards the true model as data is collected.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-48
SLIDE 48

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Experiment Design

Given the utility for an experiment e, optimization algorithms can be used to find the most useful next experiment.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-49
SLIDE 49

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Experiment Design

Given the utility for an experiment e, optimization algorithms can be used to find the most useful next experiment. Since utilities are often multimodal for highly periodic models, locally optimizing each of many initial guesses works well. In

  • ur work, we use the Newton Conjugate-Gradient method

(NCG) for locally optimizing each guess.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-50
SLIDE 50

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Experiment Design

Given the utility for an experiment e, optimization algorithms can be used to find the most useful next experiment. Since utilities are often multimodal for highly periodic models, locally optimizing each of many initial guesses works well. In

  • ur work, we use the Newton Conjugate-Gradient method

(NCG) for locally optimizing each guess. The quality of experiment design depends on having good heuristics for generating guesses. For Larmor precession model, exponentially sparse heuristics work well.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-51
SLIDE 51

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Cost and Utility

In order to select experiments, we assign a cost function, describing how hard it is to perform an experiment. For instance, $(e) = at + b describes that there is a cost to evolving for a time t, as well as a constant per-experiment cost.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-52
SLIDE 52

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Cost and Utility

In order to select experiments, we assign a cost function, describing how hard it is to perform an experiment. For instance, $(e) = at + b describes that there is a cost to evolving for a time t, as well as a constant per-experiment cost. The utility can then be defined as the reduction in Bayes risk per cost, U(e) = ∆r(π; e) $(e) .

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-53
SLIDE 53

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Cost and Utility

In order to select experiments, we assign a cost function, describing how hard it is to perform an experiment. For instance, $(e) = at + b describes that there is a cost to evolving for a time t, as well as a constant per-experiment cost. The utility can then be defined as the reduction in Bayes risk per cost, U(e) = ∆r(π; e) $(e) . If $(e) is constant, then we can use the negative variance utility instead, as both functions are optimized at the same experiment: UNV(e) = −Ed[Tr(Q · Covx|d;e(x))].

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-54
SLIDE 54

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Numerical Results: Unknown T2

Our method allows for ω and T2 to be learned with very few measurements.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-55
SLIDE 55

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Numerical Results: Unknown T2

Our method allows for ω and T2 to be learned with very few measurements: only 200 bits of data!

2 5 10 20 50 100 200

N

106 105 104 0.001

Ω Ω2 Expected MSE in Ω

2 5 10 20 50 100 200

N

5.0 108 3.0 108

T

  • 2

1 T2 1 2

Expected MSE in T2

1

Wo local optimization With local optimization

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-56
SLIDE 56

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Numerical Results: Unknown T2

Our method allows for ω and T2 to be learned with very few measurements: only 200 bits of data!

2 5 10 20 50 100 200

N

106 105 104 0.001

Ω Ω2 Expected MSE in Ω

2 5 10 20 50 100 200

N

5.0 108 3.0 108

T

  • 2

1 T2 1 2

Expected MSE in T2

1

Wo local optimization With local optimization

Moreover, using our best available knowledge to optimize experiment designs, we can continue to learn about ω even in the presence of unknown T2.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-57
SLIDE 57

Motivation Theory SMC Quantum Conclusions SMC Algorithm Performance

Numerical Results: Adaptive Experiment Design

20 40 60 80 100N 5 105 1 104 5 104 0.001 0.005 0.010

L

Expected Loss 10,000 particles

Unoptimized BCRB for Unopt. NCG Optimized BCRB for NCG

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-58
SLIDE 58

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

1

Motivation and Applications Overview Nitrogen Vacancy Centers Neutron Interferometry Superconducting Systems

2

Theory of Parameter Estimation Bayes’ Rule Decision Theory

3

Sequential Monte Carlo SMC Algorithm Performance

4

Going Quantum Weak and Strong Simulation Quantum Hamiltonian Learning

5

Conclusions

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-59
SLIDE 59

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Simulation and learning are intimately connected: if we can simulate a model under different hypotheses, then we can determine which hypotheses describe an unknown system.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-60
SLIDE 60

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Simulation and learning are intimately connected: if we can simulate a model under different hypotheses, then we can determine which hypotheses describe an unknown system. Big Idea Use quantum simulation to learn about unknown quantum systems.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-61
SLIDE 61

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Weak and Strong Simulation

Van den Nest defines as strong simulation the calculation of a likelihood function, and weak simulation as the sampling of data from a likelihood function.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Aaronson and Arkhipov 2010, Josza and Miyake 2008, Van den Nest 2008 and 2011, Josza and Van den Nest 2013

slide-62
SLIDE 62

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Weak and Strong Simulation

Van den Nest defines as strong simulation the calculation of a likelihood function, and weak simulation as the sampling of data from a likelihood function. Much of quantum simulation is concerned with the latter: producing data according to a desired distribution. Problem Sequential Monte Carlo seems to require strong simulation.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Aaronson and Arkhipov 2010, Josza and Miyake 2008, Van den Nest 2008 and 2011, Josza and Van den Nest 2013

slide-63
SLIDE 63

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Adaptive Likelihood Estimation

Solution Treat estimating the likelihood as a secondary estimation problem.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Ferrie and Blume-Kohout 2012, Ferrie and Granade 2013

slide-64
SLIDE 64

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Adaptive Likelihood Estimation

Solution Treat estimating the likelihood as a secondary estimation problem. For a two-outcome model, a hedged binomial estimator finds the probability p0 of a “0” outcome by repeatedly sampling a weak simulator.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Ferrie and Blume-Kohout 2012, Ferrie and Granade 2013

slide-65
SLIDE 65

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Adaptive Likelihood Estimation

Solution Treat estimating the likelihood as a secondary estimation problem. For a two-outcome model, a hedged binomial estimator finds the probability p0 of a “0” outcome by repeatedly sampling a weak simulator. The variance of such estimators is well-known, so repeatedly collect until a fixed tolerance is reached. We will show later that SMC is robust to likelihood estimation errors.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information Ferrie and Blume-Kohout 2012, Ferrie and Granade 2013

slide-66
SLIDE 66

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

No Inversion Model

If we wish to compare the classical outcomes of an unknown quantum system to a simulator, then we can do so by preparing an input state |ψ, evolving under either the “true” model x0 or a simulated model xe, and then measuring.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-67
SLIDE 67

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

No Inversion Model

If we wish to compare the classical outcomes of an unknown quantum system to a simulator, then we can do so by preparing an input state |ψ, evolving under either the “true” model x0 or a simulated model xe, and then measuring. Unknown System Simulator t |ψ e−iH(x0)t d t, xe |ψ e−iH(xe)t d

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-68
SLIDE 68

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Inversion Model

On the other hand, if the unknown system is coherently coupled to a quantum simulator, then we can transfer the state produced by a quantum device and attempt to invert the evolution by simulating according to a hypothesis. We model noise in the coupling by a depolarizing channel Φdep. t, xe × Φdep e−iH(xe)t d |ψ e−iH(x0)t × t

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-69
SLIDE 69

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Posterior Particle Heuristic

The inversion model connects the model and experiment parameter spaces. Use this connection to come up with a heuristic for experiment designs. Choose xe, x′

e ∼ Pr(x), the most recent posterior.

Choose t = 1/xe − x′

e.

Return e = (xe, t).

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-70
SLIDE 70

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Ising Model on Spin Chains

For a Hamiltonian representing nearest-neighbor Ising models

  • n a chain of nine qubits, the inversion model allows for

dramatic improvements over comparing classical data.

50 100 150 200 10

−8

10

−6

10

−4

10

−2

10

Experiment Number Quadratic Loss

No Inv, P = 0.1 Inv, P = 0.1 Inv, P = 0

Here, P is the adaptive likelihood estimation tolerance.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-71
SLIDE 71

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Ising Model on the Complete Graph

Using the inversion model, our algorithm also works when the interaction graph is complete. Here, we show the performance as a function of the depolarization strength N.

50 100 150 200 10

−8

10

−6

10

−4

10

−2

10

Experiment Number Quadratic Loss N = 0 N = 0.05 N = 0.5

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-72
SLIDE 72

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Scaling and Dimensionality

In both the spin-chain and complete graph cases, the quadratic loss on average decays exponentially, LQ ∝ e−γN, for some rate constant γ.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-73
SLIDE 73

Motivation Theory SMC Quantum Conclusions Weak Sim. QHL

Scaling and Dimensionality

In both the spin-chain and complete graph cases, the quadratic loss on average decays exponentially, LQ ∝ e−γN, for some rate constant γ. Consider γ as a function of the model dimension Np:

10 10

1

10

2

10

−2

10

−1

10

Np γ

Compelete Line

This suggests that, with access to a quantum simulator, learning may scale efficiently.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-74
SLIDE 74

Motivation Theory SMC Quantum Conclusions

1

Motivation and Applications Overview Nitrogen Vacancy Centers Neutron Interferometry Superconducting Systems

2

Theory of Parameter Estimation Bayes’ Rule Decision Theory

3

Sequential Monte Carlo SMC Algorithm Performance

4

Going Quantum Weak and Strong Simulation Quantum Hamiltonian Learning

5

Conclusions

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-75
SLIDE 75

Motivation Theory SMC Quantum Conclusions

Statistical inference can be used to characterize unknown quantum systems.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-76
SLIDE 76

Motivation Theory SMC Quantum Conclusions

Statistical inference can be used to characterize unknown quantum systems. Sequential Monte Carlo allows for Bayesian updates to be efficiently implemented on a classical computer.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-77
SLIDE 77

Motivation Theory SMC Quantum Conclusions

Statistical inference can be used to characterize unknown quantum systems. Sequential Monte Carlo allows for Bayesian updates to be efficiently implemented on a classical computer. Current best knowledge can be applied to adaptively design new experiments and measurements.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-78
SLIDE 78

Motivation Theory SMC Quantum Conclusions

Statistical inference can be used to characterize unknown quantum systems. Sequential Monte Carlo allows for Bayesian updates to be efficiently implemented on a classical computer. Current best knowledge can be applied to adaptively design new experiments and measurements. Our approach is generic, treating simulation as a resource for learning.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-79
SLIDE 79

Motivation Theory SMC Quantum Conclusions

Statistical inference can be used to characterize unknown quantum systems. Sequential Monte Carlo allows for Bayesian updates to be efficiently implemented on a classical computer. Current best knowledge can be applied to adaptively design new experiments and measurements. Our approach is generic, treating simulation as a resource for learning. We can apply quantum resources to characterize quantum systems.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-80
SLIDE 80

Motivation Theory SMC Quantum Conclusions

Special Thanks

Thanks to Ian Hincks, Osama Moussa and Dimitri Pushin for their work and discussions in developing the motivating physical examples.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-81
SLIDE 81

Motivation Theory SMC Quantum Conclusions

Further Information

Slides, a journal reference for this work, a full bibliography and a software implementation can be found at http://www.cgranade.com/research/talks/lfqis2013/. Thank you for your kind attention!

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-82
SLIDE 82

Hyperparameter Estimation

Method of Hyperparameters

Suppose the “true” model x changes between experiments according to a distribution Pr(x|y), for some vector of parameters y, called hyperparameters.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-83
SLIDE 83

Hyperparameter Estimation

Method of Hyperparameters

Suppose the “true” model x changes between experiments according to a distribution Pr(x|y), for some vector of parameters y, called hyperparameters. We can estimate the hyperparameters instead of the “true” model, Pr(d|y; e) =

  • Pr(d|x, y; e) Pr(x|y; e) dx.
  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information

slide-84
SLIDE 84

Hyperparameter Estimation

Method of Hyperparameters

Suppose the “true” model x changes between experiments according to a distribution Pr(x|y), for some vector of parameters y, called hyperparameters. We can estimate the hyperparameters instead of the “true” model, Pr(d|y; e) =

  • Pr(d|x, y; e) Pr(x|y; e) dx.

In many cases, hyperparameters represent an epistimic

  • decoherence. For instance, the unknown T2 model corresponds

to ω being drawn from a Cauchy (Lorentz) distribution.

  • C. Granade, C. Ferrie, N. Wiebe, D. G. Cory

Parameter Estimation for Quantum Information