topics in quantum machine learning
play

Topics in Quantum Machine Learning Vedran Dunjko - PowerPoint PPT Presentation

Topics in Quantum Machine Learning Vedran Dunjko v.dunjko@liacs.leidenuniv.nl 1 Q uantum M achine L earning (QML) Q uantum I nformation M achine L earning/ AI P rocessing ( QIP ) (ML/AI) ML QIP (quantum-applied ML) [74] QIP ML


  1. Topics in Quantum Machine Learning Vedran Dunjko v.dunjko@liacs.leidenuniv.nl 1

  2. Q uantum M achine L earning (QML) Q uantum I nformation M achine L earning/ AI P rocessing ( QIP ) (ML/AI) ML → QIP (quantum-applied ML) [’74] QIP → ML (quantum-enhanced ML) [‘94] QIP ↭ ML (quantum-generalized learning) [‘00] ML-insipred QM/QIP Physics inspired ML/AI 2

  3. Machine learning is not one thing. AI is not even a few things. unsupervised learning big data analysis ML supervised learning generative models deep learning sequential decision online learning non-parametric theory learning reinforcement learning computational learning theory control theory parametric learning statistical learning non-convex AI local search optimization Symbolic AI 3

  4. Quantum-enhanced ML is even more things big data analysis unsupervised learning ML Quantum linear algebra supervised learning generative models Shallow quantum deep learning online learning sequential circuits decision non-parametric reinforcement Quantum oracle theory learning learning identification Adiabatic QC/ parametric learning Quantum control theory non-convex computational learning theory Quantum optimization COLT optimization Quantum statistical learning local search AI walks & search Symbolic AI 4

  5. Quantum-enhanced ML is even more things big data analysis unsupervised learning ML Quantum linear algebra supervised learning generative models Shallow quantum deep learning online learning sequential circuits decision non-parametric reinforcement theory learning learning parametric learning Adiabatic QC/ control theory non-convex computational learning theory Quantum optimization optimization statistical learning local search AI Symbolic AI 5

  6. And then there’s Quantum-applied ML! QKD parameter phase control Hybrid computation diagrams (AI) Quantum network QIP optimization order NISQ optimization, Efficient parameters QAOA & VQE decoders Adaptive error Ground state correction Circuit Ansatz control and synthesis optimization of qubits Metrology Q. Phys Experiment synthesis high-energy 6

  7. Unsupervised learning g n i n r a e QKD parameter l t n e m phase e c r control o f Hybrid computation n i e diagrams R g n i n r (AI) a e l d e Quantum network s QIP i v r e p u S optimization order R NISQ optimization, e Efficient i parameters n QAOA & VQE f S o decoders u r c p e e m r l a Adaptive error r g Ground state v u n e e i n N i r Circuit a s n e l correction t n Ansatz e t e control and s m k e r d l synthesis o c r w e o t f e n n l a i e optimization of e R g r n a i n n r a e r qubits i l d n n Metrology e s Experiment i v i g r n e p u S g synthesis Q. Phys g n i high-energy n r a e l d e s i v r e p u S 7

  8. What is machine learning 8

  9. Machine Learning: the WHAT ? or -generative models -clustering (discriminative) Learning P(labels|data) given -feature extraction samples from P(data,labels) Learning structure in P(data) (also regression) give samples from P(data) 9

  10. Machine Learning: the WHAT Beyond data: reinforcement learning T ( s | s 0 , a ) 10

  11. Also: MIT technology review breakthrough technology of 2017 [AlphaGo anyone?] 11

  12. 12

  13. Using RL in Real Life Navigating a city… https://sites.google.com/view/streetlearn P. Mirowski et. al, Learning to Navigate in Cities Without a Map , arXiv:1804.00168 13

  14. Machine Learning: the HOW output hypothesis h on Data x Labels approximating P(labels|data) In practice model 
 Optimizer parameters θ estimate error on sample (dataset) 14

  15. Support vector machines separating hyperplane.. 15

  16. Support vector machines separating hyperplane.. …in higher-dimensional feature space Still (algebraic) optimization over hyperplane and feature function parameters…. 16

  17. Machine Learning: the HOW Learning structure in P(data) give samples from P(data) 17

  18. Machine Learning: the HOW output: output: hypothesis h on Data x Labels hypothesis h on Data approximating P(labels|data) “ approximating” P(data) Reinforcement learning output: policy π on Actions x States 18

  19. Supervised learning Unsupervised learning (learning how to label datapoints, (learning a distribution, learning how to approximate a function, generate. properties from samples, how to classify ) feature extraction & dim. reduction ) Reinforcement learning (learning behavior , policy , or optimal control )

  20. That is all ML we need for now What about quantum computers? 20

  21. Quantum computers… …and computer science …and physics -likely can efficiently compute more things - manipulate registers of than classical computers (factoring) 2-level systems (qubits) e.g. factor numbers, or generate complex distributions -even if QC is “shallow” -full description: n qubits → 2 n dimensional vector - manipulation : acting locally ( gates ) …and reality cca 50 qubit special-purpose all-purpose quantum annealers Banana noisy for scale 21

  22. Quantum computers… …and computer science …and physics -can compute things likely beyond BPP (factoring) - manipulate registers of -can produce distributions which are hard-to-simulate 2-level systems (qubits) for classical computers (unless PH collapses ) -full description: -even if QC is “shallow” n qubits → 2 n dimensional vector …and reality cca 50 qubit special-purpose all-purpose quantum annealers Banana noisy for scale 22

  23. 8 Quantum-enhanced supervised learning: the quantum pipeline a) The optimization bottleneck b) Big data & comp. complexity c) Machine learning Models 23

  24. Quantum-enhanced supervised learning: the quantum pipeline a) The optimization bottleneck — quantum annealers b) Big data & comp. complexity — universal QC and Q. databases — restricted (shallow) architectures c) Machine learning Models 24

  25. Quantum-enhanced supervised learning: the quantum pipeline a) The optimization bottleneck — quantum annealers b) Big data & comp. complexity — universal QC and Q. databases — restricted (shallow) architectures c) Machine learning Models 25

  26. The optimization bottleneck H ( s ) = sH initial + (1 − s ) H target ; s ( time ) • Finding ground states of Hamiltonians via adiabatic evolution 
 • Very generic optimization problem: argmin | ψ i h ψ | H | ψ i

  27. QeML is even more things big data analysis unsupervised learning ML Quantum linear algebra supervised learning generative models Shallow quantum deep learning online learning sequential circuits decision non-parametric reinforcement theory learning learning parametric learning Adiabatic QC/ control theory non-convex computational learning theory Quantum optimization optimization Quantum statistical learning local search walks & search AI Symbolic AI 27

  28. Quantum-enhanced supervised learning: the quantum pipeline a) The optimization bottleneck — quantum annealers b) Big data & comp. complexity — universal QC and Q. databases — restricted (shallow) architectures c) Machine learning Models 28

  29. Precursors of Quantum Big Data Exponential data? Much of data analysis is linear-algebra: + regression = Moore-Penrose PCA = SVD… 29

  30. Enter quantum linear algebra amplitude encoding exp(n) amplitudes interpret QM as linear algebra verbatim R N 3 x = ( x i ) i in n qubits ↓ | ψ i / P N state vector ↔ (data) vector i =1 x i | i i density matrices block encoding Hamiltonians ↔ linear maps unitaries  �  �  � A B A ψ ψ U | 0 i | ψ i = = | 0 i A | ψ i + | 0 i C | ψ i = C D C ψ 0 projective ↔ inner products measurements functions of operators (swap tests) f ( A ) | ψ i = α 0 | ψ i + α 1 A | ψ i + α 0 A 2 | ψ i · · · ⇡ · · · ⇡ A − 1 | ψ i prepare states expressible as linear-algebraic inner products manipulations of data-vectors in polylog(N) P (0) ψ = | h 0 | ψ i | 2 (when other quantities are well behaved) Phys. Rev. Lett. 15 ,. 103, 250502 (2009) arXiv:1806.01838 30

  31. If this worked literally…this would make us INFORMATION GODS. Prediction: 44 zettabytes by 2020. If all data is floats, this is 5.5x10 21 float values

  32. If this worked literally…this would make us INFORMATION GODS. Prediction: 44 zettabytes by 2020. If all data is floats, this is 5.5x10 21 float values … can be stored in state of 73 qubits (ions, photons….)

  33. Clearly there is a catch. Many of them.

  34. Timeline We made it so efficient… that sometimes we don’t need QCs!! First efficient end-to-end Machine learning Data-robustness Linear scenario applications & implies system q. efficiency Improvements Quantum solving database QLA, smoothed analysis, De-quantization of Quantum low-rank systems Recommender Optimal Systems Regression, QLS PCA, SVM HHL Pattern QRAM recognition on a QC { 2 2 2 2 2 2 2 2 2 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 1 1 1 3 9 9 2 3 8 4 6 8 ?

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend