a machine learning approach to analog rf circuit testing
play

A Machine-Learning Approach to Analog/RF Circuit Testing* Yiorgos - PowerPoint PPT Presentation

A Machine-Learning Approach to Analog/RF Circuit Testing* Yiorgos Makris Departments of Electrical Engineering & Computer Science YALE UNIVERSITY * Joint work with Dr. Haralampos Stratigopoulos (TIMA Lab, Grenoble, France), Prof. Petros


  1. A Machine-Learning Approach to Analog/RF Circuit Testing* Yiorgos Makris Departments of Electrical Engineering & Computer Science YALE UNIVERSITY * Joint work with Dr. Haralampos Stratigopoulos (TIMA Lab, Grenoble, France), Prof. Petros Drineas (RPI) and Dr. Mustapha Slamani (IBM). Partially funded by NSF, SRC/IBM, TI, and the European Commission.

  2. Test and Reliability @ YALE http://eng.yale.edu/trela

  3. Research Areas  Analog/RF circuits  Machine learning-based testing  Correlation mining for post-fabrication tuning and yield enhancement  Design of on-chip checkers and on-line test methods  Hardware Trojan detection in wireless cryptographic circuits  Digital Circuits  Workload-driven error impact analysis in modern microprocessors  Logic transformations for improved soft-error immunity  Concurrent error detection / correction methods for FSMs  Asynchronous circuits  Fault simulation & test generation for speed-independent circuits  Test methods for high-speed pipelines (e.g. Mousetrap)  Error detection and soft-error mitigation in burst-mode controllers

  4. Presentation Outline • Testing analog/RF circuits • Machine learning-based testing • Testing via a non-linear neural classifier • Construction and training • Selection of measurements • Test quality vs. test time trade-off • Experimental results • Analog/RF specification test compaction • Stand-alone Built-in Self-Test (BIST) • Performance calibration via post-fabrication tuning • Yield enhancement • Summary

  5. Definition of Analog/RF Functionality Symbol Specifications Transistor-Level

  6. Analog/RF IC Testing - Problem Definition Comparison Pre-layout or post-layout Actual silicon performances performances Simulation Measurement Chip Design Layout Fabrication Verification: Testing: • Targets design errors • Targets manufacturing defects • Once per design • Once per chip

  7. Analog/RF IC Test - Industrial Practice • Post-silicon production flow Automatic Test Die Wafer Interface Board Equipment (ATE) • Current practice is specification testing ATE test configurations performance design compare parameters specifications chip pass/fail

  8. Limitations Test Cost: • Expensive ATE (multi-million dollar equipment) • Specialized circuitry for stimuli generation and response measurement Test Time: • Multiple measurements and test configurations • Switching and settling times Alternatives? • Fault-model based test – Never really caught on • Machine learning-based (a.k.a. “alternate”) testing – Regression ( Variyam et al., TCAD’02 ) – Classification ( Pan et al., TCAS-II’99, Lindermeir et al., TCAD’99 )

  9. Machine Learning-Based Testing General idea: • Determine if a chip meets its specifications without explicitly computing the performance parameters and without assuming a prescribed fault model How does it work? • Infer whether the specifications are violated through a few simpler/cheaper measurements and information that is “learned” from a set of fully tested chips Underlying assumption: • Since chips are produced by the same manufacturing process, the relation between measurements and performance parameters can be statistically learned

  10. Regression vs. Classification Problem Definition: specification tests simple performance design compare parameters π (T 1 , … , T k ) functions specifications pass/fail Use machine-learning to approximate these functions unknown,complex, alternate tests non-linear functions (x 1 , … , x n ) (no closed-form) Regression: Classification: • Explicitly learn • Implicitly learn these functions these functions (i.e. approximate (i.e. approximate f:x → π ) f:x → Y,Y={pass/fail})

  11. Overview of Classification Approach training set of chips simpler measurements 0.02 0.015 acquisition of 0.01 measurement patterns 0.005 x 2 specification tests 0 -0.005 Nominal patterns projection on -0.01 Faulty patterns pass/fail labels measurement space -0.015 -0.025 -0.02 -0.015 -0.01 -0.005 0 0.005 0.01 0.015 0.0 x 1 LEARNING learn boundary trained classifier pass/fail label TESTING new (untested) chip measurement pattern

  12. Using a Non-Linear Neural Classifier • Allocates a single boundary of arbitrary order • No prior knowledge of boundary order is required • Constructed using linear perceptrons only • The topology is not fixed, but it grows (ontogeny) until it matches the intrinsic complexity of the separation problem

  13. Linear Perceptron y i perceptron connectivity synapses w w i d i 0 w i 1 = x 1 o x x 1 d perceptron output geometric interpretation x 2 y (x) = y (x) 1 i i for nominal 1 threshold training w activation adjusts i j -1 function to minimize x y (x) = 1 -1 i d error ∑ 0 for faulty w x d i j ∑ j = 0 Nominal patterns = w x j 0 i j j Faulty patterns = j 0

  14. Topology of Neural Classifier y i • Pyramid structure – First perceptron receives w i , y 1 the pattern x ∈ R d w y i , y 1 − i 2 w w i , y − y i 1 i – Successive perceptrons − y 0 i 2 − i 1 also receive inputs from = x 1 o preceding perceptrons w and a parabolic pattern i y 1 x d+1 = ∑ x i 1 2 , i=1…d y w − y i 3 i − d i 2 y w = 2 i + x 1 d 1 o • Every newly added layer assumes the role of the network output y 1 = x 1 o • Non-linear boundary by w w i d training a sequence of i 0 w i 1 linear perceptrons = d ∑ x 1 + = 2 x x o x x d 1 i 1 d = i 1

  15. Training and Outcome • Weights of added layer are adjusted through the thermal perceptron training algorithm • Weights of preceding layers do not change • Each perceptron separates its input space linearly • Allocated boundary is non-linear in the original space x ∈ R d Theorem: • The sequence of boundaries allocated by the neurons converges to a boundary that perfectly separates the two populations in the training set

  16. Boundary Evolution Example (layer 0) 2.5 2 1.5 1 0.5 x 2 0 -0.5 -1 -1.5 -2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 x 1 Nominal patterns correctly classified Nominal patterns erroneously classified Faulty patterns correctly classified Faulty patterns erroneously classified

  17. Boundary Evolution Example (layer 1) 2.5 2 1.5 1 0.5 x 2 0 -0.5 -1 -1.5 -2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 x 1 Nominal patterns correctly classified Nominal patterns erroneously classified Faulty patterns correctly classified Faulty patterns erroneously classified

  18. Boundary Evolution Example (layer 2) 2.5 2 1.5 1 0.5 0 x 2 -0.5 -1 -1.5 -2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 x 1 Nominal patterns correctly classified Nominal patterns erroneously classified Faulty patterns correctly classified Faulty patterns erroneously classified

  19. Boundary Evolution Example (output layer) 2.5 2 1.5 1 0.5 x 2 0 -0.5 -1 -1.5 -2 -2 -1.5 -1 -0.5 0 0.5 1 1.5 x 1 Nominal patterns correctly classified Nominal patterns erroneously classified Faulty patterns correctly classified Faulty patterns erroneously classified

  20. Matching the Inherent Boundary Order 100 95 Is Higher Order Always Better? 90 • No! The goal is to generalize rate (%) 85 training set • Inflexible and over-fitting 80 validation set boundaries 75 70 0 2 4 6 8 10 12 14 number of layers Finding The Trade-Off Point (Early Stopping) • Monitor classification on validation set • Prune down network to layer that achieves the best generalization on validation set • Evaluate generalization on test set

  21. Are All Measurements Useful? 4 4 3 3 2 2 1 1 0 x 2 x 2 0 -1 -1 -2 Nominal patterns -2 Nominal patterns -3 Faulty patterns Faulty patterns -3 -4 -3 -2.5 -2 -1.5 -1 -0.5 0 0.5 1 1.5 2 -4 -3 -2 -1 0 1 2 3 4 x 1 x 1 Non-Discriminatory Linearly Dependent

  22. Curse of Dimensionality Nominal training patterns Nominal training patterns Nominal training patterns Faulty training patterns Faulty training patterns Faulty training patterns New patterns • By increasing the dimensionality we may reach a point where the distributions are very sparse • Several possible boundaries exist – choice is random • Random label assignment to new patterns

  23. Genetic Measurement Selection • Encode measurements in a bit string, with the k-th bit denoting the inclusion (1) or exclusion (0) of the k-th measurement GENERATION GENERATION t t+1 0 1 1 0 1 1 1 0 0 1 0 0 1 0 1 1 1 0 1 1 0 0 0 0 1 1 1 0 0 1 0 0 1 0 1 1 1 0 0 1 1 0 1 0 1 1 1 0 0 0 1 0 1 0 1 1 1 0 0 0 0 1 0 0 1 0 1 0 1 1 0 1 0 0 1 0 1 0 0 0 Reproduction Crossover & Mutation 0 1 1 1 1 0 1 0 0 0 0 1 1 1 1 1 0 0 0 0 1 1 1 0 1 1 0 0 0 0 0 1 1 0 1 1 0 0 0 0 Fitness function: NSGA II: Genetic algorithm with multi-objective fitness function reporting pareto-front for error rate (g r ) and number of re-tested circuits (n r )

  24. Two-Tier Test Strategy Highly Accurate Inexpensive Machine Learning Test Pass/Fail Decision Simple Most Low-Cost Neural All Chips Alternative Chips Tester Classifier Measurements Expensive Specification Test Measurement Few Pattern in Chips Guard-band Specification Design High-Cost Compare Test Specs Tester Measurements Highly Accurate Pass/Fail Decision

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend