Test Scheduling for Modular SoCs in an Abort-on-Fail Environment - - PowerPoint PPT Presentation

test scheduling for modular socs in an abort on fail
SMART_READER_LITE
LIVE PREVIEW

Test Scheduling for Modular SoCs in an Abort-on-Fail Environment - - PowerPoint PPT Presentation

1/30 Test Scheduling for Modular SoCs in an Abort-on-Fail Environment Urban Ingelsson 1 Sandeep Kumar Goel 2 Erik Larsson 1 Erik Jan Marinissen 2 1) 2) Linkpings Universitet Philips Research Labs Department of Computer Science IC Design


slide-1
SLIDE 1

1/30

Test Scheduling for Modular SoCs in an Abort-on-Fail Environment

Urban Ingelsson1 Sandeep Kumar Goel2 Erik Larsson1 Erik Jan Marinissen2

1)

Linköpings Universitet Department of Computer Science Embedded Systems Laboratory Sweden

2)

Philips Research Labs IC Design – Digital Design & Test The Netherlands

slide-2
SLIDE 2

2/30

  • 1. Introduction

Test Scheduling for Modular SoCs in an Abort-on-Fail Environment

  • SoC with embedded modules
  • Abort-on-Fail capable ATE

– Used in high volume testing – The SoC test is aborted as soon as an defect is detected – Could be any clock cycle of test response

  • The expected time spent in the ATE

– Pass probability of the module test (yield)

  • We reduce the expected test time

– By scheduling module tests – Prioritize low-yielding and short tests

slide-3
SLIDE 3

3/30

Presentation Outline

  • 1. Introduction
  • 2. Problem Definition
  • 3. Prior Work
  • 4. Model for Expected Test Time
  • 5. Scheduling Algorithm
  • 6. Experimental Results
  • 7. Conclusion
slide-4
SLIDE 4

4/30

  • 2. Problem Definition

Test Architecture

  • Modules are tested via

disjunct Test Bus TAMs

  • The test execution order per

TAM can be rescheduled

Logic 2 TAM 1 TAM 2 TAM 3 TAM 1 TAM 2 TAM 3 Logic 1 Mem 2 Mem 1 CPU A B C D E Mem 1 A Mem 2 B C D Logic 1 Logic 2 E CPU

SoC

slide-5
SLIDE 5

5/30

  • 2. Problem Definition

Problem Definition

  • Reduce the expected test time per chip for a

fixed test architecture with disjunct TAMs

  • Input

– Test architecture – Yield-per-module

  • Output

– Test schedule, such that the expected test time is minimized

  • Spend less time on

faulty circuits

  • If the test fails, it is

aborted early

  • Low-yielding and short

tests should be performed early

slide-6
SLIDE 6

6/30

  • 2. Problem Definition

Test Scheduling Tool Overview

TAM architecture and wrapper designer Test scheduler Expected test-time calculator

Architecture Schedule Expected test time TAM width SoC description Module test pass-probabilities Test scheduling tool

slide-7
SLIDE 7

7/30

Presentation Outline

  • 1. Introduction
  • 2. Problem Definition
  • 3. Prior Work
  • 4. Model for Expected Test Time
  • 5. Scheduling Algorithm
  • 6. Experimental Results
  • 7. Conclusion
slide-8
SLIDE 8

8/30

  • 3. Prior Work

Defect-Aware Test Scheduling - One TAM

  • One test

[Lee & Krishna IEEETC´91]

– Evaluated the tests in parts – A cost for evaluating the test

  • A sequential ordering of tests

[Huss & Gyurcsik DAC´91, Jiang & Vinnakota VTS´99]

– Test scheduling for analog device – Overlapping tests – Analog tests, evaluated at completion

time 1 evaluation >1 evaluation 1 3 2

slide-9
SLIDE 9

9/30

  • 3. Prior Work

Defect-Aware Module-Test Scheduling Multiple TAMs

  • Test scheduling and test-architecture design

[Larsson et al. VTS´04]

– Expected test time for multiple TAMs – Fork-n-merge TAM architecture type – Drawbacks

  • Tests are evaluated at the end of a module test
  • A constant pass probability per clock cycle of a test
  • Only used during chip design, not for re-scheduling

time TAM wires

slide-10
SLIDE 10

10/30

Presentation Outline

  • 1. Introduction
  • 2. Problem Definition
  • 3. Prior Work
  • 4. Model for Expected Test Time
  • 5. Scheduling Algorithm
  • 6. Experimental Results
  • 7. Conclusion
slide-11
SLIDE 11

11/30

  • 4. Model for Expected Test Time

One TAM

  • The expected test time E(T )

is the sum of

– Non-overlapping time intervals – Weighted with the probability

  • f testing in that interval

– The intervals correspond to the time between test evaluations 1 2 3 time t1 t2 t3

( ) ( )

1 E

1 1 1

= = − ⋅         =

− = − =

∑ ∏

t p t t p T

i i n i i j j

and where

( ) ( ) ( )

2 3 2 1 1 2 1 1

E t t p p t t p t T − ⋅ ⋅ + − ⋅ + = time t1 t2 t3 Cumulative pass probability 1 p1 t1 p1·(t2-t1) p1·p2·(t3-t2) Schedule

slide-12
SLIDE 12

12/30

  • 4. Model for Expected Test Time

Multiple TAMs

  • Gather evaluations in a list

– Time and pass probability – For multiple evaluations at the same time, multiply the pass probability to make one list entry 1 3 5 time t1 t2 t3

( ) ( )

and 1 where E

1 1 1

= = − ⋅         =

− = − =

∑ ∏

t p t t p T

i i n i i j j

time Cumulative pass probability 1 2 4 t4 t1 t2 t3 t4 p1·p4 TAM A B Schedule

slide-13
SLIDE 13

13/30

Test-based evaluation model Pattern-based evaluation model

  • 4. Model for Expected Test Time

Reducing the Abortable Unit Size

Cycle-based evaluation model Cumulative pass probability time 1

  • Evaluation takes place

when test responses are available

  • Each clock cycle of

scan-out makes test responses available

  • A model with a smaller

grain size will have a more accurate result

slide-14
SLIDE 14

14/30

  • 4. Model for Expected Test Time

Pass Probability Distribution

  • The pass probability of the

evaluated units

– Test patterns & clock cycles – Automatic or manually given

  • The product of the

distributed pass probabilities is the total pass probability

  • f the module test

( )

( )

( )

m P m i p

m n i

=

=

1

,

slide-15
SLIDE 15

15/30

  • 4. Model for Expected Test Time

Pattern Pass Probability Distribution

  • Our model for test patterns

– Low pass probability for the first few test patterns – Increasing

1 Pass probability Pattern number

( ) ( )

( )

i m n j j

m P m i p

⋅           ∑ =

=

1 1 1

,

slide-16
SLIDE 16

16/30

  • 4. Model for Expected Test Time

Clock-Cycle Pass Probability Distribution

  • The pass probability of the

pattern is evenly distributed

  • ver the evaluated cycles
  • The ATE evaluates the test

during clock cycles of test response scan-out

  • No test responses → no fail

– The scan-in of the first pattern of a test – The scan-in time longer than the scan-out time

  • The pass probability of

scan-in-only cycles := 1

Scan-in Scan-out time Apply stimuli and capture responses Scan-in-only cycle time time Cumulative pass probability Pass probability 1 1

slide-17
SLIDE 17

17/30

Presentation Outline

  • 1. Introduction
  • 2. Problem Definition
  • 3. Prior Work
  • 4. Model for Expected Test Time
  • 5. Scheduling Algorithm
  • 6. Experimental Results
  • 7. Conclusion
slide-18
SLIDE 18

18/30

  • 5. Scheduling

Non-Preemptive Scheduling

  • Idea

– Detect faults as early as possible – Spend less time on faulty chips

  • Schedule early

– Tests with low pass probability – Short tests

  • Possible action: Arrange the test sequence in a TAM

2 3 4 5 6 7 8 9 1 1 1 1

TAM A TAM B TAM C

1 2 4 6 7 5 1 1 8 9 10

Time

3

Scheduling Architecture Schedule

slide-19
SLIDE 19

19/30

  • 5. Scheduling

Scheduling Heuristic

  • Sorting criterion derived from the expected test

time formula for one TAM

  • Gives the optimal schedule for a Test Bus TAM
  • Heuristic step:

– Assume that optimality for each TAM separately gives good solution for the whole SoC

( ) ( )

B A p t p t

B B A A

before Schedule ⇔ − < − 1 E 1 E

slide-20
SLIDE 20

20/30

  • 5. Scheduling

Good Heuristic?

  • Not optimal for

more than one TAM

  • A change in one

TAM influences the expected test time

– Introduces other interference patterns with the pass probability of

  • ther TAMs
slide-21
SLIDE 21

21/30

  • 5. Scheduling

Preemptive Scheduling

  • Digital tests that fail
  • ften do so in the first

few test patterns

– A fault coverage graph is an increasing function

  • Schedule “subtests”

with low pass probability early

  • Each preemption costs

in increased test completion time

1 Pass probability Pattern number

A B

1 B2 A2 1 Cost Time Time Pass probability Pass probability Cumulative pass probability Cumulative pass probability Time Time

slide-22
SLIDE 22

22/30

  • 5. Scheduling

Our Approach to Preemptive Scheduling

  • Preempt into 2 subtests

– 1st subtest

  • 10% of the test patterns
  • Will go early in the schedule

– 2nd subtest

  • The remaining 90% of the test patterns

– One scan-load of increased test completion time per module

  • Same scheduling heuristic as for module tests
slide-23
SLIDE 23

23/30

Presentation Outline

  • 1. Introduction
  • 2. Problem Definition
  • 3. Prior Work
  • 4. Model for Expected Test Time
  • 5. Scheduling Algorithm
  • 6. Experimental Results
  • 7. Conclusion
slide-24
SLIDE 24

24/30

  • 6. Experimental Results

Set Up

  • The Philips in-house tool TR-Architect[Goel et al. ITC´03]

supplies test architecture

– For ITC´02 benchmark SoCs

  • d695, p22810, p34392, p93791

– For a total number of TAM wires

  • 16, 24, 32, 40, 48, 56, 64
  • Module test pass probabilities

– Set I, defined by us (0.23 ≤ P ≤ 0.99, average(P)≈0.80) – Set L, [Larsson et al. VTS´04] (0.90 ≤ P ≤ 0.99, average(P)≈0.95)

slide-25
SLIDE 25

25/30

  • 6. Experimental Results

Expected Test Time

0% 10% 20% 30% 40% 50% 60% 70% 16 24 32 40 48 56 64 L-set Test-based Pattern-based C lock-cycle-based 0% 10% 20% 30% 40% 50% 60% 70% 16 24 32 40 48 56 64 I-set

E(T), relative to test completion time Pass probability set and SOC TAM width

slide-26
SLIDE 26

26/30

0% 5% 10% 15% 20% 25% 30% 35% 40% 16 24 32 40 48 56 64 I-set

0% 10% 20% 30% 40% 50% 60% 70% 80% 16 24 32 40 48 56 64 L-set Estimation of worst case Random Our heuristic

  • 6. Experimental Results

Non-Preemptive Scheduling

At worst 3% worse than random At best 95% better than random L-set On average 37% better than random I-set On average 72% better than random

E(T), relative to test completion time Pass probability set and SOC TAM width

slide-27
SLIDE 27

27/30

  • 6. Experimental Results

Our Heuristic vs. Optimal (SOC d695)

526 628 841 1237 1367 2752 2543 Our heuristic 7414 8477 9670 12065 14822 19399 29407 Our heuristic Optimal Optimal 19399 29407 L-set 520 628 807 1237 1367 2752 2507 I-set 14822 32 12065 40 7414 64 8477 56 9670 48 24 16 TAM width

slide-28
SLIDE 28

28/30

  • 6. Experimental Results

Preemptive Scheduling

0,0% 0,2% 0,4% 0,6%

Increase in completion time E(T), relative to non-preemptive results

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 16 24 32 40 48 56 64 L-set Our heuristic Lower bound 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 16 24 32 40 48 56 64 I-set

0,0% 0,2% 0,4% 0,6%

slide-29
SLIDE 29

29/30

Presentation Outline

  • 1. Introduction
  • 2. Problem Definition
  • 3. Prior Work
  • 4. Model for Expected Test Time
  • 5. Scheduling Algorithm
  • 6. Experimental Results
  • 7. Conclusion
slide-30
SLIDE 30

30/30

  • 7. Conclusion

Conclusion

  • Test scheduling for modular SoCs

– Given a fixed architecture with disjunct TAMs – Given module yield values = test pass probability – Utilizing ATE abort-on-fail capability

  • Pass probability aware test scheduling can

reduce the expected test time significantly

  • Preemptive scheduling

– Reduces the expected test time further – Small cost in increased test completion time

  • Improved expected test-time model

– Gained accuracy with smaller evaluation granularity – Pass probability distribution model