quantifiable comparative evaluation of fib sem instruments
play

QUANTIFIABLE COMPARATIVE EVALUATION OF FIB/SEM INSTRUMENTS Valery - PowerPoint PPT Presentation

QUANTIFIABLE COMPARATIVE EVALUATION OF FIB/SEM INSTRUMENTS Valery Ray (1**) , Joshua Taillon (2* ) , Lourdes Salamanca-Riba (2***) (1) PBS&T, Methuen, MA; (2) University of Maryland, College Park, MD (*) Present address: Material Measurement


  1. QUANTIFIABLE COMPARATIVE EVALUATION OF FIB/SEM INSTRUMENTS Valery Ray (1**) , Joshua Taillon (2* ) , Lourdes Salamanca-Riba (2***) (1) PBS&T, Methuen, MA; (2) University of Maryland, College Park, MD (*) Present address: Material Measurement Laboratory, NIST, Gaithersburg, MD (**) vray@partbeamsystech.com, (***) riba@umd.edu 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017

  2. Outline • Challenges of FIB/SEM equipment evaluation • Quantifiable comparative testing approach • Design of tests targeting intended applications • Practical Examples • Summary 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 2

  3. Challenges of Evaluating FIB/SEM Instruments • Complexity of translating application needs into instrumentation requirements and evaluation criteria • There are no “bad” instruments out there • OEM engineers are highly skilled with demonstrations • Outcome of same operation for average user could be very different • “Canned Demo” approach by OEMs • Designed to demonstrate strong sides • Art of crafting specifications – “specsmanship” • Critical (for your application) performance parameters could be “confidential” • Sometimes for a reason of not being known, defined, or ever tested 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 3

  4. Quantifiable Comparative Testing Approach • Identify range of applications for intended usage • Translate application goals into instrumentation requirements • Design comparative tests, define evaluation criteria • Test descriptions and samples to all vendors as early as possible • Comprehensive evaluation based for intended use: • Quantifiable testing of critical performance parameters • Based on pre-defined evaluation criteria • Applications demo • Overall performance in 3D applications, TEM lamella prep, etc… • Two-day evaluation is reasonable to get all the data 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 4

  5. Tests targeting intended applications • General Performance • Beam quality; System stability; Aperture repeatability • Patterning • Beam placement; Etching fidelity; Beam drifts and shifts • TEM lamella preparation • Throughput; Thickness uniformity; Ease of use; Automation; Endpoint • FIB Tomography 3D slice-n-view • Unattended runtime; Image quality; Throughput; Ease of use; Drift Correction; Focus Tracking; Slice thickness uniformity; EDS integration • Imaging • SEM SE, SEM BSE, STEM BF, STEM DF, FIB SE, FIB SI….. 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 5

  6. Samples for comparative evaluation • Performance Testing • Application Testing SiO 2 optical flat, ~24nm evaporated Al Epoxy-impregnated Solid Electrolyte coating, silver paint around perimeter Fuel Cell (SOFC), 2-phase ceramic Aperture Repeatability Deposition Profiles SiO 2 Etching Profiles TEM lift-out Same sample(s) to all vendors, require return of test sample(s) for independent analysis 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 6

  7. General Performance – Beam Quality • Basic test of ion beam quality: shape and homogeneity Aperture size increase Aperture size increase Inhomogeneous beam Aperture size increase Dose increase Aperture size increase Single aperture problem Systematic high-current problem 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 7

  8. General Performance – System Stability Typical system drift • Basic test of tool stability HVAC failure 20 µm “Quite not bad” 1 st pair 1 µm L-shaped single-line pattern and 21 beam burns on each arm with 1um offset along the line and 5 minutes delay between burns 20 µm 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 8

  9. Critical Performance – Etch Placement & Fidelity 10pA 1pA 3pA 1um 1.8 nC/µm 2 0.2 nC/µm 2 0.6 nC/µm 2 0.2 nC/µm 2 0.6 nC/µm 2 1.8 nC/µm 2 0.2 nC/µm 2 0.6 nC/µm 2 1.8 nC/µm 2 Gas Gas Gas Shortest dwell time, -20% pixel overlap, x2 pattern repeats: (a) sputtering/GAE (XeF 2 ) and (b) sputtering/depo (C, Pt, W) 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 9

  10. Critical Performance – Etching Placement • Patterning where intended with/without gas injection Drift after aperture change Problem for automatic patterning Shift due to gas problem for site-specific deposition and GAE Aperture change shift problem for multiple- current patterning Drift during line exposure Expected performance Visible artifacts • C or Pt stripe e-beam deposited across lines, TEM lamella prepared and STEM-imaged as part of application testing 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 10

  11. Critical Performance – Etching Fidelity Sidewall slope and Al intact Al intact No-Gas profile aspect ratio define polishing efficiency Al layer removal indicates beam tails GAE No-Gas damage to surface Al removed Narrowest cut Al removed beam tails damage beam tails damage defined by width of a tip of No-Gas etching profile No-Gas to GAE profile area ratio defines GAE GAE No-Gas enhancement 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 11

  12. Application Testing – SOFC imaging • Side-by-side comparison of same sample imaging SE BSE 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 12

  13. Application Testing – 3D Reconstruction • Fix experimental Detector Slice thickness parameters between settings vendors: Image Dwell time resolution • Run overnight, if possible • Results to evaluate: • Total running time (limited by stability) • Usable acquisition volume/hour • Acquired image quality • Output/ease of use of 3D visualization software Example of vendor visualization output 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 13

  14. Summary • Quantifiable testing approach enables comparative evaluation of FIB/SEM instruments by collecting performance data under controlled conditions • Careful sample preparation, thorough test design, and demo planning • Seamless integration of performance tests with applications demo facilitates comprehensive evaluation • providing OEMs opportunity to showcase strong features of the equipment • while allowing side-by-side comparison of critical performance parameters • There are no “bad” tools, but nobody is perfect either • Interpret test results in context of realistic application requirements 10 th FIB SEM User Group Meeting, NIST Gaithersburg MD March 2, 2017 14

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend