asymmetric information measures
play

Asymmetric Information Measures: Problem Description of the . . . - PowerPoint PPT Presentation

How Knowledge Is . . . Experts Are Usually . . . Asymmetric Information Measures: Problem Description of the . . . How to Extract Knowledge Asymptotically . . . Average Case . . . From an Expert so That Average Case: . . . the Experts


  1. How Knowledge Is . . . Experts Are Usually . . . Asymmetric Information Measures: Problem Description of the . . . How to Extract Knowledge Asymptotically . . . Average Case . . . From an Expert so That Average Case: . . . the Expert’s Effort Is Minimal Title Page Hung T. Nguyen ◭◭ ◮◮ Department of Mathematical Sciences ◭ ◮ New Mexico State University Las Cruces, New Mexico 88003, USA Page 1 of 17 Email: hunguyen@nmsu.edu Go Back Vladik Kreinovich and Elizabeth Kamoroff Full Screen Department of Computer Science, University of Texas at El Paso, Close El Paso, TX 79968, USA vladik@utep.edu Quit

  2. 1. How Knowledge Is Extracted Now How Knowledge Is . . . Experts Are Usually . . . • Knowledge acquisition: we ask experts questions, and Problem put the answers into the computer system. Description of the . . . • Problem: it is a very time-consuming and therefore Asymptotically . . . expensive task. Average Case . . . Average Case: . . . • Objective: minimize the effort of an expert. Title Page • Related problem: how do we estimate this effort? ◭◭ ◮◮ • Reasonable idea: number of binary (“yes”-“no”) ques- tions. ◭ ◮ • Resulting strategy: binary search. Page 2 of 17 Go Back • Idea: we choose a question for which the answer is “yes” for exactly half of the remaining alternatives. Full Screen • Property: we need log 2 ( N ) questions to select one of Close N alternatives. Quit

  3. 2. Experts Are Usually More Comfortable with “Yes” How Knowledge Is . . . Answers Experts Are Usually . . . Problem • In practice: most people feel more comfortable answer- Description of the . . . ing “yes” than “no”. Asymptotically . . . • Fact: the expert’s time is valuable. Average Case . . . Average Case: . . . • Consequence: an expert is usually called after compe- tent people tried to solve the problem. Title Page • Expected situation: the expert mostly confirms their ◭◭ ◮◮ preliminary solutions. ◭ ◮ • Consequence: most expert’s answers are “yes”. Page 3 of 17 • Binary search case: half of the answers are “no”s. Go Back • Meaning: half of the previous decisions were wrong. Full Screen • Expert’s conclusion: no competent people tried this Close problem – so his/her valuable time was wasted. Quit

  4. 3. Experts Are Usually More Comfortable with “Yes” How Knowledge Is . . . Answers (cont-d) Experts Are Usually . . . Problem • Situation: a knowledge engineer interviews the expert. Description of the . . . • First alternative: most answers are “yes”; meaning: Asymptotically . . . Average Case . . . – the knowledge engineer already has some prelimi- Average Case: . . . nary knowledge of the area, and – he/she is appropriately asking these questions to Title Page improve this knowledge. ◭◭ ◮◮ • Binary search: half of the answers are “no” (same as ◭ ◮ for random questions); interpretation: Page 4 of 17 – the knowledge engineer did not bother to get pre- Go Back liminary knowledge; Full Screen – the highly skilled expert is inappropriately used to answer questions Close – which could be answered by consulting a textbook. Quit

  5. 4. Problem How Knowledge Is . . . Experts Are Usually . . . • Reminder: experts prefer “yes” answers. Problem • Additional phenomenon: Description of the . . . Asymptotically . . . – the larger the number of negative answers, Average Case . . . – the more discomfort the expert will experience, and Average Case: . . . – the larger effort he will have to make to continue this interview. Title Page • Previous objective: minimize the total number of ques- ◭◭ ◮◮ tions. ◭ ◮ • More appropriate objective: minimize the effort of an Page 5 of 17 expert. Go Back • How to describe the effort: assign more weight to “no” Full Screen answers than to “yes” ones. Close • What we do: find a search procedure which attains this objective. Quit

  6. 5. How to Describe Different Search Procedures How Knowledge Is . . . Experts Are Usually . . . • Let S be the set of N alternatives. Problem • We denote “yes” as 1, “no” as 0, so each sequence of Description of the . . . answers ω is a binary sequence. Asymptotically . . . Average Case . . . • To describe a search procedure, we must have: Average Case: . . . – the set Ω of possible answer sequences ω , and Title Page – a mapping A which maps each ω ∈ Ω to the set ◭◭ ◮◮ A ( ω ) of all alternatives which are consistent with ω . ◭ ◮ • Formally: A (Λ) = S , and for every ω ∈ Ω: Page 6 of 17 • if | A ( ω ) | = 1, then no extension of ω belongs to Ω; Go Back • otherwise, ω 0 ∈ Ω, ω 1 ∈ Ω, and we have Full Screen A ( ω ) = A ( ω 0) ∪ A ( ω 1) , A ( ω 0) ∩ A ( ω 1) = ∅ , Close A ( ω 0) � = ∅ , A ( ω 1) � = ∅ . Quit

  7. 6. How to Gauge Different Search Procedures How Knowledge Is . . . Experts Are Usually . . . • Let P = (Ω , A ) be a search procedure. Problem • Let W 0 be the cost of “no” answer, and W 1 < W 0 be Description of the . . . the cost of the “yes” answer. Asymptotically . . . Average Case . . . • For a ∈ Ω, let ω ( a, P ) = ω 1 ω 2 . . . ω k denote the se- quence of answers which leads to a . Average Case: . . . • The cost W ( ω ( a, P )) of finding a is defined as Title Page W ( ω ( a, P )) = W ( ω 1 ω 2 . . . ω k ) = W ω 1 + W ω 2 + . . . + W ω k . ◭◭ ◮◮ • The effort of a procedure is defined as the largest of its ◭ ◮ costs: Page 7 of 17 E ( P ) = max a ∈ S W ( ω ( a, P )) . Go Back • Objective: find a procedure P opt with the smallest pos- Full Screen sible effort: Close def E ( P opt ) = T ( N ) = min E ( P ) . P Quit

  8. 7. Example 1: Binary Search (Optimal for W 0 = W 1 ) How Knowledge Is . . . Experts Are Usually . . . • Situation: a doctor chooses between N = 4 possible Problem analgetics: Description of the . . . – aspirin ( as ), Asymptotically . . . Average Case . . . – acetaminophen ( ac ), Average Case: . . . – ibuprofen ( ib ), and – valium ( va ). Title Page ◭◭ ◮◮ • Binary search: A (Λ) ◭ ◮ ւ ց Page 8 of 17 A (0) A (1) Go Back ւ ց ւ ց Full Screen A (00) A (01) A (10) A (11) Close as ac ib va Quit

  9. 8. Example 2: A Search Procedure Which Is Better How Knowledge Is . . . Than Binary ( W 0 > W 1 ) Experts Are Usually . . . Problem • When W 1 = 1 and W 0 = 3, the effort of the binary Description of the . . . search is 6. Asymptotically . . . • We can decrease the effort to 5 by applying the follow- Average Case . . . ing alternative procedure: Average Case: . . . A (Λ) Title Page ւ ց ◭◭ ◮◮ A (0) A (1) ◭ ◮ ւ ց as Page 9 of 17 A (10) A (11) Go Back ւ ց ac Full Screen A (110) A (111) Close ib va Quit

  10. 9. Description of the Optimal Search Procedure How Knowledge Is . . . Experts Are Usually . . . • Auxiliary result: Problem Description of the . . . T ( N ) = 0 <N + <N { max { W 1 + T ( N + ) , W 0 + T ( N − N + ) }} . min Asymptotically . . . Average Case . . . • Conclusion: we can consequently compute T (1), T (2), . . . , T ( N ) in time N · O ( N ) = O ( N 2 ). Average Case: . . . • Notation: let N + ( N ) be the value where the minimum Title Page is attained. ◭◭ ◮◮ • Optimal procedure: for each sequence ω with ◭ ◮ def n = | A ( ω ) | > 1: Page 10 of 17 – we assign N + ( n ) values to the “yes” case A ( ω 1); Go Back – we assign the remaining n − N + ( n ) values to the Full Screen “no” case A ( ω 0). Close Quit

  11. 10. Example: N = 4 , W 0 = 3 , and W 1 = 1 How Knowledge Is . . . Experts Are Usually . . . • We take T (1) = 0. Then, Problem T (2) = 0 <N + < 2 { max { 1 + T ( N + ) , 3 + T (2 − N + ) }} = min Description of the . . . Asymptotically . . . max { 1+ T (1) , 3+ T (1) } = max { 1 , 3 } = 3 , with N + (2) = 1 . Average Case . . . • T (3) = 4, with min attained for N + (3) = 2. Average Case: . . . • T (4) = 5, with min attained for N + (4) = 3. Title Page • Optimal procedure: ◭◭ ◮◮ – since N + (4) = 3, we divide 4 elements A (Λ) into a ◭ ◮ 3-element set A (1) and a 1-element set A (0); Page 11 of 17 – since N + (3) = 2, we divide 3 elements A (1) into a Go Back 2-element set A (11) and a 1-element set A (10); Full Screen – since N + (2) = 1, we divide 2 elements A (10) into a 1-element set A (101) and a 1-element set A (100). Close • Observation: this is the procedure from Example 2. Quit

  12. 11. Asymptotically Optimal Search Procedure How Knowledge Is . . . Experts Are Usually . . . • We described: optimal search procedure: Problem def Description of the . . . E ( P opt ) = T ( N ) = min E ( P ) . P Asymptotically . . . Average Case . . . • Property: P opt takes time ≈ N 2 . Average Case: . . . • Problem: for large N , time N 2 is too large. Title Page • Alternative: asymptotically optimal procedure, with ◭◭ ◮◮ E ( P a ) ≤ T ( N ) + C for some constant C > 0. ◭ ◮ • Asymptotically optimal search procedure: Page 12 of 17 – find α such that α + α w = 1, where w def = W 0 /W 1 ; Go Back def – for each ω with n = | A ( ω ) | > 1, assign ⌊ α · n ⌋ Full Screen values to the “yes” case A ( ω 1); Close – assign the remaining values to the “no” case A ( ω 0). Quit

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend