Preparing for the Worst but Hoping for the Best: Robust (Bayesian) - - PowerPoint PPT Presentation

preparing for the worst but hoping for the best robust
SMART_READER_LITE
LIVE PREVIEW

Preparing for the Worst but Hoping for the Best: Robust (Bayesian) - - PowerPoint PPT Presentation

Preparing for the Worst but Hoping for the Best: Robust (Bayesian) Persuasion Piotr Dworczak Alessandro Pavan February 2020 Motivation Bayesian persuasion/ information design designer knows agents sources of information trusts her ability


slide-1
SLIDE 1

Preparing for the Worst but Hoping for the Best: Robust (Bayesian) Persuasion

Piotr Dworczak Alessandro Pavan February 2020

slide-2
SLIDE 2

Motivation

Bayesian persuasion/ information design designer knows agents’ sources of information trusts her ability to coordinate Receivers on actions most favorable to her

  • ptimal information structure sensitive to fine details of agents’ beliefs

In many problems of interest, agents’ sources of information (both before and after receiving Sender’s information) unknown Sender may not trust her ability to coordinate Receivers Quest for robustness

slide-3
SLIDE 3

This Paper

Novel solution concept that accounts for such uncertainty/ambiguity Lexicographic approach to the problem Step 1: “Preparing for the worst” designer seeks to protect herself against possibility that Nature provides information and coordinates agents on actions most adversarial to designer Step 2: “Hoping for the best” designer maximizes over all worst-case optimal policies assuming Nature and Receivers play favorably Robust solutions best-case optimal among worst-case optimal ones max-max over max-min (also maximize λ infπ∈Π v(q, π) + (1 − λ)¯ v(q, ∅), for λ large)

slide-4
SLIDE 4

Results

Separation theorem Properties of robust solutions Implications for various persuasion models Conditionally-independent signals (online supplement)

slide-5
SLIDE 5

Literature

Bayesian persuasion ...Calzolari and Pavan (2006), Brocas and Carillo (2007), Rayo-Segal (2010), Kamenica and Gentzkow (2011), Ely (2017), Dworczak-Martini (2019)... Surveys Bergemann and Morris (2019) Kamenica (2019) Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet, Perego, Taneva (2019) Morris et al. (2019) Ziegler (2019) Persuasion with unknown beliefs Kolotilin et al. (2017) Laclau and Renou (2017) Guo and Schmaya (2018) Hu and Weng (2019) Kosterina (2019) Max-max over max-min design Borgers (2017)

slide-6
SLIDE 6

Plan

1

Introduction

2

Model

3

Robust Solutions

4

Separation Theorem

5

Corollaries

6

Applications

7

Conditionally-independent Robust Solutions (another day)

slide-7
SLIDE 7

Model

slide-8
SLIDE 8

Model: Environment

Payoff-relevant state: ω ∈ Ω (finite) Prior: µ0 ∈ ∆Ω Sender’s“signal” q : Ω → ∆S S: signal realizations (Reduced-form description of) Sender’s payoff, given induced posterior µ ∈ ∆Ω V (µ) : highest payoff V (µ): lowest payoff Difference between V and V : strategy selection (multiple Receivers) tie-breaking (single Receiver)

slide-9
SLIDE 9

Model: Sender’s uncertainty

Nature designs information structure π : Ω × S → ∆R R: signal realizations Multiple Receivers discriminatory disclosures embedded into derivation of V (µ) given common posterior µ, Nature provides (possibly private) signals to the agents and coordinates them on course of action most adversarial to Sender (among those consistent with assumed solution concept) e.g.., Bayes-correlated eq. given µ Conditioning on Sender’s signal information acquisition (after hearing from Sender) correlated noise maximal concern for robustness Online Appendix: conditionally independent signals

slide-10
SLIDE 10

Plan

1

Introduction

2

Model

3

Robust Solutions

4

Separation Theorem

5

Corollaries

6

Applications

7

Conditionally-independent Robust Solutions

slide-11
SLIDE 11

Robust Solutions

slide-12
SLIDE 12

Robust Solutions

Sender’s expected payoffs when Sender selects signal q Nature selects signal π v(q, π) ≡

  • S
  • R

V (µs,r

0 )dπ(r|ω, s)dq(s|ω)µ0(ω)

v(q, π) ≡

  • S
  • R

V (µs,r

0 )dπ(r|ω, s)dq(s|ω)µ0(ω)

where µs,r is common posterior obtained from (q, π)

slide-13
SLIDE 13

Worst-case optimality

Definition 1

Signal q is worst-case optimal if, for all signals q′, inf

π v(q, π) ≥ inf π v(q′, π).

maximal payoff guarantee

slide-14
SLIDE 14

Worst-case optimality

Given any posterior µ ∈ ∆Ω, Sender’s (lowest) payoff if, starting from µ, state fully revealed V full(µ) ≡

V (δω)µ(ω) where δω is Dirac measure assigning prob 1 to ω.

Remark 1

Since both Nature and Sender can reveal state, signal q is worst-case optimal iff inf

π v(q, π) = V full(µ0)

W : set of worst-case optimal signals non-empty (full disclosure is worst-case optimal)

slide-15
SLIDE 15

Robust Solutions

Definition 2

Signal qRS is robust solution if it maximizes v(q, ∅) over W . Lexicographic preferences max-max over max-min policies step 1: max-min (worst-case optimal policies) step 2: max-max (highest payoff if Nature and Receivers play favorably) Clearly, qRS also maximizes supπ v(q, π) over W However, Sender prefers to provide information herself rather than counting on Nature to do it

slide-16
SLIDE 16

Robust Solutions

Lemma 1

Signal qRS is robust solution iff distribution over posterior beliefs ρRS ∈ ∆∆Ω that qRS induces maximizes

  • V (µ)dρ(µ)
  • ver set of distributions over posterior beliefs W ⊂ ∆∆Ω satisfying (a) Bayes plausibility
  • µdρ(µ) = µ0

and (b)“worst-case optimality”(WCO)

  • lco(V )(µ)dρ(µ) = V full(µ0)
slide-17
SLIDE 17

Robust vs Bayesian Solutions

Bayesian solutions: qBP maximizes v(q, ∅) over Q (feasible signals) induced distribution over posterior beliefs ρBP ∈ ∆∆Ω maximizes

  • V (µ)dρ(µ) over all distributions ρ ∈ ∆∆Ω satisfying Bayes plausibility,
  • µdρ(µ) = µ0

Robust solutions: qRS maximizes v(q, ∅) over W ⊂ Q (worst-case optimal signals) induced distribution over posterior beliefs ρRS ∈ ∆∆Ω maximizes

  • V (µ)dρ(µ) over all distributions ρ ∈ ∆∆Ω satisfying, in addition to Bayes

plausibility,

  • µdρ(µ) = µ0, the WCO constraint
  • lco(V )(µ)dρ(µ) = V full(µ0)
slide-18
SLIDE 18

Plan

1

Introduction

2

Model

3

Robust Solutions

4

Separation Theorem

5

Corollaries

6

Applications

7

Conditionally-independent Robust Solutions

slide-19
SLIDE 19

Separation Theorem

slide-20
SLIDE 20

Separation Theorem

Theorem 1

Let F ≡ {B ⊆ Ω : V (µ) ≥ V full(µ) ALL µ ∈ ∆B}, Then, W = {ρ ∈ ∆∆Ω : ρ satisfies BP and supp(µ) ∈ F ALL µ ∈ supp(ρ)} Therefore, ρRS ∈ ∆∆Ω is robust solution iff ρRS maximizes

  • V (µ)dρ(µ)
  • ver all distributions over posterior beliefs ρ ∈ ∆∆Ω satisfying BP and s.t., for any

µ ∈ supp(ρ), supp(µ) ∈ F.

slide-21
SLIDE 21

Separation Theorem

Idea: suppose Sender induces posterior µ with supp(µ) = B for which there exists η ∈ ∆B s.t. V (η) < V full(η) starting from µ, Nature can induce η w/ strict positive probability starting from µ, Nature can bring Sender’s payoff strictly below V full(µ) because Nature can respond to any other posterior µ′ ∈ supp(ρ) by fully disclosing state,

  • lco(V )(˜

µ)dρ(˜ µ) < V full(µ0) policy ρ not worst-case optimal

slide-22
SLIDE 22

KG’s prosecutor example

1 1

Expected payoff after Nature's disclosure Expected payoff before Nature's disclosure

Figure: Prosecutor example

slide-23
SLIDE 23

Plan

1

Introduction

2

Model

3

Robust solutions

4

Separation theorem

5

Corollaries

6

Applications

7

Conditionally-independent Robust Solutions

slide-24
SLIDE 24

Corollaries

slide-25
SLIDE 25

Existence

Corollary 1

A robust solution always exists. existence guaranteed by possibility for Nature to condition on realization of Sender’s signal

slide-26
SLIDE 26

State separation

Corollary 2

Suppose there exist ω, ω′ ∈ Ω and λ ∈ (0, 1) s.t. V (λδω + (1 − λ)δω′) < λV (δω) + (1 − λ)V (δω′), Then any robust solution must separate ω and ω′. Assumption: there exists some belief supported on {ω, ω′} under which Sender’s payoff below full disclosure Conclusion: ALL posterior beliefs must separate ω and ω′.

slide-27
SLIDE 27

Robustness of Bayesian Solutions

Corollary 3

Bayesian solution ρBP is robust iff for any µ ∈ supp(ρBP) and any η ∈ ∆Ω s.t. supp(η) ⊂ supp(µ), V (η) ≥ V full(η) Binary state: any robust solution full disclosure Bayesian solution

slide-28
SLIDE 28

Worst-case optimality preserved under more disclosure

Corollary 4

W closed under Blackwell dominance: If ρ′ ∈ W , and ρ Blackwell dominates ρ′, then ρ ∈ W . Result not true in case of conditionally independent signals

slide-29
SLIDE 29

Informativeness of Robust vs Bayesian solutions

Corollary 5

Given any Bayesian solution ρBP, there exists robust solution ρRS s.t. either ρRS and ρBP not comparable in Blackwell order, or ρRS Blackwell dominates ρBP. If Bayesian solution ρBP is Blackwell more informative than robust solution ρRS, then ρBP also robust Reason why robustness calls for more disclosure little to do with indifference on Sender’s part concealing information gives Nature more room for adversarial design If Bayesian solution ρBP is not robust and is strictly Blackwell dominated by robust solution ρRS, then ρRS separates states that are not separated under ρBP robustness never calls for MPS with same supports

slide-30
SLIDE 30

Concavification

Let vlow := minω∈Ω V (δω) − 1 Auxiliary function

V F(µ) =

  • V (µ)

if supp(µ) ∈ F and V (µ) ≥ vlow vlow

  • therwise

Corollary 6

A feasible distribution ρ ∈ ∆∆Ω is robust iff

  • V F(µ)dρ(µ) = co(V F)(µ0).

Furthermore, there always exists a robust solution ρ with supp(ρ)| ≤ |Ω|. V F upper-semi-continuous: results follow from arguments similar to those in BP literature

slide-31
SLIDE 31

Plan

1

Introduction

2

Model

3

Robust Solutions

4

Separation Theorem

5

Corollaries

6

Applications

7

Conditionally-independent Robust Solutions

slide-32
SLIDE 32

Applications

slide-33
SLIDE 33

Limits to Third-degree Price Discrimination

Bergemann, Brooks, Morris (2015) designer segments market to maximize combination of CS and PS

Proposition 1

Suppose Pareto weight on seller’s surplus strictly positive. Full disclosure is unique robust solution. When, instead, designer cares only about consumer surplus, optimal signal is BBM solution. For any posterior µ assigning positive measure to more than one buyer’s value, Nature can construct posterior η inducing seller to ask highest price in supp(µ) thus inducing no trade. Total surplus under η strictly below full-information (if weight on PS strictly positive)

slide-34
SLIDE 34

Privately Informed Receiver

Guo and Shmaya (2019) Exogenous price p ∈ (0, 1) Seller’s payoff is 1 if trade, 0 otherwise Buyer’s exogenous private information f (t|ω) – MLRP

Proposition 2

Any robust solution separates states ω ≤ p from states ω′ > p Starting from any µ s.t. supp(µ) ⊃ {ω, ω′} with ω < p < ω′, Nature can induce posterior η s.t. Eη[ω] < p thus inducing no trade Sender’s payoff given η below full information Given ρ ∈ ∆∆Ω, Nature can bring Sender’s payoff below V full(µ0) Policy ρ not worst-case optimal

slide-35
SLIDE 35

Lemons problem

Seller’s value: ω (known to seller) Buyer’s value: ω + ∆, with ∆ > 0 (unknown to buyer) Exogenous price p drawn from U[0, 1] Trade: p ≥ ω Eµ[˜ ω|˜ ω ≤ p] + ∆ > p Seller designs info structure

Proposition 3

Under any robust solution ρRS, for any µ, µ′ ∈ supp(ρRS), diam(supp(µ)), diam(supp(µ′)) ≤ ∆ but diam(supp(µ) ∪ supp(µ′)) > ∆. Robust solutions are minimally informative among those that eliminate adverse selection!

slide-36
SLIDE 36

Supermodular Games

Continuum of Receivers ai = {0, 1} A ∈ [0, 1]: aggregate“attack” Payoff from not attacking: 0 Payoff from attacking

  • g > 0

if A ≥ ω b < 0 if A < ω Designer’s payoff: 1 − A Bayesian solution: Upper censorship Reveals each ω < 0 w.p. γBP ∈ (0, 1) (w.p. 1 − γBP, reveals nothing) Conceals all ω > 0

Proposition 4

Robust solution reveals ω < 0 w.p. γ∗ > γBP, conceals all ω ∈ [0, 1], reveals all ω > 1 with certainty. Revelation of upper dominance region: else Nature constructs η inducing A = 1 also for ω > 1

slide-37
SLIDE 37

Plan

1

Introduction

2

Model

3

Robust Solutions

4

Separation Theorem

5

Corollaries

6

Applications

7

Conditionally-independent Robust Solutions

slide-38
SLIDE 38

Conditionally Independent Signals

slide-39
SLIDE 39

Conditionally-independent Robust Solutions

Nature cannot condition on realization of Sender’s signal π : Ω → ∆R so far: π : Ω × S → ∆R

slide-40
SLIDE 40

Existence of CI-Robust Solutions

A robust solution may fail to exist A robust solution exists if V is continuous

Definition

A feasible distribution ρ ∈ ∆∆Ω is a weak CI-robust solution if it maximizes

  • V (µ)dρ(µ)
  • ver cl(WCI), where cl(WCI) is closure (in weak∗ topology) of set of CI-worst-case

solutions

Theorem

A weak solution exists no matter V .

slide-41
SLIDE 41

Separation under CI-Robust Solutions

Sufficient conditions for state separation under CI-robust solutions weaker than those for robust solutions whenever ω and ω′ must be separated under CI-robust solutions, they must be separated under robust solutions

slide-42
SLIDE 42

Full Disclosure and CI-robustness of Bayesian solutions

Sufficient conditions for full-disclosure to be unique CI-robust solution all distributions to be CI-worst-case optimal

slide-43
SLIDE 43

CI-robust solutions: Binary state

Unlike robust solutions, CI-robust solutions for binary states need not coincide with Bayesian solutions full disclosure

slide-44
SLIDE 44

Blackwell Informativeness of CI-robust solutions

Unlike robust solutions, CI-robust solutions need not be Blackwell more informative than Bayesian solution Example in which unique Bayesian solution is Blackwell strictly more informative than all CI-robust solutions Nature cannot engineer MPS conditional on s ρRS may be Blackwell less informative than ρBP an yet Nature may not be able to inflict Sender same payoff as under ρBP

slide-45
SLIDE 45

Conclusions

Bayesian persuasion when Sender uncertain Receivers’ information strategy selection Robust solutions best-case optimal among worst-case optimal ones max-max over max-min criterion Separation theorem any pair of states over which Nature can construct belief yielding less than full info are separated Robustness more disclosure but only through more separation (not MPSs over same supp) Relatively simple two-step design procedure step 1: Nature designs info to minimize Sender’s payoff step 2: designer solves standard persuasion problem on restricted set of worst-case optimal policies Implications for applications

slide-46
SLIDE 46

Most Important Slide

THANKS!