Preparing for the Worst but Hoping for the Best: Robust (Bayesian) Persuasion
Piotr Dworczak Alessandro Pavan February 2020
Preparing for the Worst but Hoping for the Best: Robust (Bayesian) - - PowerPoint PPT Presentation
Preparing for the Worst but Hoping for the Best: Robust (Bayesian) Persuasion Piotr Dworczak Alessandro Pavan February 2020 Motivation Bayesian persuasion/ information design designer knows agents sources of information trusts her ability
Piotr Dworczak Alessandro Pavan February 2020
Bayesian persuasion/ information design designer knows agents’ sources of information trusts her ability to coordinate Receivers on actions most favorable to her
In many problems of interest, agents’ sources of information (both before and after receiving Sender’s information) unknown Sender may not trust her ability to coordinate Receivers Quest for robustness
Novel solution concept that accounts for such uncertainty/ambiguity Lexicographic approach to the problem Step 1: “Preparing for the worst” designer seeks to protect herself against possibility that Nature provides information and coordinates agents on actions most adversarial to designer Step 2: “Hoping for the best” designer maximizes over all worst-case optimal policies assuming Nature and Receivers play favorably Robust solutions best-case optimal among worst-case optimal ones max-max over max-min (also maximize λ infπ∈Π v(q, π) + (1 − λ)¯ v(q, ∅), for λ large)
Separation theorem Properties of robust solutions Implications for various persuasion models Conditionally-independent signals (online supplement)
Bayesian persuasion ...Calzolari and Pavan (2006), Brocas and Carillo (2007), Rayo-Segal (2010), Kamenica and Gentzkow (2011), Ely (2017), Dworczak-Martini (2019)... Surveys Bergemann and Morris (2019) Kamenica (2019) Information design with adversarial coordination Inostroza and Pavan (2018) Mathevet, Perego, Taneva (2019) Morris et al. (2019) Ziegler (2019) Persuasion with unknown beliefs Kolotilin et al. (2017) Laclau and Renou (2017) Guo and Schmaya (2018) Hu and Weng (2019) Kosterina (2019) Max-max over max-min design Borgers (2017)
1
Introduction
2
Model
3
Robust Solutions
4
Separation Theorem
5
Corollaries
6
Applications
7
Conditionally-independent Robust Solutions (another day)
Payoff-relevant state: ω ∈ Ω (finite) Prior: µ0 ∈ ∆Ω Sender’s“signal” q : Ω → ∆S S: signal realizations (Reduced-form description of) Sender’s payoff, given induced posterior µ ∈ ∆Ω V (µ) : highest payoff V (µ): lowest payoff Difference between V and V : strategy selection (multiple Receivers) tie-breaking (single Receiver)
Nature designs information structure π : Ω × S → ∆R R: signal realizations Multiple Receivers discriminatory disclosures embedded into derivation of V (µ) given common posterior µ, Nature provides (possibly private) signals to the agents and coordinates them on course of action most adversarial to Sender (among those consistent with assumed solution concept) e.g.., Bayes-correlated eq. given µ Conditioning on Sender’s signal information acquisition (after hearing from Sender) correlated noise maximal concern for robustness Online Appendix: conditionally independent signals
1
Introduction
2
Model
3
Robust Solutions
4
Separation Theorem
5
Corollaries
6
Applications
7
Conditionally-independent Robust Solutions
Sender’s expected payoffs when Sender selects signal q Nature selects signal π v(q, π) ≡
V (µs,r
0 )dπ(r|ω, s)dq(s|ω)µ0(ω)
v(q, π) ≡
V (µs,r
0 )dπ(r|ω, s)dq(s|ω)µ0(ω)
where µs,r is common posterior obtained from (q, π)
Definition 1
Signal q is worst-case optimal if, for all signals q′, inf
π v(q, π) ≥ inf π v(q′, π).
maximal payoff guarantee
Given any posterior µ ∈ ∆Ω, Sender’s (lowest) payoff if, starting from µ, state fully revealed V full(µ) ≡
V (δω)µ(ω) where δω is Dirac measure assigning prob 1 to ω.
Remark 1
Since both Nature and Sender can reveal state, signal q is worst-case optimal iff inf
π v(q, π) = V full(µ0)
W : set of worst-case optimal signals non-empty (full disclosure is worst-case optimal)
Definition 2
Signal qRS is robust solution if it maximizes v(q, ∅) over W . Lexicographic preferences max-max over max-min policies step 1: max-min (worst-case optimal policies) step 2: max-max (highest payoff if Nature and Receivers play favorably) Clearly, qRS also maximizes supπ v(q, π) over W However, Sender prefers to provide information herself rather than counting on Nature to do it
Lemma 1
Signal qRS is robust solution iff distribution over posterior beliefs ρRS ∈ ∆∆Ω that qRS induces maximizes
and (b)“worst-case optimality”(WCO)
Bayesian solutions: qBP maximizes v(q, ∅) over Q (feasible signals) induced distribution over posterior beliefs ρBP ∈ ∆∆Ω maximizes
Robust solutions: qRS maximizes v(q, ∅) over W ⊂ Q (worst-case optimal signals) induced distribution over posterior beliefs ρRS ∈ ∆∆Ω maximizes
plausibility,
1
Introduction
2
Model
3
Robust Solutions
4
Separation Theorem
5
Corollaries
6
Applications
7
Conditionally-independent Robust Solutions
Theorem 1
Let F ≡ {B ⊆ Ω : V (µ) ≥ V full(µ) ALL µ ∈ ∆B}, Then, W = {ρ ∈ ∆∆Ω : ρ satisfies BP and supp(µ) ∈ F ALL µ ∈ supp(ρ)} Therefore, ρRS ∈ ∆∆Ω is robust solution iff ρRS maximizes
µ ∈ supp(ρ), supp(µ) ∈ F.
Idea: suppose Sender induces posterior µ with supp(µ) = B for which there exists η ∈ ∆B s.t. V (η) < V full(η) starting from µ, Nature can induce η w/ strict positive probability starting from µ, Nature can bring Sender’s payoff strictly below V full(µ) because Nature can respond to any other posterior µ′ ∈ supp(ρ) by fully disclosing state,
µ)dρ(˜ µ) < V full(µ0) policy ρ not worst-case optimal
1 1
Expected payoff after Nature's disclosure Expected payoff before Nature's disclosure
Figure: Prosecutor example
1
Introduction
2
Model
3
Robust solutions
4
Separation theorem
5
Corollaries
6
Applications
7
Conditionally-independent Robust Solutions
Corollary 1
A robust solution always exists. existence guaranteed by possibility for Nature to condition on realization of Sender’s signal
Corollary 2
Suppose there exist ω, ω′ ∈ Ω and λ ∈ (0, 1) s.t. V (λδω + (1 − λ)δω′) < λV (δω) + (1 − λ)V (δω′), Then any robust solution must separate ω and ω′. Assumption: there exists some belief supported on {ω, ω′} under which Sender’s payoff below full disclosure Conclusion: ALL posterior beliefs must separate ω and ω′.
Corollary 3
Bayesian solution ρBP is robust iff for any µ ∈ supp(ρBP) and any η ∈ ∆Ω s.t. supp(η) ⊂ supp(µ), V (η) ≥ V full(η) Binary state: any robust solution full disclosure Bayesian solution
Corollary 4
W closed under Blackwell dominance: If ρ′ ∈ W , and ρ Blackwell dominates ρ′, then ρ ∈ W . Result not true in case of conditionally independent signals
Corollary 5
Given any Bayesian solution ρBP, there exists robust solution ρRS s.t. either ρRS and ρBP not comparable in Blackwell order, or ρRS Blackwell dominates ρBP. If Bayesian solution ρBP is Blackwell more informative than robust solution ρRS, then ρBP also robust Reason why robustness calls for more disclosure little to do with indifference on Sender’s part concealing information gives Nature more room for adversarial design If Bayesian solution ρBP is not robust and is strictly Blackwell dominated by robust solution ρRS, then ρRS separates states that are not separated under ρBP robustness never calls for MPS with same supports
Let vlow := minω∈Ω V (δω) − 1 Auxiliary function
V F(µ) =
if supp(µ) ∈ F and V (µ) ≥ vlow vlow
Corollary 6
A feasible distribution ρ ∈ ∆∆Ω is robust iff
Furthermore, there always exists a robust solution ρ with supp(ρ)| ≤ |Ω|. V F upper-semi-continuous: results follow from arguments similar to those in BP literature
Plan
1
Introduction
2
Model
3
Robust Solutions
4
Separation Theorem
5
Corollaries
6
Applications
7
Conditionally-independent Robust Solutions
Bergemann, Brooks, Morris (2015) designer segments market to maximize combination of CS and PS
Proposition 1
Suppose Pareto weight on seller’s surplus strictly positive. Full disclosure is unique robust solution. When, instead, designer cares only about consumer surplus, optimal signal is BBM solution. For any posterior µ assigning positive measure to more than one buyer’s value, Nature can construct posterior η inducing seller to ask highest price in supp(µ) thus inducing no trade. Total surplus under η strictly below full-information (if weight on PS strictly positive)
Guo and Shmaya (2019) Exogenous price p ∈ (0, 1) Seller’s payoff is 1 if trade, 0 otherwise Buyer’s exogenous private information f (t|ω) – MLRP
Proposition 2
Any robust solution separates states ω ≤ p from states ω′ > p Starting from any µ s.t. supp(µ) ⊃ {ω, ω′} with ω < p < ω′, Nature can induce posterior η s.t. Eη[ω] < p thus inducing no trade Sender’s payoff given η below full information Given ρ ∈ ∆∆Ω, Nature can bring Sender’s payoff below V full(µ0) Policy ρ not worst-case optimal
Seller’s value: ω (known to seller) Buyer’s value: ω + ∆, with ∆ > 0 (unknown to buyer) Exogenous price p drawn from U[0, 1] Trade: p ≥ ω Eµ[˜ ω|˜ ω ≤ p] + ∆ > p Seller designs info structure
Proposition 3
Under any robust solution ρRS, for any µ, µ′ ∈ supp(ρRS), diam(supp(µ)), diam(supp(µ′)) ≤ ∆ but diam(supp(µ) ∪ supp(µ′)) > ∆. Robust solutions are minimally informative among those that eliminate adverse selection!
Continuum of Receivers ai = {0, 1} A ∈ [0, 1]: aggregate“attack” Payoff from not attacking: 0 Payoff from attacking
if A ≥ ω b < 0 if A < ω Designer’s payoff: 1 − A Bayesian solution: Upper censorship Reveals each ω < 0 w.p. γBP ∈ (0, 1) (w.p. 1 − γBP, reveals nothing) Conceals all ω > 0
Proposition 4
Robust solution reveals ω < 0 w.p. γ∗ > γBP, conceals all ω ∈ [0, 1], reveals all ω > 1 with certainty. Revelation of upper dominance region: else Nature constructs η inducing A = 1 also for ω > 1
1
Introduction
2
Model
3
Robust Solutions
4
Separation Theorem
5
Corollaries
6
Applications
7
Conditionally-independent Robust Solutions
Nature cannot condition on realization of Sender’s signal π : Ω → ∆R so far: π : Ω × S → ∆R
A robust solution may fail to exist A robust solution exists if V is continuous
Definition
A feasible distribution ρ ∈ ∆∆Ω is a weak CI-robust solution if it maximizes
solutions
Theorem
A weak solution exists no matter V .
Sufficient conditions for state separation under CI-robust solutions weaker than those for robust solutions whenever ω and ω′ must be separated under CI-robust solutions, they must be separated under robust solutions
Sufficient conditions for full-disclosure to be unique CI-robust solution all distributions to be CI-worst-case optimal
Unlike robust solutions, CI-robust solutions for binary states need not coincide with Bayesian solutions full disclosure
Unlike robust solutions, CI-robust solutions need not be Blackwell more informative than Bayesian solution Example in which unique Bayesian solution is Blackwell strictly more informative than all CI-robust solutions Nature cannot engineer MPS conditional on s ρRS may be Blackwell less informative than ρBP an yet Nature may not be able to inflict Sender same payoff as under ρBP
Bayesian persuasion when Sender uncertain Receivers’ information strategy selection Robust solutions best-case optimal among worst-case optimal ones max-max over max-min criterion Separation theorem any pair of states over which Nature can construct belief yielding less than full info are separated Robustness more disclosure but only through more separation (not MPSs over same supp) Relatively simple two-step design procedure step 1: Nature designs info to minimize Sender’s payoff step 2: designer solves standard persuasion problem on restricted set of worst-case optimal policies Implications for applications