Optimal Auctions for Correlated Buyers with Sampling Nima Haghpanah - - PowerPoint PPT Presentation

optimal auctions for correlated buyers with sampling
SMART_READER_LITE
LIVE PREVIEW

Optimal Auctions for Correlated Buyers with Sampling Nima Haghpanah - - PowerPoint PPT Presentation

Optimal Auctions for Correlated Buyers with Sampling Nima Haghpanah (Penn State) Joint with Hu Fu (UBC), Jason Hartline (Northwestern), Robert Kleinberg (Cornell) October 18, 2017 1 / 21 Overview Auctions: Cremer-McLean 88: Full


slide-1
SLIDE 1

Optimal Auctions for Correlated Buyers with Sampling

Nima Haghpanah (Penn State)

Joint with Hu Fu (UBC), Jason Hartline (Northwestern), Robert Kleinberg (Cornell)

October 18, 2017

1 / 21

slide-2
SLIDE 2

Overview

Auctions:

◮ Cremer-McLean ’88: Full surplus extraction “generically” possible

2 / 21

slide-3
SLIDE 3

Overview

Auctions:

◮ Cremer-McLean ’88: Full surplus extraction “generically” possible

◮ Full knowledge of distribution: auction heavily detail-dependent 2 / 21

slide-4
SLIDE 4

Overview

Auctions:

◮ Cremer-McLean ’88: Full surplus extraction “generically” possible

◮ Full knowledge of distribution: auction heavily detail-dependent

This work:

◮ Seller does not know distribution: Observes samples (past auctions)

2 / 21

slide-5
SLIDE 5

Overview

Auctions:

◮ Cremer-McLean ’88: Full surplus extraction “generically” possible

◮ Full knowledge of distribution: auction heavily detail-dependent

This work:

◮ Seller does not know distribution: Observes samples (past auctions)

◮ Learning + CM approach fails 2 / 21

slide-6
SLIDE 6

Overview

Auctions:

◮ Cremer-McLean ’88: Full surplus extraction “generically” possible

◮ Full knowledge of distribution: auction heavily detail-dependent

This work:

◮ Seller does not know distribution: Observes samples (past auctions)

◮ Learning + CM approach fails ◮ Full surplus extraction with “enough” information 2 / 21

slide-7
SLIDE 7

Overview

Auctions:

◮ Cremer-McLean ’88: Full surplus extraction “generically” possible

◮ Full knowledge of distribution: auction heavily detail-dependent

This work:

◮ Seller does not know distribution: Observes samples (past auctions)

◮ Learning + CM approach fails ◮ Full surplus extraction with “enough” information

  • 1. Extension of CM approach
  • 2. Characterize number of samples required

2 / 21

slide-8
SLIDE 8

Overview

Auctions:

◮ Cremer-McLean ’88: Full surplus extraction “generically” possible

◮ Full knowledge of distribution: auction heavily detail-dependent

This work:

◮ Seller does not know distribution: Observes samples (past auctions)

◮ Learning + CM approach fails ◮ Full surplus extraction with “enough” information

  • 1. Extension of CM approach
  • 2. Characterize number of samples required

◮ Idea:

◮ Samples as randomization device, not learning 2 / 21

slide-9
SLIDE 9

The Model

1 item, n buyers (quasilinear, finite type space)

3 / 21

slide-10
SLIDE 10

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

3 / 21

slide-11
SLIDE 11

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

3 / 21

slide-12
SLIDE 12

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

◮ f = f j for unknown j ∈ {1, . . . , m} 3 / 21

slide-13
SLIDE 13

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

◮ f = f j for unknown j ∈ {1, . . . , m} ◮ Signal s drawn from g j, independent from v 3 / 21

slide-14
SLIDE 14

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

◮ f = f j for unknown j ∈ {1, . . . , m} ◮ Signal s drawn from g j, independent from v ◮ Example: s = (s1, . . . , sk) i.i.d samples from f j 3 / 21

slide-15
SLIDE 15

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

◮ f = f j for unknown j ∈ {1, . . . , m} ◮ Signal s drawn from g j, independent from v ◮ Example: s = (s1, . . . , sk) i.i.d samples from f j

Mechanism specifies (allocation, payment) given v, s

3 / 21

slide-16
SLIDE 16

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

◮ f = f j for unknown j ∈ {1, . . . , m} ◮ Signal s drawn from g j, independent from v ◮ Example: s = (s1, . . . , sk) i.i.d samples from f j

Mechanism specifies (allocation, payment) given v, s

◮ DSIC:

ui(vi → vi, v−i, s) ≥ ui(vi → v′

i , v−i, s),

∀vi, v′

i , v−i, s

3 / 21

slide-17
SLIDE 17

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

◮ f = f j for unknown j ∈ {1, . . . , m} ◮ Signal s drawn from g j, independent from v ◮ Example: s = (s1, . . . , sk) i.i.d samples from f j

Mechanism specifies (allocation, payment) given v, s

◮ DSIC:

ui(vi → vi, v−i, s) ≥ ui(vi → v′

i , v−i, s),

∀vi, v′

i , v−i, s ◮ Interim IR: Ev∼f j,s∼gj[ui(vi → vi, v−i, s)|vi] ≥ 0,

∀vi, j

3 / 21

slide-18
SLIDE 18

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

◮ f = f j for unknown j ∈ {1, . . . , m} ◮ Signal s drawn from g j, independent from v ◮ Example: s = (s1, . . . , sk) i.i.d samples from f j

Mechanism specifies (allocation, payment) given v, s

◮ DSIC:

ui(vi → vi, v−i, s) ≥ ui(vi → v′

i , v−i, s),

∀vi, v′

i , v−i, s ◮ Interim IR: Ev∼f j,s∼gj[ui(vi → vi, v−i, s)|vi] ≥ 0,

∀vi, j

◮ Goal:

Ev∼f j,s∼gj[revenue] = Ev∼f j[maximum value], ∀j

3 / 21

slide-19
SLIDE 19

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

◮ f = f j for unknown j ∈ {1, . . . , m} ◮ Signal s drawn from g j, independent from v ◮ Example: s = (s1, . . . , sk) i.i.d samples from f j

Mechanism specifies (allocation, payment) given v, s

◮ DSIC:

ui(vi → vi, v−i, s) ≥ ui(vi → v′

i , v−i, s),

∀vi, v′

i , v−i, s ◮ Interim IR: Ev∼f j,s∼gj[ui(vi → vi, v−i, s)|vi] ≥ 0,

∀vi, j

◮ Goal:

Ev∼f j,s∼gj[revenue] = Ev∼f j[maximum value], ∀j Special Case, Cremer-McLean: F = {f } (signal redundant)

3 / 21

slide-20
SLIDE 20

The Model

1 item, n buyers (quasilinear, finite type space) Joint distribution f : (v1, . . . , vn) = v ∼ f .

◮ Buyers and seller know that F = {f 1, . . . , f m} contains f

◮ f = f j for unknown j ∈ {1, . . . , m} ◮ Signal s drawn from g j, independent from v ◮ Example: s = (s1, . . . , sk) i.i.d samples from f j

Mechanism specifies (allocation, payment) given v, s

◮ DSIC:

ui(vi → vi, v−i, s) ≥ ui(vi → v′

i , v−i, s),

∀vi, v′

i , v−i, s ◮ Interim IR: Ev∼f j,s∼gj[ui(vi → vi, v−i, s)|vi] ≥ 0,

∀vi, j

◮ Goal:

Ev∼f j,s∼gj[revenue] = Ev∼f j[maximum value], ∀j Special Case, Cremer-McLean: F = {f } (signal redundant) From definitions: if FSE possible for F, also possible for any {f j}

3 / 21

slide-21
SLIDE 21

{0, 1} Examples

f 1 = 1

  • 1/4

1/8 1 3/8 1/4 f 2 = 1

  • 1/4

3/8 1 1/8 1/4

4 / 21

slide-22
SLIDE 22

{0, 1} Examples

Cremer & McLean:

◮ If F = {f 1}, ∃ mechanism extracts full surplus ◮ If F = {f 2}, ∃ mechanism extracts full surplus

f 1 = 1

  • 1/4

1/8 1 3/8 1/4 f 2 = 1

  • 1/4

3/8 1 1/8 1/4

4 / 21

slide-23
SLIDE 23

{0, 1} Examples

Cremer & McLean:

◮ If F = {f 1}, ∃ mechanism extracts full surplus ◮ If F = {f 2}, ∃ mechanism extracts full surplus

f 1 = 1

  • 1/4

1/8 1 3/8 1/4 f 2 = 1

  • 1/4

3/8 1 1/8 1/4

4 / 21

slide-24
SLIDE 24

{0, 1} Examples

Cremer & McLean:

◮ If F = {f 1}, ∃ mechanism extracts full surplus ◮ If F = {f 2}, ∃ mechanism extracts full surplus

If F = {f 1, f 2}, signal: i.i.d samples from the true distribution f 1 = 1

  • 1/4

1/8 1 3/8 1/4 f 2 = 1

  • 1/4

3/8 1 1/8 1/4

4 / 21

slide-25
SLIDE 25

{0, 1} Examples

Cremer & McLean:

◮ If F = {f 1}, ∃ mechanism extracts full surplus ◮ If F = {f 2}, ∃ mechanism extracts full surplus

If F = {f 1, f 2}, signal: i.i.d samples from the true distribution

◮ Will see: if #samples ≥ 1, ∃ mechanism extracts full surplus

f 1 = 1

  • 1/4

1/8 1 3/8 1/4 f 2 = 1

  • 1/4

3/8 1 1/8 1/4

4 / 21

slide-26
SLIDE 26

Main Result

◮ F = {f 1, . . . , f m}

5 / 21

slide-27
SLIDE 27

Main Result

◮ F = {f 1, . . . , f m} ◮ Recall: if FSE possible for F, then FSE possible for all {f j}

5 / 21

slide-28
SLIDE 28

Main Result

◮ F = {f 1, . . . , f m} ◮ Recall: if FSE possible for F, then FSE possible for all {f j} ◮ f a CM distribution if FSE possible for {f }

5 / 21

slide-29
SLIDE 29

Main Result

◮ F = {f 1, . . . , f m} ◮ Recall: if FSE possible for F, then FSE possible for all {f j} ◮ f a CM distribution if FSE possible for {f } ◮ dimension of F: dimension of linear space spanned by

f 1, . . . , f m

5 / 21

slide-30
SLIDE 30

Main Result

◮ F = {f 1, . . . , f m} ◮ Recall: if FSE possible for F, then FSE possible for all {f j} ◮ f a CM distribution if FSE possible for {f } ◮ dimension of F: dimension of linear space spanned by

f 1, . . . , f m

◮ Example:

f 1 = (1/4, 1/8, 3/8, 1/4), f 2 = (1/4, 3/8, 1/8, 1/4). Dimension of {f 1, f 2} = 2

5 / 21

slide-31
SLIDE 31

Main Result

◮ F = {f 1, . . . , f m} ◮ Recall: if FSE possible for F, then FSE possible for all {f j} ◮ f a CM distribution if FSE possible for {f } ◮ dimension of F: dimension of linear space spanned by

f 1, . . . , f m

◮ Example:

f 1 = (1/4, 1/8, 3/8, 1/4), f 2 = (1/4, 3/8, 1/8, 1/4). Dimension of {f 1, f 2} = 2

Theorem

FSE possible for all F of CM distributions with size m and dimension d iff #samples ≥ m − d + 1

5 / 21

slide-32
SLIDE 32

Main Result

◮ F = {f 1, . . . , f m} ◮ Recall: if FSE possible for F, then FSE possible for all {f j} ◮ f a CM distribution if FSE possible for {f } ◮ dimension of F: dimension of linear space spanned by

f 1, . . . , f m

◮ Example:

f 1 = (1/4, 1/8, 3/8, 1/4), f 2 = (1/4, 3/8, 1/8, 1/4). Dimension of {f 1, f 2} = 2

Theorem

FSE possible for all F of CM distributions with size m and dimension d iff #samples ≥ m − d + 1 Implications:

◮ 2 ≤ dimension ≤ m: At most m − 1 samples; At least 1 sample

5 / 21

slide-33
SLIDE 33

Main Result

◮ F = {f 1, . . . , f m} ◮ Recall: if FSE possible for F, then FSE possible for all {f j} ◮ f a CM distribution if FSE possible for {f } ◮ dimension of F: dimension of linear space spanned by

f 1, . . . , f m

◮ Example:

f 1 = (1/4, 1/8, 3/8, 1/4), f 2 = (1/4, 3/8, 1/8, 1/4). Dimension of {f 1, f 2} = 2

Theorem

FSE possible for all F of CM distributions with size m and dimension d iff #samples ≥ m − d + 1 Implications:

◮ 2 ≤ dimension ≤ m: At most m − 1 samples; At least 1 sample ◮ With linearly independent distributions, 1 sample is enough

◮ With 2 distributions, 1 sample is enough 5 / 21

slide-34
SLIDE 34

Main Result

◮ F = {f 1, . . . , f m} ◮ Recall: if FSE possible for F, then FSE possible for all {f j} ◮ f a CM distribution if FSE possible for {f } ◮ dimension of F: dimension of linear space spanned by

f 1, . . . , f m

◮ Example:

f 1 = (1/4, 1/8, 3/8, 1/4), f 2 = (1/4, 3/8, 1/8, 1/4). Dimension of {f 1, f 2} = 2

Theorem

FSE possible for all F of CM distributions with size m and dimension d iff #samples ≥ m − d + 1 Implications:

◮ 2 ≤ dimension ≤ m: At most m − 1 samples; At least 1 sample ◮ With linearly independent distributions, 1 sample is enough

◮ With 2 distributions, 1 sample is enough

Learning Approach: 1 − ǫ approximation guarantees

5 / 21

slide-35
SLIDE 35

The Plan

1 “Learning” may require many samples, and is not necessary for FSE 2 Samples for randomization: FSE with #samples ≥ m − d + 1 3 Necessity of #samples ≥ m − d + 1 6 / 21

slide-36
SLIDE 36

What is Learning?

7 / 21

slide-37
SLIDE 37

What is Learning?

◮ Mechanism: M : V × S → O

V × S O

7 / 21

slide-38
SLIDE 38

What is Learning?

◮ Mechanism: M : V × S → O ◮ For each s, M(·, s) : V → O

7 / 21

slide-39
SLIDE 39

What is Learning?

◮ Mechanism: M : V × S → O ◮ For each s, M(·, s) : V → O

◮ M(·, s) is an “auction”, M(·, s) ∈ A := {h|h : V → O} 7 / 21

slide-40
SLIDE 40

What is Learning?

◮ Mechanism: M : V × S → O ◮ For each s, M(·, s) : V → O

◮ M(·, s) is an “auction”, M(·, s) ∈ A := {h|h : V → O} ◮ Mechanism M : S → A

S A (auctions)

7 / 21

slide-41
SLIDE 41

What is Learning?

◮ Mechanism: M : V × S → O ◮ For each s, M(·, s) : V → O

◮ M(·, s) is an “auction”, M(·, s) ∈ A := {h|h : V → O} ◮ Mechanism M : S → A

S A (auctions) A1 Am Aj

Definition

Mechanism M learns with probability δ if ∃ disjoint A1, . . . , Am ⊆ A Prs∼gj[M(s) ∈ Aj] ≥ δ, ∀j

7 / 21

slide-42
SLIDE 42

What is Learning?

◮ Mechanism: M : V × S → O ◮ For each s, M(·, s) : V → O

◮ M(·, s) is an “auction”, M(·, s) ∈ A := {h|h : V → O} ◮ Mechanism M : S → A

S A (auctions) A1 Am Aj

Definition

Mechanism M learns with probability δ if ∃ disjoint A1, . . . , Am ⊆ A Prs∼gj[M(s) ∈ Aj] ≥ δ, ∀j

7 / 21

slide-43
SLIDE 43

◮ F = {f 1, f 2}

f 1 = 1

  • 1/4

1/4 − ǫ 1 1/4 + ǫ 1/4 f 2 = 1

  • 1/4

1/4 + ǫ 1 1/4 − ǫ 1/4

8 / 21

slide-44
SLIDE 44

# of Samples Needed to Learn?

◮ F = {f 1, f 2}

f 1 = 1

  • 1/4

1/4 − ǫ 1 1/4 + ǫ 1/4 f 2 = 1

  • 1/4

1/4 + ǫ 1 1/4 − ǫ 1/4

8 / 21

slide-45
SLIDE 45

# of Samples Needed to Learn?

◮ F = {f 1, f 2}

f 1 = 1

  • 1/4

1/4 − ǫ 1 1/4 + ǫ 1/4 f 2 = 1

  • 1/4

1/4 + ǫ 1 1/4 − ǫ 1/4

Lemma

Learn with probability δ ⇒ #samples ≥ 2δ(1−ǫ2)

ǫ

(log(1/(1 − δ)) − 1).

8 / 21

slide-46
SLIDE 46

Class 1: Select a FSE Auction

9 / 21

slide-47
SLIDE 47

Class 1: Select a FSE Auction

Select auction that extracts surplus on one of distributions

9 / 21

slide-48
SLIDE 48

Class 1: Select a FSE Auction

Select auction that extracts surplus on one of distributions Problem: if true distribution is f 1

◮ With positive probability, FSE auction for f 1 ◮ With positive probability, FSE auction for f 2

9 / 21

slide-49
SLIDE 49

Class 1: Select a FSE Auction

Select auction that extracts surplus on one of distributions Problem: if true distribution is f 1

◮ With positive probability, FSE auction for f 1: IR tight ◮ With positive probability, FSE auction for f 2: IR violated

9 / 21

slide-50
SLIDE 50

Class 1: Select a FSE Auction

Select auction that extracts surplus on one of distributions Problem: if true distribution is f 1

◮ With positive probability, FSE auction for f 1: IR tight ◮ With positive probability, FSE auction for f 2: IR violated

Proposition

If Ev∼f j,s∼gj[ui(vi → vi, v−i, s)|vi] ≥ −σ, ∀vi, j, then #samples ≥ 2(1 − 16σ)(1 − ǫ2) ǫ (log(1/(8σ)) − 1)

9 / 21

slide-51
SLIDE 51

Class 2: Select Auction that is IR for All j

10 / 21

slide-52
SLIDE 52

Class 2: Select Auction that is IR for All j

Lemma

Any auction that is IR for both distributions satisfies Ef 1[revenue] + Ef 2[revenue] ≤ 1.8 × Full Surplus

10 / 21

slide-53
SLIDE 53

Class 2: Select Auction that is IR for All j

Lemma

Any auction that is IR for both distributions satisfies Ef 1[revenue] + Ef 2[revenue] ≤ 1.8 × Full Surplus

Proposition

If M(s) is IR for all j and s and Ef j[revenue] ≥ δ × Full Surplus, then #samples ≥ 1.8δ(1 − ǫ2) ǫ (log(1/(1 − δ)) − 1)

10 / 21

slide-54
SLIDE 54

Surplus Extraction Possible without Learning

11 / 21

slide-55
SLIDE 55

Surplus Extraction Possible without Learning

◮ F = {f 1, . . . , f n} ◮ f j: draw v1, . . . , vn i.i.d U{0, 1}

◮ With probability ǫ: if vj = 0, switch vj with a random j′ ◮ Prf j[vj = 1] > 1/2. 11 / 21

slide-56
SLIDE 56

Surplus Extraction Possible without Learning

◮ F = {f 1, . . . , f n} ◮ f j: draw v1, . . . , vn i.i.d U{0, 1}

◮ With probability ǫ: if vj = 0, switch vj with a random j′ ◮ Prf j[vj = 1] > 1/2.

Proposition

With 1 sample:

◮ ∃ FSE mechanism ◮ ∄ mechanism that learns with probability ǫ + 1/n

11 / 21

slide-57
SLIDE 57

The Plan

1 “Learning” may require many samples, and is not necessary for FSE 2 Samples for randomization: FSE with #samples ≥ m − d + 1 3 Necessity of #samples ≥ m − d + 1 12 / 21

slide-58
SLIDE 58

Back to Cremer-Mclean {0, 1} Example

13 / 21

slide-59
SLIDE 59

Back to Cremer-Mclean {0, 1} Example

f 1 = 1

  • 1/4

1/8 1 3/8 1/4

13 / 21

slide-60
SLIDE 60

Back to Cremer-Mclean {0, 1} Example

f 1 = 1

  • 1/4

1/8 1 3/8 1/4 Second price auction

◮ Buyer 1 has utility 1 if v1 = 1, v2 = 0, Pr[v2 = 0|v1 = 1] = 3/5

13 / 21

slide-61
SLIDE 61

Back to Cremer-Mclean {0, 1} Example

f 1 = 1

  • 1/4

1/8 1 3/8 1/4 Second price auction

◮ Buyer 1 has utility 1 if v1 = 1, v2 = 0, Pr[v2 = 0|v1 = 1] = 3/5 ◮ u1(0) = 0, u1(1) = 3/5

13 / 21

slide-62
SLIDE 62

Back to Cremer-Mclean {0, 1} Example

f 1 = 1

  • 1/4

1/8 1 3/8 1/4 Second price auction

◮ Buyer 1 has utility 1 if v1 = 1, v2 = 0, Pr[v2 = 0|v1 = 1] = 3/5 ◮ u1(0) = 0, u1(1) = 3/5

Second price + Side payment(report of other agent)

13 / 21

slide-63
SLIDE 63

Back to Cremer-Mclean {0, 1} Example

f 1 = 1

  • 1/4

1/8 1 3/8 1/4 Second price auction

◮ Buyer 1 has utility 1 if v1 = 1, v2 = 0, Pr[v2 = 0|v1 = 1] = 3/5 ◮ u1(0) = 0, u1(1) = 3/5

Second price + Side payment(report of other agent)

◮ P1(v2 = 0) = −3, P1(v2 = 1) = 6

13 / 21

slide-64
SLIDE 64

Back to Cremer-Mclean {0, 1} Example

f 1 = 1

  • 1/4

1/8 1 3/8 1/4 Second price auction

◮ Buyer 1 has utility 1 if v1 = 1, v2 = 0, Pr[v2 = 0|v1 = 1] = 3/5 ◮ u1(0) = 0, u1(1) = 3/5

Second price + Side payment(report of other agent)

◮ P1(v2 = 0) = −3, P1(v2 = 1) = 6 ◮ E[P|0] = 2 3 × (−3) + 1 3 × 6 = 0

⇒ u1(0) = 0

◮ E[P|1] = 3 5 × (−3) + 2 5 × 6 = 3/5 ⇒ u1(1) = 0

CM Auction:

◮ Second price + Side payments

13 / 21

slide-65
SLIDE 65

CM Auction

For each bidder i, find side payment P(v−i) such that

◮ ∀vi, Ev−i|vi[P(v−i)] = u(vi) (of second-price auction)

14 / 21

slide-66
SLIDE 66

CM Auction

For each bidder i, find side payment P(v−i) such that

◮ ∀vi, Ev−i|vi[P(v−i)] = u(vi) (of second-price auction)

Vi               · · . . . · Pr[v−i|vi] . . . . . . . . . ...      ·         · P(v−i) · . . .         =      · u(vi) . . .               Vi

14 / 21

slide-67
SLIDE 67

CM Auction

For each bidder i, find side payment P(v−i) such that

◮ ∀vi, Ev−i|vi[P(v−i)] = u(vi) (of second-price auction)

Vi               · · . . . · Pr[v−i|vi] . . . . . . . . . ...      ·         · P(v−i) · . . .         =      · u(vi) . . .               Vi

Theorem (CM’88)

Full surplus extraction if rows of conditional probability matrix linearly independent.

14 / 21

slide-68
SLIDE 68

Main Result (Sufficiency)

Theorem (Sufficiency)

FSE if CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1.

15 / 21

slide-69
SLIDE 69

Main Result (Sufficiency)

Theorem (Sufficiency)

FSE if CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1. Mechanism: second price & side payments (now also depend on signals)

15 / 21

slide-70
SLIDE 70

Main Result (Sufficiency)

Theorem (Sufficiency)

FSE if CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1. Mechanism: second price & side payments (now also depend on signals)

◮ ∀vi, j, Es∼gj,v−i∼f j|vi[P(v−i, s)] = uj(vi) (of second-price auction)

15 / 21

slide-71
SLIDE 71

Main Result (Sufficiency)

Theorem (Sufficiency)

FSE if CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1. Mechanism: second price & side payments (now also depend on signals)

◮ ∀vi, j, Es∼gj,v−i∼f j|vi[P(v−i, s)] = uj(vi) (of second-price auction)

Vi×|F|               · · . . . · Prj[v−i, s|vi] . . . . . . . . . ...      ·         · P(v−i, s) · . . .         =      · uj(vi) . . .               Vi×|F|

15 / 21

slide-72
SLIDE 72

Main Result (Sufficiency)

Theorem (Sufficiency)

FSE if CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1. Mechanism: second price & side payments (now also depend on signals)

◮ ∀vi, j, Es∼gj,v−i∼f j|vi[P(v−i, s)] = uj(vi) (of second-price auction)

Vi×|F|               · · . . . · Prj[v−i, s|vi] . . . . . . . . . ...      ·         · P(v−i, s) · . . .         =      · uj(vi) . . .               Vi×|F|

◮ Argued: FSE if rows of the matrix linearly independent.

15 / 21

slide-73
SLIDE 73

Main Result (Sufficiency)

Theorem (Sufficiency)

FSE if CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1. Mechanism: second price & side payments (now also depend on signals)

◮ ∀vi, j, Es∼gj,v−i∼f j|vi[P(v−i, s)] = uj(vi) (of second-price auction)

Vi×|F|               · · . . . · Prj[v−i, s|vi] . . . . . . . . . ...      ·         · P(v−i, s) · . . .         =      · uj(vi) . . .               Vi×|F|

◮ Argued: FSE if rows of the matrix linearly independent. ◮ To argue: conditions of theorem ⇒ rows are linearly independent.

15 / 21

slide-74
SLIDE 74

Back to {0, 1} Example

Two distributions: f 1 = 1

  • 1/4

1/8 1 3/8 1/4 f 2 = 1

  • 1/4

3/8 1 1/8 1/4

16 / 21

slide-75
SLIDE 75

Back to {0, 1} Example

Two distributions: f 1 = 1

  • 1/4

1/8 1 3/8 1/4 f 2 = 1

  • 1/4

3/8 1 1/8 1/4 Mechanism: second price & side payments P1(v2, s), s ∈ {1, 2}

◮ P1(0, 1)

P1(1, 1) P1(0, 2) P1(1, 2)

16 / 21

slide-76
SLIDE 76

Back to {0, 1} Example

Two distributions: f 1 = 1

  • 1/4

1/8 1 3/8 1/4 f 2 = 1

  • 1/4

3/8 1 1/8 1/4 Mechanism: second price & side payments P1(v2, s), s ∈ {1, 2}

◮ P1(0, 1)

P1(1, 1) P1(0, 2) P1(1, 2) (v1, j)\(v2, s) (0, 1) (0, 2) (1, 1) (1, 2)               (0, 1)

2 3 · 3 8 2 3 · 1 8 1 3 · 3 8 1 3 · 1 8

(1, 1)

3 5 · 3 8 3 5 · 1 8 2 5 · 3 8 2 5 · 1 8

(0, 2)

2 5 · 1 8 2 5 · 3 8 3 5 · 1 8 3 5 · 3 8

(1, 2)

1 3 · 1 8 1 3 · 3 8 2 3 · 1 8 2 3 · 3 8

·               P1(1, 1) P1(1, 2) P1(2, 1) P1(2, 2) =               u(0, 1)

3 5

u(1, 1) u(0, 2)

1 3

u(1, 2)

16 / 21

slide-77
SLIDE 77

Back to {0, 1} Example

Two distributions: f 1 = 1

  • 1/4

1/8 1 3/8 1/4 f 2 = 1

  • 1/4

3/8 1 1/8 1/4 Mechanism: second price & side payments P1(v2, s), s ∈ {1, 2}

◮ P1(0, 1) = −6; P1(1, 1) = −6; P1(0, 2) = 16; P1(1, 2) = 0

(v1, j)\(v2, s) (0, 1) (0, 2) (1, 1) (1, 2)               (0, 1)

2 3 · 3 8 2 3 · 1 8 1 3 · 3 8 1 3 · 1 8

(1, 1)

3 5 · 3 8 3 5 · 1 8 2 5 · 3 8 2 5 · 1 8

(0, 2)

2 5 · 1 8 2 5 · 3 8 3 5 · 1 8 3 5 · 3 8

(1, 2)

1 3 · 1 8 1 3 · 3 8 2 3 · 1 8 2 3 · 3 8

·               P1(1, 1) P1(1, 2) P1(2, 1) P1(2, 2) =               u(0, 1)

3 5

u(1, 1) u(0, 2)

1 3

u(1, 2)

16 / 21

slide-78
SLIDE 78

Completing the Proof of the Main Theorem

Lemma

If CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1, then rows are linearly independent. (v1, j)\(v2, s) (0, 1) (0, 2) (1, 1) (1, 2)               (0, 1)

2 3 · 3 8 2 3 · 1 8 1 3 · 3 8 1 3 · 1 8

(1, 1)

3 5 · 3 8 3 5 · 1 8 2 5 · 3 8 2 5 · 1 8

(0, 2)

2 5 · 1 8 2 5 · 3 8 3 5 · 1 8 3 5 · 3 8

(1, 2)

1 3 · 1 8 1 3 · 3 8 2 3 · 1 8 2 3 · 3 8

= 1              

2 3 1 3 3 5 2 5 2 5 3 5 1 3 2 3

⊗ 1 2              

3 8 1 8 3 8 1 8 1 8 3 8 1 8 3 8

17 / 21

slide-79
SLIDE 79

Completing the Proof of the Main Theorem

Lemma

If CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1, then rows are linearly independent. (v1, j)\(v2, s) (0, 1) (0, 2) (1, 1) (1, 2)               (0, 1)

2 3 · 3 8 2 3 · 1 8 1 3 · 3 8 1 3 · 1 8

(1, 1)

3 5 · 3 8 3 5 · 1 8 2 5 · 3 8 2 5 · 1 8

(0, 2)

2 5 · 1 8 2 5 · 3 8 3 5 · 1 8 3 5 · 3 8

(1, 2)

1 3 · 1 8 1 3 · 3 8 2 3 · 1 8 2 3 · 3 8

= 1              

2 3 1 3 3 5 2 5 2 5 3 5 1 3 2 3

⊗ 1 2              

3 8 1 8 3 8 1 8 1 8 3 8 1 8 3 8

17 / 21

slide-80
SLIDE 80

Completing the Proof of the Main Theorem

Lemma

If CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1, then rows are linearly independent. v−i               (0, H)

  • f H(·|0)

(1, H)

  • f H(·|1)

(0, L)

  • f L(·|0)

(1, L)

  • f L(·|1)

⊗ s              

  • gH(·)
  • gH(·)
  • gL(·)
  • gL(·)

17 / 21

slide-81
SLIDE 81

Completing the Proof of the Main Theorem

Lemma

If CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1, then rows are linearly independent.

◮ Sufficient:

g1, . . . , gm are linearly independent v−i               (0, H)

  • f H(·|0)

(1, H)

  • f H(·|1)

(0, L)

  • f L(·|0)

(1, L)

  • f L(·|1)

⊗ s              

  • gH(·)
  • gH(·)
  • gL(·)
  • gL(·)

17 / 21

slide-82
SLIDE 82

Completing the Proof of the Main Theorem

Lemma

If CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1, then rows are linearly independent.

◮ Sufficient:

g1, . . . , gm are linearly independent

◮ ∀v−i, s :

vi,j β(vi, j)

f j(v−i|vi) g j(s) = 0.

◮ Independence of g: ∀v−i, j

vi β(vi, j)

f j(v−i|vi) = 0.

v−i               (0, H)

  • f H(·|0)

(1, H)

  • f H(·|1)

(0, L)

  • f L(·|0)

(1, L)

  • f L(·|1)

⊗ s              

  • gH(·)
  • gH(·)
  • gL(·)
  • gL(·)

17 / 21

slide-83
SLIDE 83

Completing the Proof of the Main Theorem

Lemma

If CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1, then rows are linearly independent.

◮ Sufficient:

g1, . . . , gm are linearly independent

◮ ∀v−i, s :

vi,j β(vi, j)

f j(v−i|vi) g j(s) = 0.

◮ Independence of g: ∀v−i, j

vi β(vi, j)

f j(v−i|vi) = 0.

v−i               (0, H) 1 (1, H) 1 (0, L) 1 (1, L) 1 ⊗ s               1 1 1 1 = (v−i, s)               1 1 1 1

17 / 21

slide-84
SLIDE 84

Completing the Proof of the Main Theorem

Lemma

If CM-conditions hold ∀f j and #samples ≥ m − d(F) + 1, then rows are linearly independent.

◮ Sufficient:

g1, . . . , gm are linearly independent

◮ ∀v−i, s :

vi,j β(vi, j)

f j(v−i|vi) g j(s) = 0.

◮ Independence of g: ∀v−i, j

vi β(vi, j)

f j(v−i|vi) = 0.

◮ To argue:

g1, . . . , gm independent if #samples ≥ m − d(F) + 1 v−i               (0, H) 1 (1, H) 1 (0, L) 1 (1, L) 1 ⊗ s               1 1 1 1 = (v−i, s)               1 1 1 1

17 / 21

slide-85
SLIDE 85

The Last Step of the Proof

Claim: g1, . . . , gm independent if #samples ≥ m − d(F) + 1 Signal is k i.i.d draws from f j: gj = (⊗ f j)k.

18 / 21

slide-86
SLIDE 86

The Last Step of the Proof

Claim: g1, . . . , gm independent if #samples ≥ m − d(F) + 1 Signal is k i.i.d draws from f j: gj = (⊗ f j)k.

Lemma

∀V = {v1, . . . , vm}, (⊗v1)k, . . . , (⊗vm)k linearly independent if k ≥ m − d(V ) + 1.

18 / 21

slide-87
SLIDE 87

The Last Step of the Proof

Claim: g1, . . . , gm independent if #samples ≥ m − d(F) + 1 Signal is k i.i.d draws from f j: gj = (⊗ f j)k.

Lemma

∀V = {v1, . . . , vm}, (⊗v1)k, . . . , (⊗vm)k linearly independent if k ≥ m − d(V ) + 1.         1 1 0.7 0.3

18 / 21

slide-88
SLIDE 88

The Last Step of the Proof

Claim: g1, . . . , gm independent if #samples ≥ m − d(F) + 1 Signal is k i.i.d draws from f j: gj = (⊗ f j)k.

Lemma

∀V = {v1, . . . , vm}, (⊗v1)k, . . . , (⊗vm)k linearly independent if k ≥ m − d(V ) + 1.         1 1 0.7 0.3 ⊗         1 1 0.7 0.3 =         1 1 0.49 0.21 0.21 0.09

18 / 21

slide-89
SLIDE 89

The Last Step of the Proof

Claim: g1, . . . , gm independent if #samples ≥ m − d(F) + 1 Signal is k i.i.d draws from f j: gj = (⊗ f j)k.

Lemma

∀V = {v1, . . . , vm}, (⊗v1)k, . . . , (⊗vm)k linearly independent if k ≥ m − d(V ) + 1.         1 1 0.7 0.3 ⊗         1 1 0.7 0.3 =         1 1 0.49 0.21 0.21 0.09               1 1 1/3 2/3 2/3 1/3

18 / 21

slide-90
SLIDE 90

The Last Step of the Proof

Claim: g1, . . . , gm independent if #samples ≥ m − d(F) + 1 Signal is k i.i.d draws from f j: gj = (⊗ f j)k.

Lemma

∀V = {v1, . . . , vm}, (⊗v1)k, . . . , (⊗vm)k linearly independent if k ≥ m − d(V ) + 1.         1 1 0.7 0.3 ⊗         1 1 0.7 0.3 =         1 1 0.49 0.21 0.21 0.09               1 1 1/3 2/3 2/3 1/3 ⊗               1 1 1/3 2/3 2/3 1/3 =               1 1 0.49 0.21 0.21 0.09 0.16 0.24 0.24 0.36

18 / 21

slide-91
SLIDE 91

The Plan

1 “Learning” may require many samples, and is not necessary for FSE 2 Samples for randomization: FSE with #samples ≥ m − d + 1 3 Necessity of #samples ≥ m − d + 1 19 / 21

slide-92
SLIDE 92

Related Work

Genericness of CM-conditions

◮ Generic: Barelli (2009); Chen and Xiong (2013) ◮ Non-generic: Heifetz and Neeman (2006); Chen and Xiong (2011)

20 / 21

slide-93
SLIDE 93

Related Work

Genericness of CM-conditions

◮ Generic: Barelli (2009); Chen and Xiong (2013) ◮ Non-generic: Heifetz and Neeman (2006); Chen and Xiong (2011)

Robustness/Misspecification/Model uncertainty

◮ Bergemann and Schlag (2011); Madarasz and Prat (2016)

20 / 21

slide-94
SLIDE 94

Related Work

Genericness of CM-conditions

◮ Generic: Barelli (2009); Chen and Xiong (2013) ◮ Non-generic: Heifetz and Neeman (2006); Chen and Xiong (2011)

Robustness/Misspecification/Model uncertainty

◮ Bergemann and Schlag (2011); Madarasz and Prat (2016)

Prior Independence/sampling

◮ Hartline and Roughgarden (2011)

20 / 21

slide-95
SLIDE 95

Related Work

Genericness of CM-conditions

◮ Generic: Barelli (2009); Chen and Xiong (2013) ◮ Non-generic: Heifetz and Neeman (2006); Chen and Xiong (2011)

Robustness/Misspecification/Model uncertainty

◮ Bergemann and Schlag (2011); Madarasz and Prat (2016)

Prior Independence/sampling

◮ Hartline and Roughgarden (2011)

ex-post IR

◮ Papadimitriou and Pierrakos (2011)

20 / 21

slide-96
SLIDE 96

Conclusions

Two points:

◮ Prior-free mechanisms: identify family F instead of distribution f ◮ Strengthening “CM as critique”

21 / 21

slide-97
SLIDE 97

Conclusions

Two points:

◮ Prior-free mechanisms: identify family F instead of distribution f ◮ Strengthening “CM as critique”

◮ FSE: Knowledge of distribution not required 21 / 21

slide-98
SLIDE 98

Conclusions

Two points:

◮ Prior-free mechanisms: identify family F instead of distribution f ◮ Strengthening “CM as critique”

◮ FSE: Knowledge of distribution not required

Idea:

◮ Randomization instead of learning

21 / 21

slide-99
SLIDE 99

Conclusions

Two points:

◮ Prior-free mechanisms: identify family F instead of distribution f ◮ Strengthening “CM as critique”

◮ FSE: Knowledge of distribution not required

Idea:

◮ Randomization instead of learning

Extensions:

◮ Remove finiteness assumptions

21 / 21

slide-100
SLIDE 100

Conclusions

Two points:

◮ Prior-free mechanisms: identify family F instead of distribution f ◮ Strengthening “CM as critique”

◮ FSE: Knowledge of distribution not required

Idea:

◮ Randomization instead of learning

Extensions:

◮ Remove finiteness assumptions

Thanks!

21 / 21