It’s Not What Machines Can Learn, It’s What We Cannot Teach
ICML 2020
Gal Yehuda, Moshe Gabel, Assaf Schuster
Its Not What Machines Can Learn, Its What We Cannot Teach ICML 2020 - - PowerPoint PPT Presentation
Its Not What Machines Can Learn, Its What We Cannot Teach ICML 2020 Gal Yehuda, Moshe Gabel, Assaf Schuster Applications of machine learning G. Yehuda, M. Gabel, A. Schuster. It's Not What Machines Can Learn, It's What We Cannot Teach.
Gal Yehuda, Moshe Gabel, Assaf Schuster
ICML 2020
2
3
Prates, Avelar, Lemos, Lamb, Vardi, Learning to Solve NP-Complete Problems - A Graph Neural Network for Decision TSP ,AAAI 2019
ICML 2020
4
Generate Data Propose: architecture, features, embedding Train model Evaluate SUCCESS
ICML 2020
5
ICML 2020
6
slow (non-poly-time) data generation fast (poly-time) data generation or augmentation NP ∩ coNP NP-hard
ICML 2020
7
10 20 30 40 50 60 70 80 90 100 Augmented Sampled Accuracy
ICML 2020
8 ICML 2020
9
ICML 2020
10
, YES , NO
ICML 2020
11
The original problem space The problem space, seen by efficient sampler poly-time sampler
ICML 2020
12
L The original problem was NP-hard poly-time sampler ( , YES/NO) Is in L? Resulting problem is NP ∩ coNP
ICML 2020
13
P NP-hard coNP NP NP-complete easier harder
ICML 2020
14
ICML 2020
15
ICML 2020
16 ICML 2020
poly-time sampler
L YES NO 1 sample x
17
ICML 2020
P NP-hard coNP NP NP-complete easier harder
constant time
ICML 2020
18
19 ICML 2020
sample from phase transition Vampire theorem prover label using solver
N Y Y Y N N N
data augmentation
N Y Y Y N N N N Y Y Y N N N Y Y Y N N N
label preserving transformations
20
0.0 2.5 5.0 7.5 10.0 12.5 15.0
Million samples
70 75 80 85 90 95
Accuracy (%)
0.2 0.3 0.4 0.5
Loss
Accuracy Loss
ICML 2020
21
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Accuracy
aug all-cqc µ(10, 8)
Test set 0.942 0.804 0.647
ICML 2020
30% accuracy drop
22 ICML 2020
We will he bappy to discuss the work and answer questions. ygal@cs.technion.ac.il mgabel@cs.toronto.edu