Scalable Evolutionary Search
Ke TANG
Shenzhen Key Laboratory of Computational Intelligence Department of Computer Science and Engineering Southern University of Science and Technology (SUSTech) Email: tangk3@sustc.edu.cn November 2018 @ CSBSE, BUCT
Scalable Evolutionary Search Ke TANG Shenzhen Key Laboratory of - - PowerPoint PPT Presentation
Scalable Evolutionary Search Ke TANG Shenzhen Key Laboratory of Computational Intelligence Department of Computer Science and Engineering Southern University of Science and Technology (SUSTech) Email: tangk3@sustc.edu.cn November 2018 @ CSBSE,
Shenzhen Key Laboratory of Computational Intelligence Department of Computer Science and Engineering Southern University of Science and Technology (SUSTech) Email: tangk3@sustc.edu.cn November 2018 @ CSBSE, BUCT
n Introduction n General Ideas and Methodologies n Case Studies n Summary and Discussion
1
Railway timetabling Truss design Portfolio optimization Network Optimization Non-differentiable Objective function Discrete Search Space Non-differentiable constraints + mixed integer search space Discrete Search space
2
problems of ever growing size.
3
4
Huge No. of model parameters Huge volume of data
n Introduction n General Ideas and Methodologies n Case Studies n Summary and Discussion
5
6
7
Learn Grouping Clustering
8
Cooperative Coevolution,” Information Sciences, 178(15): 2985-2999, August 2008.
using Cooperative Coevolution with Variable Interaction Learning,” in Proceedings of PPSN2010.
9
Elementary Variable
Search method Search method Search method
10
11
12
13
Scale Multi-Objective Optimization,” IEEE Transactions on Evolutionary Computation, accepted on Oct. 30, 2018.
14
Idea: using data generated during search course
x1 … xD quality datum 1 … … … … … … … … … datum n … … … …
Build surrogate model to evaluate x1 and x2
Transactions on Evolutionary Computation, 22(1): 143-156, February 2018. 15
16
17
noise
regression
18
use resampling The sample size should be carefully selected
Environments,” Evolutionary Computation, 26(2): 237-267, June 2018.
n Introduction n General Ideas and Methodologies n Case Studies n Summary and Discussion
19
every 5 minutes (emerged with the availability of big data).
20
21
22
Decomposition,” IEEE Transactions on Cybernetics, 47(11): 3928-3940, November 2017.
Application maximum coverage a set of elements size of the union sparse regression an observation variable MSE of prediction influence maximization a social network user influence spread document summarization a sentence summary quality sensor placement a place to install a sensor entropy
23
[Qian et al., IJCAI’16]
1
POSS PPOSS
[Qian et al., NIPS’15]
24
25
PPOSS (blue line): achieve speedup around 7 when the number of cores is 10; the solution qualities are stable PPOSS-asy (red line): achieve better speedup (avoid the synchronous cost); the solution qualities are slightly worse (the noise from asynchronization)
26
Influential users
estimated by Monte Carlo simulations Noise
multiplicative noise
27
significantly better
PONSS constant Greedy Approximation Guarantee (in polynomial time):
28
Significantly better bound has also been proved for additive noise.
Compression
IPhone 8 2GB RAM
104MB 125MB Transformer (Neural Machine Translation) 200 million parameters 1.2GB storage size LSTMP RNN (Speech Recognition) 80 million parameters 300MB storage size AlexNet (Image Classification) 60 million parameters 200MB storage size
DNNs must be compressed for real-time processing and privacy concerns.
29
Initial model
Apply NCS to search for the thresholds for weight pruning
Re-train Final Model Compressed Model Model to be pruned
Iterative pruning
Stop condition satisfied?
Yes
Magnitude-based Pruning for DNN Compression," in Proc. of IJCAI’18.
Model Original Size Pruning Method Size after pruning Accuracy Change (%)
LeNet- 300-100 1.1MB ITR, 2015 93.9KB +0.05 DS, 2016 20.1KB +0.29 SWS, 2017 49.0KB
Sparse VD, 2017 16.6KB
OLMP, 2018 10.0KB +0.1 LeNet-5 3.3MB ITR, 2015 281.6KB +0.03 DS, 2016 31.3KB SWS, 2017 16.9KB
Sparse VD, 2017 12KB +0.05 OLMP, 2018 11KB AlexNet 228.0MB OLMP, 2018 2.8MB +0.4
30
n Introduction n General Ideas and Methodologies n Case Studies n Summary and Discussion
31
32
2 2
m
vo
mmhm ml
i Hisao Ishibuchi i Ai i
i
Ai Adam Ghandari i
e
G.Theodoropoulosi Elvis Sze-Yeung Liui ai i Shin Hwei Tani ai Ki i
/
Luca Rossii
i
i