SLIDE 20 Introduction Nuclear power plant control design Learning for optimization Surrogate-assisted optimization Conclusions
Features to learn : Mult.-Obj. fitness landscape
Performance prediction of MO algorithms based on fitness landscape features
(regression model : random forest, etc.)
f_cor_rws rho #lsupp_avg_rws #lsupp_avg_aws #lnd_avg_rws #lnd_avg_aws length_aws #sup_avg_aws #sup_avg_rws #inc_avg_rws #inf_avg_rws #inc_avg_aws #inf_avg_aws hv_r1_rws #sup_r1_rws #inf_r1_rws #lnd_r1_rws nhv_r1_rws hvd_r1_rws #inc_r1_rws k_n #lsupp_r1_rws n hvd_avg_rws hvd_avg_aws hv_avg_aws hv_avg_rws m nhv_avg_rws nhv_avg_aws nhv_avg_aws nhv_avg_rws m hv_avg_rws hv_avg_aws hvd_avg_aws hvd_avg_rws n #lsupp_r1_rws k_n #inc_r1_rws hvd_r1_rws nhv_r1_rws #lnd_r1_rws #inf_r1_rws #sup_r1_rws hv_r1_rws #inf_avg_aws #inc_avg_aws #inf_avg_rws #inc_avg_rws #sup_avg_rws #sup_avg_aws length_aws #lnd_avg_aws #lnd_avg_rws #lsupp_avg_aws #lsupp_avg_rws rho f_cor_rws
−1 1 Value 40 Kendall's tau
Count
- Perf. prediction (cross-val.)
feature set MAE MSE R2 rank GSEMO all 0.007781 0.000118 0.951609 1 enumeration 0.008411 0.000142 0.943046 2 sampling all 0.009113 0.000161 0.932975 3 sampling rws 0.009284 0.000167 0.930728 4 sampling aws 0.010241 0.000195 0.917563 5 {r, m, n, k/n} 0.010609 0.000215 0.911350 6 {r, m, n} 0.026974 0.001123 0.518505 7 {m, n} 0.032150 0.001545 0.340715 8