Reconst nstruct ruct Radio
- Map with Automatic
Reconst nstruct ruct Radio o Map with Automatic atically ally - - PowerPoint PPT Presentation
Reconst nstruct ruct Radio o Map with Automatic atically ally Constru tructed cted Gaussia sian n Proces ess s for Localiz lizatio ation 01 | Background C O N T E N T 02 | Gaussian Process 03 | Kernel Selection 04 | Model
C O N T E N T
01 | Background 02 | Gaussian Process 03 | Kernel Selection 04 | Model Ensemble 05 | Experiment
Page . Page . 3
P A R T O N E
SJTU 3G 4G Yindu Road 3G 4G About one million sample data 20 square kilometers
P A R T T W O
Given a training set 𝐸 = 𝒚𝑗, 𝑧𝑗 𝑗 = 1, … , 𝑜} How to calculate the output 𝑧∗ for a new input 𝑦∗ Linear regression? – Least Square Method Nonlinear regression?
T
f x b
Assume 𝒛 = {𝑧1, 𝑧2, … , 𝑧𝑜} obey multivariate Gaussian Distribution 𝑧1 ⋮ 𝑧𝑜 ~𝑂 0, 𝐿 → 𝒛~𝑂 0, 𝐿 Where Given a training set 𝐸 = 𝒚𝑗, 𝑧𝑗 𝑗 = 1, … , 𝑜} How to calculate the output 𝑧∗(RSS) for a new input 𝑦∗(longitude/latitude)
For new input data y* joint distribution defined as:
Get conditional distribution: Mean and variance:
Maxlimum: Hyper-parameters::
Maxlimum the log likelihood: (conjugate gradients)
SE kernel:
𝜏
𝑔 2:overall vertical scale of variation of the latent value.
𝑚: characteristic length-scale
data points.
𝜏𝑜
2: noise variance
Different hyper parameters Bias & Variance trade off
left with small 𝑚 right with large 𝑚
P A R T t h r e e
Squared Exponential Kernel Periodic Kernel Linear Kernel
(1)ka + kb = ka(x, x′) + kb(x, x′) (Summation) (2)ka × kb = ka(x, x′) × kb(x, x′) (Multiplication)
choosing the structural form of the kernel: a black art
products of kernels
decomposed into different parts BIC trades off model fit and complexity
P A R T F O U R
Don’t Overfit! Averaging multiple different green lines should bring us closer to the black line.
Linear regression SVR Gradient Tree Boosting Xgboost GP (RQ) GP (Compose) Rate Averaging Blending
P A R T F I V E
P A R T S I X