biomedicine
play

Biomedicine Enrico Grisan enrico.grisan@dei.unipd.it From biology - PowerPoint PPT Presentation

Applied Machine Learning in Biomedicine Enrico Grisan enrico.grisan@dei.unipd.it From biology to models (Artificial) Neural Networks Backpropagation Loop: Sample a batch of data Backpropagate the loss function to calculate the


  1. Applied Machine Learning in Biomedicine Enrico Grisan enrico.grisan@dei.unipd.it

  2. From biology to models

  3. (Artificial) Neural Networks

  4. Backpropagation • Loop: – Sample a batch of data – Backpropagate the loss function to calculate the analytical gradient – Perform parameter updata

  5. ANN Practicalities 1) Preprocess the data! % X = NxD test cases mx = mean(X); mx = mean(X); sx=std(X); sx=std(X); for ct=1:size(x,1), X=X-ones(size(X,1),1)*mx; X(ct,:)=X(ct,:)-mx; X=X./ones(size(X,1),1)*sx; X(ct,:)=X(ct,:)./sx; end

  6. ANN Practicalities 1) Preprocess the data! % X = NxD test cases Sx = cov(X); [E,D]=eig(Sx); R=E’; W=sqrt(inv(D)); Xw=(W*(R*X’))’;

  7. ANN practicalities

  8. ANN practicalities • Run cross validation across many tips & tricks • Use visualization (training/validation, loss curve, weight update, weights plot …) to guide the hyperparameters ranges and cross- validate • Ensamble multiple models

  9. ANN in Matlab % X = NxD test cases % Y = Nx1 target values % Initialize the network net = feedforwardnet(10); net = train(net,X,Y); Yhat = net(X); perfs = mse(net,Y,Yhat); % Adding L2 regularization net.performParam.regularization = 0.5; net_reg = train(net,X,Y);

  10. Pancreatic tissue Clinical Gast. Hepat., 2012

  11. Removing bones from RX Chen et al. IEEE TMI 2014

  12. Autoencoder Encoder Decoder 𝑡 𝑡

  13. Convolutional Neural Networks

  14. Hyerarchical organization

  15. Shared weights

  16. Local connectivity Image RGB: 32x32x3 before : each neuron will connect to 32x32x3weight Full connectiviy now: one neuron will connect to, e.g, a 5x5x3 patch, and have 5x5x3 weights Local connectivity

  17. Depth

  18. Receptive fields Replicate the column of hidden neurons across space, With some stride 7x7 input 3x3 connectivity (receptor field) 1 stride

  19. Receptive fields Replicate the column of hidden neurons across space, With some stride 7x7 input 3x3 connectivity (receptor field) 1 stride

  20. Receptive fields Replicate the column of hidden neurons across space, With some stride 7x7 input 3x3 connectivity (receptor field) 1 stride

  21. Receptive fields Replicate the column of hidden neurons across space, With some stride 7x7 input 3x3 connectivity (receptor field) 1 stride

  22. Receptive fields Replicate the column of hidden neurons across space, With some stride 7x7 input 3x3 connectivity (receptor field) 1 stride 5x5 output

  23. Receptive fields Replicate the column of hidden neurons across space, With some stride 7x7 input 3x3 connectivity (receptor field) 2 stride

  24. Receptive fields Replicate the column of hidden neurons across space, With some stride 7x7 input 3x3 connectivity (receptor field) 2 stride

  25. Receptive fields Replicate the column of hidden neurons across space, With some stride 7x7 input 3x3 connectivity (receptor field) 2 stride 3x3 output

  26. Receptive fields N Output size: F (N-F)/stride+1 N N=7, F=3 stride 1 (7-3)/3+1=5 stride 2 (7-3)/2+1=3 stride 3 (7-3)/3+1= ??

  27. From columns to volume All stacked neurons form [1x1xdepth] columns, that combine in the output volume

  28. Volume dimension Input image: 32x32x3 Receptive field: 5x5, stride 1 Number of neurons: 5 Output size: (32-5)/1+1=28 Number of neurons: 28x28x5 Number of weights per neuron: 5x5x3 Number of weights: 28x28x5x5x5x3=294000

  29. Padding zero-pad the image 7x7 input 3x3 connectivity (receptor field) 1 stride 7x7 output

  30. Volume dimension Input image: 32x32x3 Zero padding Receptive field: 5x5, stride 1 Number of neurons: 30 Output size: 32 Number of neurons: 32x32x30 Number of weights per neuron: 5x5x3 Number of weights: 32x32x5x5x5x3=2304000

  31. Weight sharing If all neurons at the same depth in the output columns share the same weight, we can have «slices»: One activation map (a depth slice) computed with only a set of weights

  32. Convolution and filters When all weights at the same depth are equal, the activation map can be computed through a convolution: 𝑔 𝑦, 𝑧 ∗ 𝑕 𝑦, 𝑧 = 𝑔 𝑜 1 , 𝑜 2 𝑕(𝑦 − 𝑜 1 , 𝑧 − 𝑜 2 ) 𝑜 1 𝑜 2 The weights represent a filter ! The slices (filtered images) represent feature maps !

  33. C-NN

  34. CNN architecture

  35. CNN architecture

  36. Weight interpetation

  37. Max pooling In convolutional NN architectures, conv. Layers are often followed by pool (downsampling) layers

  38. ICPR 2012 Mitosis detection Wang et al. JMI 2014

  39. MICCAI 2013 Mitosis Detection Ciresan et al. MICCAI 2013

  40. Lymph node detection Roth et al. MICCAI 2014

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend