csi5180 machinelearningfor bioinformaticsapplications
play

CSI5180. MachineLearningfor BioinformaticsApplications Deep learning - PowerPoint PPT Presentation

CSI5180. MachineLearningfor BioinformaticsApplications Deep learning encoding and transfer learning by Marcel Turcotte Version November 12, 2019 Preamble Preamble 2/47 Preamble Deep learning encoding and transfer learning In this


  1. CSI5180. MachineLearningfor BioinformaticsApplications Deep learning — encoding and transfer learning by Marcel Turcotte Version November 12, 2019

  2. Preamble Preamble 2/47

  3. Preamble Deep learning — encoding and transfer learning In this lecture, we further investigate deep learning. We review diverse methods to encode the data for these artificial neural networks. We present the concept of embeddings and specifically embeddings for biological sequences. Finally, we discuss the concept of transfer learning. General objective : Explain the various ways to encode data for deep networks Preamble 3/47

  4. Learning objectives Explain the concept of embeddings Describe how to implement transfer learning Justify the application of transfer learning Reading: Ehsaneddin Asgari and Mohammad R K Mofrad, Continuous distributed representation of biological sequences for deep proteomics and genomics, PLoS One 10 :11, e0141287, 2015. Wang, S., Li, Z., Yu, Y., Xu, J. Folding Membrane Proteins by Deep Transfer Learning. Cell Systems 5 :3, 202, 2017. Preamble 4/47

  5. Plan 1. Preamble 2. Summary 3. Keras 4. Preprocessing 5. Transfer learning 6. Prologue Preamble 5/47

  6. Summary Summary 6/47

  7. Summary - threshold logic unit Source: [3] Figure 10.4 Model h w ( x ) = ϕ ( x T w ) Summary 7/47

  8. Summary - Perceptron Source: [3] Figure 10.5 A Perceptron consists of a single layer of threshold logic units. Summary 8/47

  9. Summary - Perceptron Source: [3] Figure 10.5 A Perceptron consists of a single layer of threshold logic units. It computes the following function: h W , b ( X ) = ϕ ( WX + b ) Summary 8/47

  10. Summary - Definitions Input neuron: a special type of neuron that simply returns the value of its input . Summary 9/47

  11. Summary - Definitions Input neuron: a special type of neuron that simply returns the value of its input . Bias neuron: a neuron that always return 1 . Summary 9/47

  12. Summary - Definitions Input neuron: a special type of neuron that simply returns the value of its input . Bias neuron: a neuron that always return 1 . Fully connected layer or dense layer: all the neurons are connected to all the neurons of the previous layer. Summary 9/47

  13. Summary - Definitions Input neuron: a special type of neuron that simply returns the value of its input . Bias neuron: a neuron that always return 1 . Fully connected layer or dense layer: all the neurons are connected to all the neurons of the previous layer. X: input matrix (rows are instances, columns are features). Summary 9/47

  14. Summary - Definitions Input neuron: a special type of neuron that simply returns the value of its input . Bias neuron: a neuron that always return 1 . Fully connected layer or dense layer: all the neurons are connected to all the neurons of the previous layer. X: input matrix (rows are instances, columns are features). W: weight matrix (# rows corresponds to the number of inputs, # columns corresponds to the number of neurons in the output layer). Summary 9/47

  15. Summary - Definitions Input neuron: a special type of neuron that simply returns the value of its input . Bias neuron: a neuron that always return 1 . Fully connected layer or dense layer: all the neurons are connected to all the neurons of the previous layer. X: input matrix (rows are instances, columns are features). W: weight matrix (# rows corresponds to the number of inputs, # columns corresponds to the number of neurons in the output layer). b: bias vector (same size as the number of neurons in the output layer). Summary 9/47

  16. Summary - Definitions Input neuron: a special type of neuron that simply returns the value of its input . Bias neuron: a neuron that always return 1 . Fully connected layer or dense layer: all the neurons are connected to all the neurons of the previous layer. X: input matrix (rows are instances, columns are features). W: weight matrix (# rows corresponds to the number of inputs, # columns corresponds to the number of neurons in the output layer). b: bias vector (same size as the number of neurons in the output layer). Activation function: maps its input domain to a restricted set of values (heavyside and sign are commonly used with threshold logic unit perceptrons). Summary 9/47

  17. Summary - Multilayer Perceptron A two-layer perceptron computes: y = f 2 ( f 1 ( X )) where f l ( Z ) = ϕ ( W l Z + b l ) ϕ is an activation function, typically one of: hyperbolic tangent , Rectified Linear Unit function , sigmoid , etc. W is a weight matrix, X is an input matrix, and b is a bias vector. In the context of artificial neural networks, matrices are called tensors . Source: [3] Figure 10.7 Summary 10/47

  18. Summary - Multilayer Perceptron A k -layer perceptron computes the following function: y = f k ( . . . f 2 ( f 1 ( X )) . . . ) where f l ( Z ) = ϕ ( W l Z + b l ) Summary 11/47

  19. Keras Keras 12/47

  20. https://keras.io Using Keras (François Chollet/Google/2015 1st release) Keras 13/47

  21. https://keras.io Using Keras (François Chollet/Google/2015 1st release) Personally , I find it easier to install and maintain Keras using a package manager, such as Conda (specifically, I use Anaconda ). Keras 13/47

  22. https://keras.io Using Keras (François Chollet/Google/2015 1st release) Personally , I find it easier to install and maintain Keras using a package manager, such as Conda (specifically, I use Anaconda ). Easy to use, yet powerfull and efficient (makes use of GPUs if available) Keras 13/47

  23. https://keras.io Using Keras (François Chollet/Google/2015 1st release) Personally , I find it easier to install and maintain Keras using a package manager, such as Conda (specifically, I use Anaconda ). Easy to use, yet powerfull and efficient (makes use of GPUs if available) Two main API: Sequential and Functional Keras 13/47

  24. Sequential API from keras . models import S e q u e n t i a l model = S e q u e n t i a l () Keras 14/47

  25. Sequential API from keras . models import S e q u e n t i a l model = S e q u e n t i a l () from keras . l a y e r s import Dense model . add ( Dense ( u n i t s =64, a c t i v a t i o n=’ r e l u ’ , input_dim =100)) model . add ( Dense ( u n i t s =10, a c t i v a t i o n=’ softmax ’ )) Keras 14/47

  26. Sequential API from keras . models import S e q u e n t i a l model = S e q u e n t i a l () from keras . l a y e r s import Dense model . add ( Dense ( u n i t s =64, a c t i v a t i o n=’ r e l u ’ , input_dim =100)) model . add ( Dense ( u n i t s =10, a c t i v a t i o n=’ softmax ’ )) model . compile ( l o s s=’ c a t e g o r i c a l _ c r o s s e n t r o p y ’ , o p t i m i z e r=’ sgd ’ , m e t r i c s =[ ’ accuracy ’ ] ) Keras 14/47

  27. Sequential API from keras . models import S e q u e n t i a l model = S e q u e n t i a l () from keras . l a y e r s import Dense model . add ( Dense ( u n i t s =64, a c t i v a t i o n=’ r e l u ’ , input_dim =100)) model . add ( Dense ( u n i t s =10, a c t i v a t i o n=’ softmax ’ )) model . compile ( l o s s=’ c a t e g o r i c a l _ c r o s s e n t r o p y ’ , o p t i m i z e r=’ sgd ’ , m e t r i c s =[ ’ accuracy ’ ] ) model . f i t ( x_train , y_train , epochs =5, batch_size =32) Keras 14/47

  28. Sequential API from keras . models import S e q u e n t i a l model = S e q u e n t i a l () from keras . l a y e r s import Dense model . add ( Dense ( u n i t s =64, a c t i v a t i o n=’ r e l u ’ , input_dim =100)) model . add ( Dense ( u n i t s =10, a c t i v a t i o n=’ softmax ’ )) model . compile ( l o s s=’ c a t e g o r i c a l _ c r o s s e n t r o p y ’ , o p t i m i z e r=’ sgd ’ , m e t r i c s =[ ’ accuracy ’ ] ) model . f i t ( x_train , y_train , epochs =5, batch_size =32) loss_and_metrics = model . e v a l u a t e ( x_test , y_test ) Keras 14/47

  29. Functional API keras . l a y e r s Input , Dense from import keras . models import Model from # This r e t u r n s a t e n s o r i n p u t s = Input ( shape =(784 ,)) # a l a y e r i n s t a n c e i s c a l l a b l e on a tensor , and r e t u r n s a t e n s o r output_1 = Dense (64 , a c t i v a t i o n=’ r e l u ’ )( i n p u t s ) output_2 = Dense (64 , a c t i v a t i o n=’ r e l u ’ )( output_1 ) p r e d i c t i o n s = Dense (10 , a c t i v a t i o n=’ softmax ’ )( output_2 ) # This c r e a t e s a model that i n c l u d e s # the Input l a y e r and t h r e e Dense l a y e r s model = Model ( i n p u t s=inputs , outputs=p r e d i c t i o n s ) model . compile ( o p t i m i z e r=’ rmsprop ’ , l o s s=’ c a t e g o r i c a l _ c r o s s e n t r o p y ’ , m e t r i c s =[ ’ accuracy ’ ] ) model . f i t ( data , l a b e l s ) # s t a r t s t r a i n i n g Keras 15/47

  30. Preprocessing Preprocessing 16/47

  31. Scaling As discussed at the begining of the term, it is almost always a good idea to scale the input data . Custom code sklearn.preprocessing.StandardScaler keras.layers.Lambda Standardization layer Preprocessing 17/47

  32. keras.layers.Lambda means = np . mean( X_train , a x i s =0, keepdims=True ) s t d s = np . std ( X_train , a x i s =0, keepdims=True ) eps = keras . backend . e p s i l o n () model = keras . models . S e q u e n t i a l ( [ keras . l a y e r s . Lambda( lambda i n p u t s : ( i n p u t s − means ) / ( s t d s + eps ) ) , [ . . . ] # other l a y e r s ] ) Source: [3] §11 Preprocessing 18/47

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend