Deep Learning for Network Biology Marinka Zitnik and Jure Leskovec - - PowerPoint PPT Presentation

deep learning for network biology
SMART_READER_LITE
LIVE PREVIEW

Deep Learning for Network Biology Marinka Zitnik and Jure Leskovec - - PowerPoint PPT Presentation

Deep Learning for Network Biology Marinka Zitnik and Jure Leskovec Stanford University Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 1 This Tutorial snap.stanford.edu/deepnetbio-ismb ISMB 2018 July 6,


slide-1
SLIDE 1

Deep Learning for Network Biology

Marinka Zitnik and Jure Leskovec

Stanford University

1 Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018
slide-2
SLIDE 2

This Tutorial

snap.stanford.edu/deepnetbio-ismb

ISMB 2018 July 6, 2018, 2:00 pm - 6:00 pm

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 2
slide-3
SLIDE 3

This Tutorial

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 3

1) Node embeddings

§ Map nodes to low-dimensional embeddings § Applications: PPIs, Disease pathways

2) Graph neural networks

§ Deep learning approaches for graphs § Applications: Gene functions

3) Heterogeneous networks

§ Embedding heterogeneous networks § Applications: Human tissues, Drug side effects

slide-4
SLIDE 4 4

Part 2: Graph Neural Networks

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018

Some materials adapted from:

  • Hamilton et al. 2018. Representation Learning on
  • Networks. WWW.
slide-5
SLIDE 5

f( )=

Embedding Nodes

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 5

Intuition: Map nodes to d-dimensional embeddings such that similar nodes in the graph are embedded close together

Disease similarity network 2-dimensional node embeddings

slide-6
SLIDE 6

Embedding Nodes

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 6

Goal: Map nodes so that similarity in the embedding space (e.g., dot product) approximates similarity in the network

Input network d-dimensional embedding space

slide-7
SLIDE 7

Embedding Nodes

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 7

similarity(u, v) ≈ z>

v zu

Goal: Need to define!

Input network d-dimensional embedding space

slide-8
SLIDE 8

Two Key Components

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 8

§ Encoder: Map a node to a low-dimensional vector: § Similarity function defines how relationships in the input network map to relationships in the embedding space:

enc(v) = zv

node in the input graph d-dimensional embedding Similarity of u and v in the network dot product between node embeddings

similarity(u, v) ≈ z>

v zu

slide-9
SLIDE 9

So Far: Shallow Encoders

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 9

Shallow encoders:

§ One-layer of data transformation § A single hidden layer maps node 𝑣 to embedding 𝒜& via function 𝑔, e.g., 𝒜& = 𝑔 𝒜), 𝑤 ∈ 𝑂. 𝑣

slide-10
SLIDE 10

Shallow Encoders

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 10

§ Limitations of shallow encoding:

§ O(|V|) parameters are needed:

§ No sharing of parameters between nodes § Every node has its own unique embedding

§ Inherently “transductive”:

§ Cannot generate embeddings for nodes that are not seen during training

§ Do not incorporate node features:

§ Many graphs have features that we can and should leverage

slide-11
SLIDE 11

Deep Graph Encoders

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 11

§ Next: We will now discuss deep methods based on graph neural networks: § Note: All these deep encoders can be combined with similarity functions from the previous section

enc(v) =

multiple layers of non-linear transformation of graph structure

slide-12
SLIDE 12

Deep Graph Encoders

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 12

slide-13
SLIDE 13

Idea: Convolutional Networks

CNN on an image:

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 13

Goal is to generalize convolutions beyond simple lattices Leverage node features/attributes (e.g., text, images)

slide-14
SLIDE 14

From Images to Networks

Single CNN layer with 3x3 filter:

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 14 (Animation Vincent Dumoul

Image Graph

Transform information at the neighbors and combine it

§ Transform “messages” ℎ0 from neighbors: 𝑋

0 ℎ0

§ Add them up: ∑ 𝑋

0 ℎ0

slide-15
SLIDE 15

Real-World Graphs

But what if your graphs look like this?

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 15
  • r this:

s like this?

  • r this:

§ Examples:

Biological networks, Medical networks, Social networks, Information networks, Knowledge graphs, Communication networks, Web graph, …

slide-16
SLIDE 16

A Naïve Approach

§ Join adjacency matrix and features § Feed them into a deep neural net: § Issues with this idea:

§ 𝑃(𝑂) parameters § Not applicable to graphs of different sizes § Not invariant to node ordering

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 16 A B C D E A B C D E 0 1 1 1 0 1 0 1 0 0 1 1 0 0 1 0 0 1 0 0 1 1 1 1 0 1 1 1 0 1 0 1 0 1 0 Feat
  • Done?

?

A C B D E
slide-17
SLIDE 17

Outline of This Section

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 17

1.Basics of deep learning for graphs 2.Graph convolutional networks 3.Biomedical applications

slide-18
SLIDE 18 18

Basics of Deep Learning for Graphs

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018

Based on material from:

  • Hamilton et al. 2017. Representation Learning on Graphs: Methods and
  • Applications. IEEE Data Engineering Bulletin on Graph Systems.
  • Scarselli et al. 2005. The Graph Neural Network Model. IEEE Transactions
  • n Neural Networks.
  • Kipf et al., 2017. Semisupervised Classification with Graph Convolutional
  • Networks. ICLR.
slide-19
SLIDE 19

Setup

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 19

§ Assume we have a graph 𝐻:

§ 𝑊 is the vertex set § 𝑩 is the adjacency matrix (assume binary) § 𝒀 ∈ ℝ=×|@| is a matrix of node features

§ Biologically meaningful node features:

– E.g., immunological signatures, gene expression profiles, gene functional information

§ No features:

– Indicator vectors (one-hot encoding of a node)

slide-20
SLIDE 20

Examples

Protein-protein interaction networks in different tissues, e.g., blood, substantia nigra

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 20

WNT1 RPT6

Node feature: Associations of proteins with midbrain development Node feature: Associations of proteins with angiogenesis

slide-21
SLIDE 21

Graph Convolutional Networks

Graph Convolutional Networks:

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 21

Problem: For a given subgraph how to come with canonical node ordering

Learning convolutional neural networks for graphs. M. Niepert, M. Ahmed, K. Kutzkov ICML. 2016.

... ...

neighborhood graph construction convolutional architecture node sequence selection graph normalization

slide-22
SLIDE 22

Our Approach

Learn how to propagate information across the graph to compute node features

22 Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018

Determine node computation graph Propagate and transform information

𝑗

Idea: Node’s neighborhood defines a computation graph

Semi-Supervised Classification with Graph Convolutional Networks. T. N. Kipf, M. Welling, ICLR 2017
slide-23
SLIDE 23

Idea: Aggregate Neighbors

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 23

Key idea: Generate node embeddings based on local network neighborhoods

INPUT GRAPH TARGET NODE B D E F C A B C D A A A C F B E A
slide-24
SLIDE 24

Idea: Aggregate Neighbors

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 24

Intuition: Nodes aggregate information from their neighbors using neural networks

INPUT GRAPH TARGET NODE B D E F C A B C D A A A C F B E A

Neural networks

slide-25
SLIDE 25

Idea: Aggregate Neighbors

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 25

Intuition: Network neighborhood defines a computation graph

Every node defines a computation graph based on its neighborhood!

slide-26
SLIDE 26

Deep Model: Many Layers

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 26

§ Model can be of arbitrary depth:

§ Nodes have embeddings at each layer § Layer-0 embedding of node u is its input feature, i.e. xu.

INPUT GRAPH TARGET NODE B D E F C A B C D A A A C F B E A

xA xB xC xE xF xA xA

Layer-2 Layer-1 Layer-0

slide-27
SLIDE 27

Aggregation Strategies

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 27 INPUT GRAPH TARGET NODE B D E F C A B C D A A A C F B E A

? ? ? ? What’s in the box!?

§ Neighborhood aggregation: Key distinctions are in how different approaches aggregate information across the layers

slide-28
SLIDE 28

Neighborhood Aggregation

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 28 INPUT GRAPH TARGET NODE B D E F C A B C D A A A C F B E A

§ Basic approach: Average information from neighbors and apply a neural network 1) average messages from neighbors 2) apply neural network

slide-29
SLIDE 29

Average of neighbor’s previous layer embeddings

The Math: Deep Encoder

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 29

§ Basic approach: Average neighbor messages and apply a neural network

Initial 0-th layer embeddings are equal to node features Embedding after K layers of neighborhood aggregation Non-linearity (e.g., ReLU) Previous layer embedding of v

h0

v = xv

hk

v = σ

@Wk X

u∈N(v)

hk−1

u

|N(v)| + Bkhk−1

v

1 A , ∀k ∈ {1, ..., K} zv = hK

v

slide-30
SLIDE 30

Training the Model

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 30

Need to define a loss function

  • n the embeddings!

How do we train the model to generate embeddings?

𝒜C

slide-31
SLIDE 31

Model Parameters

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 31

We can feed these embeddings into any loss function and run stochastic gradient descent to train the weight parameters

trainable weight matrices (i.e., what we learn)

h0

v = xv

hk

v = σ

@Wk X

u∈N(v)

hk−1

u

|N(v)| + Bkhk−1

v

1 A , ∀k ∈ {1, ..., K} zv = hK

v

slide-32
SLIDE 32

Unsupervised Training

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 32

§ Train in an unsupervised manner:

§ Use only the graph structure § “Similar” nodes have similar embeddings

§ Unsupervised loss function can be anything from the last section, e.g., a loss based on

§ Random walks (node2vec, DeepWalk, struc2vec) § Graph factorization § Node proximity in the graph

slide-33
SLIDE 33

Unsupervised: Example

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 33

Image from: Rhee et al. 2017. Hybrid Approach of Relation Network and Localized Graph Convolutional Filtering for Breast Cancer Subtype Classification. arXiv.

slide-34
SLIDE 34

Supervised Training

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 34

Directly train the model for a supervised task (e.g., node classification)

Safe or toxic drug? Safe or toxic drug?

E.g., a drug-drug interaction network

slide-35
SLIDE 35

Supervised: Example

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 35

Graph neural network applied to gene-gene interaction graph to predict gene expression level Single gene inference task by adding nodes based on their distance from the node we want to predict

Image from: Dutil et al. 2018. Towards Gene Expression Convolutions using Gene Interaction Graphs. arXiv.

slide-36
SLIDE 36

Training the Model

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 36

Directly train the model for a supervised task (e.g., node classification)

Encoder output: node embedding Classification weights Node class label

Safe or toxic drug?

L = X

v2V

yv log(σ(z>

v θ)) + (1 − yv) log(1 − σ(z> v θ))

slide-37
SLIDE 37

Model Design: Overview

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 37

1) Define a neighborhood aggregation function 2) Define a loss function on the embeddings

𝒜C

slide-38
SLIDE 38

Model Design: Overview

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 38

3) Train on a set of nodes, i.e., a batch of compute graphs

slide-39
SLIDE 39

Model Design: Overview

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 39

4) Generate embeddings for nodes

𝒜C 𝒜D 𝒜E

slide-40
SLIDE 40

Summary So Far

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 40

§ Recap: Generate node embeddings by aggregating neighborhood information

§ We saw a basic variant of this idea § Key distinctions are in how different approaches aggregate information across the layers

§ Next: Describe state-of-the-art graph neural network

slide-41
SLIDE 41

Outline of This Section

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 41

1.Basics of deep learning for graphs 2.Graph convolutional networks 3.Biomedical applications

slide-42
SLIDE 42 42

Graph Convolutional Networks

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018

Based on material from:

  • Hamilton et al., 2017. Inductive Representation Learning on Large Graphs.

NIPS.

slide-43
SLIDE 43

GraphSAGE

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 43 INPUT GRAPH TARGET NODE B D E F C A B C D A A A C F B E A

??? ? ? ?

So far we have aggregated the neighbor messages by taking their (weighted) average Can we do better?

slide-44
SLIDE 44 INPUT GRAPH TARGET NODE B D E F C A B C D A A A C F B E A

GraphSAGE: Idea

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 44

hk

v = σ

⇥ Ak · agg({hk−1

u

, ∀u ∈ N(v)}), Bkhk−1

v

Any differentiable function that maps set of vectors in 𝑂(𝑣) to a single vector

slide-45
SLIDE 45

§ Simple neighborhood aggregation: § GraphSAGE:

GraphSAGE Aggregation

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 45

generalized aggregation Concatenate self embedding and neighbor embedding

hk

v = σ

⇥ Wk · agg

  • {hk−1

u

, ∀u ∈ N(v)}

  • , Bkhk−1

v

hk

v = σ

@Wk X

u∈N(v)

hk−1

u

|N(v)| + Bkhk−1

v

1 A

slide-46
SLIDE 46

Variants of Aggregation

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 46

Mean: Take a weighted average of neighbors Pool: Transform neighbor vectors and apply symmetric vector function LSTM: Apply LSTM to reshuffled of neighbors

agg = X

u∈N(v)

hk−1

u

|N(v)|

agg = LSTM

  • [hk−1

u

, ∀u ∈ π(N(v))]

  • element-wise mean/max

agg = γ

  • {Qhk−1

u

, ∀u ∈ N(v)}

slide-47
SLIDE 47

Summary So Far

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 47

Key idea: Generate node embeddings based on local neighborhoods

§ Nodes aggregate “messages” from their neighbors using neural networks

𝑤

hk−1

u khk−1 v

hk

v

slide-48
SLIDE 48

More on Graph Neural Nets

48 Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018

Attention-based neighborhood aggregation: § Graph attention networks (Hoshen, 2017; Velickovic et al., 2018; Liu et al., 2018) Embedding edges and entire graphs: § Graph neural nets with edge embeddings (Battaglia et al., 2016; Gilmer et. al., 2017) § Embedding entire graphs (Duvenaud et al., 2015; Dai et al., 2016; Li et al., 2018)

Spectral approaches to graph neural networks: § Spectral graph CNN & ChebNet (Bruna et al., 2015; Defferrard et al., 2016) Hyperbolic geometry and hierarchical embeddings: § Hierarchical relations (Nickel et al., 2017; Nickel et al., 2018)

slide-49
SLIDE 49

Outline of This Section

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 49

1.Basics of deep learning for graphs 2.Graph convolutional networks 3.Biomedical applications

slide-50
SLIDE 50 50

Application: Tissue-specific Protein Function Prediction

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018

Material based on:

  • Zitnik and Leskovec. 2017. Predicting Multicellular Function through

Multilayer Tissue Networks. ISMB.

  • Hamilton et al., 2017. Inductive Representation Learning on Large Graphs.

NIPS.

slide-51
SLIDE 51

[Greene et al. 2015, Yeger & Sharan 2015, GTEx and others]

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 51

Why Protein Functions?

Knowledge of protein functions in different tissues is essential for:

§ Understanding human biology § Interpreting genetic variation § Developing disease treatments

slide-52
SLIDE 52 52

Biotechnological limits & rapid growth of sequence data: most proteins can only be annotated computationally

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018

Why Predicting Protein Functions?

slide-53
SLIDE 53

Protein Function Prediction

53

CDC3 CDC16 CLB4 RPN3 RPT1 RPT6 UNK1 UNK2 CDC3 CDC16 CLB4 RPN3 RPT1 RPT6 UNK1

Cell proliferation Cell cycle

UNK2 Machine Learning

This is a multi-label node classification task

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018
slide-54
SLIDE 54

What Does My Protein Do?

Goal: Given a protein and a tissue, predict the protein’s functions in that tissue

Proteins × Functions × Tissue𝑡 → [0,1]

𝑋𝑂𝑈1 × (Midbrain development, Substantia nigra) → 0.9 RPT6 × (Angiogenesis, Blood) → 0.05 Midbrain development

WNT1

Substantia nigra tissue

Angiogenesis

RPT6

Blood tissue

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 54
slide-55
SLIDE 55

Existing Research

§ Guilty by association: protein’s function is determined based on who it interacts with § No tissue-specificity § Protein functions are assumed constant across organs and tissues: § Functions in heart are the same as in skin

Lack of methods for predicting protein functions in different biological contexts

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 55
slide-56
SLIDE 56

Challenges

§ Tissues are related to each other:

§ Proteins in biologically similar tissues have similar functions § Proteins are missing in some tissues

§ Little is known about tissue-specific protein functions:

§ Many tissues have no annotations

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 56
slide-57
SLIDE 57

Approach

  • 1. Represent every tissue with a separate

protein-protein interaction graph:

§ Protein function prediction is a multi-label node classification task § Each protein can have 0, 1, or more functions (labels) in each tissue

  • 2. Learn protein embeddings:

§ Use PPI graphs and labels to train GraphSAGE:

§ Learn how to embed proteins in each tissue:

– Aggregate neighborhood information – Share parameters in the encoder

§ Use inductive learning!

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 57
slide-58
SLIDE 58

Inductive Learning of Tissues

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 58 INPUT GRAPH B D E F C A

Compute graph for node A Compute graph for node B shared parameters shared parameters

Wk

This image cannot currently be

§ The same aggregation parameters are shared for all nodes:

§ Can generalize to unseen nodes § Can make predictions on entirely unseen graphs (tissues)!

Neural model for node A Neural model for node B

slide-59
SLIDE 59

Inductive Learning of Tissues

59

1. Train on a protein-protein interaction graph from one tissue 2. Generate embeddings and make predictions for newly collected data about a different tissue

Train on forebrain tissue Generalize to blood tissue

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018

Midbrain development

WNT1

Angiogenesis

RPT6

Inductive node embedding generalize to entirely unseen graphs

slide-60
SLIDE 60

Data and Setup

§ Data:

§ Protein-protein interaction (PPI) graphs, with each graph corresponding to a different human tissue § Use positional gene sets, motif gene sets, and immunological signatures from MSigDB as node features

§ Feature data is very sparse (42% of nodes have no features) § This makes leveraging neighborhood information critical

§ Use Gene Ontology annotations as labels

§ Setup:

§ Multi-label node classification:

§ Each protein can have 0, 1, or more functions (labels) in each tissue

§ Train GraphSAGE on 20 tissue-specific PPI graphs § Generate new embeddings “on the fly” § Make prediction on entirely unseen graphs (i.e., new tissues)

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 60
slide-61
SLIDE 61

Annotating New Tissues

§ Transfer protein functions to an unannotated tissue § Task: Predict functions in target tissue without access to any annotation/label in that tissue

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 61

§ GraphSAGE significantly

  • utperforms the baseline

approaches § LSTM- and pooling-based aggregators outperform mean- and GCN-based aggregators

  • Unsup. – unsupervised; Sup. – fully supervised GraphSAGE

F1 – scores are in [0,1], higher is better

slide-62
SLIDE 62

Outline of This Section

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 62

1.Basics of deep learning for graphs 2.Graph convolutional networks 3.Biomedical applications

slide-63
SLIDE 63 63

PhD Students Post-Doctoral Fellows Funding Collaborators Industry Partnerships

Claire Donnat Mitchell Gordon David Hallac Emma Pierson Himabindu Lakkaraju Rex Ying Tim Althoff Will Hamilton Baharan Mirzasoleiman Marinka Zitnik Michele Catasta Srijan Kumar Stephen Bach Rok Sosic

Research Staff

Adrijan Bradaschia Dan Jurafsky, Linguistics, Stanford University Christian Danescu-Miculescu-Mizil, Information Science, Cornell University Stephen Boyd, Electrical Engineering, Stanford University David Gleich, Computer Science, Purdue University VS Subrahmanian, Computer Science, University of Maryland Sarah Kunz, Medicine, Harvard University Russ Altman, Medicine, Stanford University Jochen Profit, Medicine, Stanford University Eric Horvitz, Microsoft Research Jon Kleinberg, Computer Science, Cornell University Sendhill Mullainathan, Economics, Harvard University Scott Delp, Bioengineering, Stanford University Jens Ludwig, Harris Public Policy, University of Chicago Geet Sethi Alex Porter Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018
slide-64
SLIDE 64 64

Many interesting high-impact projects in Machine Learning and Large Biomedical Data

Applications: Precision Medicine & Health, Drug Repurposing, Drug Side Effect modeling, Network Biology, and many more

Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018