assembly assembly
play

Assembly Assembly Computational Challenge: assemble individual short - PowerPoint PPT Presentation

Assembly Assembly Computational Challenge: assemble individual short fragments (reads) into a single genomic sequence (superstring) Shortest common superstring Problem: Given a set of strings, find a shortest string that contains all of


  1. Assembly

  2. Assembly Computational Challenge: assemble individual short fragments (reads) into a single genomic sequence (“superstring”)

  3. Shortest common superstring Problem: Given a set of strings, find a shortest string that contains all of them Input: Strings s 1 , s 2 ,…., s n Output: A string s that contains all strings s 1 , s 2 , …., s n as substrings, such that the length of s is minimized

  4. Shortest common superstring

  5. Overlap Graph

  6. De Bruijn Graph

  7. Overlap graph vs De Bruijn graph CG GT TG CA AT GC Path visited every EDGE once GG

  8. Some definitions

  9. Eulerian walk/path zero or

  10. Assume all nodes are balanced a. Start with an arbitrary vertex v and form an arbitrary cycle with unused edges until a dead end is reached. Since the graph is Eulerian this dead end is necessarily the starting point, i.e., vertex v .

  11. b. If cycle from (a) is not an Eulerian cycle, it must contain a vertex w , which has untraversed edges. Perform step (a) again, using vertex w as the starting point. Once again, we will end up in the starting vertex w.

  12. c. Combine the cycles from (a) and (b) into a single cycle and iterate step (b).

  13. Eulerian path • A vertex v is � semibalanced � if | in-degree( v ) - out-degree( v )| = 1 • If a graph has an Eulerian path starting from s and ending at t , then all its vertices are balanced with the possible exception of s and t • Add an edge between two semibalanced vertices: now all vertices should be balanced (assuming there was an Eulerian path to begin with). Find the Eulerian cycle, and remove the edge you had added. You now have the Eulerian path you wanted.

  14. Complexity?

  15. Hidden Markov Models

  16. Markov Model (Finite State Machine with Probs) Modeling a sequence of weather observations

  17. Hidden Markov Models Assume the states in the machine are not observed and we can observe some output at certain states.

  18. Hidden Markov Models Assume the states in the machine are not observed and we can observe some output at certain states. Hidden: Sunny Hidden: Rainy Observation: Clean Observation: Walk Observation: Shop

  19. Generate a sequence from a HMM p ( s ( i + 1) | s ( i )) p ( s ( i ) | s ( i − 1)) Hidden s(i-1) s(i) s(i+1) Observed x(i-1) x(i) x(i+1) p ( x ( i + 1) | s ( i + 1)) p ( x ( i ) | s ( i )) p ( x ( i − 1) | s ( i − 1))

  20. Generate a sequence from a HMM HHHHHHCCCCCCCHHHHHH Hidden: temperature 3323332111111233332 Observed: number of ice creams

  21. Hidden Markov Models: Applications Speech recognition Action recognition

  22. Motif Finding Problem: Find frequent motifs with length L in a sequence dataset ATCGCGCGGCGCGGAATCGDTATCGCGCGCC CAGGTAAGT GCGCGCG CAGGTAAGG TATTATGCGAGACGATGTGCTATT GTAGGCTGATGTGGGGGG AAGGTAAGT CGAGGAGTGCATG CTAGGGAAACCGCGCGCGCGCGAT AAGGTGAGT GGGAAAG Assumption: the motifs are very similar to each other but look very different from the rest part of sequences

  23. Motif: a first approximation Assumption 1: lengths of motifs are fixed to L Assumption 2: states on different positions on the sequence are independently distributed N i ( A ) p i ( A ) = N i ( A ) + N i ( T ) + N i ( G ) + N i ( C ) L Y p ( x ) = p i ( x ( i )) i =1

  24. Motif: (Hidden) Markov models Assumption 1: lengths of motifs are fixed to L Assumption 2: future letters depend only on the present letter p i ( A | G ) = N i − 1 ,i ( G, A ) N i − 1 ( G ) L Y p ( x ) = p 1 ( x (1)) p i ( x ( i ) | x ( i − 1)) i =2

  25. Motif Finding Problem: We don’t know the exact locations of motifs in the sequence dataset ATCGCGCGGCGCGGAATCGDTATCGCGCGCC CAGGTAAGT GCGCGCG CAGGTAAGG TATTATGCGAGACGATGTGCTATT GTAGGCTGATGTGGGGGG AAGGTAAGT CGAGGAGTGCATG CTAGGGAAACCGCGCGCGCGCGAT AAGGTGAGT GGGAAAG Assumption: the motifs are very similar to each other but look very different from the rest part of sequences

  26. Hidden state space null start end

  27. Hidden Markov Model (HMM) 0.9 null 0.99 0.02 start end 0.08 0.95 0.01 0.05

  28. How to build HMMs?

  29. Computational problems in HMMs

  30. Hidden Markov Models

  31. Hidden Markov Model Hidden q(i-1) q(i) q(i+1) Observed o(i-1) o(i) o(i+1)

  32. Conditional Probability of Observations Example:

  33. Joint and marginal probabilities Joint: Marginal:

  34. How to compute the probability of observations

  35. Forward algorithm

  36. Forward algorithm

  37. Forward algorithm

  38. Decoding: finding the most probable states Similar to the forward algorithm, we can define the following value:

  39. Viterbi algorithm

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend