topics in brain computer interfaces topics in brain
play

Topics in Brain Computer Interfaces Topics in Brain Computer - PowerPoint PPT Presentation

Topics in Brain Computer Interfaces Topics in Brain Computer Interfaces CS295- -7 7 CS295 Professor: M ICHAEL B LACK TA: F RANK W OOD Spring 2005 Bayesian Inference through Particle Filtering Frank Wood - CS295-7 2005 Brown University


  1. Topics in Brain Computer Interfaces Topics in Brain Computer Interfaces CS295- -7 7 CS295 Professor: M ICHAEL B LACK TA: F RANK W OOD Spring 2005 Bayesian Inference through Particle Filtering Frank Wood - CS295-7 2005 Brown University

  2. Homework Review • Results? • Questions? • Causal vs. Generative? Frank Wood - CS295-7 2005 Brown University

  3. Decoding Methods Direct decoding methods: v v v = x f z z ( , ,...) − k k k 1 Simple linear regression method v v = T x f Z 1 v − k k k d : v = T y f Z − k k k d 2 : Frank Wood - CS295-7 2005 Brown University

  4. Decoding Methods Direct decoding methods: v v v = x f z z ( , ,...) − k k k 1 In contrast to generative encoding models: v = v z f x ( ) k k Need a sound way to exploit generative models for decoding. Frank Wood - CS295-7 2005 Brown University

  5. Today’s Strategy • More mathematical than the previous classes. • Group exploration and discovery. • One topic with deeper level of understanding. – Particle Filtering • Review and explore recursive Bayesian estimation • Introduce SIS algorithm • Explore Monte Carlo integration • Examine SIS algorithm (if time permits) Frank Wood - CS295-7 2005 Brown University

  6. Accounting for Uncertainty Every real process has process and measurement noise v v = + x x f ( ) noise − k k process 0 : 1 v v = + z x f ( ) noise k k observatio n 0 : A probabilistic process model accounts for process and measurement noise probabilistically. Noise appears as modeling uncertainty. v v v ˆ + x x x | ~ f ( ) uncertaint y − − k k k 1 process 0 : 1 v v v ˆ + z x x | ~ f ( ) uncertaint y k k k observatio n 0 : Example: missile interceptor system. The missile propulsion system is noisy and radar observations are noisy. Even if we are given exact process and observation models our estimate of the missile’s position may diverge if we don’t account for uncertainty. Frank Wood - CS295-7 2005 Brown University

  7. Recursive Bayesian Estimation • Optimally integrates subsequent observations into process and v v v observation models. ˆ + x x x | ~ f ( ) uncertaint y − − k k k 1 model 0 : 1 v v v ˆ + z x x | ~ f ( ) uncertaint y k k k measuremen t 0 : • Example – Biased coin. – Fair Bernoulli prior Frank Wood - CS295-7 2005 Brown University

  8. Bayesian Inference Likelihood (evidence) Posterior Prior ( a priori – p p ( z | x ) ( x ) = before the evidence) p x ( | z ) p ( z ) a posteriori probability (after the evidence) normalization constant (independent of mouth) We infer system state from uncertain observations and our prior knowledge (model) of system state. Frank Wood - CS295-7 2005 Brown University

  9. Notation and BCI Example Observations System State v v v v v v v v v v = X x x x x K = or ( , , , ) Z z z z z K or ( , , , ) k k k 1 : 1 2 − k k k 1 : 1 2 1 ⎡ ⎤ x ⎡ ⎤ z k ⎢ ⎥ k y 1 , ⎢ ⎥ k ⎢ ⎥ z v ⎢ ⎥ v ⎢ ⎥ k = 2 , z x k v , ⎢ ⎥ = k ⎢ ⎥ x M v k ⎢ ⎥ ⎢ ⎥ y k , z ⎢ ⎥ ⎣ ⎦ a n k , x k ⎢ , ⎥ a ⎢ ⎥ ⎣ ⎦ y k , e.g. firing rates of all n cells at time k e.g. hand kinematics at time k Frank Wood - CS295-7 2005 Brown University

  10. Generative Model linear, non-linear? Encoding: noise (e.g. v v v = + z f x q Normal or ( ) k obs k k 1 : Poisson) v v v = + x f x w ( − ) k p k k 1 : 1 neural firing rate state (e.g. hand position, f()’s Markov? velocity, acceleration) Frank Wood - CS295-7 2005 Brown University

  11. Today’s Goal Build a probabilistic model of this real process and with it estimate the posterior distribution p ( x | z ) k 1 k : p x z so that we can infer the most likely state argmax ( | ) k 1 k : x k ∫ or the expected state x p x z ( | ) k k 1 k : How ??? ⇒ p p ( x | z ) ( x | z ) − − k k k k 1 1 : 1 1 : Recursion! • How can we formulate this recursion ? • How can we compute this recursion ? • What assumptions must we make ? Frank Wood - CS295-7 2005 Brown University

  12. Modeling • An useful aside: Graphical models Graphical models are a way of systematically diagramming the dependencies amongst groups of random variables. Graphical models can help elucidate assumptions and modeling choices that would otherwise be hard to visualize and understand. Using a graphical model will help us design our model! Frank Wood - CS295-7 2005 Brown University

  13. Graphical Model Generative model: v x k v v p z k x ( | ) k v z k Frank Wood - CS295-7 2005 Brown University

  14. Graphical Model v v v x x x − + k k k 1 1 v v v z z z − + k k k 1 1 Frank Wood - CS295-7 2005 Brown University

  15. Graphical Model v v v x x x − + k k k 1 1 v v v z z z − + k k k 1 1 = p x x x x p x x K ( | , , , ) ( | ) − − − k k k k k 1 2 1 1 r v v v v v v v v v = = p x z x p x x p z z x p z x ( | ) ( | ) ( | , ) ( | ) − − − − k k k k k k k k k k 1 : 1 , 1 1 1 : 1 Frank Wood - CS295-7 2005 Brown University

  16. Summary From these modeling choices all we have to choose is: Likelihood model Initial distributions v Encoding model v p r p v p z k x ( | ) x z ( 0 ) , ( 0 ) k Temporal prior model v v p x k x ( | ) − k 1 How to compute the posterior decoding v v v v v v v v v ∫ = κ p x z p z x p x x p x z d x ( | ) ( | ) ( | ) ( | ) − − − − k k k k k k k k k 1 : 1 1 1 : 1 1 Frank Wood - CS295-7 2005 Brown University

  17. Linear Gaussian Generative Model ⎛ ⎞ x ⎜ ⎟ k y ⎜ ⎟ system Observation Equation: k ⎜ ⎟ state v ⎜ ⎟ x vector k firing ⎛ ⎞ z 1 ⎜ ⎟ v 42 X 42 matrix ⎜ ⎟ k (zero y rate ⎜ ⎟ k ⎜ ⎟ z 2 a mean) k ⎜ ⎟ vector ⎜ ⎟ x v v v k M ⎜ ⎟ v ⎜ ⎟ = + (zero a z H x q ⎝ ⎠ ⎜ ⎟ q N Q y ~ ( 0 , ) z k 42 ⎝ ⎠ mean, k k k k k sqrt) = k L 0 , 1 , 2 , 42 X 6 matrix System Equation: 6 X 6 matrix v v v = + v x A x w w N W ~ ( 0 , ) + k k k k 1 = k 0 , 1 , 2 , L 6 X 6 matrix Frank Wood - CS295-7 2005 Brown University

  18. Gaussian Assumption Clarified Gaussian distribution: v v z N H x Q ~ ( , ) k k v v v − = z H x q N Q ~ ( 0 , ) k k k Recall: 2 σ 1 1 = − − µ p x x 2 ( ) exp( ( ) / ) π σ 2 2 Frank Wood - CS295-7 2005 Brown University

  19. Graphical Model x x x − + k k 1 k 1 = + x A x w − 1 k k k k = + z H x q k k k k z z z + − k k k 1 1 = p p p ( X , Z ) ( X ) ( Z | X ) M M M M M M M ∏ ∏ = p p p [ ( x ) ( x x ) ][ ( z x )] − k k k k 1 1 = = k k 2 1 Frank Wood - CS295-7 2005 Brown University

  20. Break! • When we come back quickly arrange yourselves in groups of 4 or 5. Uniformly distribute the applied math people and people who have taken CS143 (Vision) into these groups. • Instead of straight-up lecture we are going to work through some derivations together to improve retention and facilitate understanding. • If you don’t have pencil and paper please get some. Frank Wood - CS295-7 2005 Brown University

  21. Next step. • Now we have a model – how do we do recursive Bayesian inference. • I will present to you several relatively easy problems which I expect each group to solve in 5-10 minutes. When every group is finished I will select one group and ask for the person in that group who understood the problem the least to explain the solution to the class. The group is responsible for nominating this person and his or her ability to explain the solution. Frank Wood - CS295-7 2005 Brown University

  22. Recursive Bayesian Inference Likelihood Prior ( a priori – (evidence) before the evidence) Posterior p p ( firing | kinematics ) ( kinematics ) = p t t t ( kinematics | firing ) t 1 : t p ( firing ) 1 : t a posteriori probability normalization constant (after the evidence) (independent of kinematics) We sequentially infer hand kinematics from uncertain evidence and our prior knowledge of how hands move. Frank Wood - CS295-7 2005 Brown University

  23. Recursive Bayesian Estimation • Update Stage – From the prediction stage you have a prior distribution over the system state at the current time k . After observing the process at time k you can update the posterior to reflect that new information. • Prediction Stage – Given the posterior from a previous update stage and your system model you produce the next prior distribution. Frank Wood - CS295-7 2005 Brown University

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend