machine learning
play

Machine Learning CMSC 422 M ARINE C ARPUAT marine@cs.umd.edu What - PowerPoint PPT Presentation

Introduction to Machine Learning CMSC 422 M ARINE C ARPUAT marine@cs.umd.edu What is this course about? Machine learning studies algorithms for learning to do stuff By finding (and exploiting) patterns in data What can we do with


  1. Introduction to Machine Learning CMSC 422 M ARINE C ARPUAT marine@cs.umd.edu

  2. What is this course about? • Machine learning studies algorithms for learning to do stuff • By finding (and exploiting) patterns in data

  3. What can we do with machine learning? Teach robots how to cook from youtube Analyze text & speech videos Recognize objects in images Analyze genomics data

  4. Sometimes machines even perform better than humans! Question Answering system beats Jeopardy champion Ken Jennings at Quiz bowl!

  5. Machine Learning • Paradigm: “Programming by example” – Replace ``human writing code'' with ``human supplying data'' • Most central issue: generalization – How to abstract from ``training'' examples to ``test'' examples?

  6. A growing and fast moving field • Broad applicability – Finance, robotics, vision, machine translation, medicine, etc. • Close connection between theory and practice • Open field, lots of room for new work!

  7. Course Goals • By the end of the semester, you should be able to – Look at a problem – Identify if ML is an appropriate solution – If so, identify what types of algorithms might be applicable – Apply those algorithms • This course is not – A survey of ML algorithms – A tutorial on ML toolkits such as Weka, TensorFlow, …

  8. T opics Foundations of Supervised Learning • Decision trees and inductive bias • Geometry and nearest neighbors • Perceptron • Practical concerns: feature design, evaluation, debugging • Beyond binary classification Advanced Supervised Learning • Linear models and gradient descent • Support Vector Machines • Naive Bayes models and probabilistic modeling • Neural networks • Kernels • Ensemble learning Unsupervised learning • K-means • PCA • Expectation maximization

  9. What you can expect from the instructors Teaching Assistants: We are here to help you learn by Ryan Dorson – Introducing concepts from multiple perspectives • Theory and practice • Readings and class time – Providing opportunities to practice, Joe Yue-Hei Ng and feedback to help you stay on track • Homeworks • Programming assignments

  10. What I expect from you • Work hard (this is a 3-credit class!) – Do a lot of math (calculus, linear algebra, probability) – Do a fair amount of programming • Come to class prepared – Do the required readings!

  11. Highlights from course logistics • HW01 is due Wed Grading 2:59pm • Participation (5%) • Homeworks (15%), ~10, almost weekly • No late homeworks • Programming projects (30%) , 3 of them, in teams of two or three students • Read syllabus here: • Midterm exam (20%) , in http://www.cs.umd.edu/ class class/spring2016/cmsc4 Final exam (30%) , • 22//syllabus/ cumulative, in class.

  12. Where to… • find the readings: A Course in Machine Learning • view and submit assignments: Canvas • check your grades: Canvas • ask and answer questions, participate in discussions and surveys, contact the instructors, and everything else: Piazza – Please use piazza instead of email

  13. T oday’s topics What does it mean to “learn by example”? • Classification tasks • Inductive bias • Formalizing learning

  14. Classification tasks • How would you write a program to distinguish a picture of me from a picture of someone else? • Provide examples pictures of me and pictures of other people and let a classifier learn to distinguish the two.

  15. Classification tasks • How would you write a program to distinguish a sentence is grammatical or not? • Provide examples of grammatical and ungrammatical sentences and let a classifier learn to distinguish the two.

  16. Classification tasks • How would you write a program to distinguish cancerous cells from normal cells? • Provide examples of cancerous and normal cells and let a classifier learn to distinguish the two.

  17. Classification tasks • How would you write a program to distinguish cancerous cells from normal cells? • Provide examples of cancerous and normal cells and let a classifier learn to distinguish the two.

  18. Let ’ s try it out …  Your task: learn a classifier to distinguish class A from class B from examples

  19. • Examples of class A:

  20. • Examples of class B

  21. Let’s try it out…  learn a classifier from examples  Now: predict class on new examples using what you’ve learned

  22. What if I told you…

  23. Key ingredients needed for learning • Training vs. test examples – Memorizing the training examples is not enough! – Need to generalize to make good predictions on test examples • Inductive bias – Many classifier hypotheses are plausible – Need assumptions about the nature of the relation between examples and classes

  24. Machine Learning as Function Approximation Problem setting • Set of possible instances 𝑌 • Unknown target function 𝑔: 𝑌 → 𝑍 • Set of function hypotheses 𝐼 = ℎ ℎ: 𝑌 → 𝑍} Input • Training examples { 𝑦 1 , 𝑧 1 , … 𝑦 𝑂 , 𝑧 𝑂 } of unknown target function 𝑔 Output • Hypothesis ℎ ∈ 𝐼 that best approximates target function 𝑔

  25. Formalizing induction: Loss Function 𝑧) where 𝑧 is the truth and 𝑧 the system’s 𝑚(𝑧, prediction = 0 𝑗𝑔 𝑧 = 𝑔(𝑦) e.g. 𝑚 𝑧, 𝑔(𝑦) 1 𝑝𝑢ℎ𝑓𝑠𝑥𝑗𝑡𝑓 Captures our notion of what is important to learn

  26. Formalizing induction: Data generating distribution • Where does the data come from? – Data generating distribution: Probability distribution 𝐸 over (𝑦, 𝑧) pairs – We don’t know what 𝐸 is! – We get a random sample from it: our training data

  27. Formalizing induction: Expected loss • 𝑔 should make good predictions – as measured by loss 𝑚 – on future examples that are also draw from 𝐸 • Formally – 𝜁 , the expected loss of 𝑔 over 𝐸 with respect to 𝑚 should be small 𝜁 ≜ 𝔽 𝑦,𝑧 ~𝐸 𝑚(𝑧, 𝑔(𝑦)) = 𝐸 𝑦, 𝑧 𝑚(𝑧, 𝑔(𝑦)) (𝑦,𝑧)

  28. Formalizing induction: Training error • We can’t compute expected loss because we don’t know what 𝐸 is • We only have a sample of 𝐸 – training examples { 𝑦 1 , 𝑧 1 , … 𝑦 𝑂 , 𝑧 𝑂 } • All we can compute is the training error 𝑂 1 𝑂 𝑚(𝑧 𝑜 , 𝑔(𝑦 𝑜 )) 𝜁 ≜ 𝑜=1

  29. Formalizing Induction • Given – a loss function 𝑚 – a sample from some unknown data distribution 𝐸 • Our task is to compute a function f that has low expected error over 𝐸 with respect to 𝑚 . 𝔽 𝑦,𝑧 ~𝐸 𝑚(𝑧, 𝑔(𝑦)) = 𝐸 𝑦, 𝑧 𝑚(𝑧, 𝑔(𝑦)) (𝑦,𝑧)

  30. Recap: introducing machine learning What does it mean to “learn by example”? • Classification tasks • Learning requires examples + inductive bias • Generalization vs. memorization • Formalizing the learning problem – Function approximation – Learning as minimizing expected loss

  31. Your tasks before next class • Check out course webpage, Canvas, Piazza • Do the readings • Get started on HW01 – due Wednesday 2:59pm

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend