introduction to machine learning
play

Introduction to machine learning COMS 4721 Learning from data - PowerPoint PPT Presentation

Introduction to machine learning COMS 4721 Learning from data Machine learning : the study of computational mechanisms that learn from data in order to make predictions and decisions. Example 1: image classification A birdwatcher


  1. Introduction to machine learning COMS 4721

  2. Learning from data • Machine learning : the study of computational mechanisms that “learn” from data in order to make predictions and decisions.

  3. Example 1: image classification • A birdwatcher takes pictures of birds and sorts the photos by species. • Goal : automatically recognize bird species in new photos. Indigo bunting

  4. Example 2: matchmaking • An online matchmaking service introduces thousands of pairs of students to each other, and receives feedback about whether the pair actually goes on a date or not. • Goal : predict how likely any pair of students will go on a date if introduced to each other. Alice Bob Charlie Daisy Alice 1 1 Bob 1 1 0 Charlie 1 Daisy 1 0

  5. Example 3: machine translation • Linguists provide translations of all English language books into French, sentence-by-sentence. • Goal : automatically translate any English sentence into French.

  6. Example 4: personalized medicine • A physician attends to patients at a hospital and prescribes treatments on the basis of the patients’ symptoms, medical histories, genetic profiles, etc. The health outcome (e.g., recovery, death) for each patient is observed a day or so after the treatment. • Goal : prescribe a personalized treatment for any patient that delivers the best possible health outcome for that patient.

  7. Basic setting • Data : labeled examples 𝑦 " , 𝑧 " , 𝑦 % , 𝑧 % , …, 𝑦 ' , 𝑧 ' ∈ 𝒴×𝒵 • Goal : “learn” a function -: 𝒴 → 𝒝 𝑔 from the data that is ultimately used for prediction/decision-making. • 𝑦 1 ∈ 𝒴 : representation of 𝑗 34 object ( 𝒴 = input/feature space ) • 𝑧 1 ∈ 𝒵 : label pertinent to 𝑗 34 object ( 𝒵 = output/label space ) • 𝒝 : action space (usually 𝒵 = 𝒝 for prediction problems )

  8. Prediction problems • Goal : learn a prediction function ( predictor ) that provides “correct” labels to inputs that may be encountered in the future (i.e., new unlabeled examples ). New unlabeled example Learning algorithm Collection of labeled examples Learned predictor Prediction Why should this be possible?

  9. Some basic issues 1. How should we represent the input objects? 2. What types of prediction functions should we consider? 3. How should data be used to select a predictor? 4. How can we evaluate whether learning was successful?

  10. Special case: binary classification 𝒵 = 0,1 (e.g., is it an indigo bunting or not) 1 0 Why is this hard? ' , which together comprise a miniscule 1. Only have labels for 𝑦 1 18" fraction of the input space 𝒴 . 2. Relationship between an input 𝑦 ∈ 𝒴 and its correct label 𝑧 ∈ 𝒵 may be complicated, possibly ambiguous/non-deterministic! 3. Can be many functions that perfectly match input/output ' . How should we pick one among these? relationship on 𝑦 1 , 𝑧 1 18"

  11. Topics for this course (tentative) 1. Non-parametric models (e.g., nearest neighbor, decision trees) 2. Parametric models (e.g., generative models, linear models) 3. Ensemble methods (e.g., boosting, hedging) 4. Regression (e.g., least squares, Lasso) 5. Representation learning (e.g., mixture models, PCA, auto-encoders) 6. Other topics as time permits (e.g., sequence models, partial feedback)

  12. A small sample of other topics in ML… • Advanced issues : • Application areas : • Distributed learning • Natural language processing • Incomplete data • Speech recognition • Causal inference • Computer vision • Privacy and fairness • Computational advertising • Other models of learning : • Modes of study : • Semi-supervised learning • Mathematical analysis • Active learning • Cross-domain evaluations • Online learning • End-to-end application study • Reinforcement learning

  13. This course • Mathematical prerequisites : • Multivariable calculus • Linear algebra • Probability (and some basic statistics would be helpful) • Basic data structures and algorithms • Computational prerequisites : • You should have regular access and be able to program in MATLAB, Python, or R. • Course requirements : • Around four homework assignments (theoretical & empirical exercises): 24% • Two in-class exams (March 3, April 28): 25% each • Practical modeling project: 26% • No late assignments accepted, no make-up exams

  14. Resources • Course website : http://www.cs.columbia.edu/~djhsu/coms4721-s16 • Course staff : • Instructor : Daniel Hsu • Instructional assistants : Edward Li, Siddharth Varshney, Robert Ying • Office hours, contact information, online forum : see course website • Course materials : • Course policies : posted on the course website • Lecture slides, notes, etc. : posted on the course website • Readings : “A Course in Machine Learning” and “The Elements of Statistical Learning” (both available free online), as well as other materials posted on the course website

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend