introduction to data science
play

Introduction to Data Science Winter Semester 2019/20 Oliver Ernst - PowerPoint PPT Presentation

Introduction to Data Science Winter Semester 2019/20 Oliver Ernst TU Chemnitz, Fakultt fr Mathematik, Professur Numerische Mathematik Lecture Slides Contents I 1 What is Data Science? 2 Learning Theory 2.1 What is Statistical Learning?


  1. Introduction to Data Science Winter Semester 2019/20 Oliver Ernst TU Chemnitz, Fakultät für Mathematik, Professur Numerische Mathematik Lecture Slides

  2. Contents I 1 What is Data Science? 2 Learning Theory 2.1 What is Statistical Learning? 2.2 Assessing Model Accuracy 3 Linear Regression 3.1 Simple Linear Regression 3.2 Multiple Linear Regression 3.3 Other Considerations in the Regression Model 3.4 Revisiting the Marketing Data Questions 3.5 Linear Regression vs. K -Nearest Neighbors 4 Classification 4.1 Overview of Classification 4.2 Why Not Linear Regression? 4.3 Logistic Regression 4.4 Linear Discriminant Analysis 4.5 A Comparison of Classification Methods 5 Resampling Methods Oliver Ernst (NM) Introduction to Data Science Winter Semester 2018/19 3 / 463

  3. Contents II 5.1 Cross Validation 5.2 The Bootstrap 6 Linear Model Selection and Regularization 6.1 Subset Selection 6.2 Shrinkage Methods 6.3 Dimension Reduction Methods 6.4 Considerations in High Dimensions 6.5 Miscellanea 7 Nonlinear Regression Models 7.1 Polynomial Regression 7.2 Step Functions 7.3 Regression Splines 7.4 Smoothing Splines 7.5 Generalized Additive Models 8 Tree-Based Methods 8.1 Decision Tree Fundamentals 8.2 Bagging, Random Forests and Boosting Oliver Ernst (NM) Introduction to Data Science Winter Semester 2018/19 4 / 463

  4. Contents III 9 Unsupervised Learning 9.1 Principal Components Analysis 9.2 Clustering Methods Oliver Ernst (NM) Introduction to Data Science Winter Semester 2018/19 5 / 463

  5. Contents 7 Nonlinear Regression Models 7.1 Polynomial Regression 7.2 Step Functions 7.3 Regression Splines 7.4 Smoothing Splines 7.5 Generalized Additive Models Oliver Ernst (NM) Introduction to Data Science Winter Semester 2018/19 336 / 463

  6. Nonlinear Regression Models Chapter overview • Despite the benefits of simplicity and interpretability of the standard linear model for regression, it will suffer from large bias if the model generating the data depends nonlinearly on the predictors. • In this chapter we explore methods which make the linear regression model more flexible by using linear combinations of nonlinear functions , specifi- cally 1 polynomial and piecewise polynomial functions, 2 piecewise constant functions, 3 piecewise piecewise polynomial functions with penalty terms and 4 generalized additive model functions of the predictors. Oliver Ernst (NM) Introduction to Data Science Winter Semester 2018/19 337 / 463

  7. Contents 7 Nonlinear Regression Models 7.1 Polynomial Regression 7.2 Step Functions 7.3 Regression Splines 7.4 Smoothing Splines 7.5 Generalized Additive Models Oliver Ernst (NM) Introduction to Data Science Winter Semester 2018/19 338 / 463

  8. Nonlinear Regression Models Polynomial Regression • For univariate models, polynomial regression replaces the simple linear regression model Y = β 0 + β 1 X + ε with a polynomial of degree d > 1 in the predictor variable Y = β 0 + β 1 X + β 2 X 2 + · · · + β d X d + ε. Oliver Ernst (NM) Introduction to Data Science Winter Semester 2018/19 339 / 463

  9. Nonlinear Regression Models Polynomial Regression • For univariate models, polynomial regression replaces the simple linear regression model Y = β 0 + β 1 X + ε with a polynomial of degree d > 1 in the predictor variable Y = β 0 + β 1 X + β 2 X 2 + · · · + β d X d + ε. • High degree polynomials are often difficult to handle due to their oscillato- ry behavior and their unboundedness for large arguments, so that degrees higher than 4 can become problematic if employed naively. Oliver Ernst (NM) Introduction to Data Science Winter Semester 2018/19 339 / 463

  10. Nonlinear Regression Models Polynomial Regression • For univariate models, polynomial regression replaces the simple linear regression model Y = β 0 + β 1 X + ε with a polynomial of degree d > 1 in the predictor variable Y = β 0 + β 1 X + β 2 X 2 + · · · + β d X d + ε. • High degree polynomials are often difficult to handle due to their oscillato- ry behavior and their unboundedness for large arguments, so that degrees higher than 4 can become problematic if employed naively. • Example: Wage data set: income and demographic information for males who reside in the central Atlantic region of the United States. Fit response wage [in $ 1000] to predictor age by LS using a polynomial of degree d = 4. Oliver Ernst (NM) Introduction to Data Science Winter Semester 2018/19 339 / 463

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend