Intro to ML
March 10, 2020 Data Science CSCI 1951A Brown University Instructor: Ellie Pavlick HTAs: Josh Levin, Diane Mutako, Sol Zitter
1
Intro to ML March 10, 2020 Data Science CSCI 1951A Brown - - PowerPoint PPT Presentation
Intro to ML March 10, 2020 Data Science CSCI 1951A Brown University Instructor: Ellie Pavlick HTAs: Josh Levin, Diane Mutako, Sol Zitter 1 Announcements This class is going viral! (Funny? No? Too soon?) Not officially, but starting to
March 10, 2020 Data Science CSCI 1951A Brown University Instructor: Ellie Pavlick HTAs: Josh Levin, Diane Mutako, Sol Zitter
1
further notice
2
blocks, conceptual background
3
blocks, conceptual background
4
5
(a) None at all. I have obviously heard of ML but I’ve never really dealt with it. (b) Small amount of informal experience. I’ve read articles/blog posts and gotten the gist of how it works. (c) Like (b), but I’ve followed along an coded some models myself (d) Comfortable. I’ve taken an ML class. (e) Very comfortable. I’ve taken an ML class/classes and I’ve built models myself for research projects or internships.
5
6
(a) Mostly “conventional” ML (b) Mostly deep learning (c) Equally comfortable with both (d) Not comfortable with either
6
Goal/Task Data Model
7
Goal/Task Data Model
Prediction of some kind, e.g.:
8
Goal/Task Data Model
Can be anything. Usually data size and/or representation is the limiting factor.
9
Goal/Task Data Model
Decisions about how the problem is structured AND how to estimate parameters
10
Goal/Task Data Model
11
https://youtu.be/bq2_wSsDwkQ?t=682
12
13
Data Model
Goal/Task
14
Data Model
Task = Increase Consumption
15
Data = Reading Habits Model
Task = Increase Consumption
16
Data = Reading Habits Model
Task = Increase Consumption
17
quantifiable (and, right now, usually differentiable).
18
quantifiable (and, right now, usually differentiable).
19
quantifiable (and, right now, usually differentiable).
20
quantifiable (and, right now, usually differentiable).
21
quantifiable (and, right now, usually differentiable).
22
Data = Reading Habits Model
Task = Increase Consumption
Objective/Loss Function = ???
23
Data = Reading Habits Model
Task = Increase Consumption
Objective/Loss Function = ??? Features = ???
24
Data = Reading Habits Model
Task = Increase Consumption
Objective/Loss Function = ??? Features = ???
25
your clickbait farm pulitzer-prize worthy publication
26
your clickbait farm pulitzer-prize worthy publication
27
28 28
your clickbait farm pulitzer-prize worthy publication
29
your clickbait farm pulitzer-prize worthy publication
30
31 31
your clickbait farm pulitzer-prize worthy publication
32
your clickbait farm pulitzer-prize worthy publication
true value
33
your clickbait farm pulitzer-prize worthy publication
and true value
34
Data = Reading Habits Model
Task = Increase Consumption
Objective/Loss Function = ??? Features = ???
35
Data = Reading Habits Model
Task = Increase Consumption
Objective/Loss Function = squared difference between predicted total number of clicks and actual total number of clicks Features = ???
36
Data = Reading Habits Model
Task = Increase Consumption
Features = ??? Objective/Loss Function = squared difference between predicted total number of clicks and actual total number of clicks
37
ever-present cookies and remote control of webcam user-consented GDPR-compliant data usage agreements
38
ever-present cookies and remote control of webcam user-consented GDPR-compliant data usage agreements
39
40 40
ever-present cookies and remote control of webcam user-consented GDPR-compliant data usage agreements
41
ever-present cookies and remote control of webcam user-consented GDPR-compliant data usage agreements
…
42
ever-present cookies and remote control of webcam user-consented GDPR-compliant data usage agreements
…
43
44
Clicks Recency Reading Level Photo Title 10 1.3 11 1 “New Tax Guidelines” 1000 1.7 3 1 “This 600lb baby…” 1000000 2.4 2 1 “18 reasons you should never look at this cat unless you…” 1 5.9 19 “The Brothers Karamazov: a neo- post-globalist perspective”
45
Clicks Recency Reading Level Photo Title 10 1.3 11 1 “New Tax Guidelines” 1000 1.7 3 1 “This 600lb baby…” 1000000 2.4 2 1 “18 reasons you should never look at this cat unless you…” 1 5.9 19 “The Brothers Karamazov: a neo- post-globalist perspective”
y
46
Clicks Recency Reading Level Photo Title 10 1.3 11 1 “New Tax Guidelines” 1000 1.7 3 1 “This 600lb baby…” 1000000 2.4 2 1 “18 reasons you should never look at this cat unless you…” 1 5.9 19 “The Brothers Karamazov: a neo- post-globalist perspective”
x
47
Clicks Recency Reading Level Photo Title 10 1.3 11 1 “New Tax Guidelines” 1000 1.7 3 1 “This 600lb baby…” 1000000 2.4 2 1 “18 reasons you should never look at this cat unless you…” 1 5.9 19 “The Brothers Karamazov: a neo- post-globalist perspective”
numeric features — defined for (nearly) every row
48
Clicks Recency Reading Level Photo Title 10 1.3 11 1 “New Tax Guidelines” 1000 1.7 3 1 “This 600lb baby…” 1000000 2.4 2 1 “18 reasons you should never look at this cat unless you…” 1 5.9 19 “The Brothers Karamazov: a neo- post-globalist perspective”
boolean features — 0 or 1 (“dummy” variables)
49
Clicks Recency Reading Level Photo Title 10 1.3 11 1 “New Tax Guidelines” 1000 1.7 3 1 “This 600lb baby…” 1000000 2.4 2 1 “18 reasons you should never look at this cat unless you…” 1 5.9 19 “The Brothers Karamazov: a neo- post-globalist perspective”
strings = boolean features — 0 or 1 (“dummy” variables)
50
Clicks Recency Reading Level Photo Title: “new” Title: “tax” Title: “this” Title: “…” … 10 1.3 11 1 1 … 1000 1.7 3 1 1 1 … 1000000 2.4 2 1 1 1 … 1 5.9 19 …
strings = boolean features — 0 or 1 (“dummy” variables)
51
Clicks Recency Reading Level Photo Title: “new” Title: “tax” Title: “this” Title: “…” … 10 1.3 11 1 1 … 1000 1.7 3 1 1 1 … 1000000 2.4 2 1 1 1 … 1 5.9 19 …
“sparse features” — 0 for most rows
52
53 53
54
Y: happiness X1: day of week (“monday”, “tuesday”, … “sunday”) X2: bank account balance (real value) X3: breakfast (yes,no) X4: whether you have found your inner peace (yes,no) X5: words from last week’s worth of tweets (assuming tweets are at most 15 words long and there are 100K words in the English vocabulary)
54
Y: happiness X1: day of week (“monday”, “tuesday”, … “sunday”) X2: bank account balance (real value) X3: breakfast (yes,no) X4: whether you have found your inner peace (yes,no) X5: words from last week’s worth of tweets (assuming tweets are at most 15 words long and there are 100K words in the English vocabulary)
55
7 1 1 1 100,000
55
Data = Reading Habits Model
Task = Increase Consumption
Features = ??? Objective/Loss Function = squared difference between predicted total number of clicks and actual total number of clicks
56
Data = Reading Habits Model
Task = Increase Consumption
Features = {Recency:float, ReadingLevel:Int, Photo:Bool, Title_New:Bool, Title_Tax:Bool, …} Objective/Loss Function = squared difference between predicted total number of clicks and actual total number of clicks
57
Data = Reading Habits Model
Task = Increase Consumption
Features = {Recency:float, ReadingLevel:Int, Photo:Bool, Title_New:Bool, Title_Tax:Bool, …} Objective/Loss Function = squared difference between predicted total number of clicks and actual total number of clicks
58
ML = Function Approximation
59
ML = Function Approximation
60
ML = Function Approximation
61
ML = Function Approximation
62
ML = Function Approximation
63
ML = Function Approximation
You define inputs and outputs.
64
ML = Function Approximation
You define inputs and outputs. (The really hard part)
65
ML = Function Approximation
The machine will (ideally) learn the function (with a lot of help from you)
66
ML = Function Approximation
The machine will (ideally) learn the function (with a lot of help from you) (The part that gets the most attention.)
67
68
69
70
71
72
73
74
clicks reading level
75
Regression: continuous (infinite) output f(reading level) = # of clicks clicks reading level
76
Classification: discrete (finite) output f(reading level) = {clicked, not clicked} clicks reading level
77
clicks reading level
Linear Regression —> The specific “model” we are using here.
78
clicks reading level
clicks —> output/labels/target
79
clicks reading level
reading level —> The “feature” which is observed/derived from the data
80
clicks reading level
m and b —> The “parameters” which need to be set (by looking at data)
81
clicks reading level
“setting parameters”, “learning”, “training”, “estimation”
82
clicks reading level
parameter values, “weights”, “coefficients”
83
Data = Reading Habits Model
Task = Increase Consumption
Features = {Recency:float, ReadingLevel:Int, Photo:Bool, Title_New:Bool, Title_Tax:Bool, …} Objective/Loss Function = squared difference between predicted total number of clicks and actual total number of clicks Linear Regression
84
Features = {Recency:float, ReadingLevel:Int, Photo:Bool, Title_New:Bool, Title_Tax:Bool, …}
Linear Regression
Objective/Loss Function = squared difference between predicted total number of clicks and actual total number of clicks
85
Features = {Recency:float, ReadingLevel:Int, Photo:Bool, Title_New:Bool, Title_Tax:Bool, …}
Linear Regression
Objective/Loss Function = squared difference between predicted total number of clicks and actual total number of clicks
86
Features = {Recency:float, ReadingLevel:Int, Photo:Bool, Title_New:Bool, Title_Tax:Bool, …}
Linear Regression
Objective/Loss Function = squared difference between predicted total number of clicks and actual total number of clicks
87
88
MSE = 10
89
MSE = 10
90
Train
91
Train MSE = 6
92
Test
93
94
Test
What should we expect MSE to do?
95
Test
What should we expect MSE to do? If your model isn’ t “right” yet (i.e. in practice, most
96
Test
What should we expect MSE to do? If your model is “right”
enough (i.e. can’ t memorize training data).
97
Test
98
Test MSE = 12
99
Train MSE = 4
Problem gets worse as models get more powerful/flexible
100
MSE = 14
Problem gets worse as models get more powerful/flexible
101
102
blocks, conceptual background
103
Regression Analysis in Stats Regression in ML
104
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y
105
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs
106
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity
107
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine
108
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size
109
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size A “result” is typically in the form
performance on a (held out) test set
110
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size A “result” is typically in the form
performance on a (held out) test set Avoid overfitting by preferring simple models; avoid
for “degrees of freedom” when computing p values
111
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size A “result” is typically in the form
performance on a (held out) test set Avoid overfitting by preferring simple models; avoid
for “degrees of freedom” when computing p values Avoid overfitting through regularization; avoid
train/test splits and reporting test performance
112
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size A “result” is typically in the form
performance on a (held out) test set Avoid overfitting by preferring simple models; avoid
for “degrees of freedom” when computing p values Avoid overfitting through regularization; avoid
train/test splits and reporting test performance
But! These are the same model. These difference are “in general”/“by convention”, not anything fundamental.
113
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size A “result” is typically in the form
performance on a (held out) test set Avoid overfitting by preferring simple models; avoid
for “degrees of freedom” when computing p values Avoid overfitting through regularization; avoid
train/test splits and reporting test performance
Different scientific communities with different goals.
114
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size A “result” is typically in the form
performance on a (held out) test set Avoid overfitting by preferring simple models; avoid
for “degrees of freedom” when computing p values Avoid overfitting through regularization; avoid
train/test splits and reporting test performance
Different scientific communities with different goals. (and different software packages :)) <- R, stats_models, STATA sklearn, matlab, pytorch ->
115
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size A “result” is typically in the form
performance on a (held out) test set Avoid overfitting by preferring simple models; avoid
for “degrees of freedom” when computing p values Avoid overfitting through regularization; avoid
train/test splits and reporting test performance
116
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size A “result” is typically in the form
performance on a (held out) test set Avoid overfitting by preferring simple models; avoid
for “degrees of freedom” when computing p values Avoid overfitting through regularization; avoid
train/test splits and reporting test performance
In the limit, I think these goals are the same. Even if we care about prediction (and we want to do it using as few models as possible), shouldn’t we get the best performance by modeling the “true” underlying process? Isn’t it the case that correct explanatory/causal models necessarily make right predictions, but not vice-versa?
117
Regression Analysis in Stats Regression in ML
Make claims about whether there is a meaningful relationship between X and Y Given X, predict Y; deploy a model to make predictions for new inputs (Often) interested in causation; focus on controls and removing colinearity Focused on prediction accuracy; exploiting correlation is totally fine A “result” is typically in the form of a significant relationship and/or practically relevant effect size A “result” is typically in the form
performance on a (held out) test set Avoid overfitting by preferring simple models; avoid
for “degrees of freedom” when computing p values Avoid overfitting through regularization; avoid
train/test splits and reporting test performance
Counter argument: You can get perfect* predictive performance with the wrong model. We were extremely good at predicting whether objects would fall or float long before we knew about gravity. Explanatory/causal models are hard! We might never get
should lead, and theory/explanation will follow?
118
blocks, conceptual background
119
120
minimize
121
minimize
122
minimize
123
minimize Cov(X, Y ) V ar(X)
<latexit sha1_base64="WPNm1YuVK0X591E2DeKZG6XKYA=">ACB3icbVBNS8NAEN3Urxq/oh4FWSxC1ISEfQiFHvxWMG2kTaUzXbTLt1swu6mUEJuXvwrXjwo4tW/4M1/47bNQVsfDzem2Fmnh8zKpVtfxuFldW19Y3iprm1vbO7Z+0ftGSUCEyaOGKRcH0kCaOcNBVjLixICj0GWn7o/rUb4+JkDTi92oSEy9EA04DipHSUs869uE17AYC4dSsR+OyewYfKlqtpAou5WsZ5Xsqj0DXCZOTkogR6NnfX7EU5CwhVmSMqOY8fKS5FQFDOSmd1EkhjhERqQjqYchUR6eyPDJ5qpQ+DSOjiCs7U3xMpCqWchL7uDJEaykVvKv7ndRIVXHkp5XGiCMfzRUHCoIrgNBTYp4JgxSaICyovhXiIdKhKB2dqUNwFl9eJq3zqmNXnbuLUu0mj6MIjsAJKAMHXIauAUN0AQYPIJn8ArejCfjxXg3PuatBSOfOQR/YHz+ADg1lvo=</latexit><latexit sha1_base64="WPNm1YuVK0X591E2DeKZG6XKYA=">ACB3icbVBNS8NAEN3Urxq/oh4FWSxC1ISEfQiFHvxWMG2kTaUzXbTLt1swu6mUEJuXvwrXjwo4tW/4M1/47bNQVsfDzem2Fmnh8zKpVtfxuFldW19Y3iprm1vbO7Z+0ftGSUCEyaOGKRcH0kCaOcNBVjLixICj0GWn7o/rUb4+JkDTi92oSEy9EA04DipHSUs869uE17AYC4dSsR+OyewYfKlqtpAou5WsZ5Xsqj0DXCZOTkogR6NnfX7EU5CwhVmSMqOY8fKS5FQFDOSmd1EkhjhERqQjqYchUR6eyPDJ5qpQ+DSOjiCs7U3xMpCqWchL7uDJEaykVvKv7ndRIVXHkp5XGiCMfzRUHCoIrgNBTYp4JgxSaICyovhXiIdKhKB2dqUNwFl9eJq3zqmNXnbuLUu0mj6MIjsAJKAMHXIauAUN0AQYPIJn8ArejCfjxXg3PuatBSOfOQR/YHz+ADg1lvo=</latexit><latexit sha1_base64="WPNm1YuVK0X591E2DeKZG6XKYA=">ACB3icbVBNS8NAEN3Urxq/oh4FWSxC1ISEfQiFHvxWMG2kTaUzXbTLt1swu6mUEJuXvwrXjwo4tW/4M1/47bNQVsfDzem2Fmnh8zKpVtfxuFldW19Y3iprm1vbO7Z+0ftGSUCEyaOGKRcH0kCaOcNBVjLixICj0GWn7o/rUb4+JkDTi92oSEy9EA04DipHSUs869uE17AYC4dSsR+OyewYfKlqtpAou5WsZ5Xsqj0DXCZOTkogR6NnfX7EU5CwhVmSMqOY8fKS5FQFDOSmd1EkhjhERqQjqYchUR6eyPDJ5qpQ+DSOjiCs7U3xMpCqWchL7uDJEaykVvKv7ndRIVXHkp5XGiCMfzRUHCoIrgNBTYp4JgxSaICyovhXiIdKhKB2dqUNwFl9eJq3zqmNXnbuLUu0mj6MIjsAJKAMHXIauAUN0AQYPIJn8ArejCfjxXg3PuatBSOfOQR/YHz+ADg1lvo=</latexit><latexit sha1_base64="WPNm1YuVK0X591E2DeKZG6XKYA=">ACB3icbVBNS8NAEN3Urxq/oh4FWSxC1ISEfQiFHvxWMG2kTaUzXbTLt1swu6mUEJuXvwrXjwo4tW/4M1/47bNQVsfDzem2Fmnh8zKpVtfxuFldW19Y3iprm1vbO7Z+0ftGSUCEyaOGKRcH0kCaOcNBVjLixICj0GWn7o/rUb4+JkDTi92oSEy9EA04DipHSUs869uE17AYC4dSsR+OyewYfKlqtpAou5WsZ5Xsqj0DXCZOTkogR6NnfX7EU5CwhVmSMqOY8fKS5FQFDOSmd1EkhjhERqQjqYchUR6eyPDJ5qpQ+DSOjiCs7U3xMpCqWchL7uDJEaykVvKv7ndRIVXHkp5XGiCMfzRUHCoIrgNBTYp4JgxSaICyovhXiIdKhKB2dqUNwFl9eJq3zqmNXnbuLUu0mj6MIjsAJKAMHXIauAUN0AQYPIJn8ArejCfjxXg3PuatBSOfOQR/YHz+ADg1lvo=</latexit>m =
b =
124
minimize
125
minimize
∂Q ∂m =
n
X
i=1
−2Xi(Yi − b − mXi) = 0
<latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit><latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit><latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit><latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit>126
minimize
∂Q ∂m =
n
X
i=1
−2Xi(Yi − b − mXi) = 0
<latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit><latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit><latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit><latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit>127
https://independentseminarblog.com/2018/01/12/moving-below-the-surface-3-gradient-descent-william/ 128
https://independentseminarblog.com/2018/01/12/moving-below-the-surface-3-gradient-descent-william/
∂Q ∂b =
n
X
i=1
−2(Yi − mXi − b) = 0
<latexit sha1_base64="9cIJCpfvS+XsZ9OkPHz9MbmKXfE=">ACLnicbVBdS8MwFE3n15xfVR9CQ5hPjaIeiLIrg4wZuTtZa0ix1YUlaklQYpb/IF/+KPgq4qs/w3QO1OmBkM593LvPWHCqNKO82yVZmbn5hfKi5Wl5ZXVNXt9o6PiVGLSxjGLZTdEijAqSFtTzUg3kQTxkJHLcHha+Je3RCoaiws9SojP0Y2gEcVIGymwz7xIpx5CZKaIgZb+TcPc3gEPZXyIKNHbn6diRzuNWpXAYV7kHfHX7hrapzArjp1Zwz4l7gTUgUTNAP70evHOVEaMyQUj3XSbSfFYMxI3nFSxVJEB6iG9IzVCBOlJ+Nz83hjlH6MIqleULDsfqzI0NcqRE36+9wpAdq2ivE/7xeqNDP6MiSTUR+GtQlDKoY1hkB/tUEqzZyBCEJTW7QjxAJj9tEq6YENzpk/+STqPuOnW3tV89PpnEUQZbYBvUgAsOwDE4B03QBhjcgQfwAl6te+vJerPev0pL1qRnE/yC9fEJufumUQ=</latexit><latexit sha1_base64="9cIJCpfvS+XsZ9OkPHz9MbmKXfE=">ACLnicbVBdS8MwFE3n15xfVR9CQ5hPjaIeiLIrg4wZuTtZa0ix1YUlaklQYpb/IF/+KPgq4qs/w3QO1OmBkM593LvPWHCqNKO82yVZmbn5hfKi5Wl5ZXVNXt9o6PiVGLSxjGLZTdEijAqSFtTzUg3kQTxkJHLcHha+Je3RCoaiws9SojP0Y2gEcVIGymwz7xIpx5CZKaIgZb+TcPc3gEPZXyIKNHbn6diRzuNWpXAYV7kHfHX7hrapzArjp1Zwz4l7gTUgUTNAP70evHOVEaMyQUj3XSbSfFYMxI3nFSxVJEB6iG9IzVCBOlJ+Nz83hjlH6MIqleULDsfqzI0NcqRE36+9wpAdq2ivE/7xeqNDP6MiSTUR+GtQlDKoY1hkB/tUEqzZyBCEJTW7QjxAJj9tEq6YENzpk/+STqPuOnW3tV89PpnEUQZbYBvUgAsOwDE4B03QBhjcgQfwAl6te+vJerPev0pL1qRnE/yC9fEJufumUQ=</latexit><latexit sha1_base64="9cIJCpfvS+XsZ9OkPHz9MbmKXfE=">ACLnicbVBdS8MwFE3n15xfVR9CQ5hPjaIeiLIrg4wZuTtZa0ix1YUlaklQYpb/IF/+KPgq4qs/w3QO1OmBkM593LvPWHCqNKO82yVZmbn5hfKi5Wl5ZXVNXt9o6PiVGLSxjGLZTdEijAqSFtTzUg3kQTxkJHLcHha+Je3RCoaiws9SojP0Y2gEcVIGymwz7xIpx5CZKaIgZb+TcPc3gEPZXyIKNHbn6diRzuNWpXAYV7kHfHX7hrapzArjp1Zwz4l7gTUgUTNAP70evHOVEaMyQUj3XSbSfFYMxI3nFSxVJEB6iG9IzVCBOlJ+Nz83hjlH6MIqleULDsfqzI0NcqRE36+9wpAdq2ivE/7xeqNDP6MiSTUR+GtQlDKoY1hkB/tUEqzZyBCEJTW7QjxAJj9tEq6YENzpk/+STqPuOnW3tV89PpnEUQZbYBvUgAsOwDE4B03QBhjcgQfwAl6te+vJerPev0pL1qRnE/yC9fEJufumUQ=</latexit><latexit sha1_base64="9cIJCpfvS+XsZ9OkPHz9MbmKXfE=">ACLnicbVBdS8MwFE3n15xfVR9CQ5hPjaIeiLIrg4wZuTtZa0ix1YUlaklQYpb/IF/+KPgq4qs/w3QO1OmBkM593LvPWHCqNKO82yVZmbn5hfKi5Wl5ZXVNXt9o6PiVGLSxjGLZTdEijAqSFtTzUg3kQTxkJHLcHha+Je3RCoaiws9SojP0Y2gEcVIGymwz7xIpx5CZKaIgZb+TcPc3gEPZXyIKNHbn6diRzuNWpXAYV7kHfHX7hrapzArjp1Zwz4l7gTUgUTNAP70evHOVEaMyQUj3XSbSfFYMxI3nFSxVJEB6iG9IzVCBOlJ+Nz83hjlH6MIqleULDsfqzI0NcqRE36+9wpAdq2ivE/7xeqNDP6MiSTUR+GtQlDKoY1hkB/tUEqzZyBCEJTW7QjxAJj9tEq6YENzpk/+STqPuOnW3tV89PpnEUQZbYBvUgAsOwDE4B03QBhjcgQfwAl6te+vJerPev0pL1qRnE/yC9fEJufumUQ=</latexit>n
i=1
∂Q ∂m =
n
X
i=1
−2Xi(Yi − b − mXi) = 0
<latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit><latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit><latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit><latexit sha1_base64="XMJx4vTyfUZ86d4j9mLjvojhdw=">ACMXicbVDLSgMxFM34rPVdekmWIS6aJkpgm4KRTdtmAf0qlDJs20oUlmSDJCGeaX3Pgn4qYLRdz6E6YPUFsPXDg515u7vEjRpW27Ym1tr6xubWd2cnu7u0fHOaOjlsqjCUmTRyUHZ8pAijgjQ1Yx0IkQ9xlp+6Pbqd9+JFLRUNzpcUR6HA0EDShG2kheruYGEuHEjZDUFDHYSH84T2EFuirmXkIrTvqQiBQWyx2PFu49CovQN8XN8K02V4ub5fsGeAqcRYkDxaoe7kXtx/imBOhMUNKdR070r1kuhszkmbdWJEI4REakK6hAnGiesns4hSeG6UPg1CaEhrO1N8TCeJKjblvOjnSQ7XsTcX/vG6sg+teQkUayLwfFEQM6hDOI0P9qkWLOxIQhLav4K8RCZCLUJOWtCcJZPXiWtcsmxS07jMl+9WcSRAafgDBSA65AFdRAHTQBk/gFbyBd+vZmlgf1ue8dc1azJyAP7C+vgE9U6ea</latexit>Helpful equations for following along in the jupyter notebook = Cov(X, Y ) V ar(X)
<latexit sha1_base64="WPNm1YuVK0X591E2DeKZG6XKYA=">ACB3icbVBNS8NAEN3Urxq/oh4FWSxC1ISEfQiFHvxWMG2kTaUzXbTLt1swu6mUEJuXvwrXjwo4tW/4M1/47bNQVsfDzem2Fmnh8zKpVtfxuFldW19Y3iprm1vbO7Z+0ftGSUCEyaOGKRcH0kCaOcNBVjLixICj0GWn7o/rUb4+JkDTi92oSEy9EA04DipHSUs869uE17AYC4dSsR+OyewYfKlqtpAou5WsZ5Xsqj0DXCZOTkogR6NnfX7EU5CwhVmSMqOY8fKS5FQFDOSmd1EkhjhERqQjqYchUR6eyPDJ5qpQ+DSOjiCs7U3xMpCqWchL7uDJEaykVvKv7ndRIVXHkp5XGiCMfzRUHCoIrgNBTYp4JgxSaICyovhXiIdKhKB2dqUNwFl9eJq3zqmNXnbuLUu0mj6MIjsAJKAMHXIauAUN0AQYPIJn8ArejCfjxXg3PuatBSOfOQR/YHz+ADg1lvo=</latexit><latexit sha1_base64="WPNm1YuVK0X591E2DeKZG6XKYA=">ACB3icbVBNS8NAEN3Urxq/oh4FWSxC1ISEfQiFHvxWMG2kTaUzXbTLt1swu6mUEJuXvwrXjwo4tW/4M1/47bNQVsfDzem2Fmnh8zKpVtfxuFldW19Y3iprm1vbO7Z+0ftGSUCEyaOGKRcH0kCaOcNBVjLixICj0GWn7o/rUb4+JkDTi92oSEy9EA04DipHSUs869uE17AYC4dSsR+OyewYfKlqtpAou5WsZ5Xsqj0DXCZOTkogR6NnfX7EU5CwhVmSMqOY8fKS5FQFDOSmd1EkhjhERqQjqYchUR6eyPDJ5qpQ+DSOjiCs7U3xMpCqWchL7uDJEaykVvKv7ndRIVXHkp5XGiCMfzRUHCoIrgNBTYp4JgxSaICyovhXiIdKhKB2dqUNwFl9eJq3zqmNXnbuLUu0mj6MIjsAJKAMHXIauAUN0AQYPIJn8ArejCfjxXg3PuatBSOfOQR/YHz+ADg1lvo=</latexit><latexit sha1_base64="WPNm1YuVK0X591E2DeKZG6XKYA=">ACB3icbVBNS8NAEN3Urxq/oh4FWSxC1ISEfQiFHvxWMG2kTaUzXbTLt1swu6mUEJuXvwrXjwo4tW/4M1/47bNQVsfDzem2Fmnh8zKpVtfxuFldW19Y3iprm1vbO7Z+0ftGSUCEyaOGKRcH0kCaOcNBVjLixICj0GWn7o/rUb4+JkDTi92oSEy9EA04DipHSUs869uE17AYC4dSsR+OyewYfKlqtpAou5WsZ5Xsqj0DXCZOTkogR6NnfX7EU5CwhVmSMqOY8fKS5FQFDOSmd1EkhjhERqQjqYchUR6eyPDJ5qpQ+DSOjiCs7U3xMpCqWchL7uDJEaykVvKv7ndRIVXHkp5XGiCMfzRUHCoIrgNBTYp4JgxSaICyovhXiIdKhKB2dqUNwFl9eJq3zqmNXnbuLUu0mj6MIjsAJKAMHXIauAUN0AQYPIJn8ArejCfjxXg3PuatBSOfOQR/YHz+ADg1lvo=</latexit><latexit sha1_base64="WPNm1YuVK0X591E2DeKZG6XKYA=">ACB3icbVBNS8NAEN3Urxq/oh4FWSxC1ISEfQiFHvxWMG2kTaUzXbTLt1swu6mUEJuXvwrXjwo4tW/4M1/47bNQVsfDzem2Fmnh8zKpVtfxuFldW19Y3iprm1vbO7Z+0ftGSUCEyaOGKRcH0kCaOcNBVjLixICj0GWn7o/rUb4+JkDTi92oSEy9EA04DipHSUs869uE17AYC4dSsR+OyewYfKlqtpAou5WsZ5Xsqj0DXCZOTkogR6NnfX7EU5CwhVmSMqOY8fKS5FQFDOSmd1EkhjhERqQjqYchUR6eyPDJ5qpQ+DSOjiCs7U3xMpCqWchL7uDJEaykVvKv7ndRIVXHkp5XGiCMfzRUHCoIrgNBTYp4JgxSaICyovhXiIdKhKB2dqUNwFl9eJq3zqmNXnbuLUu0mj6MIjsAJKAMHXIauAUN0AQYPIJn8ArejCfjxXg3PuatBSOfOQR/YHz+ADg1lvo=</latexit>m =
<latexit sha1_base64="mxdaQB8GmfgkBu9RKA5ZbS6fR8=">ACkHicdVHdSsMwGE3r/ybeulNcAjbxUY7Bb1wOPVGvJrgdLJuJc1SDSZpSVJhlD6P7+Odb2O6daCbfhA4Od/5TpKTIGZUacf5suyl5ZXVtfWN0ubW9s5ueW/UWJxKSLIxbJXoAUYVSQrqakV4sCeIBI0/B203ef3onUtFIPOhxTAYcvQgaUoy0ofzyB4ct6IUS4dRTCfdT2nKzYSoyWH32ac+nsA69AMn0OTObWjavMuSwOdP0ck32j2E1NzOetToUhbowXjCtTlxr9Zlu2Mz8csVpOJOCi8AtQAU1fHLn94owgknQmOGlOq7TqwHKZKaYkaykpcoEiP8hl5I30CBOFGDdBJoBo8NM4JhJM0SGk7YnxMp4kqNeWCUHOlXNd/Lyb96/USH54OUijRODpQWHCoI5g/jtwRCXBmo0NQFhSc1eIX5EJU5s/LJkQ3PknL4LHZsN1Gu79aV9XcSxDg7BEagCF5yBNrgFHdAF2Nq2TqwLq2Xv2+f2pX01ldpWMXMAfpV9w3+Zca5</latexit><latexit sha1_base64="mxdaQB8GmfgkBu9RKA5ZbS6fR8=">ACkHicdVHdSsMwGE3r/ybeulNcAjbxUY7Bb1wOPVGvJrgdLJuJc1SDSZpSVJhlD6P7+Odb2O6daCbfhA4Od/5TpKTIGZUacf5suyl5ZXVtfWN0ubW9s5ueW/UWJxKSLIxbJXoAUYVSQrqakV4sCeIBI0/B203ef3onUtFIPOhxTAYcvQgaUoy0ofzyB4ct6IUS4dRTCfdT2nKzYSoyWH32ac+nsA69AMn0OTObWjavMuSwOdP0ck32j2E1NzOetToUhbowXjCtTlxr9Zlu2Mz8csVpOJOCi8AtQAU1fHLn94owgknQmOGlOq7TqwHKZKaYkaykpcoEiP8hl5I30CBOFGDdBJoBo8NM4JhJM0SGk7YnxMp4kqNeWCUHOlXNd/Lyb96/USH54OUijRODpQWHCoI5g/jtwRCXBmo0NQFhSc1eIX5EJU5s/LJkQ3PknL4LHZsN1Gu79aV9XcSxDg7BEagCF5yBNrgFHdAF2Nq2TqwLq2Xv2+f2pX01ldpWMXMAfpV9w3+Zca5</latexit><latexit sha1_base64="mxdaQB8GmfgkBu9RKA5ZbS6fR8=">ACkHicdVHdSsMwGE3r/ybeulNcAjbxUY7Bb1wOPVGvJrgdLJuJc1SDSZpSVJhlD6P7+Odb2O6daCbfhA4Od/5TpKTIGZUacf5suyl5ZXVtfWN0ubW9s5ueW/UWJxKSLIxbJXoAUYVSQrqakV4sCeIBI0/B203ef3onUtFIPOhxTAYcvQgaUoy0ofzyB4ct6IUS4dRTCfdT2nKzYSoyWH32ac+nsA69AMn0OTObWjavMuSwOdP0ck32j2E1NzOetToUhbowXjCtTlxr9Zlu2Mz8csVpOJOCi8AtQAU1fHLn94owgknQmOGlOq7TqwHKZKaYkaykpcoEiP8hl5I30CBOFGDdBJoBo8NM4JhJM0SGk7YnxMp4kqNeWCUHOlXNd/Lyb96/USH54OUijRODpQWHCoI5g/jtwRCXBmo0NQFhSc1eIX5EJU5s/LJkQ3PknL4LHZsN1Gu79aV9XcSxDg7BEagCF5yBNrgFHdAF2Nq2TqwLq2Xv2+f2pX01ldpWMXMAfpV9w3+Zca5</latexit><latexit sha1_base64="mxdaQB8GmfgkBu9RKA5ZbS6fR8=">ACkHicdVHdSsMwGE3r/ybeulNcAjbxUY7Bb1wOPVGvJrgdLJuJc1SDSZpSVJhlD6P7+Odb2O6daCbfhA4Od/5TpKTIGZUacf5suyl5ZXVtfWN0ubW9s5ueW/UWJxKSLIxbJXoAUYVSQrqakV4sCeIBI0/B203ef3onUtFIPOhxTAYcvQgaUoy0ofzyB4ct6IUS4dRTCfdT2nKzYSoyWH32ac+nsA69AMn0OTObWjavMuSwOdP0ck32j2E1NzOetToUhbowXjCtTlxr9Zlu2Mz8csVpOJOCi8AtQAU1fHLn94owgknQmOGlOq7TqwHKZKaYkaykpcoEiP8hl5I30CBOFGDdBJoBo8NM4JhJM0SGk7YnxMp4kqNeWCUHOlXNd/Lyb96/USH54OUijRODpQWHCoI5g/jtwRCXBmo0NQFhSc1eIX5EJU5s/LJkQ3PknL4LHZsN1Gu79aV9XcSxDg7BEagCF5yBNrgFHdAF2Nq2TqwLq2Xv2+f2pX01ldpWMXMAfpV9w3+Zca5</latexit>b = ¯ Y − m ¯ X
<latexit sha1_base64="hwHBCHxlXmNAb1i+1nKjhGPDZ7s=">ACAHicbZDLSgMxFIbP1Fut1EXLtwEi+DGMiOCboSiG5cV7EXaoWTSTBuaZIYkI5RhNr6KGxeKuPUx3Pk2peFtv4Q+PjPOZycP0w408bzvp3C0vLK6lpxvbSxubW94+7uNXScKkLrJOaxaoVYU84krRtmOG0limIRctoMhzfjevORKs1ieW9GCQ0E7ksWMYKNtbruQYiuUCfEKnvI0SkSU27lXbfsVbyJ0CL4MyjDTLWu+9XpxSQVBrCsdZt30tMkGFlGOE0L3VSTRNMhrhP2xYlFlQH2eSAHB1bp4eiWNknDZq4vycyLQeidB2CmwGer42Nv+rtVMTXQYZk0lqCTRVHKkYnROA3UY4oSw0cWMFHM/hWRAVaYGJtZyYbgz5+8CI2ziu9V/LvzcvV6FkcRDuEITsCHC6jCLdSgDgRyeIZXeHOenBfn3fmYthac2cw+/JHz+QOthJUt</latexit><latexit sha1_base64="hwHBCHxlXmNAb1i+1nKjhGPDZ7s=">ACAHicbZDLSgMxFIbP1Fut1EXLtwEi+DGMiOCboSiG5cV7EXaoWTSTBuaZIYkI5RhNr6KGxeKuPUx3Pk2peFtv4Q+PjPOZycP0w408bzvp3C0vLK6lpxvbSxubW94+7uNXScKkLrJOaxaoVYU84krRtmOG0limIRctoMhzfjevORKs1ieW9GCQ0E7ksWMYKNtbruQYiuUCfEKnvI0SkSU27lXbfsVbyJ0CL4MyjDTLWu+9XpxSQVBrCsdZt30tMkGFlGOE0L3VSTRNMhrhP2xYlFlQH2eSAHB1bp4eiWNknDZq4vycyLQeidB2CmwGer42Nv+rtVMTXQYZk0lqCTRVHKkYnROA3UY4oSw0cWMFHM/hWRAVaYGJtZyYbgz5+8CI2ziu9V/LvzcvV6FkcRDuEITsCHC6jCLdSgDgRyeIZXeHOenBfn3fmYthac2cw+/JHz+QOthJUt</latexit><latexit sha1_base64="hwHBCHxlXmNAb1i+1nKjhGPDZ7s=">ACAHicbZDLSgMxFIbP1Fut1EXLtwEi+DGMiOCboSiG5cV7EXaoWTSTBuaZIYkI5RhNr6KGxeKuPUx3Pk2peFtv4Q+PjPOZycP0w408bzvp3C0vLK6lpxvbSxubW94+7uNXScKkLrJOaxaoVYU84krRtmOG0limIRctoMhzfjevORKs1ieW9GCQ0E7ksWMYKNtbruQYiuUCfEKnvI0SkSU27lXbfsVbyJ0CL4MyjDTLWu+9XpxSQVBrCsdZt30tMkGFlGOE0L3VSTRNMhrhP2xYlFlQH2eSAHB1bp4eiWNknDZq4vycyLQeidB2CmwGer42Nv+rtVMTXQYZk0lqCTRVHKkYnROA3UY4oSw0cWMFHM/hWRAVaYGJtZyYbgz5+8CI2ziu9V/LvzcvV6FkcRDuEITsCHC6jCLdSgDgRyeIZXeHOenBfn3fmYthac2cw+/JHz+QOthJUt</latexit><latexit sha1_base64="hwHBCHxlXmNAb1i+1nKjhGPDZ7s=">ACAHicbZDLSgMxFIbP1Fut1EXLtwEi+DGMiOCboSiG5cV7EXaoWTSTBuaZIYkI5RhNr6KGxeKuPUx3Pk2peFtv4Q+PjPOZycP0w408bzvp3C0vLK6lpxvbSxubW94+7uNXScKkLrJOaxaoVYU84krRtmOG0limIRctoMhzfjevORKs1ieW9GCQ0E7ksWMYKNtbruQYiuUCfEKnvI0SkSU27lXbfsVbyJ0CL4MyjDTLWu+9XpxSQVBrCsdZt30tMkGFlGOE0L3VSTRNMhrhP2xYlFlQH2eSAHB1bp4eiWNknDZq4vycyLQeidB2CmwGer42Nv+rtVMTXQYZk0lqCTRVHKkYnROA3UY4oSw0cWMFHM/hWRAVaYGJtZyYbgz5+8CI2ziu9V/LvzcvV6FkcRDuEITsCHC6jCLdSgDgRyeIZXeHOenBfn3fmYthac2cw+/JHz+QOthJUt</latexit>129
130