CS 147: Computer Systems Performance Analysis Examples Using a - - PowerPoint PPT Presentation

cs 147 computer systems performance analysis
SMART_READER_LITE
LIVE PREVIEW

CS 147: Computer Systems Performance Analysis Examples Using a - - PowerPoint PPT Presentation

CS147 2015-06-15 CS 147: Computer Systems Performance Analysis Examples Using a Distributed File System CS 147: Computer Systems Performance Analysis Examples Using a Distributed File System 1 / 37 Velilinds Laws of Experimentation


slide-1
SLIDE 1

CS 147: Computer Systems Performance Analysis

Examples Using a Distributed File System

1 / 37

CS 147: Computer Systems Performance Analysis

Examples Using a Distributed File System

2015-06-15

CS147

slide-2
SLIDE 2

Velilind’s Laws of Experimentation

◮ If reproducibility may be a problem, conduct the test only once ◮ If a straight-line fit is required, obtain only two data points

2 / 37

Velilind’s Laws of Experimentation

◮ If reproducibility may be a problem, conduct the test only once ◮ If a straight-line fit is required, obtain only two data points

2015-06-15

CS147 Velilind’s Laws of Experimentation

slide-3
SLIDE 3

Overview

Overview of the Ficus File System Characteristics Performance Issues Measured Data Measurement Methodology Raw Results Data Analysis What Can Be Analyzed? Sample Analysis Quality of the Analysis Visual Tests A Bad Example

3 / 37

Overview

Overview of the Ficus File System Characteristics Performance Issues Measured Data Measurement Methodology Raw Results Data Analysis What Can Be Analyzed? Sample Analysis Quality of the Analysis Visual Tests A Bad Example

2015-06-15

CS147 Overview

slide-4
SLIDE 4

Overview of the Ficus File System

What is Ficus?

◮ Distributed, replicated file system ◮ Individual computers store replicas of shared files

◮ Fast local access ◮ Shared data

◮ Designed for robustness in face of network disconnections

◮ Anyone can write any file, any time 4 / 37

What is Ficus?

◮ Distributed, replicated file system ◮ Individual computers store replicas of shared files ◮ Fast local access ◮ Shared data ◮ Designed for robustness in face of network disconnections ◮ Anyone can write any file, any time

2015-06-15

CS147 Overview of the Ficus File System What is Ficus?

slide-5
SLIDE 5

Overview of the Ficus File System Characteristics

Propagation

◮ Any update generates a “best-effort” propagation message

◮ Generated on every write system call ◮ Broadcast to all known replicas ◮ Notifies of change, not contents

◮ Receiving site can ignore or can request latest version of file

from generating site

◮ Only when no conflict 5 / 37

Propagation

◮ Any update generates a “best-effort” propagation message ◮ Generated on every write system call ◮ Broadcast to all known replicas ◮ Notifies of change, not contents ◮ Receiving site can ignore or can request latest version of file

from generating site

◮ Only when no conflict

2015-06-15

CS147 Overview of the Ficus File System Characteristics Propagation

slide-6
SLIDE 6

Overview of the Ficus File System Characteristics

Reconciliation

◮ Correctness guarantees provided by reconciliation process ◮ Runs periodically ◮ Operates between pair of replicas

◮ Transfers data in one direction only

◮ Complex distributed algorithm

◮ Proven to terminate correctly ◮ Data is guaranteed to eventually get everywhere 6 / 37

Reconciliation

◮ Correctness guarantees provided by reconciliation process ◮ Runs periodically ◮ Operates between pair of replicas ◮ Transfers data in one direction only ◮ Complex distributed algorithm ◮ Proven to terminate correctly ◮ Data is guaranteed to eventually get everywhere

2015-06-15

CS147 Overview of the Ficus File System Characteristics Reconciliation

slide-7
SLIDE 7

Overview of the Ficus File System Characteristics

Garbage Collection

◮ Tricky to get deletion right ◮ Example: Joe deletes foo while Mary renames it to bar ◮ Need to globally agree that all names are gone ◮ Requires complex two-phase distributed algorithm

7 / 37

Garbage Collection

◮ Tricky to get deletion right ◮ Example: Joe deletes foo while Mary renames it to bar ◮ Need to globally agree that all names are gone ◮ Requires complex two-phase distributed algorithm

2015-06-15

CS147 Overview of the Ficus File System Characteristics Garbage Collection

slide-8
SLIDE 8

Overview of the Ficus File System Performance Issues

Ficus Performance

◮ File access (open) performance ◮ Read/write performance ◮ Aspects of deletion ◮ Reconciliation ◮ Cross-machine interference

8 / 37

Ficus Performance

◮ File access (open) performance ◮ Read/write performance ◮ Aspects of deletion ◮ Reconciliation ◮ Cross-machine interference

2015-06-15

CS147 Overview of the Ficus File System Performance Issues Ficus Performance

slide-9
SLIDE 9

Overview of the Ficus File System Performance Issues

Open Performance

◮ Opening file requires:

◮ Finding file ◮ Checking for conflicts ◮ Local or remote (NFS-like) open

◮ Finding file requires

◮ Local or remote root access ◮ Tracing path, changing machines as needed

◮ Other steps are basically one remote procedure call

(RPC—one message exchange) each

9 / 37

Open Performance

◮ Opening file requires: ◮ Finding file ◮ Checking for conflicts ◮ Local or remote (NFS-like) open ◮ Finding file requires ◮ Local or remote root access ◮ Tracing path, changing machines as needed ◮ Other steps are basically one remote procedure call

(RPC—one message exchange) each

2015-06-15

CS147 Overview of the Ficus File System Performance Issues Open Performance

slide-10
SLIDE 10

Overview of the Ficus File System Performance Issues

Read/Write Performance

◮ Reading is same as local or NFS operation ◮ Write is like local or NFS, plus:

◮ Propagation (small outgoing packet) ◮ Attribute update (beyond i-node update) 10 / 37

Read/Write Performance

◮ Reading is same as local or NFS operation ◮ Write is like local or NFS, plus: ◮ Propagation (small outgoing packet) ◮ Attribute update (beyond i-node update)

2015-06-15

CS147 Overview of the Ficus File System Performance Issues Read/Write Performance

slide-11
SLIDE 11

Overview of the Ficus File System Performance Issues

Deletion

◮ Initially removing a file is reasonably cheap

◮ Mark deleted ◮ Remove from visible namespace ◮ May actually be cheaper than UFS unlink

◮ True cost is garbage collection

◮ How long is space consumed? ◮ CPU cost? ◮ Still have to do unlink equivalent someday 11 / 37

Deletion

◮ Initially removing a file is reasonably cheap ◮ Mark deleted ◮ Remove from visible namespace ◮ May actually be cheaper than UFS unlink ◮ True cost is garbage collection ◮ How long is space consumed? ◮ CPU cost? ◮ Still have to do unlink equivalent someday

2015-06-15

CS147 Overview of the Ficus File System Performance Issues Deletion

slide-12
SLIDE 12

Overview of the Ficus File System Performance Issues

Reconciliation

◮ Runs periodically ◮ Mechanism to suppress under high load ◮ Must check every file

◮ If updated, exchange info with remote ◮ May also transfer data ◮ Special handling, but similar, for new/deleted files

◮ Primary cost is checking what’s updated

12 / 37

Reconciliation

◮ Runs periodically ◮ Mechanism to suppress under high load ◮ Must check every file ◮ If updated, exchange info with remote ◮ May also transfer data ◮ Special handling, but similar, for new/deleted files ◮ Primary cost is checking what’s updated

2015-06-15

CS147 Overview of the Ficus File System Performance Issues Reconciliation

slide-13
SLIDE 13

Overview of the Ficus File System Performance Issues

Cross-Machine Interference

◮ If you store a replica, you pay some costs:

◮ Receiving propagation requests ◮ Running reconciliation as client and server ◮ Servicing remote access requests 13 / 37

Cross-Machine Interference

◮ If you store a replica, you pay some costs: ◮ Receiving propagation requests ◮ Running reconciliation as client and server ◮ Servicing remote access requests

2015-06-15

CS147 Overview of the Ficus File System Performance Issues Cross-Machine Interference

slide-14
SLIDE 14

Measured Data Measurement Methodology

Ficus Measurement Methodology

◮ Two classes of measurement

◮ Local replica ◮ Interference with remote replicas

◮ Set up test volume ◮ Populate with files ◮ Run several “standard” benchmarks ◮ Destroy volume after test

14 / 37

Ficus Measurement Methodology

◮ Two classes of measurement ◮ Local replica ◮ Interference with remote replicas ◮ Set up test volume ◮ Populate with files ◮ Run several “standard” benchmarks ◮ Destroy volume after test

2015-06-15

CS147 Measured Data Measurement Methodology Ficus Measurement Methodology

slide-15
SLIDE 15

Measured Data Measurement Methodology

Benchmarks Used

◮ Eight benchmarks: cp, find, findgrep, grep, ls, mab, rcp, rm ◮ Most did single operation implied by name

◮ cp copied locally within volume ◮ rcp copied from remote machine ◮ findgrep essentially did recursive grep ◮ mab, Modified Andrew Benchmark, did more complex

compile-edit-debug simulation

15 / 37

Benchmarks Used

◮ Eight benchmarks: cp, find, findgrep, grep, ls, mab, rcp, rm ◮ Most did single operation implied by name ◮ cp copied locally within volume ◮ rcp copied from remote machine ◮ findgrep essentially did recursive grep ◮ mab, Modified Andrew Benchmark, did more complex compile-edit-debug simulation

2015-06-15

CS147 Measured Data Measurement Methodology Benchmarks Used

slide-16
SLIDE 16

Measured Data Measurement Methodology

Local-Replica Measurements

◮ Set up UFS, remotely-accessed NFS, or Ficus volume

◮ Ficus volume varies from 1 to 8 replicas

◮ Run benchmarks on machine that stores local copy (except

for NFS tests)

◮ Ignore effect on machines holding other replicas

16 / 37

Local-Replica Measurements

◮ Set up UFS, remotely-accessed NFS, or Ficus volume ◮ Ficus volume varies from 1 to 8 replicas ◮ Run benchmarks on machine that stores local copy (except

for NFS tests)

◮ Ignore effect on machines holding other replicas

2015-06-15

CS147 Measured Data Measurement Methodology Local-Replica Measurements

slide-17
SLIDE 17

Measured Data Measurement Methodology

Interference Measurements

◮ Set up UFS volume on “interfered” machine ◮ On 1 to 3 other machines, set up 2-replica Ficus volume

◮ Unique volume for each machine ◮ Second replica stored on “interfered” machine

◮ Run all 8 benchmarks simultaneously on all machines

◮ Compare UFS time to uninterfered version 17 / 37

Interference Measurements

◮ Set up UFS volume on “interfered” machine ◮ On 1 to 3 other machines, set up 2-replica Ficus volume ◮ Unique volume for each machine ◮ Second replica stored on “interfered” machine ◮ Run all 8 benchmarks simultaneously on all machines ◮ Compare UFS time to uninterfered version

2015-06-15

CS147 Measured Data Measurement Methodology Interference Measurements

slide-18
SLIDE 18

Measured Data Raw Results

Example of Raw Ficus Results

.../RESULTS/950531.211023/benchtimes:ficus mab 2 162.9 real 83.2 user 40.9 sys

◮ Test was run on May 31, 1995, at 21:10:23 ◮ Ficus test with with MAB benchmark, 2 replicas ◮ 162.9 seconds for run; 83.2 user time, 40.9 charged to system

18 / 37

Example of Raw Ficus Results

.../RESULTS/950531.211023/benchtimes:ficus mab 2 162.9 real 83.2 user 40.9 sys

◮ Test was run on May 31, 1995, at 21:10:23 ◮ Ficus test with with MAB benchmark, 2 replicas ◮ 162.9 seconds for run; 83.2 user time, 40.9 charged to system

2015-06-15

CS147 Measured Data Raw Results Example of Raw Ficus Results

slide-19
SLIDE 19

Data Analysis What Can Be Analyzed?

The “Standard” Analysis

◮ Everybody publishes means, usually in nice tables or graphs ◮ Standard deviations are becoming fairly common ◮ Sometimes they even tell you how many runs they did

◮ Allows you to generate confidence intervals 19 / 37

The “Standard” Analysis

◮ Everybody publishes means, usually in nice tables or graphs ◮ Standard deviations are becoming fairly common ◮ Sometimes they even tell you how many runs they did ◮ Allows you to generate confidence intervals

2015-06-15

CS147 Data Analysis What Can Be Analyzed? The “Standard” Analysis

slide-20
SLIDE 20

Data Analysis What Can Be Analyzed?

Earning Some Self-Respect

◮ You should always provide the reader or listener with at least:

◮ A mean of a specified number of runs ◮ A confidence interval at 90% or higher ◮ An analysis of whether the results are meaningful

◮ Standard deviations are nice, but not as important as

confidence intervals

20 / 37

Earning Some Self-Respect

◮ You should always provide the reader or listener with at least: ◮ A mean of a specified number of runs ◮ A confidence interval at 90% or higher ◮ An analysis of whether the results are meaningful ◮ Standard deviations are nice, but not as important as

confidence intervals

2015-06-15

CS147 Data Analysis What Can Be Analyzed? Earning Some Self-Respect

slide-21
SLIDE 21

Data Analysis What Can Be Analyzed?

Learning Something About the System

◮ Use confidence intervals to compare various parameters and

results

◮ Consider whether regression is meaningful ◮ Can you do multivariate regression? ◮ What about ANOVA?

21 / 37

Learning Something About the System

◮ Use confidence intervals to compare various parameters and

results

◮ Consider whether regression is meaningful ◮ Can you do multivariate regression? ◮ What about ANOVA?

2015-06-15

CS147 Data Analysis What Can Be Analyzed? Learning Something About the System

slide-22
SLIDE 22

Data Analysis Sample Analysis

Sample Analysis of Ficus Experiments

◮ We will consider only the “cp” benchmark ◮ Local-replica tests only ◮ Questions to ask:

◮ Is Ficus significantly slower than UFS? ◮ Is Ficus faster than NFS remote access? ◮ What is the cost of adding a remote replica? 22 / 37

Sample Analysis of Ficus Experiments

◮ We will consider only the “cp” benchmark ◮ Local-replica tests only ◮ Questions to ask: ◮ Is Ficus significantly slower than UFS? ◮ Is Ficus faster than NFS remote access? ◮ What is the cost of adding a remote replica?

2015-06-15

CS147 Data Analysis Sample Analysis Sample Analysis of Ficus Experiments

slide-23
SLIDE 23

Data Analysis Sample Analysis

Some Raw “cp” Benchmark Data

Repls Real Repls Real Repls Real 1 179.5 2 189.0 3 178.0 1 193.2 2 246.6 3 207.9 1 197.4 2 227.7 3 202.0 1 231.8 2 275.2 3 213.9 1 202.4 2 203.8 3 218.2 1 180.3 2 235.3 3 249.0 1 222.1 2 199.9 3 207.6 1 186.2 2 168.6 3 213.2

23 / 37

Some Raw “cp” Benchmark Data

Repls Real Repls Real Repls Real 1 179.5 2 189.0 3 178.0 1 193.2 2 246.6 3 207.9 1 197.4 2 227.7 3 202.0 1 231.8 2 275.2 3 213.9 1 202.4 2 203.8 3 218.2 1 180.3 2 235.3 3 249.0 1 222.1 2 199.9 3 207.6 1 186.2 2 168.6 3 213.2

2015-06-15

CS147 Data Analysis Sample Analysis Some Raw “cp” Benchmark Data

slide-24
SLIDE 24

Data Analysis Sample Analysis

Is Ficus Slower Than UFS?

◮ UFS “cp” benchmark, 90% confidence interval is

(167.6, 186.6), mean 177.1

◮ Fastest Ficus (1-replica) “cp” 90% confidence is

(188.0, 210.2), mean 199.1

◮ Non-overlapping intervals ⇒ meaningful difference, Ficus is

indeed slower

◮ Results might differ at higher confidence

24 / 37

Is Ficus Slower Than UFS?

◮ UFS “cp” benchmark, 90% confidence interval is

(167.6, 186.6), mean 177.1

◮ Fastest Ficus (1-replica) “cp” 90% confidence is

(188.0, 210.2), mean 199.1

◮ Non-overlapping intervals ⇒ meaningful difference, Ficus is

indeed slower

◮ Results might differ at higher confidence

2015-06-15

CS147 Data Analysis Sample Analysis Is Ficus Slower Than UFS?

slide-25
SLIDE 25

Data Analysis Sample Analysis

Is Ficus Faster Than Remote NFS?

◮ NFS interval is (231.0, 259.3), mean 245.2 ◮ 3-replica Ficus is (198.1, 224.4) at 90% ◮ So Ficus is definitely faster ◮ (Incidentally, result would have held even at higher

confidence)

25 / 37

Is Ficus Faster Than Remote NFS?

◮ NFS interval is (231.0, 259.3), mean 245.2 ◮ 3-replica Ficus is (198.1, 224.4) at 90% ◮ So Ficus is definitely faster ◮ (Incidentally, result would have held even at higher

confidence)

2015-06-15

CS147 Data Analysis Sample Analysis Is Ficus Faster Than Remote NFS?

slide-26
SLIDE 26

Data Analysis Sample Analysis

What is the Cost of a New Replica?

◮ Do regression on data from 1 to 8 replicas

◮ Note that some tests have different numbers of runs ◮ Regression on means will give incorrect results ◮ Proper method: regress on raw observations, with repeated x

values

◮ Time = 20.25 × replicas + 168.12 ◮ So each replica slows “cp” by about 20 seconds

26 / 37

What is the Cost of a New Replica?

◮ Do regression on data from 1 to 8 replicas ◮ Note that some tests have different numbers of runs ◮ Regression on means will give incorrect results ◮ Proper method: regress on raw observations, with repeated x values ◮ Time = 20.25 × replicas + 168.12 ◮ So each replica slows “cp” by about 20 seconds

2015-06-15

CS147 Data Analysis Sample Analysis What is the Cost of a New Replica?

slide-27
SLIDE 27

Data Analysis Quality of the Analysis

Allocation of Variation for “cp” Benchmark

◮ SSR = 76590.40, SSE = 61437.46 ◮ R2 = SSR/(SSR + SSE) = 0.55

◮ So regression explains only 50% of variation

◮ Standard deviation of errors

s2

e =

  • SSE

n − 2 =

  • 61437.46

44 − 2 = 38.25

◮ Relatively high value compared to mean of 240.4 27 / 37

Allocation of Variation for “cp” Benchmark

◮ SSR = 76590.40, SSE = 61437.46 ◮ R2 = SSR/(SSR + SSE) = 0.55 ◮ So regression explains only 50% of variation ◮ Standard deviation of errors

s2

e =

  • SSE

n − 2 =

  • 61437.46

44 − 2 = 38.25

◮ Relatively high value compared to mean of 240.4

2015-06-15

CS147 Data Analysis Quality of the Analysis Allocation of Variation for “cp” Benchmark

slide-28
SLIDE 28

Data Analysis Quality of the Analysis

Confidence Intervals for “cp” Regression

◮ sb0 = 11.53, sb1 = 2.80 ◮ Using 90% confidence level, t0.95;44 = 1.68 ◮ This gives b0 = (148.7, 187.5), while b1 = (15.5, 25.0) ◮ Standard deviation for 9-replica prediction, single observation

is se(1.09) = 41.6

◮ Using same t, interval is (308.8, 391.9)

◮ Compare to narrower 1-replica interval 28 / 37

Confidence Intervals for “cp” Regression

◮ sb0 = 11.53, sb1 = 2.80 ◮ Using 90% confidence level, t0.95;44 = 1.68 ◮ This gives b0 = (148.7, 187.5), while b1 = (15.5, 25.0) ◮ Standard deviation for 9-replica prediction, single observation

is se(1.09) = 41.6

◮ Using same t, interval is (308.8, 391.9) ◮ Compare to narrower 1-replica interval

2015-06-15

CS147 Data Analysis Quality of the Analysis Confidence Intervals for “cp” Regression

slide-29
SLIDE 29

Data Analysis Visual Tests

Scatter Plot of “cp” Regression

2 4 6 8

Replicas

100 200 300 400

Time (Secs)

29 / 37

Scatter Plot of “cp” Regression

2 4 6 8 Replicas 100 200 300 400 Time (Secs)

2015-06-15

CS147 Data Analysis Visual Tests Scatter Plot of “cp” Regression

slide-30
SLIDE 30

Data Analysis Visual Tests

Error Scatter of “cp” Regression

150 200 250 300 350

Predicted Response

  • 100
  • 50

50 100

Error Residual

30 / 37

Error Scatter of “cp” Regression

150 200 250 300 350 Predicted Response

  • 100
  • 50

50 100 Error Residual

2015-06-15

CS147 Data Analysis Visual Tests Error Scatter of “cp” Regression

slide-31
SLIDE 31

Data Analysis Visual Tests

Error Scatter by Experiment Number

10 20 30 40

Experiment Number

  • 100
  • 50

50 100

Error Residual

31 / 37

Error Scatter by Experiment Number

10 20 30 40 Experiment Number

  • 100
  • 50

50 100 Error Residual

2015-06-15

CS147 Data Analysis Visual Tests Error Scatter by Experiment Number

slide-32
SLIDE 32

Data Analysis Visual Tests

Quantile-Quantile Plot of Errors

  • 2

2

  • 100
  • 50

50 100

32 / 37

Quantile-Quantile Plot of Errors

  • 2

2

  • 100
  • 50

50 100

2015-06-15

CS147 Data Analysis Visual Tests Quantile-Quantile Plot of Errors

slide-33
SLIDE 33

Data Analysis Visual Tests

F-test for Ficus “cp” Regression

◮ SSR = 76590.40, SSE = 61437.46 ◮ MSR = SSR/k = SSR ◮ MSE = SSE/(n − k − 1) = textSSE/42 = 1462.80 ◮ Computed F value = MSR/MSE = 52.36 ◮ Table F0.9;1;42 = 2.83

◮ Regression explains significant part of variation ◮ Would have passed F-test at 99% level as well 33 / 37

F-test for Ficus “cp” Regression

◮ SSR = 76590.40, SSE = 61437.46 ◮ MSR = SSR/k = SSR ◮ MSE = SSE/(n − k − 1) = textSSE/42 = 1462.80 ◮ Computed F value = MSR/MSE = 52.36 ◮ Table F0.9;1;42 = 2.83 ◮ Regression explains significant part of variation ◮ Would have passed F-test at 99% level as well

2015-06-15

CS147 Data Analysis Visual Tests F-test for Ficus “cp” Regression

slide-34
SLIDE 34

Data Analysis Visual Tests

Summary of Ficus “cp” Analysis

◮ Ficus costs something compared to UFS, but is much faster

than remote access via NFS

◮ Adding one replica costs around 20 seconds

◮ Wide confidence intervals make this uncertain ◮ Removing outliers might greatly improve confidence ◮ Regression quality questionable (with outliers) 34 / 37

Summary of Ficus “cp” Analysis

◮ Ficus costs something compared to UFS, but is much faster

than remote access via NFS

◮ Adding one replica costs around 20 seconds ◮ Wide confidence intervals make this uncertain ◮ Removing outliers might greatly improve confidence ◮ Regression quality questionable (with outliers)

2015-06-15

CS147 Data Analysis Visual Tests Summary of Ficus “cp” Analysis

slide-35
SLIDE 35

A Bad Example

Regression Digression: A Bad Example

The following graph appeared in the July, 1996 issue of Computer Communications Review:

5000 10000 15000

File size (bytes)

1 2 3 4 5

Time to fetch (seconds)

Linear model

35 / 37

Regression Digression: A Bad Example

The following graph appeared in the July, 1996 issue of Computer Communications Review: 5000 10000 15000 File size (bytes) 1 2 3 4 5 Time to fetch (seconds) Linear model

2015-06-15

CS147 A Bad Example Regression Digression: A Bad Example

slide-36
SLIDE 36

A Bad Example

Inappropriate Use of Regression

Just calculating R2 would have shown the problem:

5000 10000 15000

File size (bytes)

1 2 3 4 5

Time to fetch (seconds)

y = 1E-05x + 1.3641 R = 0.0033

2

Linear model

36 / 37

Inappropriate Use of Regression

Just calculating R2 would have shown the problem: 5000 10000 15000 File size (bytes) 1 2 3 4 5 Time to fetch (seconds) y = 1E-05x + 1.3641 R = 0.0033

2

Linear model

2015-06-15

CS147 A Bad Example Inappropriate Use of Regression

slide-37
SLIDE 37

A Bad Example

The Tale of the Residuals

Plot of residuals also shows data isn’t homoscedastic:

5000 10000 15000

File size (bytes)

  • 2
  • 1

1 2 3

Error Residual

37 / 37

The Tale of the Residuals

Plot of residuals also shows data isn’t homoscedastic: 5000 10000 15000 File size (bytes)

  • 2
  • 1

1 2 3 Error Residual

2015-06-15

CS147 A Bad Example The Tale of the Residuals