Data Analytics and Machine Learning
Cheng Zihan Hor Jasrene Joshua Tan
Data Analytics and Machine Learning Cheng Zihan Hor Jasrene - - PowerPoint PPT Presentation
Data Analytics and Machine Learning Cheng Zihan Hor Jasrene Joshua Tan EEE03 Content Outline 1. 1. Int ntrodu ductio ction 2. 2. Aims & Objec ectiv tives es 3. Method 3. odolo logy gy 4. 4. Results ults & Discus
Cheng Zihan Hor Jasrene Joshua Tan
1.
ntrodu ductio ction 2.
ectiv tives es 3.
logy gy 4.
ults & Discus cussion sion 5.
usion
2
What is Machine Learning?
▰ Application of AI
4
▰ Identifying patterns & make decisions based on past data ▰ Analyse & interpret trends to generate new insights ▰ Handle massive data without human intervention
▰ Online Fraud Detection
5
▰ Virtual Personal Assistants ▰ Facial Recognition ▰ Malware Filtering
▰ Past st data ta to train ain alg lgori rithm hm ▰ Compa mpare re predicted dicted wit ith h exp xpect ected d outpu tput ▰ Model opti timis isation tion to reduce ce error ▰ Cla lassif sificat ication ion & Regr gression ession
▰ Information to train data is not classified
6
Supervised Learning Unsupervised Learning
Why did we work on this project?
▰ For Perceptron Learning Algorithm & Adaline Gradient Descent Algorithm to undergo supervised learning ▰ Compare effectiveness of algorithms to classify data
8
How did we carry out the project?
▰ Iris.data set for analysis by algorithms ○ 3 classes of 50 instances each
10
▰ 5 Attributes ○ Sepal Length ○ Sepal Width ○ Petal Length ○ Petal Width ○ Type of Iris Plant
Visual representation of data training set
11
Updates using individual instances of data
12
▰ Perceptron ron trained d with past data classifi sified ed into matrice ices
13
▰ Aggregates tes input based d on the we weights to gen ener erate e an o
put ▰ AIM: : Generate te weights of a perceptron
r each attribute e accurat rately ely
p (a) zi = w0 + ∑ wjxi,j;
j = 1
with z being the intermediate , w the weight and x the input.
14
(b) Limit added to value of zi to fit the output y’i: y’i = ⎰1 if zi ≥ 0; ⎱-1 otherwise.
(c) Update Weights of Perceptron: Calculate error ei -- difference between the output and the target class label -- and multiply it by the update step-size.
15
ei = yi - y’i; w0 := w0 + ηei w := w + ηxiei.
Updates using gradient of training data
16
▰ Classifiers ○ Iris data converted to numerical format ○ Recognisable by algorithms
17
▰ Weights ○ Prediction of data points ○ Continuously updated to minimise error ▰ Decision boundary ○ Classification of data by algorithms
18
Algorithm class created Weight generation and updates Comparing error graphs Comparing decision boundaries
What did we find out from the project?
19
20
▰ used in comparison with the classifications of the respective algorithms to determine their accuracy and sensitivities.
classified breeds: setosa and vesicolor
21
higher rate of learning lower rate of learning
blue: Perceptron Algorithm
Algorithm
Adaline Algorithm Perceptron Algorithm
22
▰ Perceptron Algorithm is more sensitive to outliers than Adaline Algorithm
23
▰ Perceptron Learning Algorithm is more sensitive and accurate in the classification of data as compared to the Adaline Algorithm. ▰ important indications on the unlimited potential in training and classifying data
24
▰ Supervisor: Prof. Andy W.H. Khong ▰ School teacher mentors: Mr Nicholas Wong Mr Lim Cher Chuan Dr Goh Ker Liang
25