COMP24111: Machine Learning and Optimisation
- Dr. Tingting Mu
COMP24111: Machine Learning and Optimisation Chapter 1: Machine - - PowerPoint PPT Presentation
COMP24111: Machine Learning and Optimisation Chapter 1: Machine Learning Basics Dr. Tingting Mu Email: tingting.mu@manchester.ac.uk Outline We are going to learn the following concepts: Machine learning. Unsupervised, supervised,
– Machine learning. – Unsupervised, supervised, reinforcement learning. – Classification. – Regression.
1
– Automate a process – Automate decision making – Extract knowledge from data – Predict future event – Adapt systems dynamically to enable better user experiences – …
2
3
4
x∈[0,3], y∈[0,5]
x∈[0,3], y∈[0,5]
x∈[0,3], y∈[0,5]
x∈[0,3], y∈[0,5]
min f x, y
( )
subject to 0 ≤ x ≤ 3 0 ≤ y ≤ 5
– Germany’s climate research centre generates 10 petabytes per year. – Google processes 24 petabytes per day. – PC users crossed over 300 billion videos in August 2014 alone, with an average of 202 videos and 952 minutes per viewer. – There were 223 million credit card purchases in March 2016, with a total value of £12.6 billion in UK. – Photo uploads in Facebook is around 300 million per day. – Approximately 2.5 million new scientific papers are published each year. – …
– Prediction - what can we predict about this phenomenon? – Description - how can we describe/understand this phenomenon in a new way?
5
Machine Learning Speech Recognition Speech Synthesis Natural Language Processing Text Mining Computer Vision Data Mining, Analysis, Engineering Robotics
subfields of Artificial Intelligence (A.I.).
learning plays a significant role in A.I.
6
Machine Learning Speech Recognition Speech Synthesis Natural Language Processing Text Mining Computer Vision Data Mining, Analysis, Engineering Robotics COMP14112, Fundamentals of A.I. COMP37212, Computer Vision COMP38120, Documents, Services and Data on the Web COMP61332, Text Mining COMP60711, Data Engineering COMP34120, AI and Games COMP24111, Machine Learning and Optimisation COMP61011, Foundations of Machine Learning COMP61021, Modelling and visualization
7
8
v Collecting wine samples for each grape type. v Characterising each wine sample with 13 chemical features.
Feature Extraction 30 bottles in total, 10 bottles for each tree type, each bottle is charaterised by 13 features.
x1 = x1,1, x1,2, x1,3,…, x1,12, x1,13 ⎡ ⎣ ⎤ ⎦, y1 = grape type 1 x2 = x2,1, x2,2, x2,3,…, x2,12, x2,13 ⎡ ⎣ ⎤ ⎦, y2 = grape type 2 x3 = x3,1, x3,2, x3,3,…, x3,12, x3,13 ⎡ ⎣ ⎤ ⎦, y3 = grape type 2 ! ! x30 = x30,1, x30,2, x30,3,…, x30,12, x30,13 ⎡ ⎣ ⎤ ⎦, y30 = grape type 1
feature vectors class labels Experiences 9
ˆ y = g(x) = type 1, if wixi
i=1 13
+ b ≥ 0 type 2, if wixi
i=1 13
+ b < 0 ⎧ ⎨ ⎪ ⎪ ⎩ ⎪ ⎪ v Design a mathematical model to predict the grape type. The model below is controlled by 14 parameters:
w1,w2,…,w13,b
[ ]
⇒ ˆ y1 = g(x1) ⇒ ˆ y2 = g(x2) ! ⇒ ˆ y30 = g(x30)
Predicted grape type by computer. Wine features.
bottle 1: x1 bottle 2: x2 ! ! bottle 30: x30
Real grape type. ✔ ✗ ✔
y1 y2 ! y30 w1
*,w2 *,…,w13 * , b*
⎡ ⎣ ⎤ ⎦= argmin
w1 ,w2,…,w13 , b
Oloss w1,w2,…,w13 , b
10
13 Features: x1 = 12.25, x2 = 3.88, x3 = 2.2, x4 = 18.5, x5 = 112, x6 = 1.38, x7 = 0.78, x8 = 0.29, x9 = 1.14, x10 = 8.21, x11 = 0.65, x12 = 2, x13 = 855
ˆ y = g(x) = type 1, if w*
ixi i=1 13
∑
+ b* ≥ 0 type 2, if w*
ixi i=1 13
∑
+ b* < 0 ⎧ ⎨ ⎪ ⎪ ⎩ ⎪ ⎪
11
12
13
14
15
16
17
18
19
MATLAB’s example
From https://cambridge-intelligence.com/keylines-network-clustering/
MATLAB’s example
20
21
These examples are from UCL course on RL.
22
23
24
https://en.wikipedia.org/wiki/Timeline_of_machine_learning https://cloud.withgoogle.com/build/data-analytics/explore-history-machine-learning/
25