Solving Optimization Problems Debasis Samanta IIT Kharagpur - - PowerPoint PPT Presentation

solving optimization problems
SMART_READER_LITE
LIVE PREVIEW

Solving Optimization Problems Debasis Samanta IIT Kharagpur - - PowerPoint PPT Presentation

Solving Optimization Problems Debasis Samanta IIT Kharagpur dsamanta@sit.iitkgp.ernet.in 06.03.2018 Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 1 / 22 Introduction to Solving Optimization Problems


slide-1
SLIDE 1

Solving Optimization Problems

Debasis Samanta

IIT Kharagpur dsamanta@sit.iitkgp.ernet.in

06.03.2018

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 1 / 22

slide-2
SLIDE 2

Introduction to Solving Optimization Problems

Today’s Topics Concept of optimization problem Defining an optimization problem Various types of optimization problems Traditional approaches to solve optimization problems Limitations of the traditional approaches

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 2 / 22

slide-3
SLIDE 3

Concept of optimization problem

Optimization : Optimum value that is either minimum or maximum value. y = F(x) Example: 2x − 6y = 11

  • r

y = (2x − 11) ÷ 6 Can we determine an optimum value for y? Similarly, in the following case 3x + 4y ≥ 56. These are really not related to optimization problem!

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 3 / 22

slide-4
SLIDE 4

Defining an optimization problem

Suppose, we are to design an optimal pointer made of some material with density ρ. The pointer should be as large as possible, no mechanical breakage and deflection of pointing at end should be negligible. The task is to select the best pointer out of many all possible pointers.

Diameter d

Length l

Suppose, s is the strength of the pointer. Mass of the stick is denoted by M = 1

d

2

2 ∗ l ∗ ρ = 1

12Π ∗ d2 ∗ l ∗ ρ

Deflection : δ = f1 (d, l, ρ) Strength : s = f2(d, l, ρ)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 4 / 22

slide-5
SLIDE 5

Defining an optimization problem

The problem can be stated as Objective function Minimize M = 1

12Π ∗ d2 ∗ l ∗ ρ

Subject to δ ≤ δth, where δth = allowable deflection s ≥ sth, where sth = required strength and dmin ≤ d ≤ dmax lmin ≤ l ≤ lmax

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 5 / 22

slide-6
SLIDE 6

Defining Optimization Problem

An optimization problem can be formally defined as follows: Maximize (or Minimize) yi = fi(x1, x2, · · · , xn) where i = 1, 2 · · · k, k ≥ 1 Subject to gi(x1, x2, · · · xn) ROPi ci where i = 1, 2, ..., j, j ≥ 0. ROPi denotes some relational operator and ci = 1, 2, · · · j are some constants. and xi ROP di, for all i=1,2...n (n ≥ 1) Here, xi denotes a design parameter and di is some constant.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 6 / 22

slide-7
SLIDE 7

Some Benchmark Optimization Problems

Exercises: Mathematically define the following optimization problems. Traveling Salesman Problem Knapsack Problem Graph Coloring Problem Job Machine Assignment Problem Coin Change Problem Binary search tree construction problem

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 7 / 22

slide-8
SLIDE 8

Types of Optimization Problem

Unconstrained optimization problem Problem is without any functional constraint. Example: Minimize y = f (x1, x2) = (x1 − 5)2 + (x2 − 3)3 where x1, x2 ≥ 0 Note: Here, gj = NULL

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 8 / 22

slide-9
SLIDE 9

Types of Optimization Problem

Constrained optimization problem Optimization problem with at one or more functional constraint(s). Example: Maximize y = f (x1, x2, · · · , xn) Subject to gi(x1, x2, · · · , xni) ≥ ci where i = 1, 2, · · · , k and k > 0 and x1, x2, · · · , xn are design parameters.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 9 / 22

slide-10
SLIDE 10

Types of Optimization Problem

Integer Programming problem If all the design variables take some integer values. Example: Minimize y = f (x1, x2) = 2x1 + x2 Subject to x1 + x2 ≤ 3 5x1 + 2x2 ≤ 9 and x1, x2 are integer variables.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 10 / 22

slide-11
SLIDE 11

Types of Optimization Problem

Real-valued problem If all the design variables are bound to take real values. Mixed-integer programming problem Some of the design variables are integers and the rest of the variables take real values.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 11 / 22

slide-12
SLIDE 12

Types of Optimization Problem

Linear optimization problem Both objective functions as well as all constraints are found to be some linear functions of design variables. Example: Maximize y = f (x1, x2) = 2x1 + x2 Subject to x1 + x2 ≤ 3 5x1 + 2x2 ≤ 10 and x1, x2 ≥ 0

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 12 / 22

slide-13
SLIDE 13

Types of Optimization Problem

Non-linear optimization problem If either the objective function or any one of the functional constraints are non-linear function of design variables. Example: Maximize y = f (x1, x2) = x2

1 + 5x3 2

Subject to x4

1 + 3x2 2 ≤ 629

2x3

1 + 4x3 2 ≤ 133

and x1, x2 ≥ 0

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 13 / 22

slide-14
SLIDE 14

Traditional approaches to solve optimization problems

Optimization Methods Linear Programming Method Non linear Programming Method Specialized Algorithm

Graphical Method Simplex Method

Single Variable Multi Variable Numerical Method Analytical Method Elimination Method Interpolation Method

Dynamic Programing Branch & Bound Greedy Method Unrestricted method Exhaustive method Fibonacci method Dichotomous Search Golden Section method Quadratic Cubic Direct root

Constrained Optimization Unconstrained Optimization

Unrestricted method Exhaustive method Fibonacci method Random Walk Univeriate Method Pattern Search Steepest Descent Conjugate Gradient Quasi Newton Variable Match Divide & Conquer Lagrangian method

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 14 / 22

slide-15
SLIDE 15

Example : Analytical Method

Suppose, the objective function: y = f (x). Let f (x) be a polynomial of degree m and (m > 0) If y′ = f ′(x) = 0 for some x = x∗, then we say that y is optimum (i.e. either minimum or maximum point exist) at the point x = x∗. If y′ = f ′(x) = 0 for some x = x∗, then we say that there is no optimum value at x = x∗ (i.e. x = x∗ is an inflection point) An inflection point is also called a saddle point.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 15 / 22

slide-16
SLIDE 16

Example : Analytical Method

Note: An inflection point is a point, that is, neither a maximum nor a minimum at that point. Following figure explains the concepts of minimum, maximum and saddle point.

x1* Maximum Minimum Saddle Points y x x2*

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 16 / 22

slide-17
SLIDE 17

Example : Analytical Method

Let us generalize the concept of ”Analytical method”. If y = f (x) is a polynomial of degree m, then there are m number of candidate points to be checked for optimum or saddle points. Suppose, yn is the nth derivative of y. To further investigate the nature of the point, we determine (first non-zero) (n − th) higher order derivative yn = f n(x = x∗) There are two cases. Case 1: If yn = 0 for n=odd number, then x∗ is an inflection point. Case 2: If yn = 0 for n = odd number, then there exist an optimum point at x∗.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 17 / 22

slide-18
SLIDE 18

Example : Analytical Method

In order to decide the point x∗ as minimum or maximum, we have to find the next higher order derivative, that is yn+1 = f n+1(x = x∗). There are two sub cases may be: Case 2.1: If yn = f n(x = x∗) is positive then x is a local minimum point. Case 2.2: If yn = f n(x = x∗) is negative then x is a local maximum point.

x1* x4* x3* x2* z1* z4* z3* z2* y x

If yn+1 = f n+1(x = x∗) = 0, then we are to repeat the next higher order derivative.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 18 / 22

slide-19
SLIDE 19

Question

y = f (x)

d2y dx = +ve ⇒ x = x∗ 1 d4y dx = −ve ⇒ x = x∗ 2 d6y dx = ±ve ⇒ x = x∗ 3

y x x=x2* x=x1* x=x3* Optimal Solution

Is the analytical method solves optimization problem with multiple input variables? If ”Yes”, than how? If ”No”, than why not?

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 19 / 22

slide-20
SLIDE 20

Exercise

Determine the minimum or maximum or saddle points, if any for the following single variable function f (x) f (x) = x2

2 + 125 x

for some real values of x.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 20 / 22

slide-21
SLIDE 21

Duality Principle

Principle

A Minimization (Maximization) problem is said to have dual problem if it is converted to the maximization (Minimization) problem. The usual conversion from maximization ⇔ minimization y = f (x) ⇔ y∗ = −f (x) y = f (x) ⇔ y∗ =

1 f (x)

y x y = f(x) y* = f(x)

Maximization Problem Minimization Problem Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 21 / 22

slide-22
SLIDE 22

Limitations of the traditional optimization approach

Computationally expensive. For a discontinuous objective function, methods may fail. Method may not be suitable for parallel computing. Discrete (integer) variables are difficult to handle. Methods may not necessarily adaptive. Soft Computing techniques have been evolved to address the above mentioned limitations of solving optimization problem with traditional approaches.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications (IT60108) 06.03.2018 22 / 22

slide-23
SLIDE 23

Evolutionary Algorithms

The algorithms, which follow some biological and physical behaviors: Biologic behaviors: Genetics and Evolution –> Genetic Algorithms (GA) Behavior of ant colony –> Ant Colony Optimization (ACO) Human nervous system –> Artificial Neural Network (ANN) In addition to that there are some algorithms inspired by some physical behaviors: Physical behaviors: Annealing process –> Simulated Annealing (SA) Swarming of particle –> Particle Swarming Optimization (PSO) Learning –> Fuzzy Logic (FL)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 3 / 26

slide-24
SLIDE 24

Genetic Algorithm

It is a subset of evolutionary algorithm: Ant Colony optimization Swarm Particle Optimization Models biological processes: Genetics Evolution To optimize highly complex objective functions: Very difficult to model mathematically NP-Hard (also called combinatorial optimization) problems (which are computationally very expensive) Involves large number of parameters (discrete and/or continuous)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 4 / 26

slide-25
SLIDE 25

Background of Genetic Algorithm

Firs time itriduced by Ptrof. John Holland (of Michigan University, USA, 1965). But, the first article on GA was published in 1975. Principles of GA based on two fundamental biological processes: Genetics: Gregor Johan Mendel (1865) Evolution: Charles Darwin (1875)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 5 / 26

slide-26
SLIDE 26

A brief account on genetics

The basic building blocks in living bodies are cells. Each cell carries the basic unit of heredity, called gene

Chromosome Nucleus Other cell bodies

For a particular specie, number of chromosomes is fixed. Examples

Mosquito: 6 Frogs: 26 Human: 46 Goldfish: 94 etc.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 6 / 26

slide-27
SLIDE 27

A brief account on genetics

Genetic code Spiral helix of protein substance is called DNA. For a specie, DNA code is unique, that is, vary uniquely from one to other. DNA code (inherits some characteristics from one generation to next generation) is used as biometric trait.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 7 / 26

slide-28
SLIDE 28

A brief account on genetics

Reproduction

+ =

x y

gamete haploid (Reproductive cell has half the number of chromosomes) Organism’s cell : Cell division Each chromosome from both haploids are combined to have full numbers diploid diploid

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 8 / 26

slide-29
SLIDE 29

A brief account on genetics

Crossing over

Information from two different

  • rganism’s body

cells Combined into so that diversity in information is possible Random crossover points makes infinite diversities Kinetochore

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 9 / 26

slide-30
SLIDE 30

A brief account on evolution

Evolution : Natural Selection

Four primary premises:

1

Information propagation: An offspring has many of its characteristics of its parents (i.e. information passes from parent to its offspring). [Heredity]

2

Population diversity: Variation in characteristics in the next

  • generation. [Diversity]

3

Survival for exitence: Only a small percentage of the offspring produced survive to adulthood. [Selection]

4

Survival of the best: Offspring survived depends on their inherited characteristics. [Ranking]

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 10 / 26

slide-31
SLIDE 31

A brief account on evolution

Mutation: To make the process forcefully dynamic when variations in population going to stable.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 11 / 26

slide-32
SLIDE 32

Biological process : A quick overview

Genetics Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 12 / 26

slide-33
SLIDE 33

Working of Genetic Algorithm

Definition of GA: Genetic algorithm is a population-based probabilistic search and

  • ptimization techniques, which works based on the mechanisms of

natural genetics and natural evaluation.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 13 / 26

slide-34
SLIDE 34

Framework of GA

Start Initial Population

Converge ?

Stop

Selection

Yes No

Reproduction Note:

An individual in the population is corresponding to a possible solution

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 14 / 26

slide-35
SLIDE 35

Working of Genetic Algorithm

Note:

1

GA is an iterative process.

2

It is a searching technique.

3

Working cycle with / without convergence.

4

Solution is not necessarily guranteed. Usually, terminated with a local optima.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 15 / 26

slide-36
SLIDE 36

Framework of GA: A detail view

Start Initialize population

Converge ?

Stop Evaluate the fitness Select Mate Crossover Mutation Inversion Yes No

Reproduction

Define parameters Parameter representation Create population Apply cost function to each of the population

Selection

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 16 / 26

slide-37
SLIDE 37

Optimization problem solving with GA

For the optimization problem, identify the following: Objective function(s) Constraint(s) Input parameters Fitness evaluation (it may be algorithm or mathematical formula) Encoding Decoding

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 17 / 26

slide-38
SLIDE 38

GA Operators

In fact, a GA implementation involved with the realization of the following operations.

1

Encoding: How to represent a solution to fit with GA framework.

2

Convergence: How to decide the termination criterion.

3

Mating pool: How to generate next solutions.

4

Fitness Evaluation: How to evaluate a solution.

5

Crossover: How to make the diverse set of next solutions.

6

Mutation: To explore other solution(s).

7

Inversion: To move from one optima to other.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 18 / 26

slide-39
SLIDE 39

Different GA Strategies

Simple Genetic Algorithm (SGA) Steady State Genetic Algorithm (SSGA) Messy Genetic Algorithm (MGA)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 19 / 26

slide-40
SLIDE 40

Simple GA

Start Create Initial population

  • f size N

Convergence Criteria meet ?

Stop Select Np individuals (with repetition) Create mating pool (randomly) (Pair of parent for generating new offspring) Perform crossover and create new offsprings Mutate the offspring Perform inversion on the offspring Yes No Evaluate each individuals Replace all individuals in the last generation with new offsprings created Return the individual(s) with best fitness value Reproduction

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 20 / 26

slide-41
SLIDE 41

Important parameters involved in Simple GA

SGA Parameters Initial population size : N Size of mating pool, Np : Np = p%ofN Convergence threshold δ Mutation µ Inversion η Crossover ρ

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 21 / 26

slide-42
SLIDE 42

Salient features in SGA

Simple GA features: Have overlapping generation (Only fraction of individuals are replaced). Computationally expensive. Good when initial population size is large. In general, gives better results. Selection is biased toward more highly fit individuals; Hence, the average fitness (of overall population) is expected to increase in succession. The best individual may appear in any iteration.

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 22 / 26

slide-43
SLIDE 43

Steady State Genetic Algorithm (SSGA)

Start Generate Initial population of size N

Reject the

  • ffspring if

duplicated

Stop Evaluate the offspring If the offspring are better than the worst individuals then replace the worst individuals with the offspring Yes No Evaluate each individuals Return the solutions Select two individual without repetition Crossover Mutation Inversion

Convergence meet ?

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 23 / 26

slide-44
SLIDE 44

Salient features in Steady-state GA

SGA Features: Generation gap is small. Only two offspring are produced in one generation. It is applicable when

Population size is small Chromosomes are of longer length Evaluation operation is less computationally expensive (compare to duplicate checking)

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 24 / 26

slide-45
SLIDE 45

Salient features in Steady-state GA

Limitations in SSGA: There is a chance of stuck at local optima, if crossover/mutation/inversion is not strong enough to diversify the population). Premature convergence may result. It is susceptible to stagnation. Inferiors are neglected or removed and keeps making more trials for very long period of time without any gain (i.e. long period of localized search).

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 25 / 26

slide-46
SLIDE 46

***

Any Questions??

Debasis Samanta (IIT Kharagpur) Soft Computing Applications 06.03.2018 26 / 26