Computer Programming II Algorithm Analysis and Sorting Techniques - - PowerPoint PPT Presentation

computer programming ii
SMART_READER_LITE
LIVE PREVIEW

Computer Programming II Algorithm Analysis and Sorting Techniques - - PowerPoint PPT Presentation

Computer Programming II Algorithm Analysis and Sorting Techniques Algorithm A sequence of well-defined instructions, which with a finite number of steps solves a problem or calculation. Components Sequence: a succession of instructions


slide-1
SLIDE 1

Computer Programming II

Algorithm Analysis and Sorting Techniques

slide-2
SLIDE 2

Algorithm

A sequence of well-defined instructions, which with a finite number of steps solves a problem or calculation. Components

  • Sequence: a succession of instructions
  • Selection: if, switch (not in Python)
  • Iteration: Count (for) or condition (while) loops
  • Abstraction: functions

This week you have been working on constructing algorithms using recursion

slide-3
SLIDE 3

Algorithm: Sorting

There are many algorithms that can solve the same problem A common problem to solve in CS is sorting of lists and there are many different ways to go about it

  • Selection sort
  • Shell sort
  • Heap sort
  • Quick sort

Some more efficient (and effective) than others

slide-4
SLIDE 4

Sorting: The ineffective way

def bogo_sort(a_list): while (is_sorted(a_list)== False): shuffle(a_list)

slide-5
SLIDE 5

Bogosort: Visualisation

https://youtu.be/aSH-7EICngo

slide-6
SLIDE 6

Sorting: The inefficient way

def bubble_sort(a_list): n = len(a_list) for i in range(n - 1): for j in range(0, n - i - 1): if a_list[j] > a_list[j + 1]: a_list[j], a_list[j + 1] = a_list[j + 1], a_list[j]

slide-7
SLIDE 7

Bubble Sort: Visualisation

https://youtu.be/R-WHcRdxaEk

slide-8
SLIDE 8

Sorting

Your first assignment contains two common sorting solutions

  • Insertion sort
  • Merge sort

These can also be algorithmically implemented in different ways since they are ideas of how to solve a particular problem

slide-9
SLIDE 9

Sorting: Insertion - The Idea

1. Iterate from a_list[1] to a_list[n] 2. Compare the current element (key) to its predecessor. 3. If the key element is smaller than its predecessor, compare it to the elements

  • before. Move the greater elements one position up to make space for the

swapped element.

slide-10
SLIDE 10

Insertion Sort: Iterative

def insertion_sort(a_list): length = len(a_list) for i in range(1, length): value = a_list[i] while a_list[i - 1] > value and i > 0: a_list[i], a_list[i - 1] = a_list[i - 1], a_list[i] i -= 1

slide-11
SLIDE 11

Insertion Sort: Recursive

def insertion_sort_rec(a_list, n): if n <= 1: return insertion_sort_rec(a_list, n - 1) last = a_list[n - 1] i = n - 2 while i >= 0 and a_list[i] > last: a_list[i + 1] = a_list[i] i = i - 1 a_list[i + 1] = last

slide-12
SLIDE 12

Insertion Sort: Visualisation

https://youtu.be/sl1PeLOzxQI

slide-13
SLIDE 13

Sorting: Merge - The Idea

1. Find the middle point to divide the array into two halves 2. Do a recursive call for the first half 3. Do a recursive call for the second half 4. Merge the two halves sorted in step 2 and 3

slide-14
SLIDE 14

Merge Sort: Visualisation

https://youtu.be/fuCYYoRejc0

slide-15
SLIDE 15

Time Complexity of Algorithms

Visually, we can see how differently these algorithms work How do we choose which approach to use for a particular problem? We can of course analyse them by running them a large number of times, take the average time and compare them

slide-16
SLIDE 16

Time Complexity of Algorithms

However, the time might vary a lot depending on

  • CPU power
  • RAM efficiency
  • Programming language
  • Amount of data to manipulate
  • The structure of the input data
  • Type of input to the algorithm
slide-17
SLIDE 17

Time Complexity of Algorithms

So how do we compare if there are so many variables? We get an approximation of their efficiency by considering the change in performance as the number of operations increases as a function of the number of elements we provide the algorithm That is, how does the algorithm perform with a larger input size compared to a smaller input size, given that everything else is constant? Studying the time complexity of an algorithm is defined as asymptotic analysis

slide-18
SLIDE 18

Linear Search Algorithm

def search(x, a_list): for i in a_list: if i == x: return True return False

slide-19
SLIDE 19

Asymptotic Analysis: Cases for linear search

l = [7, 2, 4, 1, 5, 3, 9] search(7, l)

The search ends immediately: Best case scenario

l = [7, 2, 4, 1, 5, 3, 9] search(9, l)

The search has to iteration through all indices: Worst case scenario

l = [7, 2, 4, 1, 5, 3, 9] search(6, l)

The search has to iteration through all indices: Worst case scenario

slide-20
SLIDE 20

Asymptotic Notation

O Ordo or Big O (upper bound) Signifies the worst case scenario Ω Omega (lower bound) Signifies the best case scenario Θ Theta (upper and lower bound) Signifies the average case

slide-21
SLIDE 21

Asymptotic Notation

O(1) Constant time O(n) Linear time O(n2) Quadratic time

slide-22
SLIDE 22

Elements Operations

slide-23
SLIDE 23

Asymptotic Analysis: Cases for linear search

l = [7, 2, 4, 1, 5, 3, 9] search(7, l)

The search ends immediately: Ω(1)

l = [7, 2, 4, 1, 5, 3, 9] search(9, l)

The search has to iteration through all indices: O(n)

l = [7, 2, 4, 1, 5, 3, 9] search(6, l)

The search has to iteration through all indices: O(n) Average: Θ(n)

slide-24
SLIDE 24

Bubble Sort Algorithm

def bubble_sort(a_list): n = len(a_list) for i in range(n - 1): for j in range(0, n - i - 1): if a_list[j] > a_list[j + 1]: a_list[j], a_list[j + 1] = a_list[j + 1], a_list[j]

slide-25
SLIDE 25

Asymptotic Analysis: Cases for bubble sort

In bubble sort, when the input array is already sorted, the time taken by the algorithm is linear i.e. the best case: Ω(n) But, when the input array is in reverse condition, the algorithm takes the maximum time (quadratic) to sort the elements i.e. the worst case: O(n2) When the input array is neither sorted nor in reverse order, then it takes average time: Θ(n2)

slide-26
SLIDE 26

Time Complexity: Sorting Algorithms

Algorithm Time Complexity Best Average Worst Selection Sort Ω(n2) θ(n2) O(n2) Bubble Sort Ω(n) θ(n2) O(n2) Insertion Sort Ω(n) θ(n2) O(n2) Heap Sort Ω(n log n) θ(n log n) O(n log n) Quick Sort Ω(n log n) θ(n log n) O(n2) Merge Sort Ω(n log n) θ(n log n) O(n log n)

slide-27
SLIDE 27

Time Complexity

Will two algorithms with a time complexity of O(n2) always perform the same? When would you use a more inefficient algorithm over an efficient one? Is asymptotic analysis really necessary when CPUs are so fast as they are today?

slide-28
SLIDE 28

Time Complexity

Will two algorithms with a time complexity of O(n2) always perform the same? No, the notation only expresses how the particular algorithm performs given an increase in its input

slide-29
SLIDE 29

Time Complexity

Will two algorithms with a time complexity of O(n2) always perform the same? No, the notation only expresses how the particular algorithm performs given an increase in its input When would you use a more inefficient algorithm over an efficient one? When the gain in efficiency is no worth the extra complexity

slide-30
SLIDE 30

Time Complexity

More important to run a program correctly than fast Make no sacrifices for minimal gains in efficiency

  • Do not sacrifice clarity
  • Do not sacrifice simplicity
  • Do not sacrifice modifiability

If your code runs slow: Find a better algorithm!

slide-31
SLIDE 31

Time Complexity

Will two algorithms with a time complexity of O(n2) always perform the same? No, the notation only expresses how the particular algorithm performs given an increase in its input When would you use a more inefficient algorithm over an efficient one? When the gain in efficiency is no worth the extra complexity Is asymptotic analysis really necessary when computer are so fast as they are today?

slide-32
SLIDE 32

Time Complexity

With increased computational power, we tend to compute even more data For example

  • Physics: aerodynamics, weather forecasting
  • Biology: bioinformatics, genome sequencing
  • Real-time systems: robots, self-driving cars, cellular base stations
  • Animation: computer games, film CGI
  • Image stabilisation of moving pictures
  • Information: search engines
slide-33
SLIDE 33

Extra

For those of you interested more in how to analyse time complexity, I recommend this video explaining how to analyse your code: https://www.youtube.com/watch?v=D6xkbGLQesk&list=PLBZBJbE_rGRV8D7XZ0 8LK6z-4zPoWzu5H&index=7 I’ve also included a few more slides showing visualisations of sorting algorithms here for those who enjoy watching them :)

slide-34
SLIDE 34

Shell Sort: Visualisation

https://youtu.be/NfQQGVN9fI4

slide-35
SLIDE 35

Selection Sort: Visualisation

https://youtu.be/EgWnJrpjpF4

slide-36
SLIDE 36

Quick Sort: Visualisation

https://youtu.be/tTt3F9PwTJ8

slide-37
SLIDE 37

Heap Sort: Visualisation

https://youtu.be/PlnoGtcmYXE

slide-38
SLIDE 38