functional programming and parallel computing
play

Functional Programming and Parallel Computing Bjrn Lisper School of - PowerPoint PPT Presentation

Functional Programming and Parallel Computing Bjrn Lisper School of Innovation, Design, and Engineering Mlardalen University bjorn.lisper@mdh.se http://www.idt.mdh.se/blr/ Functional Programming and Parallel Computing (revised


  1. Functional Programming and Parallel Computing Björn Lisper School of Innovation, Design, and Engineering Mälardalen University bjorn.lisper@mdh.se http://www.idt.mdh.se/˜blr/ Functional Programming and Parallel Computing (revised 2013-12-03)

  2. Parallel Processing Multicore processors are becoming commonplace They typically consist of several processors, and a shared memory: CPU CPU Memory CPU CPU The different cores execute different pieces of code in parallel If several cores do useful work at the same time, the processing becomes faster Functional Programming and Parallel Computing (revised 2013-12-03) 1

  3. Imperative Programming and Parallel Processing Imperative programming is inherently sequential Statements are executed one after another However, imperative programs can be made parallel through parallel processes, or threads: P3 P4 CPU CPU Memory CPU CPU P2 P1 Functional Programming and Parallel Computing (revised 2013-12-03) 2

  4. Side Effects and Parallel Processing Imperative programs have side effects Variables are assigned values These variables may be accessed by several processes This introduces the risk of race conditions Since processes run asynchronously on different processors, we cannot make any assumptions about their relative speed In one situation one process may run faster, in another situation another one runs faster Functional Programming and Parallel Computing (revised 2013-12-03) 3

  5. A Race Condition Example Two processes P1 and P2 , which both can write a shared variable tmp : P1: P2: . tmp = 4711 . z = tmp + 3 . . tmp = 17 . y = 2*tmp . The intention of the programmer was that both P1 and P2 should set tmp and then use its newly set value right away. This would yield the following, intended, final contents of y and z : y = 34 , z = 4714 Functional Programming and Parallel Computing (revised 2013-12-03) 4

  6. But the programmer missed that the processes have a race condition for tmp , since they may run at different speed. Here is one possible situation: P1: P2: tmp = 17 . . tmp = 4711 y = 2*tmp . . z = tmp + 3 Final state: y = 9422 , z = 4714 Wrong value for y ! Functional Programming and Parallel Computing (revised 2013-12-03) 5

  7. Here is another possible situation: P1: P2: . tmp = 4711 tmp = 17 . . z = tmp + 3 y = 2*tmp . Final state: y = 34 , z = 20 Wrong value for z ! Functional Programming and Parallel Computing (revised 2013-12-03) 6

  8. Parallel Programming is Difficult To avoid these race conditions, the programmer must use different synchronization mechanisms to make sure processes do not interfere with each other But this is difficult! It is very easy to miss race conditions Debugging of parallel programs is also difficult! Race conditions may occur only under certain conditions, that appear very seldom. It can be very hard to reproduce bugs This is a very bad situation, since multicore processors are becoming commonplace. We are heading for a software crisis! The heart of the problem is the side effects – they allow different processes to thrash each others’ data Functional Programming and Parallel Computing (revised 2013-12-03) 7

  9. Pure Functional Programs and Parallel Processing Pure functional programs have no side-effects The evaluation order does not matter Thus, different parts of the same expression can always be evaluated in parallel: + f(x) + g(x) g(x) f(x) Sometimes called expression parallelism Functional Programming and Parallel Computing (revised 2013-12-03) 8

  10. Parallelism in Collection-Oriented Primitives Data structures like lists, arrays, sequences, are sometimes called collections Functions like map and fold are called collection-oriented Collection-oriented functions often have a lot of inherent parallelism If one can express computations with these primitives, then parallelization often becomes easy This parallelism is often called data parallelism In imperative programs these computations are often implemented by loops. Loops are sequential. A good parallelizing compiler might retrieve some of it, but there is a risk that parallelism is lost Functional Programming and Parallel Computing (revised 2013-12-03) 9

  11. Map on Arrays Map is very parallel: x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 x1 x2 x3 x4 x5 x6 x7 x8 x9 x10 x11 x12 x13 x14 x15 x16 f f f f f x1 f x2 f x3 f x4 f x5 f x6 f x7 f x8 f x9 f x10 f x11 f x12 f x13 f x14f x15 f x16 With sufficiently many processors, map can be done in O (1) time Functional Programming and Parallel Computing (revised 2013-12-03) 10

  12. Parallel Fold Fold can be parallelized if the binary function is an associative operator : op op op op op op op x1 op op op x2 x1 x2 x3 x4 x5 x6 x7 x8 op x3 op x4 op x5 op x6 op x7 init x8 Functional Programming and Parallel Computing (revised 2013-12-03) 11

  13. If op is associative, then the expression tree can be balanced With sufficiently many processors, parallel fold can be done in O (log n ) time ( n = no. of elements) Functional Programming and Parallel Computing (revised 2013-12-03) 12

  14. Fault Tolerance Through Replicated Evaluation If an expression has no side effects, then it can be evaluated several times without changing the result of the program choose f(x) f(x) f(x) This can be used to increase the fault tolerance : if one processor fails, we can use the result from another one computing the same expression Functional Programming and Parallel Computing (revised 2013-12-03) 13

  15. Parallelism in F# F# has some support for parallel and concurrent processing: The System.Threading library gives threads A data type Async<’a> for asynchronous (concurrent) workflows (a kind of computation expressions) The System.Threading.Tasks library yields task parallelism Array.Parallel module provides data parallel operations on arrays More on this in Section 13 in the book Functional Programming and Parallel Computing (revised 2013-12-03) 14

  16. Example: Expression Parallelism using Parallel Tasks Evaluating f x and g x in parallel when computing (f x) + (g x) : Open System.Threading.Tasks let h x = let result1 = Task.Factory.StartNew(fun () -> f x) let result2 = Task.Factory.StartNew(fun () -> g x) result1 + result2 Functional Programming and Parallel Computing (revised 2013-12-03) 15

  17. Example: Data Parallel Search for Primes Create a big array, then map a test for primality over it to be done in parallel Assume a predicate isPrime : int -> bool that tests whether its argument is a prime let bigarray = [|1 .. 500000|] Array.Parallel.map isPrime bigarray Functional Programming and Parallel Computing (revised 2013-12-03) 16

  18. Summing Up Freedom from side effects simplifies parallel processing a lot Collection-oriented operations are also very helpful for this It’s the design and thinking that is important – not necessarily that a functional language is used The same principles can be applied also when using conventional programming languages Functional Programming and Parallel Computing (revised 2013-12-03) 17

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend