Pseudo-Random Number Generators Functional Programming and - - PowerPoint PPT Presentation

pseudo random number generators
SMART_READER_LITE
LIVE PREVIEW

Pseudo-Random Number Generators Functional Programming and - - PowerPoint PPT Presentation

Pseudo-Random Number Generators Functional Programming and Intelligent Algorithms Prof Hans Georg Schaathun Hgskolen i lesund 14th February 2017 1 Randomness 1. What is randomness? 2 Randomness 1. What is randomness? 2. How do we


slide-1
SLIDE 1

1

Pseudo-Random Number Generators

Functional Programming and Intelligent Algorithms Prof Hans Georg Schaathun Høgskolen i Ålesund 14th February 2017

slide-2
SLIDE 2

Randomness

  • 1. What is randomness?

2

slide-3
SLIDE 3

Randomness

  • 1. What is randomness?
  • 2. How do we create probabilistic computer programs?

2

slide-4
SLIDE 4

Randomness

  • 1. What is randomness?
  • 2. How do we create probabilistic computer programs?
  • 3. I.e. how do we make the computer act at random?

2

slide-5
SLIDE 5

Two options

3

slide-6
SLIDE 6

Two options

True randomness uses physical sources of entropy

  • 1. /dev/random on many systems
  • 2. random-fu in Haskell

3

slide-7
SLIDE 7

Two options

True randomness uses physical sources of entropy

  • 1. /dev/random on many systems
  • 2. random-fu in Haskell

Pseudo-random number generators (PRNG) are deterministic but random-looking — random, standard package in Haskell — random-tf, more recent Haskell package

3

slide-8
SLIDE 8

Linear Congruential Generators

xi = a + cxi−1 mod m, x0 is a given seed — Pseudo-random sequence [x0, x1, x2, . . .] — Aka. Lehmer’s algorithm

4

slide-9
SLIDE 9

Ciphers in counter mode

Alice Bob Eve ek(m) xi = ek(i) x0 is a given seed Pseudo-random sequence [x0, x1, x2, . . .]

5

slide-10
SLIDE 10

The PRNG is a state machine

s1 s2 s3 s4 s5 s6 s7

n e x t next n e x t n e x t next next

6

slide-11
SLIDE 11

The PRNG is a state machine

s1 s2 s3 s4 s5 s6 s7

n e x t next n e x t n e x t next next

x1 x2 x3 x4 x5 x6

6

slide-12
SLIDE 12

The PRNG is a state machine

s1 s2 s3 s4 s5 s6 s7

n e x t next n e x t n e x t next next

x1 x2 x3 x4 x5 x6 — next :: State -> (State,Int)

6

slide-13
SLIDE 13

The PRNG is a state machine

s1 s2 s3 s4 s5 s6 s7

n e x t next n e x t n e x t next next

x1 x2 x3 x4 x5 x6 — next :: State -> (State,Int) — Lehmer: next s = (s’,s’) where s’ = (a + x*s) ‘mod‘ m

6

slide-14
SLIDE 14

The PRNG is a state machine

s1 s2 s3 s4 s5 s6 s7

n e x t next n e x t n e x t next next

x1 x2 x3 x4 x5 x6 — next :: State -> (State,Int) — Lehmer: next s = (s’,s’) where s’ = (a + x*s) ‘mod‘ m — Cipher: next s = (s + 1 ‘mod‘ m, encrypt k s)

6

slide-15
SLIDE 15

random-tf package

  • 1. next ::

TFGen -> (TFGen,Word32)

Exercise

Given a TFGen object, how do you generate an random, infinite list

  • f Word32 objects?

7

slide-16
SLIDE 16

Splitting a PRNG

  • 1. split ::

TFGen -> (TFGen,TFGen)

  • 2. (g’,newstate) = split g
  • 3. Use g’ to generate the list
  • 4. newstate is your new state

8

slide-17
SLIDE 17

Where do you get the initial state?

9

slide-18
SLIDE 18

Where do you get the initial state?

  • 1. Hardcode an arbitrary seed
  • 2. Use initialisation functions in the library

2.1 initTFGen

  • 3. Use a library which provides true random values
  • random-fu

9

slide-19
SLIDE 19

Tuning parameters

  • 1. Distribution of random initial weights?
  • 2. β in the sigmoid function?
  • 3. Number of iterations?

10

slide-20
SLIDE 20

Some guidelines

— Weights: −1/√n ≤ w ≤ 1/√n

  • where n is the number of inputs to the layer

— The weights should have similar magnitude — Small β — β ≤ 3

  • 1. β = 1 is a good starting point

11

slide-21
SLIDE 21

Number of epochs

12

slide-22
SLIDE 22

Exercise

— Random starting weights

  • 1. initNeuron
  • 2. initNetwork

— Test your network — Experiment by varying

  • 1. magnitude of initial weights
  • 2. β
  • 3. number of epochs

13