SLIDE 1
1
Pseudo-Random Number Generators
Functional Programming and Intelligent Algorithms Prof Hans Georg Schaathun Høgskolen i Ålesund 14th February 2017
SLIDE 3 Randomness
- 1. What is randomness?
- 2. How do we create probabilistic computer programs?
2
SLIDE 4 Randomness
- 1. What is randomness?
- 2. How do we create probabilistic computer programs?
- 3. I.e. how do we make the computer act at random?
2
SLIDE 5
Two options
3
SLIDE 6 Two options
True randomness uses physical sources of entropy
- 1. /dev/random on many systems
- 2. random-fu in Haskell
3
SLIDE 7 Two options
True randomness uses physical sources of entropy
- 1. /dev/random on many systems
- 2. random-fu in Haskell
Pseudo-random number generators (PRNG) are deterministic but random-looking — random, standard package in Haskell — random-tf, more recent Haskell package
3
SLIDE 8
Linear Congruential Generators
xi = a + cxi−1 mod m, x0 is a given seed — Pseudo-random sequence [x0, x1, x2, . . .] — Aka. Lehmer’s algorithm
4
SLIDE 9
Ciphers in counter mode
Alice Bob Eve ek(m) xi = ek(i) x0 is a given seed Pseudo-random sequence [x0, x1, x2, . . .]
5
SLIDE 10
The PRNG is a state machine
s1 s2 s3 s4 s5 s6 s7
n e x t next n e x t n e x t next next
6
SLIDE 11
The PRNG is a state machine
s1 s2 s3 s4 s5 s6 s7
n e x t next n e x t n e x t next next
x1 x2 x3 x4 x5 x6
6
SLIDE 12
The PRNG is a state machine
s1 s2 s3 s4 s5 s6 s7
n e x t next n e x t n e x t next next
x1 x2 x3 x4 x5 x6 — next :: State -> (State,Int)
6
SLIDE 13
The PRNG is a state machine
s1 s2 s3 s4 s5 s6 s7
n e x t next n e x t n e x t next next
x1 x2 x3 x4 x5 x6 — next :: State -> (State,Int) — Lehmer: next s = (s’,s’) where s’ = (a + x*s) ‘mod‘ m
6
SLIDE 14
The PRNG is a state machine
s1 s2 s3 s4 s5 s6 s7
n e x t next n e x t n e x t next next
x1 x2 x3 x4 x5 x6 — next :: State -> (State,Int) — Lehmer: next s = (s’,s’) where s’ = (a + x*s) ‘mod‘ m — Cipher: next s = (s + 1 ‘mod‘ m, encrypt k s)
6
SLIDE 15 random-tf package
TFGen -> (TFGen,Word32)
Exercise
Given a TFGen object, how do you generate an random, infinite list
7
SLIDE 16 Splitting a PRNG
TFGen -> (TFGen,TFGen)
- 2. (g’,newstate) = split g
- 3. Use g’ to generate the list
- 4. newstate is your new state
8
SLIDE 17
Where do you get the initial state?
9
SLIDE 18 Where do you get the initial state?
- 1. Hardcode an arbitrary seed
- 2. Use initialisation functions in the library
2.1 initTFGen
- 3. Use a library which provides true random values
- random-fu
9
SLIDE 19 Tuning parameters
- 1. Distribution of random initial weights?
- 2. β in the sigmoid function?
- 3. Number of iterations?
10
SLIDE 20 Some guidelines
— Weights: −1/√n ≤ w ≤ 1/√n
- where n is the number of inputs to the layer
— The weights should have similar magnitude — Small β — β ≤ 3
- 1. β = 1 is a good starting point
11
SLIDE 21
Number of epochs
12
SLIDE 22 Exercise
— Random starting weights
- 1. initNeuron
- 2. initNetwork
— Test your network — Experiment by varying
- 1. magnitude of initial weights
- 2. β
- 3. number of epochs
13