Introduction Signals, Noise, Energy and Linearity by Erol Seke For - - PowerPoint PPT Presentation

introduction
SMART_READER_LITE
LIVE PREVIEW

Introduction Signals, Noise, Energy and Linearity by Erol Seke For - - PowerPoint PPT Presentation

Introduction Signals, Noise, Energy and Linearity by Erol Seke For the course Communications OSMANGAZI UNIVERSITY The Goal Transfer information from source point to destination correctly (and using least amount of resources, in most


slide-1
SLIDE 1

Introduction

Signals, Noise, Energy and Linearity

by Erol Seke For the course “Communications”

OSMANGAZI UNIVERSITY

slide-2
SLIDE 2

The Goal

Transfer information from source point to destination correctly (and using least amount of resources, in most cases)

Information Generator Information User

Source point Destination point

Noise Sources Information Channel

slide-3
SLIDE 3

An Example Communication System

Some other examples of (electronic) source, channel and destination Microphone – Twisted pair of wire - Amplifier Modem - Twisted pair of telephone line - Modem Radio transmitter – Air – Radio receiver Computer – Ethernet cable - Computer Computer’s data storage medium – Fiber optic network – Another computer’s storage Digital data generator – Magnetic disk – Digital data user Fax scanner – Telephone system – Fax printer Digital TV data from satellite – Atmosphere – Digital TV receiver TV Remote controller – Air – IR sensor/receiver on TV X=y Sound Amplifier

slide-4
SLIDE 4

An Example Digital Communication System

There are only 4 messages/values to sent

( Analog system had infinite number of values )

slide-5
SLIDE 5

Analog / Digital Electronic Communication

Transmitted Signal Analog Communication Transmitter Receiver Channel Noise Digital Communication

Signal’s all values are important at every point and cannot be completely repaired when damaged

Ts

Finite number of symbols represented by finite number of waveforms within Ts analog or digital circuits

(infinite number of possible values)

slide-6
SLIDE 6

Digital Communication

Advantages :

  • Mathematical/Logical Processing on the data is possible
  • Therefore : higher protection against noise
  • More flexible when performed using reconfigurable / reprogrammable elements
  • ?

Disadvantages :

  • Complexity is higher
  • Higher speed devices are required
  • Analog signals need to be converted/deconverted using ADC/DAC
  • ?

against analog communication

slide-7
SLIDE 7

1( )

N t

2( )

N t

Repeater

( ) r t

Channel

An Advantage of Digital Communication

Signal can be restored and resent halfway between transmitter and receiver

slide-8
SLIDE 8

An Advantage of Digital Communication

So that the signal is received with minimum (or no) error

slide-9
SLIDE 9

General Communication System

slide-10
SLIDE 10

t

( ) t  ( ) y t

t

T ( ) t 

t

( ) x t

t

Various Signals

triangular pulse rectangular periodic signal square wave random signal noise discrete semi-periodic BPSK samples

slide-11
SLIDE 11

Energy of a Signal

Energy of a signal x(t) is defined as the energy spent on a 1 Ohm load

2

( )

x

E x t dt

 

 

Its unit is Volts2/Ohm = Joules

slide-12
SLIDE 12

Example

Energy of a rectangular pulse

( ) x t

t

T A t t T 

2

( )

x

E x t dt

 

 

2 2 2 t T t T t t

A dt A t A T

 

  

independent of the position on time axis (valid for all signals)

slide-13
SLIDE 13

Example

Let

( ) x t

t

T

1 T

t

t T  ( ) x t

t

T t t T 

1 AT 

1 A T  ( ) x t

t t

1

lim T

T

 

unit impulse

1 T

slide-14
SLIDE 14

Power of a Signal

/2 2 1 /2

lim ( )

T x T T T

P x t dt

  

If the energy is infinite, then we talk about the energy spent in unit time. For periodic signals

2 1

( )

T x T

P x t dt

  

Its unit is Watts

slide-15
SLIDE 15

Example

Find the power of the sawtooth signal

( ) s t

t

... ...

T A

2

1 ( )

T s

P s t dt T  

2

1

T At dt

T T  

2 3 3

3

T

A t T 

2

3

s

A P 

power is independent of the period / frequency

slide-16
SLIDE 16

) 8 cos( ) ( t t y  

Find the power and energy of the waveform

T a a

dt t x T P

2

) ( 1

16 / 1 16 / 1 2 8 1

) 8 ( cos 1 dt t P  2 1 32 ) 16 sin( 2 8

16 / 1 16 / 1

        

 t t P

Since

  2 1

it is a power signal. Therefore it is not an energy signal. So,

  E Example

(verify!)

slide-17
SLIDE 17

{ } ( ) E X tx t dt

 

 

Average / Expected Values

X

is a random process,

x

is its generated values ( we will define random process later ) Expected value of a random process

/2 1 /2

lim ( )

T x T T T

m x t dt

  

Average value of a continuous signal Average value of discrete samples

1

1

N avg i i

x x N

Average and expected values are equal when observation duration (or number of samples) is infinite

slide-18
SLIDE 18

Example

( ) sin(2 / ) x t t T  

( ) cos(2 / ) y t t T  

and

2

( )

x

E x t dt

 

  

2

( )

y

E y t dt

 

  

energies are both infinite (periodic signals)

2 1

( )

T x T

P x t dt

  

2 1

1 sin(2 / ) 2

T T

t T dt  

2 1

( )

T y T

P y t dt

  

2 1

1 cos(2 / ) 2

T T

t T dt  

powers are the same

/2 1 /2

lim sin(2 / )

T x T T T

m t T dt 

  

 

y

m 

averages are the same question : what is different? They are both sinusoidals

slide-19
SLIDE 19

( ), ( ) ( ) ( ) y t x t y t x t dt

 

 

Similarity / Dissimilarity Measure

Similarity of signals is measured using an inner product

( ), ( ) y t x t

t

Example

( ), ( ) sin(2 / )cos(2 / ) 0!

T

x t y t t T t T dt     

Does this mean these two are dissimilar? We need to check similarity for shifted versions of signals too. It is obvious that the integral, except for finite duration signals, will be infinite. Therefore, we need to have some kind of normalization.

slide-20
SLIDE 20

Cross-Correlation

Similarity of shifted versions of signals called Cross-Correlation, where τ represents the time shift

( ), ( ) y t x t  

t

for τ =0 for τ =T integral interval

Example

( ) ( ) ( )

T xy

R x t y t dt     

for periodic signals

max

( ) ( )/

xy xy

R R R    ( ) ( ) ( )

xy

R x t y t dt  

 

  

( normalized )

slide-21
SLIDE 21

x y

T T T  

Since our signals are periodic, we can select an integral interval of T

( )

xy

R  

2 T 2 T

The signals are similar to each other on periodic intervals

Example

0 sin(2

/ )cos(2 ( )/ )

T xy

R t T t T dt      

2 sin(2

/ )

T

T   

1

slide-22
SLIDE 22

Autocorrelation

if both signals are same;

( ) ( ) y t x t 

the similarity is named as autocorrelation

( )

xy

R   

2 T 2 T

Example

( ) ( ) ( )

xy

R x t x t dt  

  

  

signal x is similar to itself on periodic intervals

slide-23
SLIDE 23

Example

and

1 , 0 ( ) , t T r t

  • therwise

      ( ) ( ) x t r t  ( )

T xy

R dt T

 

   

t ( ) x t t ( ) r t   t ( ) x t t ( ) r t       (0)

T xy

R dt T   

t ( ) x t t ( ) r t     ( )

T xy

R dt T

 

   

T 

xy

R T 1   T   T T 

slide-24
SLIDE 24

Example

and

, 0 ( ) , t t T r t

  • therwise

      ( ) ( ) x t r t  ( ) ( )

T xy

R t t dt

 

  

t ( ) x t t ( ) r t   t ( ) x t t ( ) r t      

2

(0)

T xy

R t dt    t ( ) x t t ( ) r t     ( ) ( )

T xy

R t t dt

 

  

T 

xy

R

3

3 T

  T   T T 

slide-25
SLIDE 25

( ), ( ) ( ) ( ) y t x t y t x t dt

 

 

Orthogonal Signals

Inner product also tells us if the signals are orthogonal

( ), ( ) y t x t

t ( ), ( ) sin(2 / )cos(2 / )

T

x t y t t T t T dt     

meaning that then

If

( ), ( ) y t x t

are orthogonal

( ) y t

does not have any component of ( )

x t

within (shifted versions of signals may not be orthogonal)

slide-26
SLIDE 26

Example

t ( ) x t 1

2 T

t ( ) y t 1

2 T

T

2 2

( ), ( ) ( ( ) ( ))( ( ) ( ))

T T T

x t y t u t u t u t u t T dt        

Given a set of waveforms , we can find an orthogonal waveform set

( )

i

x t ( )

k t

so that

1 ,

( ) ( )

M i k i k k

x t c t 

 

  ( )

i

x t

can be written as a weighted linear sum of

( )

k t

Hmw : study this subject (orthogonalization) from the referenced sources

slide-27
SLIDE 27

1 1/6 2 3 4 5 6

x Probability

Die throwing experiment

6 1

1

i i

p

i

p

Random variable : An event or value which is measured

x : Value read on die after throwing event (random variable) pi : probability of x ( p(x=i) )

slide-28
SLIDE 28

Expected Value of Discrete Experiment

( ) ( )

i i i

E X x p x 

The name "expected value" does not imply that it is expected to happen Expected value of die throwing experiment = 3.5 which will never happen 1 1/6 2 3 4 5 6

x

i

p

This graph is called Probability Mass Function (pmf)

slide-29
SLIDE 29

Probability Density Function

( ) 1.0 f x dx

 

2 1

1 2

( ) ( )

x x

P x x x f x dx    

if this is a "total of something" then this is a "density" shaded area gives the probability of

1 2

( ) P x x x  

slide-30
SLIDE 30

Example

5 5 5 2 3 2 1 1 1 1 8 8 3 2 1 1 1

( ) ( ) ( ) 3.67 E X xf x dx x x dx x x         

 

c ( ) 1.0 f x dx

 

5 5 2 4 8 1 1

( 1) ( 1) 2

c c

x dx x c    

1 2

c 

2 1 1 8 16 1

( 2) ( 1) p x x dx    

5 7 1 8 16 4

( 4) ( 1) p x x dx    

Q : if x is a periodic function, what are the possibilities of x(t)?

slide-31
SLIDE 31

Cumulative Distribution Function

( ) ( )

x

F x f u du



 

1 2 2 1

( ) ( ) ( ) P x x x F x F x    

so that

x ( ) F x 1 5 1

1

x

2

x

2

( ) F x

1

( ) F x

1 2

( ) P x x x  

slide-32
SLIDE 32

Well Known Distributions

uniform pdf Gaussian (normal) pdf

2 2

( ) 2

1 ( ) 2

x m

f x e

 

 

2

(( ) ) E x m   

Standard deviation

slide-33
SLIDE 33

Histogram

Histogram : A graph showing the occurrences of r.v. within each bin. for a random experiment repeated N times. For the given signal : See the resemblance to Gaussian r.v. Histogram shows the distribution for the results measured. pdf is the expected distribution (may not have done yet) For large number of experiments, histogram becomes a representation of pdf

slide-34
SLIDE 34

t x(t)

pdf shows only the probabilistic distribution, not the time function itself For example, pdf of the following periodic function looks like a Gaussian There may be infinite number of functions that have the same pdf

slide-35
SLIDE 35

Noise

Any signal other than our structured signal, but intentionally or unintentionally added onto our signal, is categorized as noise.

Noise Sources

  • Electronic / Thermal noise
  • Electrical discharges in the atmosphere / nearby devices
  • Interference / Crosstalk between channels and multipath effects
  • Solar / Cosmic effects
  • Distortion from nonlinearities of the electronics / media

(quantization noise and granular noise are evaluated in different contexts)

Channel input Channel output

slide-36
SLIDE 36

Autocorrelation of White Noise

t

( ) ( ) ( ) ( ) 2

xx

N R x t x t dt    

  

   

pdf might be Gaussian 2 N

( )

x

S f f

spectrum is white

 ( )

xx

R  

2 N it is uncorrelated

slide-37
SLIDE 37

Noise is usually assumed to be (this assumption is not baseless)

N kS S

i

Additive : White : |F(N )|=c (has the same power at all frequencies) Gaussian : Probability distribution function is Gaussian

AWGN

( ) s t ( ) t  ( ) r t

Channel

( ) h t

slide-38
SLIDE 38

Impulse

( ) t 

( ) h t

Impulse Response of a System

t

( ) t 

unit impulse function

2

( )

x

E t dt 

 

  

( ) 1 t dt 

 

( ) ( ) (0) t x t dt x 

 

( ) ( ) ( ) ( ) y t t h dt h t   

 

  

( ) ( ) y t h t 

( ) y t

when input is ( )

t  ( ) ( ) ( ) t x t dt x   

 

 

as a result where is the position of impulse

slide-39
SLIDE 39

Convolution

) (t h ) (t x

  

     d t h x t y ) ( ) ( ) (

Think of

) (t x

as an infinite sum of

( ) ( ) ( ) t x dt x t   

 

 

The output of the system will be an infinite sum of responses to each weighted impulse For this infinite summation to hold, the system must be a Linear System

slide-40
SLIDE 40

h(t) h(t) h(t) ( ) ( ) ( ) u t z t h d   

 

 

( ) ( ) ( ) ( ) ( ) u t x t h d y t h d      

   

   

 

( ) ( ) ( ) z t x t y t   ( ) x t ( ) y t

Linearity h(t) is a Linear Time Invariant (LTI) system

Time Invariant : the system

( ) h t

does not change by time input input

t t t

  • utput

t

  • utput
slide-41
SLIDE 41

Distortion caused by limited bandwidth additive noise

Signal + Noise

slide-42
SLIDE 42

t t t

Signal Noise Signal + Noise pmf pdf pdf

slide-43
SLIDE 43

END END