MeasuringImageQuality HeathNielson MeasuringQuality What - - PowerPoint PPT Presentation

measuring image quality
SMART_READER_LITE
LIVE PREVIEW

MeasuringImageQuality HeathNielson MeasuringQuality What - - PowerPoint PPT Presentation

MeasuringImageQuality HeathNielson MeasuringQuality What Iden7fyandmeasurea9ributesofanimage thatcanbeusedtodeterminewhetherthe


slide-1
SLIDE 1

Measuring
Image
Quality


Heath
Nielson


slide-2
SLIDE 2

Measuring
Quality


What


  • Iden7fy
and
measure
a9ributes
of
an
image


that
can
be
used
to
determine
whether
the
 perceived
quality
meets
the
expecta7ons
of
 the
organiza7on.


slide-3
SLIDE 3

Measuring
Quality


Why


  • Should
be
part
of
any
document
processing


system


  • Guarantee
consistency

  • Useful
for
iden7fying
upstream
process


problems


– Manual
 – Automated


slide-4
SLIDE 4

Measuring
Quality


How


  • Subjec7ve


– Easy
to
do
 – Not
always
predictable
nor
consistent


  • Objec7ve


– Predictable
and
consistent
 – Hard
to
measure


slide-5
SLIDE 5

Audit


Brute
Force


slide-6
SLIDE 6

Audit


Excep7on
Based


slide-7
SLIDE 7

Contribu7ng
factors
affec7ng

quality


  • State
of
original
document

  • Digi7za7on


– Resolu7on
 – Ligh7ng
 – Exposure
 – Focus


  • Post‐processing


– Rota7on
 – Cropping
 – Contrast
enhancement
 – Lossy
compression


slide-8
SLIDE 8

Quality
Standards


DIRT
(Digital
Image
Research
Team)


  • Composed
of
opera7onal
and
development


personnel


  • Iden7fy
image
a9ributes
affec7ng
quality

  • Provide,
where
possible,
metrics
to
measure


those
a9ributes


  • Determine
acceptable
ranges
for
a9ributes

  • Provide
tools
and
training
to
facilitate


consistent
quality


slide-9
SLIDE 9

Quality
Standards


DIRT
Specifica7on


  • Defines
image
a9ributes
and
desirable
values
for


each


– Tonal
Range
 – Tonal
Resolu7on
 – Even
Exposure
 – Spa7al
Resolu7on
 – Contrast
 – Colorspace
 – Focus
 – Blur
 – File
format
 – File
name
 – Dimensions
 – Size
 – Complete
Capture
 – Orienta7on
 – Skew
 – Fixity


slide-10
SLIDE 10

Subjec7ve
Evalua7on


  • Direct
Numerical
Category
Scaling


– Subjects
classify
images
into
a
number
of
categories
 – Usually
use
a
numerical
scale
e.g.
(1=Bad,
5=Good)
 – Subjects
tend
to
use
separate
internal
scales


  • Different
“types”
of
images

  • Different
types
of
distor7on

  • Func7onal
Measurement
Theory


– Compares
image
quali7es
 – Subjects
indicate
which
image
is
preferred
 – More
evalua7ons
required


  • Each
sampled
image
must
be
compared
with
every
other
sampled


image


slide-11
SLIDE 11

Subjec7ve
Evalua7on


Jpeg
Compression


  • Sample
images


– Randomly
selected
 – Includes
image
from
both
scanned
microfilm
and
 camera
capture
 – Each
image
compressed
at
several
predetermined
 se_ngs
 – The
original,
uncompressed
image
is
also
included


  • Images
were
presented
randomly

  • About
10%
of
the
7me
a
previously
evaluated


image
is
presented
for
reevalua7on


  • Each
image
was
evaluated
by
3
different
subjects

slide-12
SLIDE 12

Subjec7ve
Evalua7on
 
Jpeg
Compression


  • Direct
category
scaling
method


– Asked
to
classify
images
on
a
scale
of
1‐5


  • Zoom
image
1‐100%

  • Pan
around

  • No
7me
limit

  • No
calibra7on
of
monitors
or
ambient
light

slide-13
SLIDE 13

Subjec7ve
Evalua7on



Jpeg
Compression


slide-14
SLIDE 14

Subjec7ve
Evalua7on


Consistency


0
 10
 20
 30
 40
 50
 60
 70
 80
 1
 2
 3
 4
 5
 6
 7
 8
 9
 10
 11
 12
 13
 14
 Percent
 Users
 ‐4
 ‐3
 ‐2
 ‐1
 0
 +1
 +2
 +3
 +4


slide-15
SLIDE 15

Subjec7ve
Evalua7on


Raw
scores


0
 200
 400
 600
 800
 1000
 1200
 1400
 1
 2
 3
 4
 5
 Images
 Quality
 
(1=Worst,
5=Best)
 5
 20
 35
 80
 Orig


slide-16
SLIDE 16

Objec7ve
Measures


  • No‐Reference


– No
reference
image
available
 – “Blind”
reference


  • Reduced‐Reference


– Set
of
extracted
features
from
reference
image
are
 used


  • Full‐Reference


– A
complete
reference
image
is
available


slide-17
SLIDE 17

MSE


  • Measures
how
much
something
changed
but


not
how
important
that
change
is


  • Ranges
from
0
(exactly
the
same)
to
infinity

slide-18
SLIDE 18

MSE


JPEG
Quality


5
 35
 80
 111.2
 15.1
 4.6


slide-19
SLIDE 19

MSE


0
 2000
 4000
 6000
 8000
 10000
 12000
 14000
 16000
 18000
 20000
 0
 12
 24
 36
 48
 60
 72
 84
 96
 108
 120
 132
 144
 156
 168
 180
 192
 204
 216
 228
 240
 5
 20
 35
 80


slide-20
SLIDE 20

MSE
vs.
User
Evalua7on


0
 1000
 2000
 3000
 4000
 5000
 6000
 0
 12
 24
 36
 48
 60
 72
 84
 96
 108
 120
 132
 144
 156
 168
 180
 192
 204
 216
 228
 240
 Images
 MSE
 1
 2
 3
 4
 5


slide-21
SLIDE 21
  • Ra7o
between
the
maximum
possible


power
and
the
power
of
corrup7ng
noise
 introduced
by
compression


  • Measured
using
the
logarithmic
decibel


scale


  • Higher
values,
be9er
quality

  • Typical
values
30‐50db


PSNR


slide-22
SLIDE 22

PSNR


JPEG
Quality


5
 35
 80
 27.7
 36.4
 41.5


slide-23
SLIDE 23

PSNR


0
 2000
 4000
 6000
 8000
 10000
 12000
 0
 3
 6
 9
 12
 15
 18
 21
 24
 27
 30
 33
 36
 39
 42
 45
 48
 51
 54
 57
 60
 5
 20
 35
 80


slide-24
SLIDE 24

PSNR
vs
User
Evalua7on


0
 200
 400
 600
 800
 1000
 1200
 1400
 1600
 1800
 0
 3
 6
 9
 12
 15
 18
 21
 24
 27
 30
 33
 36
 39
 42
 45
 48
 51
 54
 57
 60
 Images
 PSNR
 1
 2
 3
 4
 5


slide-25
SLIDE 25

Universal
Quality
Index


  • Proposed
by
Wang
and
Bovik
(2002)

  • A9empts
to
measure:


– Loss
of
correla7on
 – Luminance
distor7on
 – Contrast
distor7on


slide-26
SLIDE 26

Universal
Quality
Index


JPEG
Quality


5
 35
 80
 0.428
 0.821
 0.943


slide-27
SLIDE 27

Universal
Quality
Index
vs
JPEG


0
 1000
 2000
 3000
 4000
 5000
 6000
 0.00
 0.05
 0.10
 0.15
 0.20
 0.25
 0.30
 0.35
 0.40
 0.45
 0.50
 0.55
 0.60
 0.65
 0.70
 0.75
 0.80
 0.85
 0.90
 0.95
 1.00
 Images
 Universal
Quality
Index


Universal
Quality
Index
vs
JPEG


5
 20
 35
 80


slide-28
SLIDE 28

UQI
vs.
User
Evalua7on


0
 100
 200
 300
 400
 500
 600
 700
 800
 0.00
 0.05
 0.10
 0.15
 0.20
 0.25
 0.30
 0.35
 0.40
 0.45
 0.50
 0.55
 0.60
 0.65
 0.70
 0.75
 0.80
 0.85
 0.90
 0.95
 1.00
 Images
 Universal
Quality
Index
 1
 2
 3
 4
 5


slide-29
SLIDE 29

Conclusion


  • Refine
subjec7ve
results

  • Correlate
subject
results
to
objec7ve


– Evaluate
other
published
quality
metrics
 – Define
our
own