lecture 9 perceptual image quality assessment
play

Lecture 9 Perceptual Image Quality Assessment Lin ZHANG, PhD - PowerPoint PPT Presentation

Lecture 9 Perceptual Image Quality Assessment Lin ZHANG, PhD School of Software Engineering Tongji University Fall 2016 Lin ZHANG, SSE, 2016 Contents Problem definition Full reference image quality assessment No reference image


  1. Structural Similarity (SSIM)—Computation For two corresponding local patches x and y in two images l x y ( , ) Luminance Comparison Similarity c x y ( , ) Contrast Comparison Combination Measure s x y ( , ) Structure Comparison Assume that x and y are vectorized as       x x x , ,..., x y y y , ,..., y and 1 2 N 1 2 N 1 N      ( ) x is the mean intensity of x ( y ) , x y x i N  i 1 1/2   1 N      2 ( )     is the standard deviation of x ( y ) , x   x y x i x N    i 1 1 N             x y is the covariance of x and y , xy xy i x i y N  i 1 Lin ZHANG, SSE, 2016

  2. Structural Similarity (SSIM)—Computation         C 2 C 2 C  x y 3  x y 1  x y 2 s ( , ) x y l ( , ) x y c ( , ) x y , ,        2 2     C 2 2 C C x y 3 x y 1 x y 2  C C C are fixed constants, and usually set , , C C 2 / 2 1 2 3 3 Then, the structure similarity between x and y are defined as         2 C 2 C x y 1 xy 2     SSIM ( , ) x y l ( , ) x y c ( , ) x y s ( , ) x y            2 2 2 2 C C x y 1 x y 2 If the image contains M local patches (defined by a sliding window), the overall image quality is 1 M   SSIM SSIM ( x y , ) i i M  i 1 Lin ZHANG, SSE, 2016

  3. Structural Similarity (SSIM)—Computation      (2 C )(2 C ) [Wang & Bovik, IEEE Signal Proc. Letters , ’02]  x y 1 xy 2 SSIM ( , ) x y  2   2   2   2  ( C )( C ) [Wang et al ., IEEE Trans. Image Proc., ’04] x y 1 x y 2 quality map original image distortion/similarity measure within sliding window distorted pooling image quality score Lin ZHANG, SSE, 2016

  4. Structural Similarity (SSIM)—Computation original Gaussian noise image corrupted image SSIM index absolute map error map Lin ZHANG, SSE, 2016

  5. Structural Similarity (SSIM)—Computation JPEG2000 original compressed image image SSIM index absolute map error map Lin ZHANG, SSE, 2016

  6. Structural Similarity (SSIM)—Computation JPEG original compressed image image SSIM index absolute map error map Lin ZHANG, SSE, 2016

  7. Comparison between MSE and SSIM original Image MSE=0, SSIM=1 MSE=309, SSIM=0.928 MSE=309, SSIM=0.987 MSE=309, SSIM=0.580 MSE=309, SSIM=0.641 MSE=309, SSIM=0.730 Lin ZHANG, SSE, 2016

  8. Comparison between MSE and SSIM converged image (best SSIM) initial image reference equal-MSE image contour converged image (worst SSIM) Lin ZHANG, SSE, 2016

  9. Summary about SSIM • Structural similarity (SSIM) metric measures the structure distortions of images • In implementation, SSIM measures the similarity of two local patches from three aspects, luminance, contrast, and structure • The quality scores predicted by SSIM is much more consistent with human judgments than MSE • SSIM is now widely used to gauge image processing algorithms In the next section, you will encounter an even more powerful IQA metric, FSIM Lin ZHANG, SSE, 2016

  10. Contents • Problem definition • Full reference image quality assessment • Application scenarios • Problem of the classical FR‐IQA metric—MSE • Error visibility method • Structural Similarity (SSIM) • Feature Similarity (FSIM) • Phase congruency • Feature similarity index (FSIM) • Performance metrics • No reference image quality assessment • Summary Lin ZHANG, SSE, 2016

  11. Phase Congruency • Why is phase important?  ฀       i 2 ux i ( ) u f x ( ) F u ( ) f x e ( ) dx A u e ( ) Fourier transform  ( ) u is called the Fourier phase or the global phase • Phase is defined for a specified frequency • The Fourier phase indicates the relative position of the frequency components • Phase is a real number between Lin ZHANG, SSE, 2016

  12. Phase Congruency • Why is phase important? Fourier Hilbert Reconstruction results From Fourier’s From Fourier’s From Fourier’s phase + amplitude phase Hilbert’s amplitude Lin ZHANG, SSE, 2016

  13. Phase Congruency • Local phase analysis Question: What are the frequency components (and the associated phases) at a certain position in a real signal f ( x ) ? Fourier transforms cannot answer such questions Lin ZHANG, SSE, 2016

  14. Phase Congruency • Local phase analysis Analytic signal needs to be constructed   f ( ) x f x ( ) if ( ) x A H 1   f ( ) x h x ( )* f x h x ( ), ( ) where H  x f ( ) x is called the Hilbert transform of f ( x ) H     ( ) x arctan2 f ( ), ( ) x f x Instantaneous phase: H   2 2 A x ( ) f ( ) x f ( ) x Instantaneous amplitude: H  ( ) x seems local, but not so since HT is a global transform Lin ZHANG, SSE, 2016

  15. Phase Congruency • Local phase analysis Thus, local complex filters whose responses are analytic signals themselves are used instead That is   g x ( ) g ( ) x ig ( ) x If is a complex filter and e o   g x ( )* f x ( ) g ( )* x f x ( ) ig ( )* x f x ( ) e o is an analytic signal, then, the local phase (instead of the instantaneous phase) of f ( x ) is defined as     ( ) x arctan2 g ( )* x f x g ( ), ( )* x f x ( ) o e The local amplitude is     2 2   A x ( ) g ( )* x f x ( ) g ( )* x f x ( ) e o Lin ZHANG, SSE, 2016

  16. Phase Congruency • Local phase analysis Thus, local complex filters whose responses are analytic signals themselves are used instead That is   g x ( ) g ( ) x ig ( ) x If is a complex filter and e o   g x ( )* f x ( ) g ( )* x f x ( ) ig ( )* x f x ( ) e o is an analytic signal, g g and are called a quadrature pair e o What are the commonly used quadrature pair filters? See the next sections! Lin ZHANG, SSE, 2016

  17. Phase Congruency • Gabor filter     '2 '2 1 x y          '   G x y ( , ) exp exp 2 i fx (1)       2 2 2     x y          ' ' x x cos y sin , y x sin y cos where Lin ZHANG, SSE, 2016

  18. Phase Congruency • Gabor filter     '2 '2 1 x y          '   G x y ( , ) exp exp 2 i fx (1)       2 2 2     x y          ' ' x x cos y sin , y x sin y cos where Denis Gabor, John Daugman, 1900~1979, Nobel Prize Winner University of Cambridge, UK Lin ZHANG, SSE, 2016

  19. Phase Congruency • Gabor filter     '2 '2 1 x y          '   G x y ( , ) exp exp 2 i fx (1)       2 2 2     x y          ' ' x x cos y sin , y x sin y cos where Primary Cortex Lin ZHANG, SSE, 2016

  20. Phase Congruency • Gabor filter     '2 '2 1 x y          '   G x y ( , ) exp exp 2 i fx (1)       2 2 2     x y          ' ' x x cos y sin , y x sin y cos where J. G. Daugman, Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two‐dimensional visual cortical filters, Journal of the Optical Society of America A, 2(7):1160–1169, 1985. Lin ZHANG, SSE, 2016

  21. Phase Congruency • Log‐Gabor filter • It is also a quadrature pair filter; defined in the frequency domain           2 2      log /     j 0       G ( , ) exp exp   2 j     2 2 2 2    r       j / J where is the orientation angle, is the center 0 j   frequency, controls the filter’s radial bandwidth, and  r determines the angular bandwidth radial part angular part Log‐Gabor Lin ZHANG, SSE, 2016

  22. Phase Congruency—Motivation • Gradient‐based feature detectors • Roberts, Prewitt, Sobel, Canny et al….. • Find maximum in the gradient map • Sensitive to illumination and contrast variations • Poor localization, especially with scale analysis • Difficult to use—threshold problem. One does not know in advance what level of edge strength corresponds to a significant feature Lin ZHANG, SSE, 2016

  23. Phase Congruency—Motivation • Gradient‐based feature detectors • Roberts, Prewitt, Sobel, Canny et al….. • Find maximum in the gradient map • Sensitive to illumination and contrast variations • Poor localization, especially with scale analysis • Difficult to use—threshold problem. One does not know in advance what level of edge strength corresponds to a significant feature Lin ZHANG, SSE, 2016

  24. Phase Congruency—Motivation     7 1 Harris corners, Harris corners, Lin ZHANG, SSE, 2016

  25. Phase Congruency—Motivation • Phase congruency is proposed to overcome those drawbacks • Totally based on the local phase information • A more general framework for feature definition • Invariant to contrast and illumination variation • Offers the promise of allowing one to specify universal feature thresholds Lin ZHANG, SSE, 2016

  26. Phase Congruency—Definition • First appears in [1] • It is more like the human visual system • It postulates that features are perceived at points of maximum phase congruency [all the following discussions will be based on this observation] [1] M.C. Morrone, J. Ross, D.C. Burr, and R. Owens, Mach bands are phase dependent, Nature , vol. 324, pp. 250‐253, 1986 Lin ZHANG, SSE, 2016

  27. Phase Congruency—Definition • Features from the PC view. Fourier components are all in phase in the two cases Lin ZHANG, SSE, 2016

  28. Phase Congruency—Computation • Now the widely used to method to compute phase congruency is [1] • In [1], Kovesi proposed a framework to compute PC by using quadrature pair filters [1] P. Kovesi, Image features from phase congruency, Videre: Journal of Computer Vision Research , vol. 1, pp. 1‐26, 1999 Lin ZHANG, SSE, 2016

  29. Phase Congruency—Computation e o M , M denote the even‐symmetric and odd‐ n n n symmetric wavelets at a scale      e o e x o x ( ), ( ) I x ( )* M , ( )* I x M n n n n The amplitude and phase of the transform at a given wavelet scale is given by o x ( )   n   ( ) x arctg e x 2 2 A x ( ) e x ( ) o x ( ) n n n n ( ) n F x ( ) H x ( ) and can be estimated as:     F x ( ) e x H x ( ), ( ) o ( ) x n n n n E x ( )  PC x ( )   2 2 E x ( ) F ( ) x H ( ) x    A x ( ) n n Lin ZHANG, SSE, 2016

  30. Phase Congruency—Example Lin ZHANG, SSE, 2016

  31. Phase Congruency—Example Lin ZHANG, SSE, 2016

  32. Phase Congruency—Example Lin ZHANG, SSE, 2016

  33. Phase Congruency—Example Lin ZHANG, SSE, 2016

  34. Contents • Problem definition • Full reference image quality assessment • Application scenarios • Problem of the classical FR‐IQA metric—MSE • Error visibility method • Structural Similarity (SSIM) • Feature Similarity (FSIM) • Phase congruency • Feature similarity index (FSIM) • Performance metrics • No reference image quality assessment • Summary Lin ZHANG, SSE, 2016

  35. Feature Similarity Index (FSIM) • A state‐of‐the‐art method proposed in [1] [1] Lin Zhang, Lei Zhang, Xuanqin Mou, and David Zhang, FSIM: A feature similarity index for image quality assessment, IEEE Trans. Image Processing , vol. 20, pp. 2378‐2386, 2011 Lin ZHANG, SSE, 2016

  36. Feature Similarity Index (FSIM) • A state‐of‐the‐art method proposed in [1] • Motivations • Low‐level feature inspired • Visual information is often redundant • low‐level features convey most crucial information • Image degradations will lead to changes in image low‐level features Thus, an IQA index could be devised by comparing the low‐level features between the reference image and the distorted image What kinds of features? Lin ZHANG, SSE, 2016

  37. Feature Similarity Index (FSIM) • Phase congruency • Physiological and psychophysical evidences • Measure the significance of a local structure • Gradient magnitude • PC is contrast invariant. However, local contrast indeed will affect the perceptive image quality • Thus, we have to compensate for the contrast • Gradient magnitude can be used to measure the contrast similarity Lin ZHANG, SSE, 2016

  38. Feature Similarity Index (FSIM) • Phase congruency—An example Lin ZHANG, SSE, 2016

  39. Feature Similarity Index (FSIM) • Gradient magnitude Scharr operator to extract the gradient        1 1           G 0 * f ( ), x G 0 0 0 * f ( ) x     x y 16 16         3 3 10 3     Gradient magnitude (GM):   2 2 G G G x y Lin ZHANG, SSE, 2016

  40. Feature Similarity Index (FSIM) • FSIM computation Given two images, f 1 and f 2 Their PC maps, PC 1 and PC 2 Their GM maps, G 1 and G 2   2 PC ( ) x PC ( ) x T  1 2 1 S ( ) x PC similarity T 1 is a constant PC   2 2 PC ( ) x PC ( ) x T 1 2 1   2 G ( ) x G ( ) x T  S ( ) x 1 2 2 T 2 is a constant GM similarity G   2 2 G ( ) x G ( ) x T 1 2 2     S ( ) x S ( ) x PC ( ) x  x PC G m FSIM  PC ( ) x  x m    PC ( ) x max PC ( ), x PC ( ) x where m 1 2 Lin ZHANG, SSE, 2016

  41. Feature Similarity Index (FSIM) • Extended to a color IQA Separate the chrominance from the luminance         Y 0.299 0.587 0.114 R          I 0.596 0.274 0.322 G               Q 0.211 0.523 0.312 B       I 1 ( I 2 ) and Q 1 ( Q 2 ) be the I and Q channels of f 1 and f 2     2 ( ) I x I ( ) x T 2 Q ( ) x Q ( ) x T   S ( ) x 1 2 3 1 2 4 S ( ) x I Q     2 2 2 2 I ( ) x I ( ) x T Q ( ) x Q ( ) x T 1 2 3 1 2 4          S ( ) x S ( ) x S ( ) x S ( ) x PC ( ) x    x PC G I Q m FSIM  C PC ( ) x  x m Lin ZHANG, SSE, 2016

  42. Feature Similarity Index (FSIM)—Schematic diagram Lin ZHANG, SSE, 2016

  43. Summary • FSIM is a HVS‐driven IQA index • HVS perceives an image mainly based on its low‐level features • PC and gradient magnitude are used • PC is also used to weight the contribution of each point to the overall similarity of two images • FSIM is extended to FSIM C , a color IQA index • FSIM (FSIMC) outperforms all the other state‐of‐the‐ art IQA indices evaluated Lin ZHANG, SSE, 2016

  44. Contents • Problem definition • Full reference image quality assessment • Application scenarios • Problem of the classical FR‐IQA metric—MSE • Error visibility method • Structural Similarity (SSIM) • Feature Similarity (FSIM) • Performance metrics • No reference image quality assessment • Summary Lin ZHANG, SSE, 2016

  45. Performance Metrics • How to evaluate the performance of IQA indices? • Some benchmark datasets were created • Reference images (quality distortion free) are provided • For each reference image, a set of distorted images are created; they suffer from kinds of quality distortions, such as Gaussian noise, JPEG compression, blur, etc; let’s suppose that there are altogether N distorted images • For each distorted image, there is an associated quality score, given by subjects; thus, altogether we have N scores { } N s  i i 1 • For distorted images, we can compute their objective quality scores by using an IQA index f ; we can get N quality { } N o scores  i i 1 • f ’s performance can be reflected by the rank order { } N o { } N s correlation coefficients between and   i i 1 i i 1 Lin ZHANG, SSE, 2016

  46. Performance Metrics • How to evaluate the performance of IQA indices? Spearman rank order correlation coefficient (SRCC) N  2 6 d i    SRCC 1 i 1  2 N N ( 1) where d i is the difference between the i th image's ranks in the subjective and objective evaluations. Note: in Matlab, you can compute the SROCC by using srcc = corr(vect1, vect2, 'type', 'spearman') Lin ZHANG, SSE, 2016

  47. Performance Metrics • How to evaluate the performance of IQA indices? Kendall rank order correlation coefficient (KRCC)  n n  KRCC c d  0.5 N N ( 1) where n c is the number of concordant pairs and n d is the number of discordant pairs Note: in Matlab, you can compute the SROCC by using krcc = corr(vect1, vect2, 'type', ‘kendall') Lin ZHANG, SSE, 2016

  48. Performance Metrics • Popular used benchmark datasets for evaluating IQA indices Database Reference Distorted Observer Distortion name Images images numbers types TID2013 [1] 25 2000 971 24 TID2008 [2] 25 1700 838 17 CSIQ [3] 30 866 35 6 LIVE [4] 29 779 161 5 [1] http://www.ponomarenko.info/tid2013.htm [2] http://www.ponomarenko.info/tid2008.htm [3] http://vision.okstate.edu/?loc=csiq [4] http://live.ece.utexas.edu/research/Quality/ Lin ZHANG, SSE, 2016

  49. Performance Metrics—Comparison of IQA Indices FSIM FSIM C MS‐SSIM VIF SSIM IFC VSNR NQM SRCC 0.8015 0.8510 0.7859 0.6769 0.7417 0.5389 0.6812 0.6392 TID 2013 KRCC 0.6289 0.6665 0.6047 0.5147 0.5588 0.3939 0.5084 0.4740 SRCC 0.8805 0.8840 0.8528 0.7496 0.7749 0.5692 0.7046 0.6243 TID 2008 KRCC 0.6946 0.6991 0.6543 0.5863 0.5768 0.4261 0.5340 0.4608 SRCC 0.9242 0.9310 0.9138 0.9193 0.8756 0.7482 0.8106 0.7402 CSIQ KRCC 0.7567 0.7690 0.7397 0.7534 0.6907 0.5740 0.6247 0.5638 SRCC 0.9634 0.9645 0.9445 0.9631 0.9479 0.9234 0.9274 0.9086 LIVE KRCC 0.8337 0.8363 0.7922 0.8270 0.7963 0.7540 0.7616 0.7413 Note: For more details about full reference IQA, you can refer to http://sse.tongji.edu.cn/linzhang/IQA/IQA.htm Lin ZHANG, SSE, 2016

  50. Contents • Problem definition • Full reference image quality assessment • No reference image quality assessment • Background introduction • Our proposed method: IOUML • Summary Lin ZHANG, SSE, 2016

  51. Background introduction—Problem definition • No reference image quality assessment (NR‐IQA) •Devise computational models to estimate the quality of a given image as perceived by human beings •The only information an NR‐IQA algorithm receives is the image whose quality is being assessed itself Lin ZHANG, SSE, 2016

  52. Background introduction—Problem definition • No reference image quality assessment (NR‐IQA) How do you think the quality of these two images? Though you are not provided the ground‐truth reference images, you may judge the quality of these two images as poor Lin ZHANG, SSE, 2016

  53. Background introduction—Problem definition • No reference image quality assessment (NR‐IQA) How do you think about the qualities of these images? Rank them Remember that you DONOT know the ground‐truth “high quality” reference image Lin ZHANG, SSE, 2016

  54. Background introduction—Typical methods • Opinion‐aware approaches • These approaches require a dataset comprising distorted images and associated subjective scores • At the training stage, feature vectors are extracted from images and then the regression model, mapping the feature vectors to the subjective scores, is learned • At the testing stage, a feature vector is extracted from the test image, and its quality score can be predicted by inputting the feature vector to the learned regression model Lin ZHANG, SSE, 2016

  55. Background introduction—Typical methods • Opinion‐aware approaches feature vectors Lin ZHANG, SSE, 2016

  56. Background introduction—Typical methods • Opinion‐aware approaches • BIQI [1] Proposed by Bovik’s group, Univ. Texas • BRISQUE [2] • BLIINDS [3] • BLIINDS‐II [4] • DIIVINE [5] http://live.ece.utexas.edu/ • CORNIA [6] • LBIQ [7] Lin ZHANG, SSE, 2016

  57. Background introduction—Typical methods • Opinion‐aware approaches – [1] A. Moorthy and A. Bovik, A two‐step framework for constructing blind image quality indices, IEEE Sig. Process. Letters , 17: 513‐516, 2010 – [2] A. Mittal, A.K. Moorthy, and A.C. Bovik, No‐reference image quality assessment in the spatial domain, IEEE Trans. Image Process. , 21: 4695‐4708, 2012 – [3] M.A. Sadd, A.C. Bovik, and C. Charrier, A DCT statistics‐based blind image quality index, IEEE Sig. Process. Letters , 17: 583‐586, 2010 – [4] M.A. Sadd, A.C. Bovik, and C. Charrier, Blind image quality assessment: A natural scene statistics approach in the DCT domain, IEEE Trans. Image Process. , 21: 3339‐3352, 2012 – [5] A.K. Moorthy and A.C. Bovik, Blind image quality assessment: from natural scene statistics to perceptual quality, IEEE Trans. Image Process., 20: 3350‐3364, 2011 – [6] P. Ye, J. Kumar, L. Kang, and D. Doermann, Unsupervised feature learning framework for no‐reference image quality assessment, CVPR , 2012 – [7] H. Tang, N. Joshi, and A. Kapoor. Learning a blind measure of perceptual image quality, CVPR , 2011 Lin ZHANG, SSE, 2016

  58. Background introduction—Typical methods • Opinion‐unaware approaches • These approaches DONOT require a dataset comprising distorted images and associated subjective scores • A typical method is NIQE [1] • Offline learning stage: constructing a collection of quality‐aware features from pristine images and fitting them to a multivariate Gaussian (MVG) model  • Testing stage: the quality of a test image is expressed as the distance between a MVG fit of its features and  [1] A. Mittal et al. Making a “completely blind” image quality analyzer. IEEE Signal Process. Letters , 20(3): 209-212, 2013. Lin ZHANG, SSE, 2016

  59. Contents • Problem definition • Full reference image quality assessment • No reference image quality assessment • Background introduction • Our proposed method: IL‐NIQE • Motivations and our contributions • NIS‐induced quality‐aware features • Pristine model learning • IL‐NIQE index • Experimental results • Summary Lin ZHANG, SSE, 2016

  60. Motivations [1] • Opinion‐unaware approaches seems appealing, so we want to propose an opinion‐unaware approach • Design rationale • Natural images without quality distortions possess regular statistical properties that can be measurably modified by the presence of distortions • Deviations from the regularity of natural statistics, when quantified appropriately, can be used to assess the perceptual quality of an image • NIS‐based features have been proved powerful. Any other NIS‐based features? [1] Lin Zhang et al., A feature‐enriched completely blind image quality evaluator, IEEE Trans. Image Processing 24 (8) 2579‐2591, 2015 Lin ZHANG, SSE, 2016

  61. Contributions • A novel “opinion‐unaware” NR‐IQA index, IL‐NIQE ( I ntegrated L ocal‐ NIQE ) • A set of prudently designed NIS‐induced quality‐aware features • Bhattacharyya distance based metric to measure the quality of a local image patch • A visual saliency based quality score pooling scheme • A thorough evaluation of the performance of modern NR‐IQA indices Lin ZHANG, SSE, 2016

  62. Contents • Problem definition • Full reference image quality assessment • No reference image quality assessment • Background introduction • Our proposed method: IL‐NIQE • Motivations and our contributions • NIS‐induced quality‐aware features • Pristine model learning • IL‐NIQE index • Experimental results • Summary Lin ZHANG, SSE, 2016

  63. IL‐NIQE—NIS‐induced quality‐aware features • Statistics of normalized luminance • The mean subtracted contrast normalized (MSCN) I ( , ) x y coefficients have been observed to follow a unit n normal distribution when computed from natural images without quality distortions [1] • This model, however, is violated when images are subjected to quality distortions; the degree of violation can be indicative of distortion severity [1] D.L. Ruderman. The statistics of natural images. Netw. Comput. Neural Syst. , 5(4):517-548, 1994. Lin ZHANG, SSE, 2016

  64. IL‐NIQE—NIS‐induced quality‐aware features • Statistics of normalized luminance   I x y ( , ) ( , ) x y  I ( , ) x y Conforms to Gaussian n   ( , ) x y 1 where K L        ( , ) x y I x ( k y , l ) k l ,   k K l L K L     2        ( , ) x y I x ( k y , l ) ( , ) x y k l ,   k K l L Lin ZHANG, SSE, 2016

  65. IL‐NIQE—NIS‐induced quality‐aware features • Statistics of normalized luminance • We use a generalized Gaussian distribution (GGD) to I ( , ) x y model the distribution of n Density function of GGD,       x       g x ( ; , ) exp          2 1/       , Parameters are used as quality‐aware features which can be estimated from { I n ( x , y )} by MLE Lin ZHANG, SSE, 2016

  66. IL‐NIQE—NIS‐induced quality‐aware features • Statistics of MSCN products • The distribution of products of pairs of adjacent MSCN coefficients, I n ( x , y ) I n ( x , y +1), I n ( x , y ) I n ( x +1, y ), I n ( x , y ) I n ( x +1, y +1), and I n ( x , y ) I n ( x +1, y -1 ), can also capture the quality distortion Lin ZHANG, SSE, 2016

  67. IL‐NIQE—NIS‐induced quality‐aware features • Statistics of MSCN products • They can be modeled by asymmetric generalized Gaussian distribution (AGGD),             exp x / , x 0      l      1/   l r      g x ( ; , , )   l r          exp x / , x 0      r      1/  l r                2 / / 1/ The mean of AGGD is r l       , , , are used as “quality‐aware” features r l Lin ZHANG, SSE, 2016

  68. IL‐NIQE—NIS‐induced quality‐aware features • Statistics of partial derivatives and gradient magnitudes • We found that when introducing quality distortions to an image, the distribution of its partial derivatives, and gradient magnitudes, will be changed Lin ZHANG, SSE, 2016

  69. IL‐NIQE—NIS‐induced quality‐aware features • Statistics of partial derivatives and gradient magnitudes 1(b) 1(c) 1(a) 1(d) 1(e) Lin ZHANG, SSE, 2016

  70. IL‐NIQE—NIS‐induced quality‐aware features • Statistics of partial derivatives and gradient magnitudes 3.5 6 Fig. 1(a) Fig. 1(b) 3 5 Fig. 1(c) Fig. 1(d) 2.5 4 Percentage (%) Percentage (%) Fig. 1(e) 2 3 Fig. 1(a) 1.5 2 Fig. 1(b) Fig. 1(c) 1 1 Fig. 1(d) Fig. 1(e) 0.5 0 0 0.005 0.01 0.015 -0.015 -0.01 -0.005 0 0.005 0.01 0.015 gradient magnitude (normalized) partial derivative (normalized) Lin ZHANG, SSE, 2016

  71. IL‐NIQE—NIS‐induced quality‐aware features • Statistics of partial derivatives and gradient magnitudes Partial derivatives   I I * G ( , ), x y I I * G ( , ) x y x x y y where,     2 2 x x y   G ( , ) x y exp   x  4  2 2 2       2 2 y x y   G ( , ) x y exp   y   4 2 2 2   Gradient magnitudes   2 2 GM x y ( , ) I I x y Lin ZHANG, SSE, 2016

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend