Computational Aesthetics CS 294-69 Final Project Armin Samii Tim - - PowerPoint PPT Presentation

computational aesthetics
SMART_READER_LITE
LIVE PREVIEW

Computational Aesthetics CS 294-69 Final Project Armin Samii Tim - - PowerPoint PPT Presentation

Computational Aesthetics CS 294-69 Final Project Armin Samii Tim Althoff Problem Problem Problem Problem Some change Original Some change Result Problem Exposure +2 Original Some change Result Problem Exposure +2 Original


slide-1
SLIDE 1

Computational Aesthetics

CS 294-69 Final Project Armin Samii Tim Althoff

slide-2
SLIDE 2

Problem

slide-3
SLIDE 3

Problem

slide-4
SLIDE 4

Problem

slide-5
SLIDE 5

Problem

Original Some change Some change Result

slide-6
SLIDE 6

Problem

Original Exposure +2 Some change Result

slide-7
SLIDE 7

Problem

Original Exposure +2 Contrast +20% Result

slide-8
SLIDE 8

Problem

Original Exposure +2 Contrast +20% Saturation +25%

slide-9
SLIDE 9

Roadblocks

 Training Data

 Noisy  Repetitions  Hard to obtain

 Sequence Learning

 Feature-dependence

(avoid repeating same sequence)

 Training a good model

 User Interface

 Simplicity  Facilitate learning

 Parameter Learning

 Predict parameters

using regression

slide-10
SLIDE 10

Approach

 Feature Extraction

 Must be done for each iteration

slide-11
SLIDE 11

Approach

 Feature Extraction

 Must be done for each iteration

 Must be fast

slide-12
SLIDE 12

Approach

 Feature Extraction

 Must be done for each iteration

 Must be fast  We work on small (100x100) images

slide-13
SLIDE 13

Approach

 Feature Extraction

 Must be done for each iteration

 Must be fast  We work on small (100x100) images  Features must be simple enough to be

detected in thumbnails

slide-14
SLIDE 14

Approach

 Feature Extraction

 Must be done for each iteration

 Must be fast  We work on small (100x100) images  Features must be simple enough to be

detected in thumbnails

 Features we use

 Color-based

slide-15
SLIDE 15

Approach

 Feature Extraction

 Must be done for each iteration

 Must be fast  We work on small (100x100) images  Features must be simple enough to be

detected in thumbnails

 Features we use

 Color-based (e.g. histograms, contrast, etc.)

slide-16
SLIDE 16

Approach

 Feature Extraction

 Must be done for each iteration

 Must be fast  We work on small (100x100) images  Features must be simple enough to be

detected in thumbnails

 Features we use

 Color-based (e.g. histograms, contrast, etc.)  Simple Haar features for face detection

slide-17
SLIDE 17

Approach

 Feature Extraction

 Must be done for each iteration

 Must be fast  We work on small (100x100) images  Features must be simple enough to be

detected in thumbnails

 Features we use

 Color-based (e.g. histograms, contrast, etc.)  Simple Haar features for face detection

(distinguish between portraits, group shots, etc.)

slide-18
SLIDE 18

Approach

 Feature Extraction

 Must be done for each iteration

 Must be fast  We work on small (100x100) images  Features must be simple enough to be

detected in thumbnails

 Features we use (~30 total)

 Color-based (e.g. histograms, contrast, etc.)  Simple Haar features for face detection

(distinguish between portraits, group shots, etc.)

slide-19
SLIDE 19

Approach

 Parameter learning

 P(adjustment strength |

features, adjustment)

 Regression techniques

 Sequence Learning

P(next adjustment(s) | features, previous adjustments)

N-grams + features

Exposure Adjustment ? Parameter ?

slide-20
SLIDE 20

Approach

 Parameter learning

 P(adjustment strength |

features, adjustment)

 Regression techniques

 Sequence Learning

P(next adjustment(s) | features, previous adjustments)

N-grams + features

Exposure Contrast

  • 15%
slide-21
SLIDE 21

Approach

 Parameter learning

 P(adjustment strength | features, adjustment)  Regression techniques:  Linear  Ridge  Lasso  Lars  ElasticNet  Gaussian Procress

slide-22
SLIDE 22

Approach

 Sequence Learning

P(next adjustment(s) | features, previous adjustments) → ”Feature-augmented n-grams”

n-gram: sequence of n items from a given sequence

n-gram model → (n − 1)-order Markov model

Feature augmentation Tri-gram Modelled by GMM

slide-23
SLIDE 23

Approach

 User interface

 Show user each step in sequence

slide-24
SLIDE 24

Results: Parameter learning

slide-25
SLIDE 25

Results: Sequence learning

slide-26
SLIDE 26

Future Work

 More features  Local edits

 Treat skin separately  Gradients (e.g. horizon)  Foreground/background separation

 Style modeling  User personalization

 a*GeneralModel + (1-a)*UserModel

 User study

slide-27
SLIDE 27