dnn assisted parameter space exploration and
play

DNN#Assisted*Parameter*Space* Exploration*and*Visualization*for* - PowerPoint PPT Presentation

DNN#Assisted*Parameter*Space* Exploration*and*Visualization*for* Large*Scale*Simulations HA HAN#WE WEI*SH SHEN DE DEPARTM TMENT*OF NT*OF*C *COM OMPUTE TER*S *SCIENC NCE*A *AND*E ND*ENG NGINE NEERING NG TH THE*OH OHIO* O*STATE


  1. DNN#Assisted*Parameter*Space* Exploration*and*Visualization*for* Large*Scale*Simulations HA HAN#WE WEI*SH SHEN DE DEPARTM TMENT*OF NT*OF*C *COM OMPUTE TER*S *SCIENC NCE*A *AND*E ND*ENG NGINE NEERING NG TH THE*OH OHIO* O*STATE TE*UNIVERSITY TY 1

  2. Traditional**Visualization*Pipeline* Disk I/O I/O Simulator Raw*data Memory Supercomputer Post9analysis • Data*are*often*very*large* • Mainly*batch*mode*processing;**Interactive*exploration*is*not*possible • Limited*parameters*to*explore

  3. Parameter'Space'Exploration • Running&large&scale&simulations&is&very&time&and&storage& consuming& • A&physical&simulation&typically&have&a&huge&parameter&space • Ensemble&simulations&are&needed&for&analyzing&the&model&quality& and&also&identify&the&uncertainty&of&the&simulation& • It&is&not&possible&to&exhaust&all&possible&input&parameters&– time& and&space&prohibitive& 3

  4. DNN#Assisted#Parameter#Space#Exploration • Can$we$create$visualization$of$the$simulation$outputs$without$saving$the$data? Or • Can$we$predict$the$simulation$results$without$running$the$simulation? • Why?$ : Identify$important$simulation$parameters$ : Identify$simulation$parameter$sensitivity$ : Quantify$the$uncertainty$of$the$simulation$models • Methods: : InSituNet (IEEE$SciVis’19$Best$Paper) : NNVA (IEEE$VAST’19$Best$Paper$Honorable$Mention) 4

  5. InSituNet:*Deep*Image*Synthesis*for* Parameter*Space*Exploration*of*Ensemble* Simulations 5

  6. Introduction • Ensemble(data(analysis(workflow • Issues 5 I/O(bottleneck(and(storage(overhead simulation parameters I/O I/O raw data ensemble post-hoc disk simulations analysis 6

  7. Introduction • In#situ#visualization - Generating#visualizations#at#simulation#time - Storing#images#for#post-hoc#analysis visual mapping parameters view parameters simulation visualization parameters I/O I/O images in situ vis image data ensemble post-hoc disk simulations analysis 7

  8. 8 Introduction • Challenges ) Limiting.the.flexibility.of.post)hoc.exploration.and.analysis ) Raw.data.are.no.longer.available ) Incapable.of.exploring.the.simulation.parameters ) Expensive.simulations.need.to.be.conducted.for.new.parameter.settings • Our.Approach ) Studying.how.the.parameters.influence.the.visualization.results ) Predicting.visualization.results.for.new.parameter.settings

  9. Approach(Overview visual mapping parameters view ! "#$ parameters simulation visualization ( ! %#" ℱ ! "#$ , ! %#" , ! %#&' = ( images parameters in situ vis ! %#&' ensemble simulations InSituNet A#deep#neural#network#that#models#the#function# ℱ 9

  10. Design'of'InSituNet • Three'subnetworks'and'two'losses 3 Regressor (mapping'parameters'to'prediction'images) 3 Feature+comparator+ (computing' feature+reconstruction+loss ) 3 Discriminator (computing' adversarial+loss ) input feature Feature parameters reconstruction Comparator loss ground truth Regressor Discriminator adversarial loss prediction 10

  11. Design'of'InSituNet • Three&subnetworks&and&two&losses 2 Regressor (mapping(parameters(to(prediction(images) 2 Feature'comparator' (computing& feature'reconstruction'loss ) 2 Discriminator (computing& adversarial'loss ) input feature Feature parameters reconstruction Comparator loss ground truth Regressor Discriminator adversarial loss prediction 11

  12. 12 • A"convolutional"neural"network"(CNN) Regressor' + , 4 Weights:" * ,"updated"during"training 4 Output:"prediction"image" ( 4 Input:"parameters" ! "#$ ," ! %#" ,"and" ! %#&' parameters input Regressor ) prediction view visual mapping simulation parameters parameters parameters (n) (m) (l) (n, 512) (m, 512) (l, 512) (512) (512) 2D convolution fully connected residual block input/output (512) (512, 512) (512, 512) (512, 512) concat Architecture"of"the"regressor (1536) (1536, 4 × 4 × 16 × k ) (4 × 4 × 16 × k ) others normalization bacth ReLU reshape (4, 4, 16 × k ) (in=16 × k , out=16 × k ) (8, 8, 16 × k ) (in=16 × k , out=8 × k ) (16, 16, 8 × k ) (in=8 × k , out=8 × k ) upsampling (32, 32, 8 × k ) (in=8 × k , out=4 × k ) upsampling (64, 64, 4 × k ) (in, out, 1, 1) (in=4 × k , out=2 × k ) (128, 128, 2 × k ) (in, out, 3, 3) (in=2 × k , out= k ) (256, 256, k ) (out, out, 3, 3) ( k , 3, 3, 3) (256, 256, 3) sum tanh a image

  13. view visual mapping simulation parameters parameters parameters (m) (l) (n) Regressor' ! " (m, 512) (l, 512) (n, 512) (512) (512) (512) 2D convolution fully connected residual block input/output (512, 512) (512, 512) (512, 512) concat (1536) (1536, 4 × 4 × 16 × k ) (4 × 4 × 16 × k ) others normalization bacth ReLU reshape (4, 4, 16 × k ) (in=16 × k , out=16 × k ) (8, 8, 16 × k ) (in=16 × k , out=8 × k ) (16, 16, 8 × k ) (in=8 × k , out=8 × k ) (32, 32, 8 × k ) upsampling (in=8 × k , out=4 × k ) upsampling (64, 64, 4 × k ) (in, out, 1, 1) (in=4 × k , out=2 × k ) (in, out, 3, 3) (128, 128, 2 × k ) (in=2 × k , out= k ) (256, 256, k ) (out, out, 3, 3) ( k , 3, 3, 3) (256, 256, 3) sum tanh a image • A$convolutional$neural$ network$(CNN) • Weights$ # :$weights$ • Output:$prediction$ • Input:$simulation,$visual$ collected$from$all$layers image parameters mapping,$view$ 13

  14. view visual mapping simulation parameters parameters parameters (m) (l) (n) Regressor' ! " (m, 512) (l, 512) (n, 512) (512) (512) (512) 2D convolution fully connected residual block input/output (512, 512) (512, 512) (512, 512) concat (1536) (1536, 4 × 4 × 16 × k ) (4 × 4 × 16 × k ) others normalization bacth ReLU reshape (4, 4, 16 × k ) (in=16 × k , out=16 × k ) (8, 8, 16 × k ) (in=16 × k , out=8 × k ) (16, 16, 8 × k ) (in=8 × k , out=8 × k ) (32, 32, 8 × k ) upsampling (in=8 × k , out=4 × k ) upsampling (64, 64, 4 × k ) (in, out, 1, 1) (in=4 × k , out=2 × k ) (in, out, 3, 3) (128, 128, 2 × k ) (in=2 × k , out= k ) (256, 256, k ) (out, out, 3, 3) ( k , 3, 3, 3) (256, 256, 3) sum tanh a image • Activation'function • Fully'connected'layer • Rectified'Linear'Units'(ReLU) • Weights:'matrix' ) ∈ ℝ &×( • Output:'1D'vector' ' ∈ ℝ ( • Input:'1D'vector' # ∈ ℝ & '′ = max(0, ') ' = )# 14

  15. view visual mapping simulation parameters parameters parameters (m) (l) (n) Regressor' ! " (m, 512) (l, 512) (n, 512) (512) (512) (512) 2D convolution fully connected residual block input/output Conference%on%Computer%Vision%and%Pattern%Recognition,%pp.%770–778,%2016. 1 K.%He,%X.%Zhang,%S.%Ren,%and%J.%Sun.%Deep%residual%learning%for%image%recognition.%In%Proceedings%of%2016%IEEE% (512, 512) (512, 512) (512, 512) concat (1536) (1536, 4 × 4 × 16 × k ) (4 × 4 × 16 × k ) others normalization bacth ReLU reshape (4, 4, 16 × k ) (in=16 × k , out=16 × k ) (8, 8, 16 × k ) (in=16 × k , out=8 × k ) (16, 16, 8 × k ) (in=8 × k , out=8 × k ) (32, 32, 8 × k ) upsampling (in=8 × k , out=4 × k ) upsampling (64, 64, 4 × k ) (in, out, 1, 1) (in=4 × k , out=2 × k ) (in, out, 3, 3) (128, 128, 2 × k ) (in=2 × k , out= k ) (256, 256, k ) (out, out, 3, 3) ( k , 3, 3, 3) (256, 256, 3) sum tanh a image • Residual%block 1 • 2D%Convolutional%Layer • Adding%input%to%the%output% • Weights:%kernel% , ∈ • Output:%tensor% * ∈ ℝ &×(×) + • Input:%tensor% # ∈ ℝ &×(×) of%convolutional%layers ℝ -_(×-_&×)×) + * = , ∗ # 15

  16. 16 Loss$Function • Difference*between*the*prediction*and*the*ground*truth • Used*to*update*the*weights* ! Backpropagation input Regressor parameters prediction Loss ground truth

  17. 17 Loss$Function$+ Straightforward$Approach • Pixel&wise&loss&functions / Example:&mean&squared&error&loss&(MSE&loss& ℒ "#$ ) / Issue:&blurry&prediction&images input Regressor parameters prediction MSE&Loss Blurry&image&generated&by&a regressor&trained&with&the&MSE&loss ground truth

  18. 18 Loss$Function$+ Our$Approach • Combining(two(loss(functions(defined(by(two(subnetworks 5 Feature(comparator(5>(feature(reconstruction(loss( ℒ "#$% 5 Discriminator(5>(adversarial(loss( ℒ $&' feature input Feature Regressor reconstruction parameters Comparator loss prediction Discriminator adversarial loss ground truth

  19. Design'of'InSituNet • Three'subnetworks'and'two'losses 3 Regressor (mapping'parameters'to'prediction'images) 3 Feature'comparator' (computing+ feature'reconstruction'loss ) 3 Discriminator (computing' adversarial1loss ) input feature Feature parameters reconstruction Comparator loss ground truth Regressor Discriminator adversarial loss prediction 19

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend