ai and predictive analytics in data center environments
play

AI and Predictive Analytics in Data-Center Environments Neural - PowerPoint PPT Presentation

AI and Predictive Analytics in Data-Center Environments Neural Networks and Deep Learning Josep Ll. Berral @BSC Intel Academic Education Mindshare Initiative for AI Introduction Neural networks attempt to imitate the brain neuron


  1. AI and Predictive Analytics in Data-Center Environments Neural Networks and Deep Learning Josep Ll. Berral @BSC Intel Academic Education Mindshare Initiative for AI

  2. Introduction “Neural networks attempt to imitate the brain neuron mechanisms” “Actually, mathematical artifacts that learn non - linear models”

  3. Introduction • Neural networks: allow non-linear models More Dimensions Dimensions

  4. Introduction • NNs: Adjusts input weights after passing data repeatedly Data Neural Net Predict Adjust Get the error Repeat this until error is low enough

  5. Introduction • NNs: Adjusts input weights after passing data repeatedly Data Neural Net Predict Adjust Get the error Repeat this until error is low enough Repeat this with different NN tuning until error is even lower

  6. Introduction • NNs: Adjusts input weights after passing data repeatedly Predic Data Neural Net Lots of Experiments to RUN! t Adjust Get the error Repeat this until error is Parallelism is Welcome low enough Repeat this with different NN tuning until error is even lower Accelerators are Welcome Any optimization is Welcome

  7. Introduction • Schema of Perceptron (basic linear neuron) • Example... g(x) = sigm(f(x)) Inputs x Output g(x) f(x) = ∑(x i · w i + b i ) y i = g(f(x)) = sigm (∑ (x i · w i + b i ))

  8. Introduction • …but simple neurons can’t learn non -linear mechanisms • We can create networks! • Example of XOR (non-linear) • XOR is not linear, so requires neuron composing Input Output “ hidden layer ” Multi-Layer Perceptrons (MLP)

  9. Training a regular NN • Iterative process on MLPs Predict → ŷ 2 X ŷ 1 Input Output ε = Y - ŷ 2 ε = Ỹ 1 - ŷ 1 ← Check error & Adjust Repeat until acceptable error/reach iteration limit

  10. Composition of NNs • Decisions: • How many hidden layers? h1 Input Output h1 h2 Input Output h1 h2 h3 Input Output

  11. Composition of NNs • Decisions: • How many neurons per layer? h1 Input Output h1 Input Output h1 Input Output

  12. Composition of NNs • Decisions: • Which kind of neuron per layer? • Linear? Sigmoid? Rectifier? Convolutional? ... h1 h2 Input Output sigmoid convolutional linear + pooling

  13. Composition of NNs • Decisions: • Do we have any kind of “recurrence”? (back connections?) h1 h2 Input Output

  14. Composition of NNs • Decisions: • How many iterations? Any “stop” criteria...? Predict → h1 h2 Input Output How many “EPOCHS” ← Adjust (iterations)?! • Also other hyper-params! (learning rate, momentum...)

  15. Composition of NNs • Most of these issues and decisions: EXPERIMENTS → TRIAL/ERROR Executions!

  16. Details of Neural Networks • Readability • NNs are hard to “read” • Can’t know always what’s happening inside • Sometimes, weights can be visualized • See the weights of each neuron

  17. Details of Neural Networks • Example: MNIST • Hand-written number recognition HL (30 neurons) Input (28x28) (10 neurons) Output 9

  18. Deep Neural Networks • Some hard problems can be solved using “Deep” NNs • … Those are NNs with several hidden layers • Highly complex datasets require high degree of layers + neurons • Training require High Performance Computing • E.g. image recognition scenarios conv + pool + relu 10000 neurons conv + pool + 2000 neurons conv + pool + 50000 neurons Input (image) 1000 neurons line + relu + flatten line + relu (classes) Output + sigm relu relu “ cat ”

  19. DNN & Transfer Learning • Some trained NNs can be downloaded and tuned/adapted I trained this Trained NN The Internet I have new Trained data NN Little Data Refined training NN

  20. DNN & Transfer Learning • We can re-train NNs for fine-grain tuning, even parts of a NN • E.g.: VGG19 (image recognition NN) + tuning “style” layer (hl4) • Transfer Learning on images: Update style Forward + VGG19 layer reconstruction

  21. Summary • Neural Networks: • Method for Non-linear modeling • Capabilities: • Multi-dimensional data • Transfer Learning • Difficulties: • NN smithering and tinkering • Decide architecture • Decide hyper-params • Checking each decision → AN EXPERIMENT TO RUN!

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend