tensor what
play

Tensor what? An introduction to AI on mobile Luke Sleeman - - PowerPoint PPT Presentation

Tensor what? An introduction to AI on mobile Luke Sleeman - Freelance Android developer http://lukesleeman.com luke.sleeman@gmail.com @LukeSleeman Agenda Intro - History of AI, Recent breakthroughs, why AI on mobile is important Part 1


  1. Tensor what? An introduction to AI on mobile Luke Sleeman - Freelance Android developer http://lukesleeman.com luke.sleeman@gmail.com @LukeSleeman

  2. Agenda • Intro - History of AI, Recent breakthroughs, why AI on mobile is important • Part 1 - Key AI tech - Neural networks, image recognition, voice recognition, and Siri work • Part 2 - TensorFlow - What TensorFlow does, Server side use, embedding TensorFlow in a mobile app, models built with TensorFlow • Part 3 - Demo - We recognise a banana with my phone! • Closing thoughts - Strong AI, Why all this is important. • Questions

  3. Intro - A bit of history

  4. Hal 9000 is a great example of “Strong AI”. Hal can: • Reason (use strategy, solve puzzles, and make judgments under uncertainty) • Represent knowledge, including commonsense knowledge • Plan • Learn • Communicate in natural language • Integrate all these skills towards common goals • Kill All Humans https://en.wikipedia.org/wiki/Artificial_general_intelligence

  5. 10 Print “Good morning Dave, I am HAL 9000”

  6. 10 Print “Good morning Dave, I am HAL 9000” 20 Input “What can I do for you?”, R$

  7. 10 Print “Good morning Dave, I am HAL 9000” 20 Input “What can I do for you?”, R$ ???

  8. 10 Print “Good morning Dave, I am HAL 9000” 20 Input “What can I do for you?”, R$ 30 Print “I'm sorry, Dave. I'm afraid I can't do that.”

  9. 10 Print “Good morning Dave, I am HAL 9000” 20 Input “What can I do for you?”, R$ 30 Print “I'm sorry, Dave. I'm afraid I can't do that.” 40 Goto 20

  10. The First AI Winter "within ten years a digital computer will be the world's chess champion" and "within ten years a digital computer will discover and prove an important new mathematical theorem.” - 1958 , H. A. Simon and Allen Newell "Within a generation ... the problem of creating 'artificial intelligence' will substantially be solved.” - 1967 , Marvin Minsky

  11. Present Day • IBM’s Deep blue beats Kasparov - 1997 • DARPA self driving car challenge - 2005, 2007 • IBM’s Watson wins jeopardy - 2011 • Googles AlphaGo beats GO champion - 2015

  12. AI + Mobile = A great combo

  13. Why AI on mobile is important Constrained input + limited interaction means intelligence is important

  14. Part 1 - Key AI Tech

  15. Key AI Tech - Neural Networks

  16. Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(x1 + x2 > 0){ … we are in! } else { … we are out! }

  17. Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(x1 + x2 > 0){ … we are in! } else { … we are out! }

  18. Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(x1 + x2 > 0){ … we are in! } else { … we are out! }

  19. Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(x1 + x2 > b ){ … we are in! } else { … we are out! }

  20. Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if(( w1 * x1) + ( w2 * x2) > b){ … we are in! } else { … we are out! }

  21. Neural Networks - Classification problem • 2 dimensions to our data - x 1 , x 2 • Can draw it helpfully on a graph! if((w1 *x1) + (w2 * x2) > b){ return 1.0; } else { return 0.0; }

  22. Neural Networks - Classification problem

  23. Neural Networks - Classification problem Lets do a demo!

  24. Neural networks - Summary • They learn to recognise patterns from training data! • Work for complex patterns • Can take a long time to train • You have to choose the right hyper-parameters (number of nodes, how they are laid out, activation function, input transformation, etc)

  25. Key AI Tech - Image search and Deep belief networks

  26. Image search

  27. Deep belief networks • Stack a number of layers together to make a DBN • Early layers learn to recognise simple features • Later use features to recognise larger objects • A type of ‘auto-encoder’ - learns from input • Early unsupervised training to recognise features • Later supervised training to learn what they ‘mean’

  28. Deep belief network layers Unsupervised Learning of Hierarchical Representations with Convolutional Deep Belief Networks Honglak Lee, Roger Grosse, Rajesh Ranganath, and Andrew Y. Ng

  29. Deep belief networks • Stack a number of layers together to make a DBN • Early layers learn to recognise simple features • Later use features to recognise larger objects • A type of ‘auto-encoder’ - learns from input • Early unsupervised training to recognise features • Later supervised training to learn what they ‘mean’

  30. Key AI Tech - Speech recognition and Recurrent Neural Networks

  31. Speech Recognition

  32. Speech recognition and Recurrent Neural Networks (RNNs) • Everything up to now is a feed forward network • In RNN’s output is fed back in as input • Feedback serves as a type of short term memory! • Takes a sequence of inputs • Supplies results over time

  33. Key AI Tech - Ok Google, Siri and Recursive Neural Tensor Networks

  34. Ok Google, Hey Siri

  35. Sentence parsing

  36. Alice drove down the street in her car

  37. Recursive Neural Tensor Networks (RNTN’s) Root • Good for recognising hierarchal structures • Tree like structure - root node + left and right • Just like RNN’s, the complexity is in how Leaf Leaf they are invoked

  38. Recursive Neural Tensor Networks fast is The car

  39. Recursive Neural Tensor Networks The car

  40. Recursive Neural Tensor Networks Class, Score The car

  41. Recursive Neural Tensor Networks is The car

  42. Recursive Neural Tensor Networks fast is The car

  43. Recursive Neural Tensor Networks fast is The car

  44. Part 2 - TensorFlow

  45. Tensor What?

  46. TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>

  47. TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>

  48. TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>

  49. TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>

  50. TensorFlow - some code $ python ... >>> import tensorflow as tf >>> hello = tf.constant('Hello, TensorFlow!') >>> sess = tf.Session() >>> print(sess.run(hello)) Hello, TensorFlow! >>> a = tf.constant(10) >>> b = tf.constant(32) >>> print(sess.run(a + b)) 42 >>>

  51. TensorFlow - some code dnnc = tf.contrib.learn.DNNClassifier( feature_columns=feature_columns, hidden_units=[20, 20, 20, 20], n_classes=2) dnnc.fit(x=latlng_train, y=is_mt_train, steps=200) accuracy = dnnc.evaluate(x=latlng_test, y=is_mt_test)["accuracy"] print('Accuracy: {:.2%}'.format(accuracy))

  52. TensorFlow - some code dnnc = tf.contrib.learn.DNNClassifier( feature_columns=feature_columns, hidden_units=[20, 20, 20, 20], n_classes=2) dnnc.fit(x=latlng_train, y=is_mt_train, steps=200) accuracy = dnnc.evaluate(x=latlng_test, y=is_mt_test)["accuracy"] print('Accuracy: {:.2%}'.format(accuracy))

  53. TensorFlow - some code dnnc = tf.contrib.learn.DNNClassifier( feature_columns=feature_columns, hidden_units=[20, 20, 20, 20], n_classes=2) dnnc.fit(x=latlng_train, y=is_mt_train, steps=200) accuracy = dnnc.evaluate(x=latlng_test, y=is_mt_test)["accuracy"] print('Accuracy: {:.2%}'.format(accuracy))

  54. TensorFlow - some code dnnc = tf.contrib.learn.DNNClassifier( feature_columns=feature_columns, hidden_units=[20, 20, 20, 20], n_classes=2) dnnc.fit(x=latlng_train, y=is_mt_train, steps=200) accuracy = dnnc.evaluate(x=latlng_test, y=is_mt_test)["accuracy"] print('Accuracy: {:.2%}'.format(accuracy))

  55. Building an app with TensorFlow

  56. Tensor No

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend