from zero to ai hero
play

From Zero to AI Hero Presented by: Kevin - PDF document

W4 Test Analytics, AI/ ML Wednesday, October 2nd, 2019 11:30 AM From Zero to AI Hero Presented by: Kevin Pyles


  1. ¡ ¡ W4 ¡ Test ¡Analytics, ¡AI/ ¡ML ¡ Wednesday, ¡October ¡2nd, ¡2019 ¡11:30 ¡AM ¡ ¡ ¡ ¡ ¡ From ¡Zero ¡to ¡AI ¡Hero ¡ ¡ Presented ¡by: ¡ ¡ ¡ ¡ Kevin ¡ ¡Pyles ¡ ¡ Domo, ¡Inc ¡ ¡ Brought ¡to ¡you ¡by: ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ 888 -­‑-­‑-­‑ 268 -­‑-­‑-­‑ 8770 ¡ ·√·√ ¡904 -­‑-­‑-­‑ 278 -­‑-­‑-­‑ 0524 ¡-­‑ ¡info@techwell.com ¡-­‑ ¡http://www.starwest.techwell.com/ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡

  2. Kevin ¡ ¡Pyles ¡ ¡ Kevin ¡Pyles ¡has ¡been ¡in ¡QA ¡for ¡over ¡ten ¡years ¡and ¡has ¡led ¡local ¡and ¡remote ¡teams ¡ responsible ¡for ¡QA ¡across ¡forty-­‑plus ¡concurrent ¡projects. ¡Kevin’s ¡previous ¡teams ¡ provided ¡web ¡load ¡testing ¡for ¡over ¡600,000 ¡connections/sec ¡and ¡I18N ¡and ¡L10N ¡ testing ¡for ¡over ¡a ¡hundred ¡languages ¡and ¡countries. ¡Kevin ¡recently ¡fell ¡in ¡love ¡with ¡ artificial ¡intelligence ¡and ¡Python ¡and ¡has ¡since ¡been ¡promoting ¡AI ¡as ¡the ¡future ¡of ¡ testing, ¡whether ¡through ¡homegrown ¡or ¡commercial ¡services. ¡Follow ¡Kevin ¡on ¡ Twitter ¡@pyleskevin. ¡ ¡

  3. Linear Regression ŷ = a * x + b Cost Function (J) of Linear Regression n J = 1 n Σ (pred i – y i ) 2 Perception Learning Rule Naïve Bayes i=1 w i,j (next step) = w i,j +n(y j -ŷ j )x i P(x|c)P(c) P(c|x)= P(x) P(c|X) = P(x 1 |c)×P(x 2 |c) Linear Regression from ZERO to AI HERO ŷ = a * x + b Stochastic Gradient Descent by Kevin Pyles for i in range(m): Θ j = Θ j -α(ŷ i -y i )x i j Root Mean Square Error √ m 1 m Σ SE(X,h) = (h(x (i) ) i – y (i) ) 2 i=1 Logistic Regression Multiple Linear Regression F(x)= ___1___ ŷ = b_1*X_1 + b_2*X_2 + b_3*X_3 + a 1 + e -x Our journey begins … 1

  4. Thoughts after this presentation: • What is the future of testing? • Will AI take over our jobs? • What am I going to do about it? • What AI can do … 2

  5. Jump in! 3

  6. ML Recommendation System import numpy as np import pandas as pd import matrix_factorization_utilities # Load user ratings raw_dataset_df = pd.read_csv('movie_ratings_data_set.csv') # Load movie titles movies_df = pd.read_csv('movies.csv', index_col='movie_id') # Convert the running list of user ratings into a matrix ratings_df = pd.pivot_table(raw_dataset_df, index='user_id’, columns='movie_id’, aggfunc=np.max) # Apply matrix factorization to find the latent features U, M = matrix_factorization_utilities.low_rank_matrix_factorization(ratings_df.as_matrix(), num_features=15, regularization_amount=0.1) 4

  7. Is this Python? Now that’s a Panda! 5

  8. Now that’s the Matrix! 6

  9. Issues learning AI • Code • Python • AI/Machine Learning/Data Science • Automation • Data 7

  10. Never Give Up! What should we do next? 8

  11. Bring AI to Testing The Learning Path Data Machine Artificial Automation Science Learning Intelligence 9

  12. Linear Regression ŷ = a * x + b Cost Function (J) of Linear Regression n J = 1 n Σ (pred i – y i ) 2 Perception Learning Rule Naïve Bayes i=1 w i,j (next step) = w i,j +n(y j -ŷ j )x i P(x|c)P(c) P(c|x)= P(x) P(c|X) = P(x 1 |c)×P(x 2 |c) Linear Regression ŷ = a * x + b Stochastic Gradient Descent for i in range(m): Automation Θ j = Θ j -α(ŷ i -y i )x i j Root Mean Square Error √ m 1 m Σ SE(X,h) = (h(x (i) ) i – y (i) ) 2 i=1 Logistic Regression Multiple Linear Regression F(x)= ___1___ ŷ = b_1*X_1 + b_2*X_2 + b_3*X_3 + a 1 + e -x Our Path to Automation • Python • Demo, demo, demo • Start simple • Add more tests 10

  13. Quick Plug for Python • Python #1 for AI • 41.7% of Developers • Beginner Friendly How to Learn Python • Read book and complete exercises • Work on real world projects (test automation) • Demo to team 11

  14. Why Beginning Python? 1. We are beginners 2. Someone else recommended it. 3. We wanted to be Professionals J Automation: Keys to Success • Start simple • Even more simple • Go one step at a time 12

  15. Selenium: First Script from selenium import webdriver #Open a browser driver = webdriver.Chrome() #Go to Google.com driver.get(‘https://www.google.com’) Linear Regression ŷ = a * x + b Cost Function (J) of Linear Regression n J = 1 n Σ (pred i – y i ) 2 Perception Learning Rule Naïve Bayes i=1 w i,j (next step) = w i,j +n(y j -ŷ j )x i P(x|c)P(c) P(c|x)= P(x) P(c|X) = P(x 1 |c)×P(x 2 |c) Linear Regression ŷ = a * x + b Stochastic Gradient Descent for i in range(m): Data Science Θ j = Θ j -α(ŷ i -y i )x i j Root Mean Square Error √ m 1 m Σ SE(X,h) = (h(x (i) ) i – y (i) ) 2 i=1 Logistic Regression Multiple Linear Regression F(x)= ___1___ ŷ = b_1*X_1 + b_2*X_2 + b_3*X_3 + a 1 + e -x 13

  16. AI, ML, Data Science, Oh my! Artificial Data Intelligence Science Machine Learning Data Analytics How to learn Data Science • AI for Everyone on Coursera • Data Analysis w/ Python 3 and Pandas • Read Data Science from Scratch • Linear Algebra from Khan Academy 14

  17. Data science is more than charts • In this Book – Python – Linear Algebra – Statistics & Probability – Data – Linear Regression – Neural Networks – And lots more! Data, data, data • Get Data! – Test Cases – Tickets i.e. Jira – Code i.e. Git 15

  18. Example: • What Test Cases have NEVER failed? • Could we remove some of them? Chart of Test Cases Pass/Fail 16

  19. Chart of Test Cases Pass/Fail Additional Questions • What else could you do with this data? • Where are your bugs clustered? • What bugs aren’t getting fixed? • What code is most buggy? • What is your bug to feature rate? 17

  20. Share your findings! • Charts are fun to share! • We shared and immediately deleted over 900 test cases! Wow! • But what about Quality? • Quality didn’t go down! More Data Analytics J Linear Regression ŷ = a * x + b Cost Function (J) of Linear Regression n J = 1 n Σ (pred i – y i ) 2 Perception Learning Rule Naïve Bayes i=1 w i,j (next step) = w i,j +n(y j -ŷ j )x i P(x|c)P(c) P(c|x)= P(x) P(c|X) = P(x 1 |c)×P(x 2 |c) Linear Regression ŷ = a * x + b Stochastic Gradient Descent for i in range(m): Machine Learning Θ j = Θ j -α(ŷ i -y i )x i j Root Mean Square Error √ m 1 m Σ SE(X,h) = (h(x (i) ) i – y (i) ) 2 i=1 Logistic Regression Multiple Linear Regression F(x)= ___1___ ŷ = b_1*X_1 + b_2*X_2 + b_3*X_3 + a 1 + e -x 18

  21. ML Learning Plan • Complete MachineLearningMastery.com courses • Practical Machine Learning Tutorial with Python • Read Hands-On Machine Learning with Scikit- Learn and TensorFlow • Complete AI for Element Selection from TestAutomationU Hands-On Machine Learning • In this Book – Classification – Training models – Vector Machines – Decision Trees – TensorFlow – Convolutional Neural Nets – Recurrent Neural Nets – Reinforcement Learning – Yeah … there’s more in there … 19

  22. How can ML help with Automation? Fuzzy Screenshot Comparison Problem: Image Comparison Tools require exact matches. i.e. Pixel = Pixel Solution: Fuzzy Comparison using Machine Learning Generate screenshots, as many as you want Test Case will then create a screenshot and compare against previous screenshots. We even use black and white because we don’t care about the color. If there is a 99% match that is close enough. 20

  23. Code Class ImageCompare(): def compare_images(self, image_array_1, image_array_2): #convert the images to grayscale grayA = cv2.cvtColor(image_array_1[:], cv2.COLOR_BGR2GRAY) grayB = cv2.cvtColor(image_array_2[:], cv2.COLOR_BGR2GRAY) #computer the Structure Similarity Index (SSIM) between the two #images, ensuring that the difference between the images is returned (score, diff) = compare_ssim(grayA, grayB, full=True) diff = (diff * 255).astype(“uint8”) print(“SSIM: {}”.format(score)) return score . . . AI Bug Detected 21

  24. AI Changes Detected Linear Regression ŷ = a * x + b Cost Function (J) of Linear Regression n J = 1 n Σ (pred i – y i ) 2 Perception Learning Rule Naïve Bayes i=1 w i,j (next step) = w i,j +n(y j -ŷ j )x i P(x|c)P(c) P(c|x)= P(x) P(c|X) = P(x 1 |c)×P(x 2 |c) Linear Regression ŷ = a * x + b Stochastic Gradient Descent for i in range(m): Artificial Intelligence Θ j = Θ j -α(ŷ i -y i )x i j Root Mean Square Error √ m 1 m Σ SE(X,h) = (h(x (i) ) i – y (i) ) 2 i=1 Logistic Regression Multiple Linear Regression F(x)= ___1___ ŷ = b_1*X_1 + b_2*X_2 + b_3*X_3 + a 1 + e -x 22

  25. AI Learning Plan • Coursera Deep Learning Specialization • Twitter Sentiment Analysis – Learn Python for Data Science #2 How can AI help with testing? 23

  26. Sentiment Analysis • Sentiment Analysis is a field of Natural Language Processing (NLP) that builds models that try to identify and classify attributes of the expression Customer Feedback Positive or Negative? 24

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend