human centered machine learning
play

Human-Centered Machine Learning Saleema a Amershi hi Machine T - PowerPoint PPT Presentation

Human-Centered Machine Learning Saleema a Amershi hi Machine T eaching Group, Microsoft Research UW CSE 510 Lecture, March 1, 2016 What is Machine T eaching? Can improve learning with better learning strategies: Note taking


  1. Human-Centered Machine Learning Saleema a Amershi hi Machine T eaching Group, Microsoft Research UW CSE 510 Lecture, March 1, 2016

  2. What is Machine T eaching?

  3. Can improve learning with better learning strategies: • Note taking • Self-explanation • Practice • Mnemonic devices • ….

  4. Machine T eaching Machine Learning Images from http://thetomatos.com

  5. What is Machine Learning?

  6. What is Machine Learning? “Process by which a system improves performance from experience.” – Herbert Simon “Study of algorithms that improve their performance P at some task T with experience E” – T om Mitchell “Field of study that gives computers the ability to learn without being explicitly programmed” – Arthur Samuel

  7. Programming 6 1 5 2 3 3 1 5 8 6 7 7 2 8

  8. Programming 6 1 5 2 3 3 1 5 8 6 7 7 2 8

  9. Programming

  10. Programming

  11. f (x) ≈y

  12. f ( )≈2

  13. How do Machines Learn?

  14. How do Machines Learn? Apply Machine Learning Algorithms Clean Model Data

  15. How do Machines Learn? Apply Machine Learning Algorithms Images from: https://www.toptal.com/machine-learning/machine-learning-theory-an-introductory-primer

  16. How do Machines Learn? Apply Machine Learning Algorithms Clean Model Data Which algorithm should you use? Where do you get this data? How do you know if its working? How should it be represented?

  17. Investigating Statistical Machine Learning as a T ool for Software Developers Patel, K., Fogarty, J., Landay, J., and Harrison, B. CHI 2008.

  18. Methodology Semi-structured interviews with 11 researchers. 5 hour think-aloud study with 10 participants. Digit recognition task.

  19. Applied Machine Learning is a Process Collect Data Create Features Select Model Evaluate Slide content from Kayur Patel

  20. Applied Machine Learning is a Process Collect Data Create Features Select Model Evaluate

  21. Applied Machine Learning is a Process Collect Data Create Features Select Model Evaluate

  22. What About Music Recommendation? Collect Data Create Features Select Model Evaluate Genre: Rock T empo: Fast Drums: Yes Time of day: Afternoon Recently heard: No ….

  23. Problems with current tools Don’t support machine learning as an iterative and exploratory process. Image from: http://www.crowdflower.com/blog/the-data-science-ecosystem

  24. Problems with current tools Don’t support machine learning as an iterative and exploratory process. Don’t support relating data to behaviors of the algorithm. LogitBoostWith8T o18EvenWindow-Iter=10.model LogitBoostWith8T o18EvenWindow-Iter=20.model SVMWith8T o18EvenWindow-Iter=10.model ….

  25. Problems with current tools Don’t support machine learning as an iterative and exploratory process. Don’t support relating data to behaviors of the algorithm. Don’t support evaluation in context of use.

  26. Model Performance is Important Apply Machine Learning Algorithms Clean Model Data

  27. What are other considerations? Collect Data Create Features Select Model Evaluate

  28. Computational Efficiency? Collect Data Create Features Select Model Evaluate

  29. Data processing efficiency? Collect Data Create Features Select Model Evaluate “Data scientists, according to interviews and expert estimates, spend 50 percen ent to 80 percent nt of of their ir time mired in this more mundane labor of collecting and preparing unruly digital data.” - New York Times, 2014

  30. Understandability? Collect Data Create Features Select Model Evaluate “TAP9 initially used a decision tree algorithm because it allowed TAP9 to easily see what features were being used…Later in the study…they transitioned to using more complex models in search of increased performance.”

  31. Considerations for Machine Learning Model performance Computational efficiency Iteration efficiency Ease of experimentation Understandability …. New opportunities for HCI research! Need to make tradeoffs!

  32. Interactive Machine Learning Fails, J.A. and Olsen, D.R. IUI 2003.

  33. Crayons: IML for Pixel Classifiers

  34. What tradeoffs did Crayons make? Collect Data Create Features Select Model Evaluate

  35. What tradeoffs did Crayons make? Collect Data Create Features Select Model Evaluate

  36. “Classical” ML Interactive ML

  37. What tradeoffs did Crayons make? Rapid iteration Model performance • Fast training Flexibility • Integrated environment • Automatic featuring Simplicity • No model selection

  38. When are these tradeoffs appropriate? Rapid iteration Model performance Simplicity Flexibility Novices Experts Large set of available features Custom features needed Data can be efficiently viewed Data types that can’t be viewed and labeled at a glance Labels obtained from external sources

  39. Flock: Hybrid Crowd-Machine Learning Classifiers Cheng, J. and Bernstein, S. CSCW 2015.

  40. Collect Data Create Features Select Model Evaluate “At the end of the day, some machine learning projects succeed and some fail. What makes the differences? Easil ily the e most important ant factor or is the features ures used ed. ” [Domingos, CACM 2012]

  41. How do people come up with features? Look for features used in related domains. Use intuition or domain knowledge. Apply automated techniques. Featur ture e ideati ation on – think of and experiment with custom features.

  42. A Brainstorming Exercise

  43. How do people come up with features? Look for features used in related domains. Use intuition or domain knowledge. Apply automated techniques. Featur ture e ideati ation on – think of and experiment with custom features. “The novelty of generated ideas increases as participants ideate, reaching a peak after their 18 th instance.” [Krynicki, F. R., 2014]

  44. Workflow User specifies a concept and uploads some unlabeled data. Crowd views data and suggests features.

  45. What makes a cat a cat?

  46. What makes a cat a cat?

  47. Workflow User specifies a concept and uploads some unlabeled data. Crowd compares and contrasts positive and negative examples and suggests “why” they are different. Reasons become features. Reasons are clustered. User vets, edits, and adds to features. Crowd implements feature by labeling data. Features used to build classifiers.

  48. Crayons Collect Data Create Features Select Model Evaluate Flock Collect Data Create Features Select Model Evaluate

  49. Collect Data Create Features Select Model Evaluate Positives Negatives Standard Ranked List Split T echnique (Best/Worst Matches) [Fogarty et al., CHI 2007]

  50. Collect Data Create Features Select Model Evaluate Grouping and tagging surfaces decision making. Moving, merging and splitting groups helps with revising decisions. Traditional Labeling Structured Labeling [Kulesza et al., CHI 2014]

  51. Collect Data Create Features Select Model Evaluate Summary Stats ModelTracker [Amershi et al., CHI 2015]

  52. Collect Data Create Features Select Model Evaluate [Patel et al., IJCAI 2011]

  53. Rule-based explanation Collect Data Create Features Keyword-based explanation Select Model Similarity-based explanation [Stumpf et al, IUI 2007] Evaluate

  54. Collect Data Create Features Select Model Evaluate

  55. Experts Practitioners Everyday People

  56. How are these scenarios different? Experts Practitioners Everyday People User experience impacts what you can expose. Interaction focus impacts attention and feedback. Accuracy requirements impacts expected time and effort. …..

  57. Principles for human-centered ML? Tradit ditiona nal User Interfa faces ces Intelligent/ gent/ML ML-Base ased d Interfa faces ces Visibility and feedback Safety Consistency and standards Trust Predictability Manage expectations Actionability Degrade gracefully under uncertainty Error prevention and recovery …. ….

  58. Traditional Machine Learning Apply Machine Learning Algorithms Clean Model Data

  59. Human-Centered Machine Learning Collect Data Create Features Select Model Evaluate

  60. Human-Centered Machine Learning + Machine Learning Machine T eaching samershi@microsoft.com

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend