distributing matrix computations with spark mllib
play

Distributing Matrix Computations with Spark MLlib Reza Zadeh A - PowerPoint PPT Presentation

Distributing Matrix Computations with Spark MLlib Reza Zadeh A General Platform Standard libraries included with Spark Spark MLlib Spark SQL GraphX Streaming machine structured graph learning real-time Spark Core Outline Introduction to


  1. Distributing Matrix Computations with Spark MLlib Reza Zadeh

  2. A General Platform Standard libraries included with Spark Spark MLlib Spark SQL GraphX Streaming � machine structured graph learning real-time … Spark Core

  3. Outline Introduction to MLlib Example Invocations Benefits of Iterations: Optimization Singular Value Decomposition All-pairs Similarity Computation MLlib + {Streaming, GraphX, SQL}

  4. Introduction

  5. MLlib History MLlib is a Spark subproject providing machine learning primitives Initial contribution from AMPLab, UC Berkeley Shipped with Spark since Sept 2013

  6. MLlib: Available algorithms classification: classification: logistic regression, linear SVM, � naïve Bayes, least squares, classification tree regr egression: ession: generalized linear models (GLMs), regression tree collaborative filtering: collaborative filtering: alternating least squares (ALS), non-negative matrix factorization (NMF) clustering: clustering: k-means|| decomposition: decomposition: SVD, PCA optimization: optimization: stochastic gradient descent, L-BFGS

  7. Example Invocations

  8. Example: K-means

  9. Example: PCA

  10. Example: ALS

  11. Benefits of fast iterations

  12. Optimization At least two large classes of optimization problems humans can solve: - Convex Programs - Spectral Problems (SVD)

  13. Optimization - LR data ¡= ¡spark.textFile(...).map(readPoint).cache() ¡ ¡ w ¡= ¡numpy.random.rand(D) ¡ ¡ for ¡i ¡ in ¡range(iterations): ¡ ¡ ¡ ¡ ¡gradient ¡= ¡data.map(lambda ¡p: ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡ ¡(1 ¡/ ¡(1 ¡+ ¡exp(-­‑p.y ¡* ¡w.dot(p.x)))) ¡* ¡p.y ¡* ¡p.x ¡ ¡ ¡ ¡ ¡).reduce(lambda ¡a, ¡b: ¡a ¡+ ¡b) ¡ ¡ ¡ ¡ ¡w ¡-­‑= ¡gradient ¡ ¡ print ¡“Final ¡w: ¡%s” ¡% ¡w ¡

  14. Spark PageRank Using cache(), keep neighbor lists in RAM Using partitioning, avoid repeated hashing partitionBy Neighbors (id, edges) Ranks (id, rank) … join join join

  15. PageRank Results 200 171 ime per iteration (s) Time per iteration (s) Hadoop 150 Basic Spark 100 72 Spark + Controlled 50 Partitioning 23 0

  16. Spark PageRank Generalizes ¡to ¡Matrix ¡Multiplication, ¡opening ¡many ¡algorithms ¡ from ¡Numerical ¡Linear ¡Algebra ¡

  17. Deep Dive: Singular Value Decomposition

  18. Singular Value Decomposition Two cases: Tall and Skinny vs roughly Square computeSVD function takes care of which one to call, so you don’t have to.

  19. SVD selection

  20. Tall and Skinny SVD

  21. Tall and Skinny SVD Gets ¡us ¡ ¡ ¡V ¡and ¡the ¡ singular ¡values ¡ Gets ¡us ¡ ¡ ¡U ¡by ¡one ¡ matrix ¡multiplication ¡

  22. Square SVD via ARPACK Very mature Fortran77 package for computing eigenvalue decompositions � JNI interface available via netlib-java � Distributed using Spark

  23. Square SVD via ARPACK Only needs to compute matrix vector multiplies to build Krylov subspaces The result of matrix-vector multiply is small � The multiplication can be distributed

  24. Deep Dive: All pairs Similarity

  25. Deep Dive: All pairs Similarity Compute via DIMSUM: “Dimension Independent Similarity Computation using MapReduce” Will be in Spark 1.2 as a method in RowMatrix

  26. All-pairs similarity computation

  27. Naïve Approach

  28. Naïve approach: analysis

  29. DIMSUM Sampling

  30. DIMSUM Analysis

  31. Spark implementation

  32. Ongoing Work in MLlib stats library (e.g. stratified sampling, ScaRSR) ADMM LDA General Convex Optimization

  33. MLlib + {Streaming, GraphX, SQL}

  34. MLlib + Streaming As of Spark 1.1, you can train linear models in a streaming fashion Model weights are updated via SGD, thus amenable to streaming More work needed for decision trees

  35. � MLlib + SQL points = context.sql(“select latitude, longitude from tweets”) � model = KMeans.train(points, 10) �

  36. MLlib + GraphX

  37. Future of MLlib

  38. General Linear Algebra CoordinateMatrix RowMatrix BlockMatrix Goal: ¡version ¡1.2 ¡ Local and distributed versions. � Operations in-between. Goal: ¡version ¡1.3 ¡

  39. Research Goal: General Convex Optimization Distribute ¡CVX ¡by ¡ backing ¡CVXPY ¡with ¡ PySpark ¡ ¡ Easy-­‑to-­‑express ¡ distributable ¡convex ¡ programs ¡ ¡ Need ¡to ¡know ¡less ¡ math ¡to ¡optimize ¡ complicated ¡ objectives ¡

  40. Spark and ML Spark has all its roots in research, so we hope to keep incorporating new ideas!

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend