negative dependence stable polynomials etc in ml
play

Negative Dependence, Stable Polynomials etc in ML Part 2 SUVRIT - PowerPoint PPT Presentation

Negative Dependence, Stable Polynomials etc in ML Part 2 SUVRIT SRA & STEFANIE JEGELKA Laboratory for Information and Decision Systems Massachusetts Institute of Technology Neural information Processing Systems, 2018 ml.mit.edu


  1. Negative Dependence, 
 Stable Polynomials etc in ML Part 2 SUVRIT SRA & STEFANIE JEGELKA Laboratory for Information and Decision Systems Massachusetts Institute of Technology Neural information Processing Systems, 2018 ml.mit.edu

  2. Outline Introduction 
 Prominent example: Determinantal Point Processes 1 Stronger notions of negative dependence Intro & Implications: Sampling Theory Approximating partition functions Learning a DPP (and some variants) 2 Applications Theory & Applications Recommender systems, Nyström method, optimal design, regression, neural net pruning, negative mining, anomaly detection, etc. Perspectives and wrap-up Negative dependence, stable polynomials etc. in ML - part 1 Stefanie Jegelka (stefje@mit.edu)

  3. Theory Partition functions Learning DPPs Negative dependence, stable polynomials etc. in ML - part 2 Suvrit Sra (suvrit@mit.edu)

  4. <latexit sha1_base64="qwEaZ3oTpAJyVs2s06lKeQV+gtY=">ADTHicfVJNbxMxEHW3UEr4SuHIxSIgFQlV2QiJXipVwIELJQjSpsR5HVmN1Zt78qeLYms/QP8Gq7wI7jzP7ghJyPQ7JFjOT103tvdmdmJymUdNhu/9yKtq9d37mxe7Nx6/adu/eae/dPXV5aAT2Rq9z2E+5ASQM9lKigX1jgOlFwly8mutnl2CdzM1HnBUw1DwzMpWCY6BGzcMYq+a6v9D0/pEWp5cLHlf80Yrqswgn8qNlqH7QXQa+CeAVaZBXd0V7UZONclBoMCsWdG8TtAoeW5RCQdVgpYOCiwuewSBAwzW4oV+0U9EngRnTNLfhGKQLdj3Dc+3cTCfBqTlOXF2bk/UEr3xZe9Qczuz41o9mB4OvTRFiWDEspy0VBRzOp8gHUsLAtUsAC6sDB1RMeFhbBjm3GAGPotca27GnsmiGnSGnilIkSluMgW0FT+jrQ5lVmYTZHZBVptpl9PKs0T7aVUTjHKlDlp4MpMrqSW6muVEBH3euDKn9Rf0F9X+3X1fF09n6uvIfw+C28D964AyzG3nGbaR5KXN3/cUmzdIU7FBc35ir4LRzEAf8/nr+OVqm3bJQ/KI7JOYvCDH5A3pkh4R5Av5Sr6R79GP6Ff0O/qztEZbq5wHZCO2d/4CXo4ZSw=</latexit> <latexit sha1_base64="qwEaZ3oTpAJyVs2s06lKeQV+gtY=">ADTHicfVJNbxMxEHW3UEr4SuHIxSIgFQlV2QiJXipVwIELJQjSpsR5HVmN1Zt78qeLYms/QP8Gq7wI7jzP7ghJyPQ7JFjOT103tvdmdmJymUdNhu/9yKtq9d37mxe7Nx6/adu/eae/dPXV5aAT2Rq9z2E+5ASQM9lKigX1jgOlFwly8mutnl2CdzM1HnBUw1DwzMpWCY6BGzcMYq+a6v9D0/pEWp5cLHlf80Yrqswgn8qNlqH7QXQa+CeAVaZBXd0V7UZONclBoMCsWdG8TtAoeW5RCQdVgpYOCiwuewSBAwzW4oV+0U9EngRnTNLfhGKQLdj3Dc+3cTCfBqTlOXF2bk/UEr3xZe9Qczuz41o9mB4OvTRFiWDEspy0VBRzOp8gHUsLAtUsAC6sDB1RMeFhbBjm3GAGPotca27GnsmiGnSGnilIkSluMgW0FT+jrQ5lVmYTZHZBVptpl9PKs0T7aVUTjHKlDlp4MpMrqSW6muVEBH3euDKn9Rf0F9X+3X1fF09n6uvIfw+C28D964AyzG3nGbaR5KXN3/cUmzdIU7FBc35ir4LRzEAf8/nr+OVqm3bJQ/KI7JOYvCDH5A3pkh4R5Av5Sr6R79GP6Ff0O/qztEZbq5wHZCO2d/4CXo4ZSw=</latexit> <latexit sha1_base64="qwEaZ3oTpAJyVs2s06lKeQV+gtY=">ADTHicfVJNbxMxEHW3UEr4SuHIxSIgFQlV2QiJXipVwIELJQjSpsR5HVmN1Zt78qeLYms/QP8Gq7wI7jzP7ghJyPQ7JFjOT103tvdmdmJymUdNhu/9yKtq9d37mxe7Nx6/adu/eae/dPXV5aAT2Rq9z2E+5ASQM9lKigX1jgOlFwly8mutnl2CdzM1HnBUw1DwzMpWCY6BGzcMYq+a6v9D0/pEWp5cLHlf80Yrqswgn8qNlqH7QXQa+CeAVaZBXd0V7UZONclBoMCsWdG8TtAoeW5RCQdVgpYOCiwuewSBAwzW4oV+0U9EngRnTNLfhGKQLdj3Dc+3cTCfBqTlOXF2bk/UEr3xZe9Qczuz41o9mB4OvTRFiWDEspy0VBRzOp8gHUsLAtUsAC6sDB1RMeFhbBjm3GAGPotca27GnsmiGnSGnilIkSluMgW0FT+jrQ5lVmYTZHZBVptpl9PKs0T7aVUTjHKlDlp4MpMrqSW6muVEBH3euDKn9Rf0F9X+3X1fF09n6uvIfw+C28D964AyzG3nGbaR5KXN3/cUmzdIU7FBc35ir4LRzEAf8/nr+OVqm3bJQ/KI7JOYvCDH5A3pkh4R5Av5Sr6R79GP6Ff0O/qztEZbq5wHZCO2d/4CXo4ZSw=</latexit> <latexit sha1_base64="qwEaZ3oTpAJyVs2s06lKeQV+gtY=">ADTHicfVJNbxMxEHW3UEr4SuHIxSIgFQlV2QiJXipVwIELJQjSpsR5HVmN1Zt78qeLYms/QP8Gq7wI7jzP7ghJyPQ7JFjOT103tvdmdmJymUdNhu/9yKtq9d37mxe7Nx6/adu/eae/dPXV5aAT2Rq9z2E+5ASQM9lKigX1jgOlFwly8mutnl2CdzM1HnBUw1DwzMpWCY6BGzcMYq+a6v9D0/pEWp5cLHlf80Yrqswgn8qNlqH7QXQa+CeAVaZBXd0V7UZONclBoMCsWdG8TtAoeW5RCQdVgpYOCiwuewSBAwzW4oV+0U9EngRnTNLfhGKQLdj3Dc+3cTCfBqTlOXF2bk/UEr3xZe9Qczuz41o9mB4OvTRFiWDEspy0VBRzOp8gHUsLAtUsAC6sDB1RMeFhbBjm3GAGPotca27GnsmiGnSGnilIkSluMgW0FT+jrQ5lVmYTZHZBVptpl9PKs0T7aVUTjHKlDlp4MpMrqSW6muVEBH3euDKn9Rf0F9X+3X1fF09n6uvIfw+C28D964AyzG3nGbaR5KXN3/cUmzdIU7FBc35ir4LRzEAf8/nr+OVqm3bJQ/KI7JOYvCDH5A3pkh4R5Av5Sr6R79GP6Ff0O/qztEZbq5wHZCO2d/4CXo4ZSw=</latexit> Computing Partition functions Aim: Estimate Z µ , i.e., normalization const / partition function Pr( S ) = 1 µ ( S ) Z µ Typically intractable and often even hard to approximate (exponential number of terms to sum over, or evaluation of high-dimensional integrals / volumes) but… 4 Negative dependence, stable polynomials etc. in ML - part 2 Suvrit Sra (suvrit@mit.edu)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend