deep semantic matching for amazon product search
play

Deep Semantic Matching for Amazon Product Search Yi Yiwei ei So - PowerPoint PPT Presentation

Search Deep Semantic Matching for Amazon Product Search Yi Yiwei ei So Song ng Amazon Product Search Deep Semantic Matching for Amazon Product Search Amazon Product Search Amazon is 4th most popular site in US [1] Majority of


  1. Search Deep Semantic Matching for Amazon Product Search Yi Yiwei ei So Song ng Amazon Product Search

  2. Deep Semantic Matching for Amazon Product Search

  3. Amazon Product Search • Amazon is 4th most popular site in US [1] • Majority of Amazon retail revenue is attributed to search • Nearly half of US internet users start product search on Amazon [2] Place image here [1] https://www.alexa.com/topsites/countries/US [2] https://retail.emarketer.com/article/more-product-searches-start-on- amazon/5b92c0e0ebd40005bc4dc7ae

  4. Semantic Matching in Product Search • Goal of Semantic Matching is to reduce customers’ effort to shop Reduced query reformulations Bridge the vocabulary gap between customers’ queries and product description

  5. What is a match for a query? Zion Health Adama Clay Minerals Lexical Match Shampoo, 16 Fluid Ounce by Zion Health “ health shampoo ”

  6. What is a match for a query? Zion Health Adama Clay Minerals Lexical Match Shampoo, 16 Fluid Ounce by Zion Health “ health shampoo ” ArtNaturals Organic Moroccan Argan Oil Shampoo and Semantic Match Conditioner Set - (2 x 16 Fl Oz / 473ml) - Sulfate Free - Volumizing & Moisturizing - Gentle on Curly & Color Treated Hair - Infused with Keratin by ArtNaturals

  7. What is a match for a query? Antarctic Star 17 Bottle Wine Cooler/Cabinet Refigerator Small Wine Cellar Beer Counter Top Fridge Lexical Match Quiet Operation Compressor Freestanding Black by Antarctic Star “countertop wine fridge”

  8. What is a match for a query? Antarctic Star 17 Bottle Wine Cooler/Cabinet Refigerator Small Wine Cellar Beer Counter Top Fridge Lexical Match Quiet Operation Compressor Freestanding Black by Antarctic Star “countertop wine fridge” DELLA 048-GM-48197 Beverage Center Semantic Match Cool Built-in Cooler Mini Refrigerator w/ Lock- Black/Stainless Steel by DELLA

  9. Semantic Matching augments Lexical Matching iphone P1 P1 P1 P10 P1 P12 P1 P100 xr P1 P1 P2 P2 P9 P9 Lexical case P1 P1 P3 P3 P4 P4 Matches Query Merge Ranking Semantic Matches Neural KNN Query Network Search Embedding Product Embeddings P1: 0.12598,0.058533,-0.09845,0.010078,0.045166,-0.014076,… P2: 0.051819,0.0054588,0.0047226,0.045959,-0.015015,… P3: 0.010887,-0.015808,-0.098145,0.039215,-0.058655,-0.085388, P4: 0.042053,0.087402,0.070129,0.082397,-0.051056,-0.089478,

  10. Neural Network Representation Model Similarity Function Query Embedding Document Embedding Neural Neural Networks Networks Document Query Text Text

  11. Data “artistic iphone 6s case” purchased

  12. Data “artistic iphone 6s case” purchased Impressed but not purchased

  13. Data Random “artistic iphone 6s case” purchased Impressed but not purchased

  14. Loss Function Similarity between Query_Embed and “artistic iphone 6s case” Product_Embed High purchased Impressed but Medium not purchased Low Random

  15. Loss Function • For purchases: $ = (0, $ ≥ 0.9 & !"## $, & $ − 0.9 . , & $ < 0.9 & • For impressed but not purchased: $) = (0, $ ≤ 0.55 & l"##($, & $ − 0.55 . , & $ > 0.55 & • For randomly-sampled: $) = (0, $ ≤ 0.2 & l"##($, & $ − 0.2 . , & $ > 0.2 &

  16. Loss Function

  17. N-gram Average Neural Network Cosine Similarity Product Embedding Query Embedding Dense Layer Average, Normalize, Activation Average, Normalize, Activation Shared Embedding Layer Title Query Ngrams Ngrams N-gram Parser Query Product Title Product Attributes

  18. N-gram Average Neural Network "artistic", "iphone", "6s", "case", "artistic#iphone", "iphone#6s", "6s#case", “artistic iphone 6s case” "artistic#iphone#6s", "iphone#6s#case", "#ar", "art", "rti", …, "#ca", "cas", "ase", "se#"

  19. N-gram Average Neural Network Embedding Size Build N-gram vocab by frequency Hash OOV N-gram to a bin to group low count tokens "artistic" "iphone#6s" Vocab Size "artistic", "iphone", "6s", "case", "se#" "artistic#iphone", "iphone#6s", "6s#case", “artistic iphone No Out of Embedding 6s case” "artistic#iphone#6s", "iphone#6s#case", Vocab? Matrix Yes "#ar", "art", "rti", …, "#ca", "cas", "ase", "se#" Hash() OOV Bucket "artistic#iphone#6s" Size "artistic#iphone"

  20. N-gram Average Neural Network 0.63 0.61 • Word Unigram baseline on small dataset 0.59 0.57 0.55 MAP 0.53 0.51 0.49 0.47 0.45 0 100 200 300 400 500 600 700 800 Epoch

  21. N-gram Average Neural Network 0.63 0.61 • Word Unigram baseline on small dataset • Use more data 0.59 0.57 0.55 MAP 0.53 0.51 0.49 0.47 0.45 0 100 200 300 400 500 600 700 800 Epoch

  22. N-gram Average Neural Network 0.63 0.61 • Word Unigram baseline on small dataset • Use more data 0.59 • Add Word Bigram 0.57 0.55 MAP 0.53 0.51 0.49 0.47 0.45 0 100 200 300 400 500 600 700 800 Epoch

  23. N-gram Average Neural Network 0.63 0.61 • Word Unigram baseline on small dataset • Use more data 0.59 • Add Word Bigram • Add Character Trigram 0.57 0.55 MAP 0.53 0.51 0.49 0.47 0.45 0 100 200 300 400 500 600 700 800 Epoch

  24. N-gram Average Neural Network 0.63 0.61 • Word Unigram baseline on small dataset • Use more data 0.59 • Add Word Bigram • Add Character Trigram 0.57 • Add OOV hashing for ngrams 0.55 MAP 0.53 0.51 0.49 0.47 0.45 0 100 200 300 400 500 600 700 800 Epoch

  25. N-gram Average Neural Network 0.63 0.61 • Word Unigram baseline on small dataset • Use more data 0.59 • Add Word Bigram • Add Character Trigram 0.57 • Add OOV hashing for ngrams 0.55 • More tokens/parameters overfits on small MAP dataset 0.53 0.51 0.49 0.47 0.45 0 100 200 300 400 500 600 700 800 Epoch

  26. Increase Vocab Size by Model Parallelism Performance increases with more parameters in model 0.63 0.61 • 3000 MM 0.59 • 500 MM 0.57 • 180 MM 0.55 MAP 0.53 0.51 0.49 0.47 0.45 0 200 400 600 800 1000 Epoch

  27. Structured Product Features Product Embedding Dense Layer Product Features: Title Embedding sales, review rating Product Features Product title embedding

  28. Still Day 1 iphone P1 P1 P1 P10 P1 P12 P1 P100 xr P1 P1 P2 P2 P9 P9 Lexical case P1 P1 P3 P3 P4 P4 Matches Query Merge Ranking Semantic Matches Neural KNN Query Network Search Embedding Product Embeddings P1: 0.12598,0.058533,-0.09845,0.010078,0.045166,-0.014076,… P2: 0.051819,0.0054588,0.0047226,0.045959,-0.015015,… P3: 0.010887,-0.015808,-0.098145,0.039215,-0.058655,-0.085388, P4: 0.042053,0.087402,0.070129,0.082397,-0.051056,-0.089478,

  29. Still Day 1

  30. Thank you Questions? Want to join us? https://www.amazon.jobs/en/teams/search.html 30

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend