fxpal at trecvid 2007 collaborative exploratory search
play

FXPAL at TRECvid 2007 Collaborative Exploratory Search - PowerPoint PPT Presentation

FXPAL at TRECvid 2007 Collaborative Exploratory Search Collaborative search is overloaded Synchronous Real-time awareness Collaborative and continual update Exploratory Search context systems (FXPAL) (e.g. Nokia, Imity)


  1. FXPAL at TRECvid 2007

  2. Collaborative Exploratory Search

  3. “Collaborative” search is overloaded Synchronous Real-time awareness Collaborative and continual update Exploratory Search context systems (FXPAL) (e.g. Nokia, Imity) Asynchronous Web 2.0 Chi et al Wisdom of Crowds “Search Trails” Collaborative Filtering (Xerox PARC) Personalization Explicit Implicit 6 November 2007 TRECvid 2007 workshop

  4. “Collaborative” search is overloaded Synchronous Collaborative Collaborative Exploratory Search Exploratory Search (FXPAL) Algorithmically-Mediated Intelligent Interfaces Only •Fischlar-DiamondTouch: Collaborative Video Searching on a Table (Smeaton et al, 2005) Explicit •Interfaces for Collaborative Exploratory Web Search: Motivations and Directions for Multi-User Designs (M. Morris, 2007) 6 November 2007 TRECvid 2007 workshop

  5. Collaborative Exploratory Search • Synchronous – Collaborating users use the system at the same time • Explicitly Shared goals – Collaborating users share the information need • Algorithmically-mediated – System combines users’ inputs in various ways • Not just keyword pooling – System generates results based on users’ roles • Terms, ranked lists, etc. 6 November 2007 TRECvid 2007 workshop

  6. User 1 User 2 Input Output Coordinator Coordinator Algorithmic Collaboration Logic Unit 6 November 2007 TRECvid 2007 workshop

  7. 6 November 2007 TRECvid 2007 workshop

  8. 6 November 2007 TRECvid 2007 workshop

  9. System overview MediaMagic Shared Display RSVP Input Output Coordinator Coordinator Algorithmic Collaboration Module 6 November 2007 TRECvid 2007 workshop

  10. 6 November 2007 TRECvid 2007 workshop

  11. 6 November 2007 TRECvid 2007 workshop

  12. 6 November 2007 TRECvid 2007 workshop

  13. 6 November 2007 TRECvid 2007 workshop

  14. Prospector Miner 6 November 2007 TRECvid 2007 workshop

  15. RSVP Queue Priority Weighted Borda Count fusion ∑ rank score w w = ⋅ ⋅ doc doc , q seen , q rel , q q score N rank Relevance = − Freshness doc , q retrieved , q doc , q w N / N = seen , q seen , q unseen , q w N N / = rel , q rel , q nonrel , q 6 November 2007 TRECvid 2007 workshop

  16. Shared Display Suggested Query Term Weighted frequency fusion ∑ rank score w w = ⋅ ⋅ term term q seen q rel q , , , q score TF Relevance , = Freshness term q retrieved q , w N / N = seen , q seen , q unseen , q w N N / = rel , q rel , q nonrel , q 6 November 2007 TRECvid 2007 workshop

  17. Example 6 November 2007 TRECvid 2007 workshop

  18. TRECvid Experiments • 3 ½ Systems, 4 Users a. MMA: Single MediaMagic user (full capabilities) b. MMV: Single MediaMagic user (no text) c. MMA+V: Post hoc simulated MMA+MMV combination – Duplicates (both rel and nonrel) removed d. COLL: Collaborative search 6 November 2007 TRECvid 2007 workshop

  19. TRECvid Experiments • Problem: Learning effect? – All COLL runs done first – All MMA runs done second – All MMV runs done third 6 November 2007 TRECvid 2007 workshop

  20. Results: Mean Average Precision Single user, text Collaborative search Single user, Video only Collaborative search, 7 minutes 0.4000 0.3500 0.3000 0.2500 MAP 0.2000 0.1500 0.1000 0.0500 0.0000 FXPAL_CO15 FXPAL_CO FXPAL_MMA FXPAL_CO11 FXPAL_MMV FXPAL_CO07 6 November 2007 TRECvid 2007 workshop

  21. Additional Metrics • Examine Recall and Precision separately • Examine the manually-selected shot set – What actually happened during the run? 6 November 2007 TRECvid 2007 workshop

  22. Precision TP • COLL is: TP FP + 1.47% relative improvement over MMA -3.42% relative improvement over MMV 15.4% relative improvement over MMA+V Legend explaining MMA,etc 6 November 2007 TRECvid 2007 workshop

  23. Recall TP • COLL is: # totalrel 101.1% relative improvement over MMA 43.3% relative improvement over MMV -10.7% relative improvement over MMA+V 6 November 2007 TRECvid 2007 workshop

  24. • COLL outperforms MMA and MMV • COLL is about the same against MMA+V – What does this suggest? – Why bother working collaboratively? – Let’s examine closer 6 November 2007 TRECvid 2007 workshop

  25. % improvement in precision 120 100 80 60 40 20 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 -20 -40 COLL over MMA COLL over MMV 6 November 2007 TRECvid 2007 workshop

  26. % improvement in recall 450 400 350 300 250 200 150 100 50 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 COLL over MMA COLL over MMV 6 November 2007 TRECvid 2007 workshop

  27. % improvement COLL over MMA+V Kernel density smoothing 100 80 60 40 Precision 20 Recall 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 -20 -40 -60 6 November 2007 TRECvid 2007 workshop

  28. Tentative Conclusion: Collaborative Search ( at least in our current implementation ) offers its best improvements when there are fewer relevant documents to be found 6 November 2007 TRECvid 2007 workshop

  29. Normalizing by Shots Viewed • Our RSVP system needed another design iteration (missed opportunity) • Average number of shots viewed: – MMA: 2,123 – MMV: 2,601 – MMA+V: 4,184 – COLL: 2,614 Work smarter not harder? 6 November 2007 TRECvid 2007 workshop

  30. Precision Precision, with counts normalized by the number of seen shots, does not change TP TP seen # = TP FP TP FP + + # # seen seen 6 November 2007 TRECvid 2007 workshop

  31. Recall TP seen # • COLL is: # totalrel 73.9% relative improvement over MMA (101.1%) 38.5% relative improvement over MMV (43.3%) 44.1% relative improvement over MMA+V (-10.7%) 6 November 2007 TRECvid 2007 workshop

  32. % improvement in recall 600 500 400 300 200 100 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 -100 COLL over MMA COLL over MMV 6 November 2007 TRECvid 2007 workshop

  33. % improvement in recall Kernel density smoothing COLL over MMAV 200 150 100 50 0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 -50 COLL over MMAV 6 November 2007 TRECvid 2007 workshop

  34. Future Work • Still like the idea of miner vs. prospector – But need to give the miner more ability to “steer” – And achieve higher throughput • Also investigate other collaboration roles • Also investigate types of queries in which different roles work better. Can we know this a priori? 6 November 2007 TRECvid 2007 workshop

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend