crowdsourcing
play

Crowdsourcing Erfurt Meeting@QoMEX 2017 30 May 2017, 16.00-16.30 - PowerPoint PPT Presentation

Task Force Crowdsourcing Erfurt Meeting@QoMEX 2017 30 May 2017, 16.00-16.30 Tobias Hossfeld, Babak Naderi WG1 Research Agenda: Crowdsourcing TF 1. Overview on recent activities of CS TF (Tobias, Babak, 10 min) 2. Short


  1. Task Force “Crowdsourcing” Erfurt Meeting@QoMEX 2017 30 May 2017, 16.00-16.30 Tobias Hossfeld, Babak Naderi WG1 Research

  2. Agenda: Crowdsourcing TF • 1. Overview on recent activities of CS TF (Tobias, Babak, 10 min) • 2. Short research talks: – Uni Würzburg: Michael Seufert, Matthias Hirth (5min) – Uni Konstanz: Dietmar Saupe, Vlad Hosu, Franz Hahn (5min) – TU Berlin: Babak Naderi (5min) • 3. Next steps: ITU-T experiments (Babak, 5min) WG1 Research

  3. Joint Activities and Major Outcome • Collected in Wiki https://www3.informatik.uni- wuerzburg.de/qoewiki/qualinet:crowd:qomex2017meeting • CS ITU-T Experiments on Audio: Current status • ITU Standardization on Crowdsourcing: P.CROWD, Sebastian Möller (TU Berlin) • Started last QoMEX: ITU-T Standard on one CS Recommendation that contains common guidance across different media subjective assessment testing in crowdsourcing (within P.CROWD) • ITU-T P912 with an appendix focused on Crowdsourcing based on the white: “Best practices and recommendations for crowdsourced QoE-Lessons learned from the qualinet task “force crowdsourcing”. 2014. https://hal.archives-ouvertes.fr/hal-01078761 ] WG1 Research

  4. Joint Events • Summer school on "Crowdsourcing and IoT", Würzburg, Germany, 31 July - 4 Aug 2017, organized by Matthias Hirth and Tobias Hossfeld http://iotcrowd.org/ • PQS 2016: 5th ISCA/DEGA Workshop on Perceptual Quality of Systems, Berlin, 2016 • PQS Special Session on Crowdsourcing: Judith Redi (TU Delft), Matthias Hirth (Uni of Würzburg), Tim Polzehl (TU Berlin) • PQS 2016 Publications in ISCA Archive • Crowdsourcing TF meeting, co-located with PQS 2016, TU Berlin WG1 Research

  5. Joint Publications: Book • Book: “Evaluation in the Crowd: Crowdsourcing and Human - Centred Experiments”. Editors: Daniel Archambault, Helen C. Purchase, Tobias Hoßfeld – Understanding The Crowd: ethical and practical matters in the academic use of crowdsourcing: Sheelagh Carpendale, Neha Gupta, Tobias Hoßfeld, David Martin, Babak Naderi, Judith Redi, Ernestasia Siahaan, Ina Wechsung – Crowdsourcing for QoE Experiments: Sebastian Egger, Judith Redi, Sebastian Möller, Tobias Hossfeld, Matthias Hirth, Christian Keimel, and Babak Naderi – Crowdsourcing Versus the Laboratory: Towards Human-centered Experiments Using the Crowd: Ujwal Gadiraju, Sebastian Möller, Martin Nöllenburg, Dietmar Saupe, Sebastian Egger, Daniel W. Archambault, and Brian Fischer – Crowdsourcing Technology to Support Academic Research: Matthias Hirth, Jason Jacques, Peter Rodgers, Ognjen Scekic, and Michael Wybrow WG1 Research

  6. Joint Publications: PQS 2016 / PCS 2016 • „Worker's Cognitive Abilities and Personality Traits as Predictors of Effective Task Performance on Crowdsourcing Tasks“ by Vaggelis Mourelatos; Manolis Tzagarakis. • „ Reported Attention as a Promising Alternative to Gaze in IQA Tasks“ by Vlad Hosu; Franz Hahn; Igor Zingman; Dietmar Saupe. • „ One Shot Crowdtesting: Approaching the Extremes of Crowdsourced Subjective Quality Testing“ by Michael Seufert; Tobias Hoßfeld. • „ Size does matter. Comparing the results of a lab and a crowdsourcing file download QoE study“ by Andreas Sackl; Bruno Gardlo; Raimund Schatz. • Saliency-driven image coding improves overall perceived JPEG quality by Vlad Hosu, Franz Hahn, Oliver Wiedemann, Sung-Hwan Jung, Dietmar Saupe. 32nd Picture Coding Symposium (PCS 2016), Berlin, 2016 WG1 Research

  7. Joint Publications: QoMEX 2017 • On Use of Crowdsourcing for H.264/AVC and H.265/HEVC Video Quality Evaluation“ by Ondrej Zach; Michael Seufert; Matthias Hirth; Martin Slanina; Phuoc Tran-Gia. • „Collecting Subjective Ratings in Enterprise Environments “ by Kathrin Borchert; Matthias Hirth; Thomas Zinner; Anja Göritz. • “ Unsupervised QoE Field Study for Mobile YouTube Video Streaming with YoMoApp ” by Michael Seufert; Nikolas Wehner; Florian Wamser; Pedro Casas; Alessandro D'Alconzo; Phuoc Tran- Gia. • “The Konstanz natural video database (KoNViD- 1k)“ by Vlad Hosu; Franz Hahn; Hui Men; Tamas Szirányi; Shujun Li; Dietmar Saupe. • “Empirical evaluation of no-reference VQA methods on a natural video quality database” by Hui Men; Hanhe Lin; Dietmar Saupe. • “Scoring Voice Likability using Pair -Comparison: Laboratory vs. Crowdsourcing Approach ” by Rafael Zequeira Jiménez, Laura Fernández Gallardo and Sebastian Möller. WG1 Research

  8. Recent and Future Activities • (Joint) projects and project proposals – National project proposal (under submission) “Analysis of influence factors and definition of subjective methods for evaluating the quality of speech services using crowdsourcing ” (Sebastian Möller, Tobias Hoßfeld) – National DFG project (accepted) “Design and Evaluation of new mechanisms for crowdsourcing as emerging paradigm for the organization of work in the Internet ” (Tobias Hoßfeld, Phuoc Tran-Gia, Ralf Steinmetz, Christoph Rensing) http://dfg-crowdsourcing.de • Crowdsourcing and IoT • Enterprise Crowdsourcing • Future plans and steps – Adaptive crowdsourcing and automatic (multi-parameter) selection – CS ITU-T: Lab vs. Crowdsourcing experiment as input for standardization – ITU Standardization on Crowdsourcing (P.Crowd) WG1 Research

  9. Active TF! WG1 Research

  10. Agenda: Crowdsourcing TF • 1. Overview on recent activities of CS TF (Tobias, Babak, 10 min) • 2. Short research talks: – Uni Würzburg: Michael Seufert, Matthias Hirth (5min) – Uni Konstanz: Dietmar Saupe, Vlad Hosu, Franz Hahn (5min) – TU Berlin: Babak Naderi (5min) • 3. Next steps: ITU-T experiments (Babak, 5min) WG1 Research

  11. YoMoApp  Android mobile app based on Android WebView  YouTube mobile website can be displayed, videos can be streamed via HTML5 video player  Thus, in the app, YouTube is fully functional including all features, plus immersive monitoring  JavaScript is used to monitor the playback via the HTML 5 <video> -element (player state/events, video playback time, buffer filling level, video quality level)  Device characteristics, user interactions, network statistics, and subjective feedback can be obtained through the Android app (screen size, volume, location, cell id, RAT, throughput, etc.) YoMoApp Portal 11 Michael Seufert

  12. YoMoApp Portal  YoMoApp portal: http://yomoapp.de/dashboard YoMoApp Portal 12 Michael Seufert

  13. YoMoApp Portal  YoMoApp portal: http://yomoapp.de/dashboard YoMoApp Portal 13 Michael Seufert

  14. YoMoApp Portal  YoMoApp portal: http://yomoapp.de/dashboard  Sign in with Google Account  Add device ID (can be obtained in YoMoApp statistics view)  Multiple devices can be added for one account  Select device and browse statistics of all YoMoApp sessions  Download log files of single/multiple sessions  Playout information log (playtime, buffered playtime ,…)  Event log (device information, network information ,…)  Statistics log (streaming overview, user rating)  More details on log files: http://www.comnet.informatik.uni- wuerzburg.de/research/cloud_applications_and_networks/internet _applications/yomoapp/  Use YoMoApp for your crowdsourced QoE study!  YoMoApp Portal 14 Michael Seufert

  15. Institute of Computer Science Chair of Communication Networks Prof. Dr.-Ing. P. Tran-Gia QualiNet: Crowdsourcing TF Matthias Hirth Chair of Communication Network, University of Würzburg

  16. Joint Research Project Würzburg – Duisburg/Essen  National DFG project Design and Evaluation of new mechanisms for crowdsourcing as emerging paradigm for the organization of work in the Internet  Research objectives  Enterprise crowdsourcing – Processing of sensitive data – Trade-offs between internal and external crowdsourcing – Integration into day-to-day business  Mobile crowdsourcing – Trade-offs between data quality and costs – Combination of crowd-based and fixed sensors – Task routing in mobile settings  Crowdsourced QoE – (Cost-)optimal selection of test-stimuli – Dynamic adaptation of test setup 16 16

  17. Advertisement  Focus Combining objective measurements from IoT and subjective ratings from crowdsourcing users  When 31. July 2017 – 04. August 2017  Where Würzburg, Bavaria, Germany  More information available at http://iotcrowd.com 17 17

  18. 2016 • Scalability • IQA/VQA • where to filter • different dimensions of quality • qualification/price tradeoff • verification of algorithms • thousands of stimuli • Bias removal • grounding MOS • objective anchors • Improving Accuracy • ACR vs. PC • crowd as a predictor • mathematical models

  19. ACR vs vs PC (VQA) ACR • 8 videos • 400 ratings PC • 8 videos • 400 ratings Judgment Cost 0.75 ACR = 1 PC

  20. Perceptually-Guided Im Image Coding Experiment A F subjective scores task • Standard JPEG PC (10 bitrates) • 30 users per pair Experiment B • Standard vs Saliency-Driven JPEG Δbitrate Δb ap • 3 parameters apparent avg bitrate improvement • 50 users per pair 10% 10% for best parameter settings

  21. Konstanz Natural Vid ideo Database Conventional KoNViD • small number of source sequences • YFCC100m baseline (~800000 videos) • little content diversity • filtering methodology • artificial distortions • ensure naturalness of videos • infeasible for blind VQA algorithms • maximize diversity across many quality dimensions

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend