clef and p clef and p promises promises
play

CLEF and P CLEF and P PROMISEs PROMISEs Nicola a Ferro - PowerPoint PPT Presentation

CLEF and P CLEF and P PROMISEs PROMISEs Nicola a Ferro Information Management Sys Information Management Sys stems (IMS) Research Group stems (IMS) Research Group Department of Infor rmation Engineering University of y Padua, Italy


  1. CLEF and P CLEF and P PROMISEs PROMISEs Nicola a Ferro Information Management Sys Information Management Sys stems (IMS) Research Group stems (IMS) Research Group Department of Infor rmation Engineering University of y Padua, Italy

  2. CLEF “Cl CLEF “Classic” i ” The CLEF Initiative The CLEF Initiative From CLEF 2010 to CLEF 2012 Surrounding CLEF: P g PROMISE

  3. CLEF “Classic”

  4. 1997 – First CLIR system evalua ation campaigns in US and Japan: TREC and NTCIR CLEF actually began life in 1997 as a track for Cross Language Information Retrieval (CLIR) within TREC. Mainl y, English centered tasks (EN -> X, X -> EN 2000-2009 – CLIR evaluation in 2000 2009 CLIR evaluation in Europe: CLEF (extension of CLIR Europe: CLEF (extension of CLIR track at TREC) Fully multilingual multimodal informa Fully multilingual, multimodal informa ation retrieval systems capable of processin ation retrieval systems capable of processin query in any medium and any langua age finding relevant information from a multilingual multimedia collection co ntaining documents in any language and for and presenting it in the style most lik d ti it i th t l t lik kely to be useful to the user k l t b f l t th Funding DELOS NoE under FP5 2000 – 2003 3 (http://delos-noe.isti.cnr.it/) ( p ) DELOS NoE under FP6 2004 – 2007 7 (http://www.delos.info/) T bl CLEF d FP7 2008 2009 ( (htt // t bl l f /)

  5. Stimulation of research act tivity in new, previously unexplored areas unexplored areas Study and implementation Study and implementation of evaluation methodologies of evaluation methodologies for diverse types of cross-l anguage IR systems Creation of a large set of e empirical data about multilingual information acc g cess from the user perspectiv p p Quantitative and qualitative e evidence with respect to be practice in cross-language ti i l system development t d l t Creation of reusable test c Creation of reusable test c collections for system collections for system benchmarking Building of a strong, multid disciplinary research commun

  6. Ch Changes in: i users : they increasingly interac users : they increasingly interac ct with content and other users ct with content and other users organizations : they need to ma anage multilingual (versioned) content and offer services/acce t t d ff i / ss to it t it Growing dissatisfaction with cu Growing dissatisfaction with cu urrent available technology urrent available technology Future evaluation campaigns must foster development of systems that better meet user n needs Multilingual issues must be st Multilingual issues must be st udied also from communicative udied also from communicative perspective The MLIA/CLIR user model mu ust now be adapted to meet emerging trends g g

  7. CLEF must offer a new evaluat CLEF must offer a new evaluat ion cycle impacting on: ion cycle impacting on: Methodology definition: devel loping models and metrics to describe needs and behavior of d ib d d b h i f f the new multicultural and multi- f th lti lt l d lti tasking users; System building: assessing sy ystem conformity wrt the newly identified user needs, tasks, and , , d models; Results assessment: measurin ng all aspects of system & component performance includi component performance includi ng response times usability and ng response times, usability, and user satisfaction Community building: involving g other research domains, e.g. MT information science and user st udies sectors, & application communities, e.g enterprise sea arch, legal, patent, educational, cultural heritage and infotainme ent areas; Validation of technology: gua ranteeing that the results obtained

  8. multilingual and multimodal system testi ing, tuning and evaluation; investigation of the use of unstructured, semi-structured, highly-structured, and semantically enriched data in informatio on access; creation of reusable test collections for b benchmarking; exploration of new evaluation methodolo ogies and innovative ways of using experimental data; discussion of results comparison of app proaches exchange of ideas and transf

  9. he CLEF Initiative is struct tured in two main parts: a series of Evaluation L Labs , i.e. laboratories to conduct evaluation of inf conduct evaluation of inf formation access systems formation access systems and workshops to discus ss and pilot innovative evaluation activities; a peer-reviewed Confer i d C f rence on a broad range of b d f issues, including issues, including investigation continuing the e activities of the Evaluation Labs experiments using multiling gual and multimodal data; in p particular, but not only, data y a resulting from CLEF activities; g research in evaluation meth hodologies and challenges.

  10. zation Organiz

  11. Comm mittee Steering Committee Chair Steering Committee Chair Nicola Ferro , University of Padua, Italy Deputy Steering Committee Chair for the Conference Julio Gonzalo , National Distance Educatio on University (UNED), Spain Deputy Steering Committee Chair for the Evaluation Labs Carol Peters , ISTI, National Council of Re search (CNR), Italy Members Members Martin Braschler , Zurich University of App plied Sciences, Switzerland Khalid Choukri Evaluations and Languag Khalid Choukri , Evaluations and Languag ge resources Distribution Agency (ELDA), France ge resources Distribution Agency (ELDA) France Paul Clough , University of Sheffield, Unite ed Kingdom Donna Harman , National Institute for Stan ndards and Technology (NIST), USA gy ( ) Jaana Kekäläinen , University of Tampere , Finland Emanuele Pianta , Centre for the Evaluatio on of Language and Communication Technologies (CELCT) Italy (CELCT), Italy Maarten de Rijke , University of Amsterdam m UvA, The Netherlands

  12. CLEF 2010

  13. What and how innov What and how innov vate vate Change in the coord Change in the coord ination ination Funding Funding

  14. The community th the key to succe k t EF 2010 as a bridge e he future

  15. Scientific and Technol Scientific and Technol ogical Advancement in� ogical Advancement in� Multilingual and Multime edia Information Systems� ops� ops� ops� ops� ops� ops� nd nd nd nd nd nd orksho orksho orksho orksho orksho orksho Labs a Labs a Labs a Labs a Labs a Labs a Wo Wo Wo Wo Wo Wo L L L L L L Confe erence�

  16. Conference Two days Two days Large program committe ee Keynote talks and panel ls Publication in Springer L p g LNCS no more LNCS post-proceed dings of the CLEF working notes Labs Two days (more space t than in CLEF classic) Lab selection committee e Online publication (with ISBN) in time for the conference Lab organizers are resp Lab organizers are resp ponsible for individual outcomes ponsible for individual outcomes for post-conference pub blication (special issues, ...)

  17. Honorary Chair Honorary Chair Carol Peters, ISTI-CNR, Italy General Chairs Maristella Agosti, University of Padua, Italy Maarten de Rijke, University of Maarten de Rijke, University of f Amsterdam, The Netherlands f Amsterdam, The Netherlands P Program Chairs Ch i Nicola Ferro, University of Pad ua, Italy Alan Smeaton, Dublin City Univ versity, Ireland Lab Chairs Martin Braschler, Zurich Univer rsity of Applied Sciences, Switzerland Donna Harman NIST USA

  18. 12 papers (8 full papers and 4 12 papers (8 full papers and 4 4 short papers) 4 short papers) out of 21 submissions (17 full pap ers and 4 short papers) Two keynote talks Norbert Fuhr , “IR Between Scienc ce and Engineering, and the Role of Experimentation” Ricardo Baeza-Yates, “Retrieval E Evaluation in Practice” Other Evaluation Initiatives Ellen Voorhees for TREC; Noriko Ellen Voorhees for TREC; Noriko Kando for NTCIR; Prasenjit Majumder Kando for NTCIR; Prasenjit Majumder FIRE; Jaap Kamps for INEX; Pav vel Braslavski for ROMIP Two panels Donna Harman, Noriko Kando, Mo D H N ik K d M ounia Lalmas, Carol Peters, “The Four i L l C l P t “Th F Ladies of Experimental Evaluation n”

  19. Benchmarking activities CLEF IP : A benchmarking a CLEF-IP : A benchmarking a activity on intellectual property activity on intellectual property ImageCLEF : A benchmarkin ng activity on image retrieval PAN : A benchmarking activit ty on plagiarism detection RespubliQA : A benchmarkin RespubliQA : A benchmarkin ng activity on question answering ng activity on question answering using multilingual political da ata WePS : A benchmarking activ WePS : A benchmarking activ vity on web people search vity on web people search Workshops CriES : A workshop aimed at t exploring the evaluation of search for expertise in social media a. LogCLEF : A workshop aime ed at exploring methodologies for studying search engine log f files

  20. CLEF 2011

  21. General chairs General chairs Julio Gonzalo National Dist Julio Gonzalo, National Dist ance Education University (UNED ance Education University (UNED Spain Maarten de Rijke University Maarten de Rijke, University y of Amsterdam, The Netherlands y of Amsterdam The Netherlands Program chairs Jaana Kekäläinen, Universit ty of Tampere, Finland Mounia Lalmas, Yahoo! Res Mounia Lalmas Yahoo! Res search Barcelona Spain search Barcelona, Spain Lab chairs Paul Clough, University of S Sheffield, United Kingdom

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend