IMPACT SCORE JOURNAL RANKING CONFERENCE RANKING Conferences Journals Workshops Seminars SYMPOSIUMS MEETINGS BLOG LaTeX 5G Tutorial Free Tools
CLEF 2024 : Conference and Labs of the Evaluation Forum - Information Access Evaluation meets Multilinguality, Multimodality, and Visualization
CLEF 2024 : Conference and Labs of the Evaluation Forum - Information Access Evaluation meets Multilinguality, Multimodality, and Visualization

CLEF 2024 : Conference and Labs of the Evaluation Forum - Information Access Evaluation meets Multilinguality, Multimodality, and Visualization

Grenoble, France
Event Date: September 09, 2024 - September 12, 2024
Submission Deadline: May 10, 2024
Notification of Acceptance: June 07, 2024
Camera Ready Version Due: June 21, 2024


Categories



Call for Papers

Important Dates (Time zone: Anywhere on Earth)

· Submission of Long, Short, Best of 2023 Labs Papers: 10 May, 2024
· Notification of Acceptance: 7 June, 2024
· Camera Ready Copy due: 21 June, 2024
· Conference: 9-12 September, 2024


Aim and Scope

The CLEF Conference addresses all aspects of Information Access in any modality and language. The CLEF conference includes presentation of research papers and a series of workshops presenting the results of lab-based comparative evaluation benchmarks.
CLEF 2024 is the 15th CLEF conference continuing the popular CLEF campaigns which have run since 2000 contributing to the systematic evaluation of information access systems, primarily through experimentation on shared tasks. The CLEFconference has a clear focus on experimental IR as carried out within evaluation forums (e.g., CLEF Labs, TREC, NTCIR, FIRE, MediaEval, RomIP, SemEval, and TAC) with special attention to the challenges of multimodality, multilinguality, and interactive search also considering specific classes of users as children, students, impaired users in different tasks (e.g., academic, professional, or everyday-life). We invite paper submissions on significant new insights demonstrated on IR test collections, on analysis of IR test collections and evaluation measures, as well as on concrete proposals to push the boundaries of the Cranfield style evaluation paradigm.
All submissions to the CLEF main conference will be reviewed on the basis of relevance, originality, importance, and clarity. CLEF welcomes papers that describe rigorous hypothesis testing regardless of whether the results are positive or negative. CLEF also welcomes past runs/results/data analysis and new data collections. Methods are expected to be written so that they are reproducible by others, and the logic of the research design is clearly described in the paper. The conference proceedings will be published in the Springer Lecture Notes in Computer Science (LNCS).


Topics

Relevant topics for the CLEF 2024 Conference include but are not limited to:
· Information Access in any language or modality: information retrieval, image retrieval, question answering, search interfaces and design, infrastructures, etc.
· Analytics for Information Retrieval: theoretical and practical results in the analytics field that are specifically targeted for information access data analysis, data enrichment, etc.
· Reproducibilty and replicability issues : analyses of past results/run deep analysis both statistically and fine grain based.
· Language diversity : Work on less-resourced languages.
· Models leveraging collaborative and social data and their evaluation.
· User studies either based on lab studies or crowdsourcing.
· Evaluation initiatives: conclusions, lessons learned, impact and projection of any evaluation initiative after completing their cycle.
· Evaluation: methodologies, metrics, statistical and analytical tools, component based, user groups and use cases, ground-truth creation, impact of multilingual/multicultural/multimodal differences, etc.
· Technology transfer: economic impact/sustainability of information access approaches, deployment and exploitation of systems, use cases, etc.
· Interactive and Conversational Information Retrieval evaluation: the interactive/conversational evaluation of information retrieval systems using user-centered methods, evaluation of novel search interfaces, novel interactive/conversational evaluation methods, simulation of interaction/conversation, etc.
· Specific application domains: information access and its evaluation in application domains such as cultural heritage, digital libraries, social media, health information, legal documents, patents, news, books, and in the form of text, audio and/or image data.
· New data collection: presentation of new data collection with potential high impact on future research, specific collections from companies or labs, multilingual collections.


Format

Authors are invited to electronically submit original papers, which have not been published and are not under consideration elsewhere, using the LNCS proceedings format:
http://www.springer.com/it/computer-science/lncs/conference-proceedings-guidelines
Two types of papers are solicited:
· Long papers: 12 pages max (including references). Aimed to report complete research works.
· Short papers: 6 pages max (including references). Position papers, new evaluation proposals, developments and applications, etc.


Review Process

Authors of long and short papers are asked to submit the following TWO versions of their manuscript:
Methodology version: This version does NOT report anything related to the results of the study. At this stage, the manuscripts will be evaluated based on the importance of the problem addressed and the soundness of the methodology. Manuscripts can include an introduction, description of the proposed methodology and datasets used. However, there should be no result and discussion sections. The authors should also remove mentions of results in the included sections (e.g., abstract, introduction)
Experimental version: This is the full version of the manuscript that contains all the sections of the paper including the experiments and results.
Papers will be peer-reviewed by 3 members of the program committee in two stages. At the first stage, the members will review the methodology version of the manuscripts based on originality and methodology. At the second stage, the full version of the manuscripts that passed from the first sage will be reviewed. Selection will be based on originality, clarity, and technical quality.
The deadline for the submission of both versions is 10th of May.


Paper Submission

Papers should be submitted in PDF format to the following address:
https://easychair.org/my/conference?conf=clef2024
· Submit the methodology version at the « Conference - Methodology Part » Track
· Submit the experimental version at the « Conference - Experimental Part » Track


Organization

General Chairs
Lorraine Goeuriot, Université Grenoble Alpes (France)
Philippe Mulhem, Université Grenoble Alpes (France)
Georges Quénot, Université Grenoble Alpes (France)
Didier Schwab, Université Grenoble Alpes (France)

Program Chairs
Laure Soulier - Sorbonne Université (France)
Giorgio Maria Di Nunzio - University of Padua (Italy)

Evaluation Lab Chairs
Petra Galuscakova, University of Stavanger (Norway)
Alba García Seco de Herrera, University of Essex (UK)

Lab Mentorship Chair
Liana Ermakova, Université de Bretagne Occidentale (France)
Florina Piroi, TU Wien (Austria)


Summary

CLEF 2024 : Conference and Labs of the Evaluation Forum - Information Access Evaluation meets Multilinguality, Multimodality, and Visualization will take place in Grenoble, France. It’s a 4 days event starting on Sep 9, 2024 (Monday) and will be winded up on Sep 12, 2024 (Thursday).

CLEF 2024 falls under the following areas: etc. Submissions for this Conference can be made by May 10, 2024. Authors can expect the result of submission by Jun 7, 2024. Upon acceptance, authors should submit the final version of the manuscript on or before Jun 21, 2024 to the official website of the Conference.

Please check the official event website for possible changes before you make any travelling arrangements. Generally, events are strict with their deadlines. It is advisable to check the official website for all the deadlines.

Other Details of the CLEF 2024

  • Short Name: CLEF 2024
  • Full Name: Conference and Labs of the Evaluation Forum - Information Access Evaluation meets Multilinguality, Multimodality, and Visualization
  • Timing: 09:00 AM-06:00 PM (expected)
  • Fees: Check the official website of CLEF 2024
  • Event Type: Conference
  • Website Link: https://clef2024.clef-initiative.eu/
  • Location/Address: Grenoble, France


Credits and Sources

[1] CLEF 2024 : Conference and Labs of the Evaluation Forum - Information Access Evaluation meets Multilinguality, Multimodality, and Visualization


Check other Conferences, Workshops, Seminars, and Events