Following the success of the 2009 campaign and based on our experience we will provide a revised evaluation corpus consisting of artificial and simulated plagiarism.
Call for Participation:
Evaluation Campaign on Plagiarism Detection and Wikipedia Vandalism Detection
held in conjunction with the CLEF'10 conference
in Padua, Italy, September 20-23
pan.webis.de
About the Campaign:
Plagiarism detection in text documents is a challenging retrieval task:
today's detection systems are faced with intricate situations, such as
obfuscated plagiarism or plagiarism within and across languages.
Moreover, the source of a plagiarism case may be hidden in a large
collection of documents, or it may not be available at all. Informally,
the respective CLEF-Lab task can be described as follows:
1. Plagiarism Detection.
Given a set of suspicious documents and a set
of source documents, the task is to find all plagiarized sections
in the suspicious documents and, if available, the corresponding
source sections.
Following the success of the 2009 campaign and based on our experience
we will provide a revised evaluation corpus consisting of artificial and
simulated plagiarism.
Vandalism has always been one of Wikipedia's biggest problems. However,
the detection of vandalism is done mostly manually by volunteers, and
research on automatic vandalism detection is still in its infancy.
Hence, solutions are to be developed which aid Wikipedians in their
efforts. Informally, the respective CLEF-Lab task can be described as
follows:
2. Wikipedia Vandalism Detection. Given a set of edits on Wikipedia
articles, the task is to identify all edits which are vandalism,
i.e., all edits whose editors had bad intentions.
Participants are invited to submit results for one or both of the tasks.
Contact:
E-mail: pan at webis.de
Campaign Web page: pan.webis.de
|