KDnuggets : News : 2002 : n22 : item34    (previous | next)

CFP

From: Joćo Gama
Date: 15 Nov 2002
Subject: Spec. Issue of IDA on Learning Systems Dealing with Concept Drift, deadline Feb 1, 2003

SPECIAL ISSUE on INCREMENTAL LEARNING SYSTEMS CAPABLE OF DEALING WITH CONCEPT DRIFT

Special issue Editors:
Miroslav Kubat, University of Miami, USA
João Gama, University of Porto, Portugal
Paul Utgoff, University of Massachusetts, USA

Suppose the existence of a concept description that has been induced from a set, T, of training examples.

Suppose that later another set, T', of examples become available. What is the most effective way to modify the concept so as to reflect the examples from T'? In many real-world learning problems the data flows continuously and learning algorithms should be able to respond to this circumstance.

The first requirement of such algorithms is thus incrementality, the ability to incorporate new information. If the process is not strictly stationary, the target concept could gradually change over time, a fact that should be reflected also by the current version of the induced concept description.

The ability to react to concept drift can thus be viewed as a natural extension of incremental learning systems. These techniques can be useful for scaling-up learning algorithms to very large datasets. Other types of problems were these techniques could be potentially useful include: user-modelling, control in dynamic environments, web-mining, times series, etc.

Most of evaluation methods for machine learning (e.g. cross-validation) assume that examples are independent and identically distributed. This assumption is clear unrealistic in the presence of concept drift. How can we estimate the performance of learning systems under these constrains?

The objective of the special issue is to present the current status of algorithms, applications, and evaluation methods for these problems. Relevant techniques include the following (but are not limited to):
1.      Incremental, online, real-time, and any-time learning algorithms
2.      Algorithms that learn in the presence of concept drift
3.      Evaluation Methods for dynamic instance distributions
4.      Real world applications that involve online learning
5.      Theory on learning under concept drift.

Submission Details: We are expecting full papers to describe original, previously unpublished research, be written in English, and not be simultaneously submitted for publication elsewhere (previous publication of partial results at workshops with informal proceedings is allowed). We could also consider the publication of high-quality surveys on these topics.

Please submit a PostScript or PDF file of your paper to: jgama@liacc.up.pt

Important Dates:
Submission Deadline:  1 of February 2003
Author Notification:  1 of July 2003
Final Paper Deadline: 1 of September 2003


KDnuggets : News : 2002 : n22 : item34    (previous | next)

Copyright © 2002 KDnuggets.   Subscribe to KDnuggets News!