TY - GEN
T1 - A mixture model framework for class discovery and outlier detection in mixed labeled/unlabeled data sets
AU - Miller, David Jonathan
AU - Browning, John
N1 - Publisher Copyright:
© 2003 IEEE.
PY - 2003
Y1 - 2003
N2 - Several authors have addressed learning a classifier given a mixed labeled/unlabeled training set. These works assume each unlabeled sample originates from one of the (known) classes. Here, we consider the scenario in which unlabeled points may belong either to known/predeAned or to heretofore undiscovered classes. There are several practical situations where such data may arise. We earlier proposed a novel statistical mixture model to flt this mixed data. Here we review this method and also introduce an alternative model. Our fundamental strategy is to view as observed data not only the feature vector and the class label, but also the fact of label presence/ahsence for each point. Two types of mixture components are posited to explain label presence/absence. "Predefined" components generate both labeled and unlabeled points and assume labels are missing at random. These components represent the known classes. "Non-predeAned" components only generate unlabeled points-thus, in localized regions, they capture data subsets that are ezclusively unlabeled. Such subsets may represent an outlier distribution, or new classes. The components' predeflnedlnonpredefined natures are data-driven, learned along with the other parameters via an algorithm based on expectation-maximization (EM). There are three natural applications: 1) robust classifier design, given a mixed training set with outliers; 2) classiflcation with rejections; 3) identitication of the unlabeled points (and their representative components) that originate from unknown classes, i.e. new class discovery. The effectiveness of our models in discovering purely unlabeled data components (potential new classes) is evaluated both on synthetic and real data sets. Although each of our models has its own advantages, our original model is found to achieve the best class discovery results.
AB - Several authors have addressed learning a classifier given a mixed labeled/unlabeled training set. These works assume each unlabeled sample originates from one of the (known) classes. Here, we consider the scenario in which unlabeled points may belong either to known/predeAned or to heretofore undiscovered classes. There are several practical situations where such data may arise. We earlier proposed a novel statistical mixture model to flt this mixed data. Here we review this method and also introduce an alternative model. Our fundamental strategy is to view as observed data not only the feature vector and the class label, but also the fact of label presence/ahsence for each point. Two types of mixture components are posited to explain label presence/absence. "Predefined" components generate both labeled and unlabeled points and assume labels are missing at random. These components represent the known classes. "Non-predeAned" components only generate unlabeled points-thus, in localized regions, they capture data subsets that are ezclusively unlabeled. Such subsets may represent an outlier distribution, or new classes. The components' predeflnedlnonpredefined natures are data-driven, learned along with the other parameters via an algorithm based on expectation-maximization (EM). There are three natural applications: 1) robust classifier design, given a mixed training set with outliers; 2) classiflcation with rejections; 3) identitication of the unlabeled points (and their representative components) that originate from unknown classes, i.e. new class discovery. The effectiveness of our models in discovering purely unlabeled data components (potential new classes) is evaluated both on synthetic and real data sets. Although each of our models has its own advantages, our original model is found to achieve the best class discovery results.
UR - http://www.scopus.com/inward/record.url?scp=84945174346&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84945174346&partnerID=8YFLogxK
U2 - 10.1109/NNSP.2003.1318048
DO - 10.1109/NNSP.2003.1318048
M3 - Conference contribution
AN - SCOPUS:84945174346
T3 - Neural Networks for Signal Processing - Proceedings of the IEEE Workshop
SP - 489
EP - 498
BT - 2003 IEEE 13th Workshop on Neural Networks for Signal Processing, NNSP 2003
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 13th IEEE Workshop on Neural Networks for Signal Processing, NNSP 2003
Y2 - 17 September 2003 through 19 September 2003
ER -