F-Measure Maximization in Multi-Label Classification with Conditionally Independent Label Subsets
Résumé
We discuss a method to improve the exact F-measure max-imization algorithm called GFM, proposed in [2] for multi-label classification , assuming the label set can be partitioned into conditionally independent subsets given the input features. If the labels were all independent , the estimation of only m parameters (m denoting the number of labels) would suffice to derive Bayes-optimal predictions in O(m^2) operations [10]. In the general case, m^2 + 1 parameters are required by GFM, to solve the problem in O(m^3) operations. In this work, we show that the number of parameters can be reduced further to m^2 /n, in the best case, assuming the label set can be partitioned into n conditionally independent subsets. As this label partition needs to be estimated from the data beforehand, we use first the procedure proposed in [4] that finds such partition and then infer the required parameters locally in each label subset. The latter are aggregated and serve as input to GFM to form the Bayes-optimal prediction. We show on a synthetic experiment that the reduction in the number of parameters brings about significant benefits in terms of performance.
Origine | Fichiers produits par l'(les) auteur(s) |
---|