File(s) not publicly available

A non-specialized ensemble classifier using multi-objective optimization

journal contribution
posted on 29.03.2021, 01:02 by Samuel P Fletcher, Brijesh VermaBrijesh Verma, Mengjie Zhang
Ensemble classification algorithms are often designed for data with certain properties, such as imbalanced class labels, a large number of attributes, or continuous data. While high-performing, these algorithms sacrifice performance when applied to data outside the targeted domain. We propose a non-specific ensemble classification algorithm that uses multi-objective optimization instead of relying on heuristics and fragile user-defined parameters. Only two user-defined parameters are included, with both being found to have large windows of values that produce statistically indistinguishable results, indicating the low level of expertise required from the user to achieve good results. Additionally, when given a large initial set of trained base-classifiers, we demonstrate that a multi-objective genetic algorithm aiming to optimize prediction accuracy and diversity will prefer particular types of classifiers over others. The total number of chosen classifiers is also surprisingly small – only 10.14 classifiers on average, out of an initial pool of 900. This occurs without any explicit preference for small ensembles of classifiers. Even with these small ensembles, significantly lower empirical classification error is achieved compared to the current state-of-the-art. © 2020 Elsevier B.V.

Funding

Category 1 - Australian Competitive Grants (this includes ARC, NHMRC)

History

Volume

409

Start Page

93

End Page

102

Number of Pages

10

eISSN

1872-8286

ISSN

0925-2312

Publisher

Elsevier

Language

en

Peer Reviewed

Yes

Open Access

No

Acceptance Date

10/05/2020

External Author Affiliations

Victoria University of Wellington, NZ

Author Research Institute

Centre for Intelligent Systems

Era Eligible

Yes

Journal

Neurocomputing