Advanced search
3 files | 20.94 MB Add to list

Powershap : a power-full shapley feature selection method

Jarne Verhaeghe (UGent) , Jeroen Van Der Donckt (UGent) , Femke Ongenae (UGent) and Sofie Van Hoecke (UGent)
Author
Organization
Project
Abstract
Feature selection is a crucial step in developing robust and powerful machine learning models. Feature selection techniques can be divided into two categories: filter and wrapper methods. While wrapper methods commonly result in strong predictive performances, they suffer from a large computational complexity and therefore take a significant amount of time to complete, especially when dealing with high-dimensional feature sets. Alternatively, filter methods are considerably faster, but suffer from several other disadvantages, such as (i) requiring a threshold value, (ii) not taking into account intercorrelation between features, and (iii) ignoring feature interactions with the model. To this end, we present powershap, a novel wrapper feature selection method, which leverages statistical hypothesis testing and power calculations in combination with Shapley values for quick and intuitive feature selection. Powershap is built on the core assumption that an informative feature will have a larger impact on the prediction compared to a known random feature. Benchmarks and simulations show that powershap outperforms other filter methods with predictive performances on par with wrapper methods while being significantly faster, often even reaching half or a third of the execution time. As such, powershap provides a competitive and quick algorithm that can be used by various models in different domains. Furthermore, powershap is implemented as a plug-and-play and open-source sklearn component, enabling easy integration in conventional data science pipelines. User experience is even further enhanced by also providing an automatic mode that automatically tunes the hyper-parameters of the powershap algorithm, allowing to use the algorithm without any configuration needed.
Keywords
Open source, Python, Toolkit, Simulation, Benchmark, Shap, Feature selection

Downloads

  • sub 1038.pdf
    • full text (Author's original)
    • |
    • open access
    • |
    • PDF
    • |
    • 445.48 KB
  • AAM.pdf
    • full text (Accepted manuscript)
    • |
    • open access
    • |
    • PDF
    • |
    • 438.02 KB
  • (...).pdf
    • full text (Published version)
    • |
    • UGent only
    • |
    • PDF
    • |
    • 20.05 MB

Citation

Please use this url to cite or link to this publication:

MLA
Verhaeghe, Jarne, et al. “Powershap : A Power-Full Shapley Feature Selection Method.” MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT I, edited by Massih-Reza Amini et al., vol. 13713, Springer, 2023, pp. 71–87, doi:10.1007/978-3-031-26387-3_5.
APA
Verhaeghe, J., Van Der Donckt, J., Ongenae, F., & Van Hoecke, S. (2023). Powershap : a power-full shapley feature selection method. In M.-R. Amini, S. Canu, A. Fischer, T. Guns, P. K. Novak, & G. Tsoumakas (Eds.), MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT I (Vol. 13713, pp. 71–87). https://doi.org/10.1007/978-3-031-26387-3_5
Chicago author-date
Verhaeghe, Jarne, Jeroen Van Der Donckt, Femke Ongenae, and Sofie Van Hoecke. 2023. “Powershap : A Power-Full Shapley Feature Selection Method.” In MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT I, edited by Massih-Reza Amini, Stéphane Canu, Asja Fischer, Tias Guns, Petra Kralj Novak, and Grigorios Tsoumakas, 13713:71–87. Springer. https://doi.org/10.1007/978-3-031-26387-3_5.
Chicago author-date (all authors)
Verhaeghe, Jarne, Jeroen Van Der Donckt, Femke Ongenae, and Sofie Van Hoecke. 2023. “Powershap : A Power-Full Shapley Feature Selection Method.” In MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT I, ed by. Massih-Reza Amini, Stéphane Canu, Asja Fischer, Tias Guns, Petra Kralj Novak, and Grigorios Tsoumakas, 13713:71–87. Springer. doi:10.1007/978-3-031-26387-3_5.
Vancouver
1.
Verhaeghe J, Van Der Donckt J, Ongenae F, Van Hoecke S. Powershap : a power-full shapley feature selection method. In: Amini M-R, Canu S, Fischer A, Guns T, Novak PK, Tsoumakas G, editors. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT I. Springer; 2023. p. 71–87.
IEEE
[1]
J. Verhaeghe, J. Van Der Donckt, F. Ongenae, and S. Van Hoecke, “Powershap : a power-full shapley feature selection method,” in MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT I, Grenoble, France, 2023, vol. 13713, pp. 71–87.
@inproceedings{01GMAZ18TNJCZV7DCSGFWRVM11,
  abstract     = {{Feature selection is a crucial step in developing robust and powerful machine
learning models. Feature selection techniques can be divided into two
categories: filter and wrapper methods. While wrapper methods commonly result
in strong predictive performances, they suffer from a large computational
complexity and therefore take a significant amount of time to complete,
especially when dealing with high-dimensional feature sets. Alternatively,
filter methods are considerably faster, but suffer from several other
disadvantages, such as (i) requiring a threshold value, (ii) not taking into
account intercorrelation between features, and (iii) ignoring feature
interactions with the model. To this end, we present powershap, a novel wrapper
feature selection method, which leverages statistical hypothesis testing and
power calculations in combination with Shapley values for quick and intuitive
feature selection. Powershap is built on the core assumption that an
informative feature will have a larger impact on the prediction compared to a
known random feature. Benchmarks and simulations show that powershap
outperforms other filter methods with predictive performances on par with
wrapper methods while being significantly faster, often even reaching half or a
third of the execution time. As such, powershap provides a competitive and
quick algorithm that can be used by various models in different domains.
Furthermore, powershap is implemented as a plug-and-play and open-source
sklearn component, enabling easy integration in conventional data science
pipelines. User experience is even further enhanced by also providing an
automatic mode that automatically tunes the hyper-parameters of the powershap
algorithm, allowing to use the algorithm without any configuration needed.
}},
  author       = {{Verhaeghe, Jarne and Van Der Donckt, Jeroen and Ongenae, Femke and Van Hoecke, Sofie}},
  booktitle    = {{MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2022, PT I}},
  editor       = {{Amini, Massih-Reza and Canu, Stéphane and Fischer, Asja and Guns, Tias and Novak, Petra Kralj and Tsoumakas, Grigorios}},
  isbn         = {{9783031263866}},
  issn         = {{2945-9133}},
  keywords     = {{Open source,Python,Toolkit,Simulation,Benchmark,Shap,Feature selection}},
  language     = {{eng}},
  location     = {{Grenoble, France}},
  pages        = {{71--87}},
  publisher    = {{Springer}},
  title        = {{Powershap : a power-full shapley feature selection method}},
  url          = {{http://doi.org/10.1007/978-3-031-26387-3_5}},
  volume       = {{13713}},
  year         = {{2023}},
}

Altmetric
View in Altmetric
Web of Science
Times cited: