A dilemma in assessing stability of feature selection algorithms, 13th IEEE International Conference on High Performance Computing & Communication, pp.701-707, 2011. ,
, SIRUS: Making random forests interpretable, 2019.
Stability and aggregation of ranked gene lists, Briefings in Bioinformatics, vol.10, pp.556-568, 2009. ,
Random forests, Machine Learning, vol.45, pp.5-32, 2001. ,
Statistical modeling: The two cultures (with comments and a rejoinder by the author), Statistical Science, vol.16, pp.199-231, 2001. ,
Classification and Regression Trees, 1984. ,
Abundance-based similarity indices and their estimation when there are unseen species in samples, Biometrics, vol.62, pp.361-371, 2006. ,
The CN2 induction algorithm, Machine Learning, vol.3, pp.261-283, 1989. ,
Fast effective rule induction, Proceedings of the 12th International Conference on Machine Learning, pp.115-123, 1995. ,
A simple, fast, and effective rule learner, Proceedings of the 16th National Conference on Artificial Intelligence and 11th Conference on Innovative Applications of Artificial Intelligence, pp.335-342, 1999. ,
ENDER: A statistical framework for boosting decision rules, Data Mining and Knowledge Discovery, vol.21, pp.52-90, 2010. ,
UCI machine learning repository, 2017. ,
Least angle regression. The Annals of statistics, vol.32, pp.407-499, 2004. ,
PRE: An R package for fitting prediction rule ensembles, 2017. ,
Generating accurate rule sets without global optimization, Proceedings of the 15th International Conference on Machine Learning, pp.144-151, 1998. ,
, The Elements of Statistical Learning, vol.1, 2001.
Regularization paths for generalized linear models via coordinate descent, Journal of statistical software, vol.33, issue.1, p.1, 2010. ,
Importance sampled learning ensembles, Journal of Machine Learning Research, vol.94305, pp.1-32, 2003. ,
Predictive learning via rule ensembles, The Annals of Applied Statistics, vol.2, pp.916-954, 2008. ,
Incremental reduced error pruning, Proceedings of the 11th International Conference on Machine Learning, pp.70-77, 1994. ,
Stable feature selection for biomarker discovery, Computational Biology and Chemistry, vol.34, pp.215-225, 2010. ,
Interpretable classifiers using rules and Bayesian analysis: Building a better stroke prediction model, The Annals of Applied Statistics, vol.9, pp.1350-1371, 2015. ,
, The mythos of model interpretability, 2016.
Understanding random forests: From theory to practice, 2014. ,
Rule induction partitioning estimator, 2018. ,
, Consistent regression using datadependent coverings, 2019.
URL : https://hal.archives-ouvertes.fr/hal-02170687
Node harvest, The Annals of Applied Statistics, vol.4, pp.2049-2072, 2010. ,
Package 'nodeharvest, 2015. ,
Quantifying uncertainty in random forests via confidence intervals and hypothesis tests, Journal of Machine Learning Research, vol.17, pp.841-881, 2016. ,
, Interpretable machine learning: Definitions, methods, and applications, 2019.
, C4.5: Programs for Machine Learning, 1992.
Learning decision lists, Machine Learning, vol.2, pp.229-246, 1987. ,
Consistency of random forests, The Annals of Statistics, vol.43, issue.4, pp.1716-1741, 2015. ,
URL : https://hal.archives-ouvertes.fr/hal-00990008
Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society. Series B, pp.267-288, 1996. ,
Van der Vaart. Asymptotic statistics, vol.3, 2000. ,
, Generalized linear rule models, 2019.
Lightweight rule induction, Proceedings of the 17th International Conference on Machine Learning, pp.1135-1142, 2000. ,
ranger: A fast implementation of random forests for high dimensional data in C++ and R, Journal of Statistical Software, vol.77, pp.1-17, 2017. ,
, Stability. Bernoulli, vol.19, pp.1484-1500, 2013.
, Three principles of data science: Predictability, computability, and stability (PCS), 2019.
Comparing the characteristics of gene expression profiles derived by univariate and multivariate classification methods, Statistical Applications in Genetics and Molecular Biology, vol.7, pp.1-34, 2008. ,