S. Alelyani, Z. Zhao, and H. Liu, A dilemma in assessing stability of feature selection algorithms, 13th IEEE International Conference on High Performance Computing & Communication, pp.701-707, 2011.

C. Bénard, G. Biau, S. Da, E. Veiga, and . Scornet, SIRUS: Making random forests interpretable, 2019.

A. Boulesteix and M. Slawski, Stability and aggregation of ranked gene lists, Briefings in Bioinformatics, vol.10, pp.556-568, 2009.

L. Breiman, Random forests, Machine Learning, vol.45, pp.5-32, 2001.

L. Breiman, Statistical modeling: The two cultures (with comments and a rejoinder by the author), Statistical Science, vol.16, pp.199-231, 2001.

L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and Regression Trees, 1984.

A. Chao, R. L. Chazdon, R. K. Colwell, and T. Shen, Abundance-based similarity indices and their estimation when there are unseen species in samples, Biometrics, vol.62, pp.361-371, 2006.

P. Clark and T. Niblett, The CN2 induction algorithm, Machine Learning, vol.3, pp.261-283, 1989.

W. W. Cohen, Fast effective rule induction, Proceedings of the 12th International Conference on Machine Learning, pp.115-123, 1995.

W. W. Cohen and Y. Singer, A simple, fast, and effective rule learner, Proceedings of the 16th National Conference on Artificial Intelligence and 11th Conference on Innovative Applications of Artificial Intelligence, pp.335-342, 1999.

K. Dembczy?ski, W. Kot?owski, and R. S?owi?ski, ENDER: A statistical framework for boosting decision rules, Data Mining and Knowledge Discovery, vol.21, pp.52-90, 2010.

D. Dua and C. Graff, UCI machine learning repository, 2017.

B. Efron, T. Hastie, I. Johnstone, and R. Tibshirani, Least angle regression. The Annals of statistics, vol.32, pp.407-499, 2004.

M. Fokkema, PRE: An R package for fitting prediction rule ensembles, 2017.

E. Frank and I. H. Witten, Generating accurate rule sets without global optimization, Proceedings of the 15th International Conference on Machine Learning, pp.144-151, 1998.

J. Friedman, T. Hastie, and R. Tibshirani, The Elements of Statistical Learning, vol.1, 2001.

J. Friedman, T. Hastie, and R. Tibshirani, Regularization paths for generalized linear models via coordinate descent, Journal of statistical software, vol.33, issue.1, p.1, 2010.

J. H. Friedman and B. E. Popescu, Importance sampled learning ensembles, Journal of Machine Learning Research, vol.94305, pp.1-32, 2003.

J. H. Friedman and B. E. Popescu, Predictive learning via rule ensembles, The Annals of Applied Statistics, vol.2, pp.916-954, 2008.

J. Fürnkranz and G. Widmer, Incremental reduced error pruning, Proceedings of the 11th International Conference on Machine Learning, pp.70-77, 1994.

Z. He and W. Yu, Stable feature selection for biomarker discovery, Computational Biology and Chemistry, vol.34, pp.215-225, 2010.

B. Letham, C. Rudin, T. H. Mccormick, and D. Madigan, Interpretable classifiers using rules and Bayesian analysis: Building a better stroke prediction model, The Annals of Applied Statistics, vol.9, pp.1350-1371, 2015.

Z. C. Lipton, The mythos of model interpretability, 2016.

G. Louppe, Understanding random forests: From theory to practice, 2014.

V. Margot, J. Baudry, F. Guilloux, and O. Wintenberger, Rule induction partitioning estimator, 2018.

V. Margot, J. Baudry, F. Guilloux, and O. Wintenberger, Consistent regression using datadependent coverings, 2019.
URL : https://hal.archives-ouvertes.fr/hal-02170687

N. Meinshausen, Node harvest, The Annals of Applied Statistics, vol.4, pp.2049-2072, 2010.

N. Meinshausen, Package 'nodeharvest, 2015.

L. Mentch and G. Hooker, Quantifying uncertainty in random forests via confidence intervals and hypothesis tests, Journal of Machine Learning Research, vol.17, pp.841-881, 2016.

W. J. Murdoch, C. Singh, K. Kumbier, R. Abbasi-asl, and B. Yu, Interpretable machine learning: Definitions, methods, and applications, 2019.

J. R. Quinlan, C4.5: Programs for Machine Learning, 1992.

R. L. Rivest, Learning decision lists, Machine Learning, vol.2, pp.229-246, 1987.

E. Scornet, G. Biau, and J. Vert, Consistency of random forests, The Annals of Statistics, vol.43, issue.4, pp.1716-1741, 2015.
URL : https://hal.archives-ouvertes.fr/hal-00990008

R. Tibshirani, Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society. Series B, pp.267-288, 1996.

A. W. , Van der Vaart. Asymptotic statistics, vol.3, 2000.

D. Wei, S. Dash, T. Gao, and O. Günlük, Generalized linear rule models, 2019.

S. M. Weiss and N. Indurkhya, Lightweight rule induction, Proceedings of the 17th International Conference on Machine Learning, pp.1135-1142, 2000.

M. N. Wright and A. Ziegler, ranger: A fast implementation of random forests for high dimensional data in C++ and R, Journal of Statistical Software, vol.77, pp.1-17, 2017.

B. Yu, Stability. Bernoulli, vol.19, pp.1484-1500, 2013.

B. Yu and K. Kumbier, Three principles of data science: Predictability, computability, and stability (PCS), 2019.

M. Zucknick, S. Richardson, and E. A. Stronach, Comparing the characteristics of gene expression profiles derived by univariate and multivariate classification methods, Statistical Applications in Genetics and Molecular Biology, vol.7, pp.1-34, 2008.