BOOK-CHAPTER

CountARFactuals – Generating Plausible Model-Agnostic Counterfactual Explanations with Adversarial Random Forests

Susanne DandlKristin BleschTimo FreieslebenGunnar KönigJan KaparBernd BischlMarvin N. Wright

Year: 2024 Communications in computer and information science Pages: 85-107   Publisher: Springer Science+Business Media

Abstract

Abstract Counterfactual explanations elucidate algorithmic decisions by pointing to scenarios that would have led to an alternative, desired outcome. Giving insight into the model’s behavior, they hint users towards possible actions and give grounds for contesting decisions. As a crucial factor in achieving these goals, counterfactuals must be plausible, i.e., describing realistic alternative scenarios within the data manifold. This paper leverages a recently developed generative modeling technique – adversarial random forests (ARFs) – to efficiently generate plausible counterfactuals in a model-agnostic way. ARFs can serve as a plausibility measure or directly generate counterfactual explanations. Our ARF-based approach surpasses the limitations of existing methods that aim to generate plausible counterfactual explanations: It is easy to train and computationally highly efficient, handles continuous and categorical data naturally, and allows integrating additional desiderata such as sparsity in a straightforward manner.

Keywords:
Counterfactual thinking Adversarial system Random forest Computer science Artificial intelligence Machine learning Natural language processing Epistemology Philosophy

Metrics

2
Cited By
2.52
FWCI (Field Weighted Citation Impact)
37
Refs
0.84
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Adversarial Robustness in Machine Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Explainable Artificial Intelligence (XAI)
Physical Sciences →  Computer Science →  Artificial Intelligence
Anomaly Detection Techniques and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.