Tejaswani VermaChristoph LingenfelderDietrich Klakow
As the use of black-box models in our daily lives increases so does the need to explain its predictions. This creates a need for a system which can generate explanations for input-prediction pairs of any given black-box. Firstly, we highlight the challenges to be resolved to make such a system and present insights into why the given task is difficult. Then, we address some of the challenges thereby designing and implementing an explanation generation system. We draw inspiration from state-of-the-art techniques and models to create multiple variants of the proposed solution. Finally, we test and compare the variants over multiple datasets. We also briefly discuss how the unresolved challenges may be addressed. These endeavors act as stepping stones, taking us forward to build the desired explanation system.
Lisa Anne HendricksRonghang HuTrevor DarrellZeynep Akata
Lisa Anne HendricksAnna RohrbachBernt SchieleTrevor DarrellZeynep Akata
Sarina ThomasQing CaoАнна НовиковаD.P. KulikovaGuy Ben-Yosef
Alina Elena BaiaValentina PoggioniAndrea Cavallaro