Skip to yearly menu bar Skip to main content


Poster

DEPICT: Diffusion-Enabled Permutation Importance for Image Classification Tasks

Sarah Jabbour · Gregory Kondas · Ella Kazerooni · Michael Sjoding · David Fouhey · Jenna Wiens

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Tue 1 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

We propose a permutation-based explanation method for image classifiers. Current image model explanations like activation maps are limited to instance-based explanations in the pixel space, making it difficult to understand global model behavior. Permutation based explanations for tabular data classifiers measure feature importance by comparing original model performance to model performance on data after permuting a feature. We propose an explanation method for image-based models that permutes interpretable concepts across dataset images. Given a dataset of images labeled with specific concepts like captions, we permute a concept across examples and then generate images via a text-conditioned diffusion model. Model importance is then given by the change in classifier performance relative to unpermuted data. When applied to a set of concepts, the method generates a ranking of concept importance. We show that this approach recovers underlying model feature importance on synthetic and real-world image classification tasks.

Live content is unavailable. Log in and register to view live content