EraseAnything: Enabling Concept Erasure in Rectified Flow Transformers

ICML 2025

1USTC, 2Eliza Labs, 3NTU, 4BUPT, 5A*STAR, 6Alibaba Tongyi lab, 7TeleAI, 8Beihang University
teaser

We propose EraseAnything, a novel concept erasure method for rectified flow transformers. The results of our method demonstrate that it can effectively erase concepts from various ranges: abstract concepts, specific objects, qualitative concepts, and etc.

Abstract

Removing unwanted concepts from large-scale text-to-image (T2I) diffusion models while maintaining their overall generative quality remains an open challenge. This difficulty is especially pronounced in emerging paradigms, such as Stable Diffusion (SD) v3 and Flux, which incorporate flow matching and transformer-based architectures. These advancements limit the transferability of existing concept-erasure techniques that were originally designed for the previous T2I paradigm (e.g., SD v1.4). In this work, we introduce EraseAnything, the first method specifically developed to address concept erasure within the latest flow-based T2I framework. We formulate concept erasure as a bi-level optimization problem, employing LoRA-based parameter tuning and an attention map regularizer to selectively suppress undesirable activations. Furthermore, we propose a self-contrastive learning strategy to ensure that removing unwanted concepts does not inadvertently harm performance on unrelated ones. Experimental results demonstrate that EraseAnything successfully fills the research gap left by earlier methods in this new T2I paradigm, achieving state-of-the-art performance across a wide range of concept erasure tasks.

Main Results Comparison
Figure 1: Comparison with baselines.
Ablation Studies
Figure 2: EraseAnything on irrelevant concepts.
Supplementary Figures
Figure 3: Basic diagram of Flux.dev/schnell, backbone of EraseAnything.
LoRA Visualizations
Figure 4: Multi-concept erasure.
Additional Visualizations
Figure 5: More results.

BibTeX


      @article{gao2024eraseanything,
        title={EraseAnything: Enabling Concept Erasure in Rectified Flow Transformers},
        author={Gao, Daiheng and Lu, Shilin and Walters, Shaw and Zhou, Wenbo and Chu, Jiaming and Zhang, Jie and Zhang, Bang and Jia, Mengxi and Zhao, Jian and Fan, Zhaoxin and others},
        journal={ICML 2025},
        year={2024}
      }