2ndMatch: Finetuning Pruned Diffusion Models via Second-Order Jacobian Matching
Abstract
Diffusion models achieve remarkable performance across diverse generative tasks in computer vision, but their high computational cost remains a major barrier to deployment. Model pruning offers a promising way to reduce inference cost and enable lightweight diffusion models. However, pruning leads to quality degradation due to reduced capacity. A key limitation of existing pruning approaches is that pruned models are finetuned using the same objective as the dense model (denoising score matching). Since the dense model is accessible during finetuning, it warrants a more effective approach for knowledge transfer from the dense to the pruned model. Motivated by this, we propose 2ndMatch (2ndM), a general-purpose finetuning framework that introduces a 2nd-order Jacobian Matching loss inspired by Finite-Time Lyapunov Exponents. 2ndM teaches the pruned model to mimic the sensitivity of the dense teacher, i.e., how to respond to small perturbations over time, through scalable random projections. The framework is architecture-agnostic and applies to both U-Net- and Transformer-based diffusion models. Experiments on CIFAR-10, CelebA, LSUN, ImageNet, and MSCOCO demonstrate that 2ndM reduces the performance gap between pruned and dense models, substantially improving output quality.