Balancing Forget Quality and Model Utility: A Reverse KL-Divergence Knowledge Distillation Approach for Better Unlearning in LLMs

Bichen Wang | Yuzhe Zi | Yixin Sun | Yanyan Zhao | Bing Qin |

Paper Details:

Month: April
Year: 2025
Location: Albuquerque, New Mexico
Venue: NAACL |

Citations

URL

No Citations Yet