LLMs as Medical Safety Judges: Evaluating Alignment with Human Annotation in Patient-Facing QA

Yella Diekmann | Chase Fensore | Rodrigo Carrillo-Larco | Eduard Castejon Rosales | Sakshi Shiromani | Rima Pai | Megha Shah | Joyce Ho |

Paper Details:

Month: August
Year: 2025
Location: Viena, Austria
Venue: BioNLP | WS |

Citations

URL