Junyuan Hong
Junyuan Hong
Research
Publications
Experiences
Teaching
DPO
More is Less: The Pitfalls of Multi-Model Synthetic Preference Data in DPO Safety Alignment
A study revealing safety-specific pitfalls of multi-model synthetic preference data in DPO alignment.
Cite
×