Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

i like that gpt-oss can be easly jailbroken, the issue is that the prompt needs to be sanitized before execution.. so nothing new


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: