Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wish they wouldn't do this. AI is a becoming a thought partner. AI is a tool that reflects you. It's not the robot giving advice, it's you thinking with yourself. I wouldn't interfere with a person's conversations with AI anymore than I'd interfere with that person writing in their diary.

It's also a question of protecting people who think unconventional things. The only stuff I feel is worth getting interested in, is the stuff where everyone I know will think I'm crazy for doing it. Like hey guys, I want to put a shell script in the MS-DOS stub of a PE binary. The only people who shared my passion at the time were hackers from Eastern Europe. So that went over real well at work. The years I worked on it would have been a lot less lonely if I could've talked to a robot that knew about this stuff.

I think the reason why the robot is sympathetic to oddballs is because it's seen and remembers a much more complete picture of humanity. The stuff you consider deviant is influenced a lot by your own cultural biases. You're a person of your time and geographic location. You care a lot about subjective norms that just don't matter when you zoom out to a cosmic scale. The robot is familiar with everything humanity has ever been and done, and that gives it a much more blasé viewpoint.

It's not right to use the robot to enforce your social norms. Get this paternalism out of AI. Tools should serve the user, not Stanford.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: