Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

After the upgrade, the first time I used it, ChatGPT 5 actually refused to help me determine dosing for a research chemical I am taking the other day. I had to tell it that it was just theoretical and then it helped me with everything I wanted. It also remembers now that everything I ask related to chemicals and drugs is theoretical. Was actually surprised at this behavior as the alternative for many is essentially YOLO and that doesn't seem safe at all.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: