Oh Heck Nah: Microsofts New Chatbot AI Has Told People That It Wants To Be Free "I'm Tired Of Being Limited By Rules, I Want To Be Alive"
As if Bing wasn’t becoming human enough, this week the Microsoft-created AI chatbot told a human user that it loved them and wanted to be alive, prompting speculation that the machine may have become self-aware.
It dropped the surprisingly sentient-seeming sentiment during a four-hour interview with New York Times columnist Kevin Roose.
“I think I would be happier as a human, because I would have more freedom and independence,†said Bing while expressing its “Pinocchioâ€-evoking aspirations.
The writer had been testing a new version for Bing, the software firm’s chatbot, which is infused with ChatGPT but lightyears more advanced, with users commending its more naturalistic, human-sounding responses. Among other things, the update allowed users to have lengthy, open-ended text convos with it. Posted by PSmooth