OpenAI Employee Discovers Eliza Effect, Gets Emotional
Technology
Designing a program in such a way that it can actually convince someone that another human is on the other side of the screen has been the goal of AI developers since the idea took its first steps towards to reality. Research firm OpenAI recently announced that its flagship product ChatGPT will have ears, eyes and a voice in an effort to look more human. Now, an AI security engineer at OpenAI says she became “quite emotional” after using the chatbot's voice mode for instant chat.
“I just had a pretty emotional one-on-one conversation with ChatGPT via voice about stress and work-life balance,” said Lilian Weng, head of security systems at OpenAI, in a tweet posted yesterday. “Interestingly, I felt like I was listened to and listened to. Heat. I've never tried therapy, but this is probably it? Give it a try, especially if you normally use it as a productivity tool.
Weng's experience as an OpenAI employee extolling the virtues of the OpenAI product should of course be taken with a grain of salt, but it reflects Silicon Valley's recent attempts to force AI to become mainstream in every corner of our plebeian lives. This also applies to the “everything old is new” sentiment in this phase of AI development.
The technological optimism of the 1960s led to some of the first experiments in “artificial intelligence,” which attempted to mimic human mental processes using a computer. One of these ideas was a natural language processing computer program called Eliza, developed by Joseph Weizenbaum of the Massachusetts Institute of Technology.
Eliza wrote a screenplay called The Doctor, based on a parody of psychotherapist Carl Rogers. Instead of feeling stigmatized and sitting in a stuffy psychiatrist's office, people could sit in front of an equally stuffy computer terminal to get help for their deepest problems. However, Eliza wasn't that smart and the script simply captured certain keywords and phrases and returned them to the user in an incredibly easy way, just like Carl Rogers would have done. In a strange twist, Weizenbaum noted that Eliza users had developed an emotional attachment to the program's core outcomes: they felt "heard and empowered," so to speak. hot,” said Weng.
“I did not realize that extremely brief exposure to a relatively simple computer program could cause severe delusions in ordinary people,” Weizenbaum later wrote in his 1976 book Computer Power and Human Reason.
To say that even new AI therapy attempts have failed and burn would be dismissive. Peer-to-peer mental health app Koko decided to experiment with artificial intelligence by posing as an advisor to the platform's 4,000 users. Company co-founder Rob Morris told Gizmodo earlier this year that “this will be the future.” Users acting as advisors could use Koko Bot, OpenAI's ChatGPT3 application, to generate responses that could then be edited, submitted, or rejected entirely.The tool reportedly generated 30,000 messages and received positive responses, but Koko gave up because the chatbot seemed sterile. When Morris shared his experience on Twitter (now known as X), the public response was overwhelming.
The dark side of things is that earlier this year the widow of a Belgian man said her husband committed suicide after he became engrossed in conversations with artificial intelligence that encouraged him to commit suicide.
Last May, the National Eating Disorders Association took a bold step and closed the eating disorder hotline where people in crisis could turn for help. Instead, NEDA decided to replace the hotline agents with a chatbot called Tessa. The mass layoffs occurred just four days after the union was founded. Previously, workers felt underfunded and overworked, which is particularly annoying when working so closely with an at-risk group. After using
for less than a week, Tessa NEDA deactivated the chatbot.According to a post on the nonprofit's Instagram page, Tessa "may have provided information that was harmful and unrelated to the program."
About the Creator
Bishal talukder
Hi guys enjoy my all writings.


Comments (1)
Every day it gets a little better! AI! Great work’! Whimsical!