OpenAI is concerned that its users will form emotional relationships with their AI.

The company that launched the generative artificial intelligence (AI) ChatGPT was concerned that the realistic voice in its program could push users to form connections with it, at the expense of human interactions.

Will OpenAI’s engineers grow tired of their own innovations? In a document, the company behind the generative artificial intelligence (AI) ChatGPT expressed concern that the realistic voice in its program could lead users to form connections with it, at the expense of human interactions.Anthropomorphism involves assigning human attitudes or characteristics to something non-human, such as an artificial intelligence model.The company confirmed in a report published on Thursday. “The risks may be increased by GPT-4o’s vocal features, which facilitate human-like interactions.”The report indicates.

The document was published the day before the launch of the new version of ChatGPT, GPT-4o, which incorporates the ability for the program to respond to voice and allow for the equivalent of a conversation. But having the same kind of conversation with an AI as you can with a human could create chaos. “misplaced trust” In the program, which can be enhanced by the added audio of the latter. OpenAI specifically claims that it was able to observe exchanges between testers of the new version and the AI ​​that seemed to show the creation of an emotional bond, such as expressing regret that this was their last day together.

“These cases certainly seem insignificant, but they highlight the need for continued research into how these effects manifest themselves in the long term.”Entering into some form of social relationship with AI could also encourage users to become less willing to engage in relationships with humans, OpenAI predicts. “Prolonged interactions with the model can have an impact on social norms. For example, our models are always respectful, and allow users to interrupt at any time, behavior that may be outside the norms of social interactions, although it is natural for AI.report details.

Risks of Relying on Technology

The ability of AI to remember details of conversations and carry out tasks assigned to it could also lead to users becoming too dependent on the technology. “These new concerns shared by OpenAI about the potential risk of relying on ChatGPT’s voice highlight the extent of the emerging question: Should we take the time and seek to understand how technology impacts human interactions and relationships?”“We are not only a technology company, but also a global leader in plagiarism detection,” said Alon Yamin, co-founder and CEO of Copyleaks, an AI-powered plagiarism detection platform.

“AI is a complementary technology, designed to help us simplify our work and daily lives; it should not become a substitute for real human relationships.”He added. OpenAI has confirmed that it will continue studying how the AI’s voice function can make users emotionally attached to it. Testers have also been able to get it to repeat false information or create conspiracy theories, raising concerns about the potential risks of the AI ​​model.

Chat-GPT’s voice function has already sparked a backlash and forced OpenAI to apologize to actress Scarlett Johansson last June for using a voice that sounded too close to her own, sparking controversy about the risks associated with copying voices using the technology. While the company denied using Scarlett Johansson’s voice, its president Sam Altman highlighted the voice feature on social media with a single word, “Ha”Referring to the movie in which the actress plays an artificial intelligence, this did not help convince followers.

The 2013 film tells the story of a man, played by Joaquin Phoenix, who falls in love with his own artificial intelligence, “Samantha”Which was voiced by Scarlett Johansson.

See also  Whatsapp Trick: This is How You Can Secretly "Leave" Group Chats

Stan Shaw

<p class="sign">"Professional food nerd. Internet scholar. Typical bacon buff. Passionate creator."</p>

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top