Prepare to Get Manipulated by Emotionally Expressive Chatbots

  • 📰 WIRED
  • ⏱ Reading Time:
  • 25 sec. here
  • 10 min. at publisher
  • 📊 Quality Score:
  • News: 41%
  • Publisher: 51%

Fast Forward News

Artificial Intelligence,Machine Learning,Deep Learning

The emotional mimicry of OpenAI's new version of ChatGPT could lead AI assistants in some strange—even dangerous—directions.

It’s nothing new for computers to mimic human social etiquette, emotion, or humor. We just aren’t used to them doing it very well. OpenAI’s presentation of an all-new version of ChatGPT on Monday suggests that’s about to change. It’s built around an updated AI model called GPT-4o, which OpenAI says is better able to make sense of visual and auditory input, describing it as “multimodal.

Many people are already spending lots of time with chatbot companions or AI girlfriends, and the technology looks set to get a lot more engaging. When I spoke with Demis Hassabis, the executive leading Google’s AI charge, ahead of Google’s event, he said the research paper was inspired by the possibilities raised by Project Astra. “We need to get ahead of all this given the tech that we're building,' he said. After Monday’s news from OpenAI that rings truer than ever.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 555. in Aİ

Ai Ai Latest News, Ai Ai Headlines