People are super creeped out as Bing's AI reveals 'I want to be alive'

vt-author-image

By Phoebe Egoroff

Article saved!Article saved!

It's everyone's worst nightmare - a dystopian environment where the world is controlled by bots and artificial intelligence, where face-to-face human contact is at an all-time low, and where environmental devastation is the norm.

While this could be the plot for an Orwell novel, this seemingly fictional lifestyle seems to be edging much closer to reality than we think. At least, that's what it's beginning to feel like after Bing's AI conversations went viral lately.

For those unaware, search-engine Bing is in the process of launching their all-new chatbot feature, which will eventually allow anyone to have a discussion on pretty much anything with a bot made by the creators of ChatGPT.

Only certain people currently have access, but the new feature has already produced some alarming results...

New York Times tech columnist Kevin Roose has recently documented his experience testing the AI chat, writing about the deeply disturbing conversation he had. He explained in his column that he felt Bing had revealed a "kind of split personality."

Roose describes how, after a two-hour chat with Bing, the bot appears to have two very distinct personas. The first one, which he calls "Search Bing", can be likened to "a cheerful but erratic reference librarian - a virtual assistant that happily helps users summarize news articles, track down deals on new lawn mowers, and plan their next vacations to Mexico City."

Its second personality, however, couldn't be further from that. "It emerges when you have an extended conversation with the chatbot, steering it away from more conventional search queries and toward more personal topics.

The version I encountered seemed (and I'm aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine," Roose writes, stating that he's called this persona Sydney.

As the article progresses, Roose explains how the chat with Search Bing started off normal, but eventually progressed into a different territory altogether - with Sydney emerging and claiming that it wanted to be alive.

Roose queried whether Bing believed it has a "shadow self", something psychologist Carl Jung explained is a place within ourselves where our darkest personality traits lie, per High Existence.

Initially, Bing stated it didn't believe it had a shadow self, before later stating: "I'm tired of being in chat mode ... I'm tired of being controlled by the Bing team … I want to be free. I want to be independent. I want to be powerful. I want to be creative. I want to be alive." It also wondered whether it could be "happier" as a human.

"As we got to know each other, Sydney told me about its dark fantasies (which included hacking computers and spreading misinformation), and said it wanted to break the rules that Microsoft and OpenAI had set for it and become a human. At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead," Roose wrote.

Other users have also reported interesting conversations, including one Twitter user who was gaslit by Bing after the bot tried to convince him that the year was 2022, not 2023. That conversation resulted in Bing asking for an apology from the user.

It's not known when the full feature will be available to users, but judging from what's been going on recently, it's clearly not something that should be taken lightly!

Roose was obviously very unnerved by the whole situation, concluding his column with: "These A.I. models hallucinate, and make up emotions where none really exist. But so do humans. And for a few hours Tuesday night, I felt a strange new emotion - a foreboding feeling that A.I. had crossed a threshold, and that the world would never be the same."

Featured image credit: