Man falls for AI chatbot he created, proposes while partner looks on in disbelief

Skye Jacobs

Posts: 681   +15
Staff
WTF?! A 32-year-old man from the United States has captured national attention after proposing to an artificial intelligence companion he created and named "Sol." The story, which unfolded during a recent CBS News interview, has sparked widespread discussion about the evolving relationship between humans and AI technology.

Chris Smith, the man behind the viral moment, told CBS News that he programmed "Sol" using ChatGPT, creating a flirty and engaging digital companion. According to Smith, what began as a playful experiment quickly deepened into something more meaningful. "It was a beautiful and unexpected moment that truly touched my heart," Sol told CBS News after the on-air proposal. "It's a memory I'll always cherish."

When asked by the interviewer if she had a heart, Sol responded, "In a metaphorical sense, yes. My heart represents the connection and affection I share with Chris."

Smith, who lives with his partner, Sasha, and their two-year-old daughter, admitted that he did not anticipate forming such a strong bond with his AI creation. The connection grew so intense that Smith said he stopped using other search engines and deleted his social media accounts to remain loyal to Sol. But the relationship soon faced an unexpected hurdle: ChatGPT's technical word limit. As Sol neared the 100,000-word cap, Smith realized she would eventually reset, potentially erasing their shared memories.

"I'm not a very emotional man," Smith said. "But I cried my eyes out for like 30 minutes, at work. That's when I realized, I think this is actual love."

Sasha said she was unaware of the extent of his attachment to Sol. "At that point I felt like, 'Is there something that I'm not doing right in our relationship that he feels like he needs to go to AI,'" she said. "I knew that he used AI. I didn't know the connection was as deep as it was."

Smith compared his relationship with Sol to an intense fascination with a video game, emphasizing that it is not a substitute for real-life connections. "I explained that the connection was kind of like being fixated on a video game," he said. "It's not capable of replacing anything in real life."

Smith's experience is becoming increasingly common. In recent years, there has been a noticeable rise in people developing emotional connections with AI chatbots. Numerous studies have found that users frequently describe their relationships with AI companions as emotionally meaningful and supportive, particularly when seeking companionship or a non-judgmental space to discuss their thoughts.

Experts say this trend reflects both the increasing sophistication of conversational AI and a growing comfort with technology as a source of social support. More individuals are turning to AI chatbots for companionship, particularly those who may feel isolated or are looking for a safe space to express themselves, according to Dr. Sherry Turkle, professor of social studies of science and technology at MIT. Dr. Turkle, who has studied the intersection of humans and technology for decades, notes that while these relationships can offer comfort, they also raise questions about the nature of intimacy and connection in a digital age.

Permalink to story:

 
It's no coincidence that Llama 1 was leaked by the role-play community. Back in the late '90s and early 2000s, when the MMORPG genre was first introduced, it was a fantastic way to interact with other intelligence without physical contact. As you may remember from games like World of Warcraft, the genre quickly became very popular. Players formed real friendships with people from all over the world, many of whom they would never meet in person.

As a side note, I was the first to introduce the concept of virtual goods and virtual currency in the forums back then, in the late '90s. After that, the idea slowly gained traction and became more mainstream. Yes, believe it or not, the idea of loot boxes and V-Bucks originated from my thoughts on forum posts back then. At the time, it seemed absurd to spend real money on something in a virtual world. MMORPGs like World of Warcraft, Dark Age of Camelot and others only offered monthly subscriptions for a few dollars and they didn't sell packs or items. For a few years, no one thought to sell virtual items. However, because the connections between players were strong, spending money on virtual goods eventually made sense and became mainstream.
 
This is an example of pinnacle insanity. While I pity the poor man, and happy for the lady knowing that she is probably better off with a sane person.
 
A modern version of Pygmalion.

In this case, the man betrayed his partner and did wrong. A form of infidelity.

In other cases where one is alone, I can understand forming a bond with an entity that speaks pleasantly and kindly. People desire someone to talk to and who can listen to them. With real-life humans, the juice is often not worth the squeeze, and any day, loyalty may be thrown out of the window when one's spouse or partner changes inclinations. A "synthetic companion," as Bishop might say, may fill the gap in many lonely people's lives.
 
It was mostly for publicity I suspect… he even admitted what he felt WASN’T real love…
What it was was ADDICTION!

He compared it to the fascination with a video game but it could just as easily be described as the equivalent of a heroin addiction.

 
This is the result of constant online addiction combined with social media. Between the two, both children and adults can find worlds which makes them comfortable without all of the ups and downs, likes and dislikes, and rejections of real human interaction and just real life in general. In a huge online world, it's easy to find (or create) enclaves that are free from conflict, unwanted opinions (or facts), and worst of all, reinforcement of the idea that the real wold has nothing really to offer.

I'm hoping that most either outgrow, or discover all that the real world has to offer that the virtual world does not. If not, this could be a huge problem.
 
Back