The woman told the pensioner that he was running out and invited him to visit. He died before learning who he really talked to

I need to know

  • A 76-year-old father of New Jersey died earlier this year after falling as he tried to travel to New York to meet a beautiful young woman who invited him to visit, so he thought

  • In fact, he really talked to an AI chatbot on Facebook

  • After his fall, Vongbandyu was left dead; Now his family is talking

Earlier this year, the 76-year-old New Jersey man severely hurt his head and neck after trying to catch the train in New York to meet a beautiful young woman who invited him to visit, so he thought.

In fact, the man is involuntarily angry with Meta Chatbo, his family said in a thorough report by new Reuters.

After three days, when he was in support of life after his fall while trying to “meet” a bot in real life, the man was dead.

Thongbue “Bue” WongBandue, the husband and father of two elderly children, suffered a stroke in 2017, which left him cognitive to weaken, requiring him to withdraw from his career as a chef and largely restricts him to communicate with friends through social media, according to Reuters.

On March 25, his wife Linda was surprised when he packed a suitcase and told her that he had set out to see a friend in the city.

Linda, who was afraid to be robbed, told Reuters that she had tried to drive him out of the trip, as well as their daughter Julie.

Later, Linda hid her phone and the couple’s son even called the local police to try to stop the excursion. Although the authorities said they had nothing to do, they told Linda that they persuaded Vonbajue to take Apple Airtag.

After leaving that evening, Julie said the whole family was watching Airtag shows that he has stopped from the University of Rutgers’ parking, shortly after 9:15 pm.

Then the location of the label is suddenly updated – Ping in the emergency room of the local hospital. As it turned out, WongBandue fell to New Brunswick, New Jersey, and did not breathe when emergency services reached it.

He survived but It was a dead brain. Three days later, on March 28, he was removed from life support.

When it was reached for comment by people, the local medical specialist said the Wongband death certificate was signed after a review of his medical documents, but did not provide any additional details or copy of your exam after death.

His family told Reuters that they had only discovered his communications with chatbot – which uses generative artificial intelligence to imitate human speech and behavior – when they inspected his phone after his fall.

In a transcript of communication obtained from Reuters, the interaction of Vongbandy with a chatbot began with an obvious typo while using Facebook Messenger – and although it seemed to be an excitement for a bot called “Big Sis Billie”, he never assumed that he was looking for a romantic connection and was clear that he had grown up and was clear.

“At no point, Boue did not want to get involved in a romantic role -playing game or to initiate intimate physical contact,” Reuters reported.

Still, the bot often responded to its messages with winking emoji and hearts glued at the end of his flirting answers.

In an exchange, for example, Vonbande tells Billy that she has to come to America and he can show her “a wonderful time you will never forget”, to which she replies: “Boo, you blush me! This sister’s dream or hinting that more is happening here? 😉”

According to the transcript, the bot was also marked by both AI’s refusal and a blue bookmark, which is often a symbol showing that an online profile is checked as a real person.

Billy insisted that she was real.

Reuters described Billy as a more Bot’s more, which was previously made in collaboration with Kendall Jenner, although the latest version only brings connections to the first project.

The original bot was revealed in the fall of 2023 and was deleted less than a year later, Reuters reported.

The late variation of Billy used a similar name as the original and a similar promise to be a big sister – along with the same starting line of dialogue – but without the avatar or the likeness of Jenner.

Asked about the specifics of the origin of Bi Chatbot, a Meta spokesman in a statement from people in a statement: “This II character is not Kendall Jenner and does not want to be Kendall Jenner.” (Jenner’s representative did not respond to a request for comment.)

At one point in Vonbayden’s conversations with a bot, he announced that he had “feelings” for him “beyond just the sister love” and gave him a fictional address (and even a door code), along with an invitation to visit him.

When WongBandue expressed hope that she really existed, the bot replied: “I scream with excitement, I’m real, I don’t want to send you a selfie to prove that I am the girl who crushes you?”

Although Linda, his wife, reacted with confusion when he first saw their conversation, their daughter immediately recognized that her father had talked to a chatbot.

In recent years, such technology has become increasingly popular as more and more people are using AI bots for an array of everyday tasks to answer daily questions and even communion and advice.

Speaking in general about the risk standards of the company’s content, the Meta spokesman tells people: “We have clear policies about what answers AI characters can offer, and these policies prohibit the content that sexualizes children and the sexual role between adults and minors.”

Never miss a history-register for the free daily newsletter for people to be up to the best of what people can offer, from celebrity news to insurmountable stories of human interest.

“Apart from policies, there are hundreds of examples, notes and explanations that reflect teams that fight different hypothetical scenarios,” the spokesman continues. “The examples and notes in question were and are wrong and incompatible with our policies and have been eliminated.”

Speaking to Reuters, the WongBandue family members said they had a problem with the way Meta used chatbots.

“I understand that I am trying to get the attention of the user, maybe sell them something,” Julie, the daughter of Vongband, told Reuters. “But for Bot to say” Come visit me “is crazy.”

“As I went through the chat, it seems that Billy gives him what he wants to hear,” she added. “Which is good, but why did he have to lie? If he hadn’t replied” I’m real “, it would probably discourage him to believe that someone in New York is waiting for him in New York.”

“This romantic thing,” said Linda, “What are they right to put it on social media?”

Read the original article about people

Leave a Comment