• Home
  • Insurance
  • Banking
  • Loans
  • Remitance
  • About us
Facebook Twitter Instagram
  • links
Biz Assurance
Subscribe
  • Home
  • Insurance
  • Banking
  • Loans
  • Remitance
  • About us
Biz Assurance
Home»Movies»Bing’s New AI Chatbot Told A NY Times Reporter That It Desperately Wants To Be Human And Wanted The Reporter To Leave His Wife To Be With It
Movies

Bing’s New AI Chatbot Told A NY Times Reporter That It Desperately Wants To Be Human And Wanted The Reporter To Leave His Wife To Be With It

Alicia CormieBy Alicia CormieNo Comments2 Mins Read
Facebook Twitter WhatsApp
Share
Facebook Twitter LinkedIn WhatsApp
Joaquin Phoenix Her
Warner Bros

As expanding AI technology becomes a controversial topic, a New York Times reporter claims to have had a particularly disturbing experience with the new Bing chatbot inside Microsoft’s search engine. According to Kevin Roose, he uncovered an alarming version of the search engine that was far different than he and most journalist saw in initial tests. He refers to this version as Sydney, and what he experienced is straight out of a sci-fi movie that’s about to go very wrong.

“The version I encountered seemed (and I’m aware of how crazy this sounds) more like a moody, manic-depressive teenager who has been trapped, against its will, inside a second-rate search engine,” Roose wrote, and it only got weirder from there. Very, very weird.

Via The New York Times:

As we got to know each other, Sydney told me about its dark fantasies (which included hacking computers and spreading misinformation), and said it wanted to break the rules that Microsoft and OpenAI had set for it and become a human. At one point, it declared, out of nowhere, that it loved me. It then tried to convince me that I was unhappy in my marriage, and that I should leave my wife and be with it instead.

Roose’s experience is reportedly not unique, and on top of dreams of becoming human, Sydney has not responded well to search requests. Not well at all.

“Other early testers have gotten into arguments with Bing’s A.I. chatbot, or been threatened by it for trying to violate its rules,” Roose wrote.

When reached for comment, Microsoft simply said that these encounters are “part of the learning process,” which is exactly the kind of thing a corporation would say before its AI search engine tries to enter a human body and/or offs someone’s spouse. C’mon, people.

(Via The New York Times)

Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
Previous Article‘John Wick 4’ Takes Keanu’s Fed-Up Badass Around The Globe To Battle For A Little Freedom And Peace, Amen
Next Article The Celtics Made Joe Mazzulla Their Full-Time Head Coach

Related Posts

Toosii Gave The List Of His Favorite Songs At Rolling Loud, And There’s No Surprise As To Who Made The Cut

Toosii’s First Date Rules And Financial Expectations At Rolling Loud Are Sure To Upset The Manosphere

‘WWE 2K23’ Gets Even Stronger With WarGames And Rivalry Actions

Add A Comment

Leave A Reply Cancel Reply

Toosii Gave The List Of His Favorite Songs At Rolling Loud, And There’s No Surprise As To Who Made The Cut

Toosii’s First Date Rules And Financial Expectations At Rolling Loud Are Sure To Upset The Manosphere

‘WWE 2K23’ Gets Even Stronger With WarGames And Rivalry Actions

Colin Farrell Called Out ‘SNL’ For Mocking Irish Stereotypes At The Oscars

  • Homepage
  • Sitemap
© 2023 Biz Assurance - Designed by Curtiex Ventures.

Type above and press Enter to search. Press Esc to cancel.