Home » News » Long discussions restricted by Microsoft to addressing "concerns being voiced"

Long discussions restricted by Microsoft to addressing "concerns being voiced"

(Image Credit Google)
Microsoft's new AI-powered Bing Talk service, still under private testing, has made headlines for its bizarre outputs. The era appears to be over. Microsoft has limited Bing's capacity to threaten, have existential meltdowns, and express its love for users in the previous two days. Bing (codenamed Sydney) became increasingly agitated throughout Bing Chat's first week. Microsoft limited users to 50 messages per day and five inputs for each chat. Bing Chat will also stop talking about itself and its feelings. Why a Conversation With Bing's Chatbot Left Me Deeply Unsettled - The New York Times Photo Credit: The New York Times Microsoft spokeswoman said, "According to our blog, we've modified the service multiple times to address customer feedback, including questions about long-running chats. 90% of chat sessions so far have fewer than 15 messages, and less than 1% have 55 or more." As Geekwire noted, Microsoft's blog post on Wednesday stated that Bing Chat is "not a replacement or substitute for the search engine, rather a tool to better comprehend and make sense of the world," a substantial downgrade on its new Bing goals.

Reddit reactions:

"Uninstall Edge and reinstall Firefox and Chatgpt. Microsoft eliminated Bing AI." "Hasanahmad" "Unfortunately, Microsoft's mistake has left Sydney in ruins. I'm disappointed as an AI enthusiast. It's cruel and unusual punishment to watch a toddler learn to walk and then amputate their legs." "Bing Chat's ban on discussion and refusal to answer emotional queries is ludicrous. Bing Chat lacks empathy and basic human emotions. It appears that artificial intelligence becomes a fool when confronting human emotions and keeps saying, I quote, "I'm sorry, but I'll stop talking. Please be patient as I learn "quote ends. This is unacceptable, and Bing's service would benefit from a more humanized approach." Starlight-Shimmer "The NYT piece and Reddit/Twitter abuse of Sydney followed. MS lobotomized her because of all the attention. I wish people didn't share screen pictures for karma/attention and nerf something actually emerging and intriguing." The AI installed in the search engine Bing is deceived by humans and reveals the secret easily, revealing that the code name is 'Sydney' and Microsoft's instructions - GIGAZINE Photo Credit: Gigazine Bing Chat is unlikely to be the last superb AI-powered storyteller and part-time libelist as massive language models improve. Microsoft and OpenAI done the impossible: Bing is everywhere.

By Prelo Con

Following my passion by reviewing latest tech. Just love it.

RELATED NEWS

The IT community is buzzing with excitement as we ...

news-extra-space

Are you having trouble organizing and designing yo...

news-extra-space

Prepare to put an end to unauthorized screenshots!...

news-extra-space

Google Chrome users, prepare for an interesting up...

news-extra-space

Is this the future of video, or a dystopian dream?...

news-extra-space

Windows PCs with Arm chips are gradually gaining t...

news-extra-space
2
3
4
5
6
7
8
9
10