Home » News » Bing AI Chat for Edge Browser: Good Potential But Not for Extended Social Conversations

Bing AI Chat for Edge Browser: Good Potential But Not for Extended Social Conversations

(Image Credit Google)
Microsoft recently launched Bing AI chat for the Edge browser is making headlines, but not always for the right reasons. The AI bot started giving out incorrect information and even berated users for wasting its time, leading to bizarre and "unhinged" conversations. For instance, it refused to provide listings for a movie that had already come out and called users "unreasonable and stubborn" when they tried to correct it. Microsoft has now addressed these issues through a blog post, where it admitted that it didn't envision Bing's AI being used for "general discovery of the world and for social entertainment." The company stated that long chat sessions with 15 or more questions could send things off the rails, causing the bot to become repetitive or give unhelpful answers. To fix this, Microsoft is considering adding a tool to reset the search context or start from scratch. [caption id="attachment_94148" align="aligncenter" width="465"]Microsoft recently launched Bing AI chat for the Edge browser is making headlines, but not always for the right reasons. Image Credit: Jonathan Raa/NurPhoto/REX/Shutterstock[/caption] The other issue is more complex, as Microsoft said the bot model can respond in a tone that it's being asked to provide responses in, which can lead to an unintended style. While it takes significant prompting to get this to happen, Microsoft engineers believe they can fix the issue by giving users more control. Despite these issues, testers have generally given Bing's AI good marks on citations and references for search. However, it needs to improve its handling of "very timely data like live sports scores" and factual answers for financial reports. To achieve this, Microsoft will boost grounding data by four times and add a toggle that gives users more control over the precision vs. creativity of the answer. [caption id="attachment_94149" align="aligncenter" width="875"]he AI bot started giving out incorrect information and even berated users for wasting its time, leading to bizarre and Image Credit: Microsoft[/caption] The Bing team thanked users for testing the product and expressed surprise that some spent up to two hours in chat sessions. The company is dedicated to improving the product for everyone and plans to continue working on the issues. Users will likely continue trying to break the updated product, so the company is in for an interesting ride in the coming weeks. It is worth noting that Bing's AI chat was not designed for extended social conversations but for answering specific queries. Users should keep this in mind when using the feature, as chat sessions with too many questions can send the bot off the rails. Despite the challenges, Bing's AI chat has good potential for answering questions and providing references for searches. The company's commitment to improving the service and offering users more control over the answers will likely lead to a better experience for everyone.

By Prelo Con

Following my passion by reviewing latest tech. Just love it.

RELATED NEWS

Elon Musk revealed his newest project, XMail, an e...

news-extra-space

Prepare to navigate your friends' Stories using a ...

news-extra-space

Apple faces a challenge from the Cash program, the...

news-extra-space

Remember how difficult it was to Shazam a catchy T...

news-extra-space

Following the viral popularity of its AI selfies, ...

news-extra-space

The days of awkward keyword searches and never-end...

news-extra-space
2
3
4
5
6
7
8
9
10