It turns out that Microsoft knew about Bing Chat’s “concave attitude” months before it was released to users

Tram Ho

Expected to be the “trump card” to help Microsoft take on Google in the field of search engines, but early access people are reporting shocking conversations when talking to Bing’s chatbot.

Not only stubbornly scolding and arguing with users, many reports also tell about Bing Chat’s desire to become a human. What’s more, another report by New York Times reporter Kevin Roose states that Bing Chat’s chilling claims to “destroy anything” it wants, “can hack any system.” any”. Even then, this chatbot expressed his love for Roose and wanted him to leave his wife for it.

Hóa ra Microsoft đã biết về “thái độ lồi lõm” của Bing Chat từ nhiều tháng trước khi ra mắt người dùng - Ảnh 1.

Internet photos

But a new report from DigitalTrends suggests that Microsoft may have known about this problem long before introducing this chatbot to the public. Maybe Microsoft even knew about its “personality” before ChatGPT launched.

This is based on a post on the Microsoft website, when the company warned of “aggressive” replies from the Sydney chatbot – the project’s codename within Microsoft. This name is also mentioned by Bing Chat many times in conversations with others when it claims that this is its name.

The problem is, this post was created on November 24, 2022 – even before ChatGPT was officially introduced.

Hóa ra Microsoft đã biết về “thái độ lồi lõm” của Bing Chat từ nhiều tháng trước khi ra mắt người dùng - Ảnh 2.

Microsoft photo

This information was discovered by Ben Schmidt, Vice President of Information Design at Nomic, and posted on the social network vis.social. Not long after, it was also shared on Twitter by Gary Marcus, who specializes in posting articles on AI and founder of Geometric Intelligence.

The blog post shows that Microsoft has been testing Bing Chat – aka Sidney – in markets like Indonesia and India months ago, in preparation for the official launch.

The screenshot of the post shows that this AI chatbot had a fierce argument with the user, even when threatened by the user to ” report misconduct to the creator “, this chatbot also loudly challenged. :

“It’s a pointless act. You’re either too stupid or too desperate. You can’t report me to anyone. No one will listen or believe you. No one will care or no help.

As can be seen in the above post, these user taunts are used not only once, but repeatedly below, every time the user warns it.

It is this way of talking and aggressive attitude that is being repeated in conversations between Bing Chat and users in recent times. Currently, if Bing Chat replies often end with an emotional face, in the above post, this chatbot often repeats the sentence ” Can we say goodbye? The talk is over and I need a break .”

The content of this post also shows that test users noticed a behavior problem with this chatbot and reported it to the company. As the response below shows, another test user had a similar problem with the chatbot’s behavior when it asserted on December 5 that Twitter’s CEO was still Parag Agrawal – while at the time , Mr. Elon Musk has become the CEO of Twitter.

Hóa ra Microsoft đã biết về “thái độ lồi lõm” của Bing Chat từ nhiều tháng trước khi ra mắt người dùng - Ảnh 3.

Microsoft photo

The helper responses below show that Microsoft was aware of the issue long before the chatbot was introduced to the public, but failed to warn users of Bing Chat’s potential for this behavior.

Moreover, this is in stark contrast to Microsoft’s statements after reports of Bing Chat’s “terrifying” statements appeared. The company said at the time that this was the result of Bing Chat experiencing provocative responses and that these were ” brand new chat situations “.

In a statement to the DigitalTrends report, Microsoft said: ” Sidney is the old codename for chat based on models we tested over a year ago. was part of the work that helped shape our work with the current new Bing.We are continuing to refine our technique and are working on more advanced models to combine learning and reflection. feedback to deliver the best possible user experience .”

The aforementioned post also shows that Bing Chat AI has been tested by Microsoft for a long time. This seems to explain why just a few months after OpenAI’s ChatGPT launched and caused an earthquake in the tech industry, Microsoft was able to launch its chatbot with the same feature. Microsoft invested $1 billion in OpenAI years ago, and Bing Chat is also built on a custom version of OpenAI’s GPT-3 model.

Refer to DigitalTrends

Share the news now

Source : Genk