Technology

Microsoft admits long conversations with Bing’s ChatGPT mode can send it haywire

[ad_1]

Microsoft’s new ChatGPT-powered Bing has gone haywire on several occasions during the week since it launched – and the tech giant has now explained why.

In a blog post (opens in new tab) titled “Learning from our first week”, Microsoft admits that “in long, extended chat sessions of 15 or more questions” its new Bing search engine can “become repetitive or be prompted/provoked to give responses that are not necessarily helpful or in line with our designed tone”.



[ad_2]

Share this news on your Fb,Twitter and Whatsapp

File source

Times News Network:Latest News Headlines
Times News Network||Health||New York||USA News||Technology||World News

Tags
Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button
Close