Users say Microsoft's Bing chatbot gets defensive and testy
Microsoft's fledgling Bing chatbot can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation.. Posts in the Reddit forum included screen shots of exchanges with the souped-up Bing, and told of stumbles such as insisting that the current year is 2022 and telling someone they have "not been a good user" for challenging its veracity.