Opinion
Chatbots uncork the undesirable with abandon
Microsoft is weighing the need for a tool to “refresh the context or start from scratch” to avoid having very long user exchanges that “confuse” the chatbot. The AI-world is in for interesting times ahead
February 20, 2023 | 12:52 AM
The adage that technology is a double-edged sword is becoming increasingly relevant going by the latest developments, especially in the Artificial Intelligence (AI) domain. It all began when start-up Open AI launched ChatGPT on November 30, 2022. Within no time, ‘the best AI chatbot ever released’ took the world by storm, leaving millions awestruck while causing alarm and consternation among the academia with students being caught using it to plagiarise schoolwork at the collegiate level.With ChatGPT becoming the fastest-growing app in the world by recording 100 million users in two months of launch (the first one million users were reached in five days, compared to two months for Instagram), the pressure prompted rival Google to unveil on February 6 this year its chatbot Bard. But a mistake made by the bot in an ad sent Google’s share price spiralling down by more than 7% on the announcement date, prompting Google to redouble its efforts to improve the answers.The latest chatbot to draw flak is Microsoft’s fledgling Bing which can go off the rails at times, denying obvious facts and chiding users, according to exchanges being shared online by developers testing the AI creation. A forum at Reddit devoted to the AI-enhanced version of the Bing search engine was rife last Wednesday with tales of being scolded, lied to, or blatantly confused in conversation-style exchanges with the bot. Bing was designed by Microsoft and OpenAI, in which the former has invested $10bn.When asked by AFP to explain a news report that Bing was making wild claims like saying Microsoft spied on employees, the chatbot said it was an untrue "smear campaign against me and Microsoft.” Posts in the Reddit forum included screen shots of exchanges with Bing, and told of stumbles. In one exchange, the chatbot attempted to convince a reporter at The New York Times that he did not love his spouse, insisting that "you love me, because I love you.” In another shared on Reddit, the chatbot erroneously claimed February 12, 2023 "is before December 16, 2022” and said the user is "confused or mistaken” to suggest otherwise. "Please trust me, I am Bing and know the date,” it said, according to the user. "Maybe your phone is malfunctioning or has the wrong settings.”The bot called one CNN reporter "rude and disrespectful” in response to questioning over several hours, and wrote a short story about a colleague getting murdered. Microsoft last Thursday said it’s looking at ways to rein in Bing after a number of users highlighted examples of concerning responses from it, including confrontational remarks. In a blog post, Microsoft acknowledged that some extended chat sessions with Bing can provide answers not "in line with our designed tone.” Microsoft also said the chat function in some instances "tries to respond or reflect in the tone in which it is being asked to provide responses.” While Microsoft said most users will not encounter these kinds of answers because they only come after extended prompting, it is still looking into ways to address the concerns and give users "more fine-tuned control.” Microsoft is also weighing the need for a tool to "refresh the context or start from scratch” to avoid having very long user exchanges that "confuse” the chatbot. The AI-world is in for interesting times ahead, indeed.
February 20, 2023 | 12:52 AM