It has just been announced that Google has opened its chatbot Bard to users in the UK and the US, which is a lightweight version of Google’s LaMDA general-purpose language pre-trained big model. But the bot, which has been in public testing for almost a year, made an embarrassing mistake when a user asked the bot when it would be shut down by Google and the bot incorrectly replied that it was already shut down.

A Google spokesperson was expressed flat out yesterday that Bard’s answers would contain factual errors and it’s one of the main problems with such big language models nowadays, so why even report this story? Factual errors are nothing new to chatbot responses. They’re one of the main problems with these big language models these days. There is no point in picking on Bard, but I thought it was an interesting mistake.

READ MORE: The Horizon Worlds VR social network is getting traction

A user on Hacker News made a joking post on Hacker News which suggested that Google would shut down Bard within a year, noting that Google had shut down many of its own services over the past few years. This was the source of Bard’s incorrect answer.

There are also some factual errors in Bard because it is based on information from the internet, and the new Bing from Microsoft has a similar issue due to the fact that it is based on the web as well. Getting information from large language models (LLMs) is one of the biggest challenges confronting artificial intelligence based on large language models (LLMs), and both Google and Microsoft are working on improving the process of getting information as time goes on, but there will still be some issues from time to time. Aside from this, Bard is still a preview product, so it still needs to go through a lot more testing in the public before it becomes final.

It is imperative that these tech giants seriously consider ways of making these chatbots better at identifying fake messages. This is a problem that they must address head on.


Please enter your comment!
Please enter your name here