Meta tests a new chatbot

BlenderBot 3 is a general-purpose chatbot that is presently only available in the United States.

Aug 6, 2022 - 01:42
 28
Meta tests a new chatbot

In its definition, attempts to provide both general discussions, such as those that may be established at any time at the bar, and answering the type of questions that are typically posed to digital assistants.

BlenderBot, like all LLMs, has been trained with enormous amounts of text material in order to develop patterns that, in retrospect, will be responsible for the AI's replies. Such systems have shown to be incredibly adaptable, with applications ranging from producing code for programmers to assisting authors in writing their next best-seller.

However, these models have severe flaws, such as the creation of biases from the datasets and the fact that when they do not know the correct answer to a question, they tend to make one up.

And here we can praise Meta because the goal of BlenderBot is to test a possible solution to the problem of fabricated responses. Thus, one notable aspect of this chatbot is its ability to search the Internet for information on specified themes.

Even better, users can click on your responses to see where you obtained your data. BlenderBot 3 can, in other words, cite its sources, offering tremendous transparency.

So, at least in theory, Meta's starting point is sound. The issue is that chatbots, as we know them today, have two shortcomings. The first is that its learning is constant, thus it just takes a big number of users to decide to establish a malevolent bias in the AI so that, if it lacks the essential ingredients to prevent it, it becomes "infected".

The second issue is similar to the first, in that this form of algorithm acts like a closed and opaque box, with no idea what transpires within. As a result, individuals in charge must rely solely on the AI's constant observation; they cannot "lift the hood" to see what is going on inside, making it difficult and time-consuming to identify problems and, in many circumstances, impossible to solve them.

Post by Bryan C.