When we meet a person, we form an initial opinion within a few moments. Posture and facial expression give us an impression, which is then complemented by behaviour and expression. Whether we consider someone likeable, competent, or trustworthy often depends on the little things.

And the same is true for chatbots. Since they communicate in a human way – namely through language – we judge them by the same characteristics. That’s why it is important to pay attention not only to what a chatbot says (content) but also to how they say it (personality). But what kind of chatbot personality is the best? What are the most important factors?

We investigated these questions in cooperation with the University of Würzburg. Liibot, a chatbot for knowledge bases in the IT environment, served as the test object. The surprising findings are summarised in this article.

Many thanks to Fiona Wiederer for the design, implementation and evaluation of the study.

 

How does the chatbot personality make itself felt?

In the same way as with people: through clothing, behaviour, and manner of expression.

The “clothing” of a chatbot is the chat window. Bright and colourful or subtle and understated? Square or rounded? Times New Roman or Calibri? The chatbot icon is also important. A neutral speech bubble evokes different associations than a grinning mascot.

The behaviour of a chatbot mainly includes its timing. A bot that addresses the user of its own accord may seem more helpful – or, if the timing is bad, more intrusive – than a bot that waits until it is addressed. Timing is also important during the conversation. How long does it take to get the answer? When are additional questions asked, suggestions made, and tips given? When is the inquirer passed on to a human colleague?

Manner of expression is the third aspect of the chatbot personality. This was the focus of our study. Detailed or concise, casual or formal, personal or mechanical?

As an illustration, one variant of the Liibot uses phrases such as “Let’s have a look…”, incorporates smileys into the answers and speaks in the 1st person (“I can find pages and answer FAQs.”). This variant expresses views such as: “I suggest writing an email to the service desk.”

We contrasted this with a serious, goal-oriented bot. This bot always chooses the shortest possible wording and refers to itself as “This bot”. The answers consist only of facts and instructions, instead of recommendations and suggestions.

 

What effect does a different chatbot personality have?

This has been investigated in a whole series of studies (Jain et al 2018, Chaves & Gerosa 2020, Ruane et al 2021, …). In general, the effect of a personality trait depends on the domain. This means that, depending on the purpose of the bot and the user group, the same personality can create different impressions. For example, different qualities are desired for financial advice compared to fashion advice.

Users evaluate chatbots (like all media) based on certain characteristics. Above all, these include:

  • Usefulness (To what extent am I achieving my goal?)
  • Usability (How difficult is it to use the medium?)

These factors can also be combined into one question: Does the chatbot increase my productivity? This is the most important consideration when it comes to user satisfaction. Other influencing factors are entertainment value and general social intentions (e.g. to experience affirmation).

In other words, a bot that is too serious may not achieve its full potential because it comes across as boring. A bot that utters too many motivational phrases and jokes runs the risk of being used only as a toy. There is no universally optimal ratio between friendly small talk and task-oriented work ethic. Different users have different preferences. For this reason, it is important to adapt the bot not only to the task but also to the user group.

 

What personality should an IT chatbot have?

To find out, we gave the Liibot chatbot 2 different personalities. Liibot helps users from different IT sectors (DevOps, Service Desk, Management, …) to find information in an internal knowledge base. Technology-wise, the two versions of the bot are identical. They use the same language model, access the same data and use the same interface. The course of the conversation is also the same. Apart from the differences in the manner of expression already described, there are also two small differences in behaviour. Firstly, the avatar of the socially oriented bot changes its facial expression depending on the situation. The task-oriented bot, on the other hand, always maintains a neutral facial expression. Secondly, the social bot sometimes answers with a delay of up to 1 second, while the task-oriented bot always answers as quickly as possible.

80 test subjects (40 per bot variant) were then asked to solve a series of tasks with the help of the Liibot. Beforehand, they were asked about their attitude towards chatbots and afterwards about their evaluation of the Liibot.

The supposition

The initial assumption was that

  • the social bot would be rated as more entertaining and socially competent;
  • the task-oriented bot would be perceived as more competent, useful and easier to use.

Productivity is generally considered the most important quality of a medium. For IT companies in particular, efficiency plays a major role. Emotional aspects are considered less relevant. That’s why we also assumed that the task-oriented bot would perform slightly better overall than the social one.

The result

The first assumption turned out to be true. The social bot was rated as significantly more entertaining than the task-oriented bot. The difference in the evaluation of social competence was also significantly in favour of the social bot.

In terms of professional competence and usefulness, the two bot variants were rated equally. Since the bots conveyed the same content, this may not be too surprising. While it is true that personality can also influence perceived efficiency, this seems to require major adjustments, e.g. to the course of conversation or to the interface.

The other results were surprising. In terms of usability, there was a tendency – contrary to expectations – for the social bot to come out on top here as well. Users of the social variant even rated the overall quality of the Liibot service significantly better. They tended to be more likely to indicate that they would use the Liibot in the future and were generally more satisfied.

Why was this?

The fact that the social bot was rated as equally good or even better in every respect could have various causes. First, there is the test group. For the most part, the group did not consist of “real” IT workers, but of students who had to put themselves in the shoes of an IT worker by means of a scenario. This kind of methodology is often used. Nevertheless, it can distort the results. Students may have different requirements as a user group than IT staff.

The second reason could be that the social variant is the original personality of the bot. The course of the conversation, the interface and the avatar are chosen to suit this variant. This perhaps overshadows some of the strengths of the task-oriented variant.

And finally, there is of course a third explanation. It is quite possible that a serious, results-focussed bot might actually be less well received, even in the IT sector.

Conclusion

However the results of this study are interpreted, one thing is indisputable. The chatbot personality can be noticeably influenced by small changes in the manner of expression and behaviour. In addition, personality has a significant impact on how users evaluate the chatbot. This also means that the success of a chatbot does not depend only on the technology. The content and design are also important. Consequently, when planning a chatbot project, it is important to give as much consideration to the selection of the personality and the formulation of the answers as to the selection of the tools. Ensure your bot fits the company and creates the right impression with customers and partners. Just as you would in the case of a human employee.

 

 

 

Do you have any ideas, or feedback? Let us know: marketing@avato.net

Imprint: 
Date: September 2021
Author: Isabell Bachmann
Contact: marketing@avato.net
www.avato-consulting.com
© 2021 avato consulting ag
All Rights Reserved.

Share This

Share this post with your friends!