The value of service robots in embarrassing service encounters

While we tend to think of an attentive member of staff as the epitome of customer service, there are circumstances when customers would much rather not have to interact with one. Maybe they are in a hurry. Maybe they want to avoid small talk. Maybe they prefer to socially distance. Or, maybe, they are a little bit embarrassed by the specific service encounter – for instance, because the purchase is associated with social stigma (e.g., sexually transmitted diseases); or because it requires customers admitting that they made a mistake (like that time I shrank the curtains, because I misread the washing instructions).

Service providers may help customers navigate those embarrassing situations by offering self-service options, like vending machines or self-checkouts. However, this type of solution may not be suitable for complex exchanges. Would customers prefer to interact with a robot enabled by artificial intelligence, rather than a member of staff, in those circumstances? And, if yes, why?

Those are the questions explored by Valentina Pitardi, Jochen Wirtz, Stefanie Paluch and Werner H. Kunz, in the paper “Service robots, agency and embarrassing service encounters”, which was recently published in the Journal of Service Management. 

Pitardi and her colleagues used a mixed methods approach to examine how customers respond to service robots in the context of embarrassing service encounters. First, through in-depth interviews, the authors found that, when experiencing an embarrassing situation, customers felt the most discomfort when:

  • Interacting with a human (vs. a robot)
  • Interacting with a robot that looked like a human (vs one that was clearly a machine)
  • Using voice for the interactions (vs. using text)

Furthermore, they found that interacting with robots was deemed better than interacting with staff, in embarrassing scenarios, because robots were seen as not being able to make moral or social judgements, and not being indiscreet.

Image source

The authors, then, ran two experiments. In the first one, participants were instructed to book a medical appointment for an embarrassing vs a non-embarrassing condition (namely, haemorrhoids vs gastritis). As shown in the figure above, the participants expected to feel less embarrassment when interacting with a robot than when interacting with a human, and that difference was particularly significant for the embarrassing medical condition. 

Image source

In the second experiment, the scenario was going to a pharmacy to collect an anti-fungal treatment for genitals. This experiment revealed that the main reason why participants preferred to interact with the robot was that they felt that the machine wouldn’t be able to think and form opinions (i.e., it lacked agency). 

I am not sure that picking up some medication (the scenario used in the second experiment) qualifies as a complex service problem. It doesn’t seem more complex to me than, say, fulfilling an ice-cream order. Personally, I would have chosen a scenario where the participant is undergoing an intimate medical examination. Still, I find the focus on the robot’s inability to form opinions (as the main reason for choosing the robot over the member of staff) very interesting. It made me wonder if this would be a useful intervention, for instance, to deal with student queries. There are some students that send you an e-mail for every single question that they have, even when the answer is clearly spelled out in the module guide or could be easily found on Google. However, there are also those students that delay coming to see you about a problem that they feel is embarrassing. For instance, I once had a student who accumulate a huge amount of debt due to an addiction. When those students, eventually, seek help, the problem has snowballed, and it is much more difficult to solve. Wouldn’t it be great if I could get a robotic teaching assistant to answer student queries? Alas, not likely to happen anytime soon.

Anyway, if you provide a complex service where your users are likely to feel embarrassed, these are the recommendations provided by Pitardi and her colleagues:

Service managers should encourage the deployment of service robots with a low level of perceived agency in potentially embarrassing service encounters. 


(T)ext-based robots (e.g. chatbots) are the customers’ preferred delivery mode in potentially embarrassing service encounters, while humanoid (e.g. Sophia) and humanlike robots (e.g. Pepper) appear as less suitable because of their appearance. Given this, service managers could adapt the type of service robot configurations to specific service contexts and deploy less humanlike robots in situations where customers may experience embarrassment (e.g. body measurements and examinations in medical clinics).

Finally, our study shows that consumers value and appreciate the level of interaction privacy and anonymity that service robots can offer during the actual service encounter. However, at the same time, customers display concerns about the privacy of their personal data. To mitigate data security concerns, firms should follow and communicate to customers their adoption of best practices in data privacy and corporate digital responsibility.”

And now, if it’s not too embarrassing, tell me: When do you prefer interacting with a robot than a member of staff?

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s