I just came across a very interesting study published in the Journal of Marketing Management, which examined how smart speaker users viewed their devices, and how that related to their experiences of using the device.
The study was conducted by Fiona Schweitzer, Russell Belk, Werner Jordan and Melanie Ortner, and its title is “Servant, friend or master? The relationships users build with voice-controlled smart devices”. I am not aware of any open access versions of the paper, at the time of publishing this post – the paid version is available here.
Professor Schweitzer and her co-authors found that most users (20 out of 39 research participants) described their devices as servants who followed orders and helped their masters (i.e., themselves, the users) to complete tasks. These users tended to have a positive assessment of smart speakers, describing them as nice, friendly or helpful. They compared their devices to a dog or an employee, and visualised them as thus:
“She is a perfectionist and very civilised, dressed like a secretary – blouse and jeans. And she is very shy, stays in the background, and obeys if talked to.”
A second group of 10 participants (out of a total of 39) saw the devices as amusing and sassy partners, with a life of their own. These users tended to have a neutral assessment of smart speakers, describing them as attractive and likeable. They compared their devices to a serving assistant with personality, or a companion trapped in a machine; and visualised them as:
“female and helpful and…quite attractive. Rather tall, slim, black-haired.”.
The final group (9 participants out of 39) saw the smart speakers as masters, with their own will and who were somehow unpredictable. They tended to have negative assessments of this technology, and described their devices as erratic and untrustworthy. They compared their devices to gender free, disobedient agents; and visualised them as:
“a senile old man, who can’t hear me properly anymore …. When I say something and it doesn’t understand me (…), it then says something completely unrelated”.
While the overall number of respondents in the second / partner and third / master categories was very similar (10 and 9 respondents, respectively), the gender composition was dramatically different. Namely, 8 males and 2 females reported seeing the smart speaker as a partner; whereas 2 males and 7 females reported seeing the smart speaker as a master.
VCSA as a servant | VCSA as a partner | VCSA as a master | |
Perception of nature of the relation with the VCSA | |||
Perceived allegory of the relationship between VCSA and U | Dog (VCSA) and master (U) Serving parent (VCSA) and child (U) Empowering object (VCSA) and empowered user (U) Obedient servant (VCSA) and master (U) Subordinate (VCSA) and superordinate (U) person at work |
Equal partners Adaptive child (VCSA) and parent (U) Serving assistant with personality (VCSA) and me (U) |
Suspicious other (VCSA) and me (U) Disobedient other (VCSA) and me (U) Incapable other (VCSA) and sufferer (U) |
Extended-Self/Digital Self/Assemblage perception of VCSA-user relationship | No acknowledgement of object experience, self-extension focus with utilitarian goals; the VCSA is part of me because, like my hand, it does what I command; I command this assemblage effectively. | Acknowledgement of object experience, longing for self-extension through social interaction; here the device is like a prosthesis and really becomes a part of me; it is a symbiotic assemblage. | Acknowledgement of object experience, fear of losing control over one’s digital self, and reluctance to be constrained by object experience; here the self has lost its agency and feels diminished; the assemblage is frustrating and does not really work. |
Anthropomorphic perspective of VCSA | Female, nice, intelligent, accurate, rigorous, dumb, helpful, reliable, competent, loyal, businesslike, obedient, frigid, factual, remote, emotionless, professional | Female, tall, sassy, perky, humorous, abrupt, shy, helpful, nice, friendly, unassertive, intelligent, factual, organised, and serious | Gender-free, intelligent, ignorant, inflexible, annoying, obstinate, and erratic |
Experience and implications of interacting with VCSA: | |||
Perceived difficulty of interaction | Low | Medium | High |
Interaction perceived as | Useful and efficient when successful Useless and ridiculous when unsuccessful |
Engaging and fun when successful Disappointing and disillusioning when unsuccessful |
Disconcerting and worrisome |
Future usage intention | Wish for better ability to take over complex tasks Intend further usage with easy tasks with which the VCSA is currently good |
Look for autotelic engagement with the VCSA Intend future usage for fun |
Seek solutions that allow U to regain control Refrain from future use of VCSA |
Informants: | |||
Male | 6*, 16*, 18*, 21*, 22*, 28* 1**, 3**, 4**, 23**, 24**, 33**, 35**, 38**, |
5*, 10*, 17*, 20*, 32*,37* 12**,14** |
19* 25** |
Female | 8* 9**, 15**, 29**, 34**, 36** |
2*, 11** | 13*, 27*, 31* 7**, 26**, 30**, 39** |
VCSA = Voice controlled smart device, U = User, * = Google User; ** = Apple User.
Image source
Even more interesting than these classifications, however, is the consequence of these assessments and perceptions in terms of device use. The research team looked at the relationship between the device characterisations on the one hand (i.e., servant vs partner vs master), and the users’ willingness to use their devices on the other. They found that, in general:
“those respondents who saw the [smart speakers] as a servant were more ready to use the[m] in the future than those who regarded it as a partner or a master.”
Moreover, the researchers found that:
“[Smart speakers] as partners seem attractive at first, but that users are likely to emotionally abandon them if they do not live up to their expectations emotionally.”
Faced with these findings, the manufacturers of smart speakers, and of AI interfaces more generally, would be well advised to present their technology as an effective, emotionless, subservient, female-like employee; but one that is one not too competent, too knowledgeable, or too influential. The issue, though, is that AI can be more competent than the user, has access to numerous knowledge repositories and is extremely influential by virtue of its algorithmic decision making which determine what answers or product suggestions we are served vs. which ones are left out.
This gap between what users are comfortable with vs. what the technology can actually do, reminded me of the case of profiling technology and targeted marketing. As the (in)famous case of Target’s personalised offers to a teenager who was in the very early stages of her pregnancy has shown, we tend to react badly to evidence of just how revealing our personal data can be. As a result, marketers working in the field of personalisation often refrain from showing just how much they do know about their customers; meaning that we are lulled into a false sense of security about how much data specific companies have about us, and how insightful such data are.
Likewise, could presenting the smart speakers as helpful servants lead us to a false sense of security about how much in control we are, and how much choice we have?
Did you read the most recent revelation about humans actually listening in on conversations?
M x
LikeLike
Yes, yes. I saw it.
Artificial intelligence is, currently, performing well below the level that many users believe it is. As a result, most companies are using people to review, or even entirely handle, the interactions with AI, to try and improve its accuracy (e.g., https://anacanhoto.com/2019/01/24/the-human-backstage-of-tech-businesses/).
What I found distressing in these news was that, when some staff at Amazon raised concerns about some possible cases of sexual assault captured in the recordings, they were reportedly told that it was not their job to intervene (I suspect in order not to make it so obvious that they are listening!).
LikeLike