Research has consequences. Sometimes, negative ones.

I have been working on a research proposal looking at the impact of artificial intelligence on a particular profession. It’s a multidisciplinary project with various angles on the issue, and mine is what that means for the customer experience.

wadamson_theimpactofaionjobs
Image source

It is possible that the results from our project could end up showing that a particular group of people are very likely to lose their jobs to the technology, in the not so distant future. Hence, as part of our discussion and preparation for this project, we talked about our responsibilities to that group. For instance, we have to be extremely careful in terms of reporting the findings in order to avoid creating unnecessary anxiety; and we have to direct them to sources of emotional and practical support.

 

However, at one point in the conversation, one of the team membersasked a very simple, yet very important (and uncomfortable) question:

Is it possible that our research project will actually increase this group’s problems? For instance, will service providers end up discriminating against this group because our research brought this group’s job vulnerability to light?

 

What a dilemma. Do you go ahead with a research project, knowing that it can disadvantage a group of people? Or do you give up on that project knowing that the problem is not going to go away, if we don’t research it? The former could be callous, the latter coward.

 

While I was pondering this problem, someone mentioned the speech made by particle physicist Yangyang Cheng, at the Atomic Scientists’ 2018 Annual Dinner and Meeting, where she received her 2017 Rieser Award. In her short but very interesting speech, which I encourage you to watch, Dr Yangyang Cheng talks about the need for researchers to consider the consequences of their work for society.

Yangyang Cheng, 2018 Rieser Award recipient from http://www.thebulletin.org on Vimeo.

 

Around minute seven, Dr Yangyang Cheng says that “Scientists have an ethical and moral obligation to be aware of the social cost of their work and strive for the peaceful and responsible use of their technology” because “the ethical use of technology can’t stop at water’s edge, just as a nuclear fallout or climate change recognises no national borders.” She goes on to give the example of scientists in China who worked on biometric technology which is now being used to surveil and discriminate against a particular ethnic group in the north-western region of the country.

 

Around minute nine, Dr Yangyang Cheng argues that “the inherent risks of new scientific advancements or technological break-throughs, are not derivatives of, and can not be masked by, the nature of a political system. Democracy is fragile. Freedom is not free. And nothing is to be taken for granted.” And around minute 11, she concludes that: “A scientist can’t retreat behind the notion of being a neutral explorer of nature.”

 

So, that’s where I am now. Facing the moral responsibility to untangle this particular knot. Ideas on steps forward welcome.

 

* Incidentally, hooray for multidisciplinary research teams

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s