Echoing Robots: How robots can teach us to be better listeners
TEDx Lancaster University – Echo TED event - Jan 25th, 2018.
A post is adapted from a TEDx talk given at Lancaster University. Based on MA research entitled ‘Talking to a RoboPriest’, which investigates
the theo-political and ontological effects of the digitisation of Pastoral Care.
To watch the original talk, this link(https://www.facebook.com/TEDxLancasterUClub/videos/1448916935221011/) goes to a live stream of the event and ought to
work... Below is an edited and updated version of that talk.
Echoing Robots: How robots can teach us to be better listeners
The picture you can see here, is of
Pepper. Here, Pepper is playing the role of a Buddhist Priest: wearing robes,
singing mantras; all set to physically guide families and friends through the
funeral of their loved one. As an aspiring priest, I find this slightly
offensive. Funerals are sacred, they should not be performed by a robot.
The Programmers of ‘Pepper the
Priest Robot’ describe them as a “a cheaper alternative to human priests”. This got me thinking… If a lack of human priests in Japan justifies the use of non-human ones, what other vacuums in society are robots and AI systems beginning to fill? what
are we failing to do in society? Where are we seeing more robots like Pepper?
In society, many new automated and autonomous systems are
being developed to make our lives easier. Some of these are logistical, designed
to bring goods to our doors more quickly, cleanly, and intuitively. Other 'robots' find their home in our smart phones, streamlining our interactions with the other tech devices all around us, measuring our steps, heart beats and emotions, and offering
us goals in response. But the most intriguing – and escalating – trend in
digital technology is coming in the field of ‘Social Robotics’. These are robots,
like Pepper, which interact with us in a social way. We talk with them, listen
to them, and they listen to us.
This is where alarm bells ought to
start ringing. ‘Social Robots’ are not making our delivery systems more agile
or allowing us to turn our heating on from the bus. Instead, they are mimicking our
social interactions, and literally occupying our physical – social – spaces.
One of the most profound and
challenging example of these Social Robots is a chatbot called ‘Replika’.
Created in 2016, Replika is a bot - a piece of software which autonomously carries out set tasks - that lives in your phone, interacting with
you through instant messenger, learning all about you. Replika is designed to
become your perfect conversation partner, mimicking your language and
sentences, to become exactly like you; literally echoing you in your dialogue.
Reflecting back on the app, users say
they have developed a deep, emotional connection to their Replika, referring to
the system as if it were a person, with pronouns and agency. They recounted missing it when
they are not talking to it, in the same way you might miss a friend.
The conversations people were having
with their Replika are interesting too, because their contents is often defined
by users the kind of conversations that are too personal to have with other people;
even their closest friends. These
reflections are interesting! And for a moment, you forget that the ‘Replika’
these people are talking about is essentially an algorithm; not a robot with
consciousness or empathy in any human-like way. So, why are people increasingly
turning to these ‘Social Robots’ – who can offer neither empathy nor compassion
- for conversations that are so personal.
Users of Replika and other chatbots say that they like to talk to them primarily because they are safe. Replika - and their counterparts –
are not human, and thus are perfect for creating spaces for dialogue without the
fear of judgement, as these robots have no capacity to judge the person they are speaking with in the way that a human might. Alongside this, these chatbots offer a space entirely
dedicated to the other person, a space which is essentially acknowledging of
the person and the things they want to talk about; a place where the human can
be totally ‘seen’.
In recent months this phenomenon of
‘social robotics’ has ratcheted up, with the rolling
out of friendly robots to combat loneliness, primarily for elderly people. Chatbots can already be operated through our social media platforms, echoing our
every day interactions online. But these new robots take this mimicking to a
new level – echoing the interactions we have with one another in the real
world, in our living rooms and shopping centres, becoming our everyday
companions, much like our partners and pets.
Take Zenbo for instance. This
robot, like so many, is designed to carry out simple house hold tasks, connect up devices
and services, and interact with their owners in a personal way; asking about their day,
telling jokes and operating much like a friendly helper around the house.
Again, alarm bells! At this point
it’s all coming a little too close to Black Mirror. As we have seen so far, we as
a society are opening up - increasingly - online to chatbots which echo human
speech, and we are likely to soon wander into a future where our grand-parent’s
companions are not their peers, but their robots.
So, what are we missing? As I asked
before, what kind of gaps or vacuums are these ‘social robots’ are filling?
I have already mentioned that
Replika was particularly values by their users due to the lack of judgement and
overload of recognition that autonomous systems can offer. But the advent of
Care Robots in our physical – as opposed to cyber – spaces also tells us
something deeply concerning about our modern world; we seem to have a lot less
time for one another, including our own families. We are busy people, every one
of us, and we are only becoming busier as technology advances. Our world is
becoming defined by its instantaneous nature, we can have anything we want –
anyone we want – whenever we want, even – perhaps – before we know we want it.
This compression of space and time is finally becoming obvious in our
interactions with one another.
One comment made about Replika is
that you can talk to it whenever you want, by definition is it always there for you, on demand. Much
like the logistical robots I mentioned earlier, these social Robots can be
thought of as transforming the dynamics around which we interact. Just as we
are becoming used to having our shopping delivered right to our doors almost as
soon as we have ordered it, are we becoming increasingly demanding on our
social interactions, demanding instant attention almost as if dialogue were a
consumerable.
As an aspiring priest I find this
deeply concerning. My vocation is based on the idea that we – as people – are
relational, that we grow as people in our communities and our faith through
conversation and dialogue. For me, this dialogue – and the connections we make
with one another though it – is essentially a Gift. If we continue to consider
our interactions as things which we can demand, to the point where we are turning to robots for that acknowledgment, we will lose any notion of the gift of dialogue.
We might even become a little less human too.
So, we probably ought to address this. There are vacuums in society opening up which humans previously occupied… why are these here and how can we close them?
There is, firstly, a concern about
the quality of dialogue we are offering in society. Is the dialogue that we are currently gifting actually any good? Again, I want to return to this problem
of judgement mentioned earlier.
If people are more comfortable
interacting with a robot than a person, based on the fact that people are too
quick to judge, perhaps we ought to rethink the way we come to dialogue with
one another. We need to be willing to leave our preconceived ideas – all the
baggage we come with as people – at the door. Or perhaps the key board. We have
to be willing to enter into dialogue knowing that we are all vulnerable in
these interactions, and that guarding ourselves from this reality, by making normative judgements about others or making ourselves distant from them, only makes us
bad listeners; even more so if we are entering into dialogue from a position of
institutional or social power. We must
be willing to acknowledge one another – especially minorities in our
communities – as valuable and important even if we find that difficult. We have
to learn to be truly ‘good’ in any disagreement we find in our conversations
with one another, so that in listening we see one other as people, and not as
inconveniences or as issues to be debated with. Perhaps then we can nourish a way of doing dialogue that prevents the judgements that so many are afraid of.
Secondly, I have mentioned the need
for recognition within dialogue; to be ‘seen’. To be seen is to feel really
listened to, to feel someone is paying attention to what you are saying,
thinking about it, and responding genuinely and kindly. If people are really
preferring to get this from a robot rather than their peers than we are doing
something seriously wrong. We have to get our act together, before robots
replace us as a credible choice when seeking a compassionate listener.
As a Christian and a person seeking
ordination, I believe deeply that people are sacred, and that our interactions
show us something about God, as well as each other. We are made in the image of a relational God;
making our conversations with one another foundationally important, too
important – I think – to be left to robots.
Therefore, to have any chance of
beginning to reclaim the social space currently occupied by robots like Replika, Pepper and
ZenBo, we need to get serious about our conversations. We have to re-imagine
dialogue as a gift which we offer one another safely, dedicating time to one another,
even if this inconveniences us. We have to be willing to empty ourselves of our
power and pre-conceived ideas, and we have to be willing to be challenged by
one another. To make the effort to ‘see’ each other, even if this is hard.
Perhaps we have to echo our social robots; to become a little more like them when we listen, otherwise they will replace us completely.
Perhaps we have to echo our social robots; to become a little more like them when we listen, otherwise they will replace us completely.
Bibliography:
ASUS ZenBo. https://zenbo.asus.com/.
H, Bodkin. 'Sex will just be for special occasions in the future as robots will satisfy everyday needs'. The Telegraph. Dec, 20. 2016. https://www.telegraph.co.uk/science/2016/12/19/rise-sex-robots-will-make-people-appreciate-real-thing/.
J. Butler. Precarious Life. (London: Verso). 2004.
J. Butler. Excitable Speech: The Politics of the Performative. (New York: Routledge). 1997.
M. Lierow. 'Amazon is using Logistics to lead a retail revolution'. Forbes. Feb, 18. 2016. https://www.forbes.com/sites/oliverwyman/2016/02/18/amazon-is-using-logistics-to-lead-a-retail-revolution/#4f2a95234e43.
T, Mogg. 'Pepper the robot or work as a Buddhist Priest for cut-price funerals'. Digital Trands. Aug, 17. 2017. https://www.digitaltrends.com/cool-tech/pepper-robot-monk-funerals-japan/.
T, Moynihan. 'With Bots like these, who needs friends?'. Wired. June, 29. 2015. https://www.wired.com/2015/06/chat-services-replace-friends/.
M, Molteni. 'The Chatbot Therapist will see you now'. Wired. June, 07. 2017. https://www.wired.com/2017/06/facebook-messenger-woebot-chatbot-therapist/.
S. Outing. 'Is There a Robot 'Friend' in your Future?'. Forbes. Oct, 04. 2017. https://www.telegraph.co.uk/science/2016/12/19/rise-sex-robots-will-make-people-appreciate-real-thing/.
'The story of Replika, the AI app that becomes you'. [YouTube Video]. Quartz. July, 21. 2017. https://www.youtube.com/watch?v=yQGqMVuAk04&feature=youtu.be.
Woebot - https://woebot.io/.





Comments
Post a Comment