Dancing with computers
By: Jacqui Bealing
Last updated: Friday, 20 July 2018
Elizabeth Churchill, currently Director of User Experience at Google, first became fascinated by the interaction between humans and computers when studying for an MSc in Knowledge Based Systems, at Sussex.
What came first – your interest in psychology or in technology?
Psychology. Most definitely. I got into AI (artificial intelligence) and cognitive science in large part because I wanted to think about human reasoning and learning, and cognitive models seemed like a reasonable approach to me then.
I became interested in augmentation - can we develop smart tools that augment human reasoning rather than model or replace it – tools that are well-executed, for a perfectly choreographed human-technology dance. Think of how two dancers can develop and then perform an elegantly executed piece.
This fascination really developed when I worked on my MSc thesis at Sussex with Ben du Boulay on intelligent tutoring systems.
I was also deeply inspired by Professor Maggie Boden, who encouraged me to think of technologies and computational techniques as reflecting our values as people, and that we can look at what we build and see ourselves inscribed therein. What follows logically from that is that if we don’t reflect on our moral values, we can inadvertently create truly destructive experiences.
We could start by just acknowledging we are not singular identities, but always shaped socially. That is what psychology teaches us.
What do you believe has been the greatest breakthrough in terms of Human-Computer Interaction (HCI)?
What HCI has really done, is to put the human at the centre of the design and development process. HCI is a broad area of interlocking disciplines, from Human Factors to applied computer science, to psychology. HCI initiated a series of innovations from design methods, to user experience research and usability, to the idea of graphical user interfaces and voice user interfaces, to adaptive layouts and interfaces, and input devices that are designed for people.
Through the gathering and application of big data, are we on the threshold of a major change for good? Or will we drown in it?
It is going in both directions. When I was doing my MSc at Sussex, one of my lecturers introduced me to the concept/phrase “Garbage in - Garbage out” - that is, if you don’t understand the data and the quality of the data you put into a “black box” process, you will not truly understand the results you get out the other end.
So, the main problem I see is overconfidence in the data without a deep reflective perspective on the provenance and quality of the data. Too often, people only see what they want to see, and they have very little rigor and ‘hygiene’ around analysis. We need better tools for data quality evaluation and analysis, and for interrogating what is going on in the “black box(es)”.
AI seems to be steaming ahead, with fears that it will decimate jobs for humans. What is your view on this?
Technologies automate the easiest things to automate and leave the hard tasks to people. There is a great paper, Ironies of Automation, which discusses the ways in which automation of industrial processes may expand rather than eliminate problems with the human operator. We all know the issues with the automation of domestic tasks in the home; we actually shifted our expectations of what “needs” to be done in the home. Instead of time saving, people continued to spend the same amount of time doing different domestic chores. So, I think the nature of work is changing and there will be jobs that go away and others that emerge. A good example is the role of Data Scientist; there was no career known as a Data Scientist a few years ago.
It’s frequently said that technology is causing us to become more isolated, less trusting of our own instincts, more reliant on the computers making those choices for us (eg matchmaking). Is this an overstatement?
Yes I think it is a little overstated. Technology is allowing us to connect in all kinds of ways with others, but it is a tool and is only as good as our ability to use it critically and judiciously.
Someone who does one search query and is satisfied with the top result and does not dig deeper either has a very simple question or is not doing their due diligence. Someone who believes that a dating site will solve their intimacy issues and find them the perfect partner without them doing some serious emotional work is probably being naive. And someone who thinks that a video conference once a week is equivalent to actually seeing someone and spending time with them sharing meals and other experiences is kidding themselves.
One of the interesting things that has happened recently is the move from FOMO (fear of missing out) to JOMO (the joy of missing out), a call to turn off the technology and make some time for ourselves away from social media scrutiny. More and more people are becoming interested in technology ethics, and in disconnection rather than connection.
Is technology causing us to lose our humanity and to be less kind?
Rather than less kind, I think we may be losing our experience of developing empathy for perspectives that do not align with our own. As many scholars have noted, we live in our own “bubbles” too much. We walk down the street and stare at our phones and don’t look at each other.
There is a really interesting book called Selfie which goes into the idea of “social perfectionism” and the unrealistic expectations it can burden people with. I worry that we are less kind to ourselves as a result of this - the feeling that one has to package oneself is really hurting a lot of teenagers in particular who feel extreme pressure to be a certain way, live a certain life, showcase a certain success, and hide feelings of vulnerability and fragility for fear of being ostracised or, even worse, bullied. People are so hard on themselves now and worried about their status updates. More self-compassion, more kindness to self, will I believe lead to more compassion and care for others. Affix your own mask before helping others, as they say.
What are you most optimistic about in terms of tech development in the future?
Medicine and new techniques for diagnosis. Also smart prosthetics. And if AI is useful I’d like to see better educational tools that mould to different learning styles and different abilities to help people who are not “normal” learners (for whatever normal means) - e.g. better support for those who are dyslexic, assistive tools for people with memory issues, and create assistive technologies for those with cognitive impairments. And assistive technologies to help older adults live independently, while ensuring they are not isolated.
What could universities be focussing on in terms of tech development?
The focus should be on long term value - not just technology for technology’s sake or cool consumer devices that are obsolete in short time frames.
The big problems – like developing truly smart urban environments that focus on long term sustainability – cannot be tackled by small groups and may not be tackled comprehensively by businesses that are under pressure to meet their quarterly accounting numbers. The incentives can drive toward quick solutions that leave little time for evaluation focused on long term value and no time for broader scale, ethical review. Universities are able to take a longer and broader perspective. Technology is part of the solution, not the whole solution. The real work is in asking the right questions and focusing on the right scale of impact at all levels. There is a useful framework called STEEPLE that is worth keeping in mind. STEEPLE stands for the range of concerns to address and applies nicely to technology: Social, Technological, Economic, Environmental, Political, Legal and Ethical. I believe universities, and especially universities like Sussex who have always invested in cross-disciplinary thinking, are where such bigger scope analyses can happen.
Do you ever doubt your own existence?
What an interesting and philosophical question! No not really. I feel very solidly on this earth and of this earth.
What I do try to question are my firmly held beliefs. I think it is critical that we don’t take ourselves too seriously. “Walk a mile in someone else’s shoes” as they say. Someone else likely has different, if not more information than I do. So I doubt the existence of stable or perfect knowledge. And I embrace the possibility that someone else’s perspective may be better than mine.