In a single day I go from looking at how algorithms are used to produce news, to exploring the social implications of designing wearable technology for people who are blind, to how we can help people with HIV, and design probes that help people understand complex ideas
Tell us about your background and journey to your current role
I came late to academia – in my prior career I worked as a researcher in the third sector for a non-profit organisation that advocated on behalf of adult learners. The specific research agenda I worked to was framed as digital inclusion, and it was during that time that I realised not only how important and empowering technology could be, but also how open to manipulation, subversion. It could be as much a barrier as an enabler. Many of the people I worked with, and on behalf of, struggled with basic ICT skills and relied on intermediaries to help them complete fundamental tasks online – the film ‘I’ Daniel Blake’ was, and still is ,a reality for a large tranche of our society, and in our data-driven world it’s all too easy to forget that.
I soon came to realise that, rather than working to an organisational agenda, I wanted to explore my own research and so I applied for an EPSRC PhD scholarship, and that was the beginning of my academic career. My undergraduate and Masters degrees are in Politics and International Relations and so the idea of multi-method, multidiscipline research is pretty much hard coded in my practice – the boundaries and barriers to this which are often present in academia are not really present in non-academic research and I felt strongly that I wanted a career that not only focused on social good, but also one that was agnostic when it came to methods, and so Human Computer Interaction was the obvious choice for me. From getting my PhD, where I focused on ethics and the problem of human consent in the context pervasive systems, I worked as a research fellow at Horizon Digital Economy Research Hub (University of Nottingham), and then moved to Microsoft Research where I worked in the Human Experience and Design group in parallel to a fellowship at Corpus Christi College (University of Cambridge). Here I was able to explore new methods and really develop an understanding of the product development process and all of the factors that come together during the flow of tech innovation. Despite the excitement of industry, I again felt that what I needed was to be able to drive forward my own research agenda and so, when I was successful in my application to become a chancellor’s fellow here at Edinburgh, I grabbed the opportunity.
Can you tell us about a typical day at work – what projects are you working on at the moment?
I’m not sure there is a typical day at work for me – part of the benefit of having a fellowship position is that you get to decide what you work on, when, how and with whom. Having said that, I’d say that I always start by reading my emails and flagging the ones I need to respond to, over breakfast. I tend to end the day by responding to them, once I’m home. When I’m in the office I usually have a to-do list to work through! Since I started at the university, I have developed a portfolio of projects, all around my own research agenda which falls within the remit of moral and understandable data-driven systems. At the moment, people tend to frame this work as data or AI ethics, but it’s basically about exploring, understanding and supporting the design of systems that work for people in ways that are fair, aren’t harmful, and allow us to flourish. Algorithms basically underpin any contemporary critical system you can think of really. So in a day I can go from looking at how algorithms are used to produce news, to exploring the social implications of designing wearable technology for people who are blind, to how we can help people with HIV manage and share their personal health data, to creating games and design probes that help people understand complex ideas.
In addition to my academic work I am also a consulting researcher for Microsoft research, in the area of AI and ethics, and I am a fellow at the Alan Turing Institute which means I tend to travel to London quite a lot.
At the moment I am working on a few projects and they are quite broad but with a common core of ethics. One (INTUIT) is about how we might create systems to support those with HIV share broader health data with clinicians and third parties, another is a Network Plus in Human Data Interaction, where I allocate funding to other projects to create a wide network of practitioners who all, in some way, are working towards helping us to better understand the issues around human-data relationships. I also work on the consent issues that emerge from wearable smart systems, and my most recent project – part of the PETRAS centre – that is just starting is looking at the use of algorithms in news production and how the resulting systems might be better designed to support human understanding, both from the journalistic and end user perspective.
What is your vision for data innovation and the City Region Deal, if you have one?
My vision for data innovation is one where we put people first in order to create a future that is inclusive and sustainable. So, human-centred data driven innovation! There’s so much potential inherent in the City Region Deal, but the biggest challenge is how to make it work for everyone in ways that support and sustain human dignity and enable human flourishing.
The point is not to innovate for its own sake, but to ensure that our lives and future are substantively better as a result.
What are you particularly passionate about in your work? What do you look forward to in your field?
My core passion is working out how we can take all of this academic research and make it work for people. Because really, what is the point of us if we’re not making a difference, even if it’s a little one. On a more practical level, analysing the data that comes out of my research is pretty exciting. That moment where you realise you’ve found something new can be really satisfying. The moments when I feel most proud is when someone who is not an academic make use of my work – that’s the moment I know I’ve done something right.
In terms of looking forward, I guess the obvious thing is I look forward to a point when DDI has progressed to the extent that it’s no longer necessary to have ethical specialists. I think when you have a social agenda the thing you’re aiming for is to do yourself out of a job because that shows you’ve succeeded. In reality, as an academic, you’ve often moved your agenda on by that point, but it’s still the end goal.
Do you work with any interesting data sets, technologies or analysis techniques you’re working with?
I mostly work with qualitative data, though I do sometimes do quantitative analysis. At the moment I have a few projects starting so it’s less about the data and more about planning the research. I guess my work is innovative solely because I’m always looking at how you can design things better, make things more understandable and make that human-data connection.
What do you think are the biggest challenges for women and girls in your field? What would you like to see change?
I think the biggest challenge for women and girls is to find their pathway into a male-dominated field. We’re often told that we ought to include more women in STEM but often with little consideration as to whether a male-dominated and defined pathway is also right for women and girls. There is a great book called ‘Invisible Women: Exposing the Data Bias in a World Designed for Men’. It’s truly eye-opening. It takes what we think we know about what is ‘normal’ society and shows us how it’s actually constructed in ways that frame male needs and the male perspective as universal. Once you understand the state of play, you can start to change the rules of the game. That’s our biggest challenge – how do we re-design the rules of the game so that gender equality is hardcoded into our daily systems?
What would you recommend to women and girls who’d like to do what you’re doing?
I think there are a few things that have helped in my career and I think they are pretty much helpful to everyone. So here they are…firstly, find a strong mentor. Once you see someone else doing it, it’s easier to see yourself in the same place. Academic life can be pretty rough, and women often suffer from imposter syndrome – sometimes you need someone to tell you that you’re doing OK. Secondly, practice resilience. There are lots of times when things go badly and you have to not make that personal – some of my greatest successes have come out of what, on the face of it, seemed like personal failures. Lastly, the best advice I have ever been given is ‘let them say no’. Don’t ever hold back because you think you’re not smart enough, or not ready for the opportunity, or you might fail, because if you don’t throw your hat in then you’ll never know how brilliant you could have been.
Do you have a fun fact about yourself?
I’m not sure it was fun, but my first proper job was running a pub – it might seem a bit random now, but it certainly helped me to develop people skills!
Do you have any heroes or heroines that have influenced you?
I’m not really sure I have any heroes or heroines in the classic sense. The person I looked up to most though was my grandmother – she was a Polish immigrant during the Second World War. She had to endure things I can’t even imagine and still emerged to be a decent and compassionate human being.
Dr Ewa Luger : Chancellor’s Fellow in Digital Arts and Humanities & Fellow of the Alan Turing Institute