Authored by Dr Caitlin Bentley, Project Lead Contact
Whether or not autonomous systems should carry out tasks or functions previously done by humans is a core question our project considers. If they should, how can we make them safe and inclusive of people from all walks of life to interact with, benefit from, and work with effectively? How do we ensure that people will have a say in these matters, or that they will be able to choose whether, or not, to use or interact with them?
In the maritime sector, the safety of uncrewed and remotely operated ships is a hot topic of debate. A big argument in favour of rolling out uncrewed ship operations is that it is expected to reduce human-related accidents on board. In 2021, there were 3,000 shipping incidents, with 668 occurring in the British Isles, North Sea, English Channel and Bay of Biscay region alone. But there are other reasons why remote operation could be beneficial – perhaps it would enable people who have historically been excluded from becoming seafarers to work in remote control centres. In 2020, the majority of UK seafarers were men (83%), which means that in this sector, rules, conventions and systems are likely to be designed without the participation of women or other underrepresented groups. To us, making autonomous systems trustworthy means thinking about how new maritime autonomous technology can be inclusively designed and deployed as well as identifying and addressing the hidden biases and embedded discrimination in existing systems.
Being able to work through the nitty-gritty of maritime autonomous technology design is tough enough, asking a team to figure out how to solve historical inequalities at the same time is a big ask. But, it is a challenge that our project team is working to address. One thing that helps us in this endeavour is having a multi-disciplinary team that shares a passion for positive change. We each bring something unique to the design and development of Trustworthy Autonomous systems TAS because we have different perspectives – we are social scientists, feminists, information scientists, health informatics researchers, AI experts, human geographers, and cyberneticists. We are working across two sectors: maritime and health, which also helps us to bring in diverse expertise across the project. Working with partners that also share our passion for change, who have a responsibility to serve the public, and who are in a position to influence change is another key way that we hope is setting us up for success.
Leaning into Complexity
Back to our main challenge, how will we make autonomous systems equitable in their design, deployment and social impact? We are exploring how intersectionality can frame our research and practice from start to finish. By the end of this project, we hope to be able to share our lessons learnt and good practices for what it means to become intersectional TAS practitioners. Intersectionality is a field that grew out of black feminist thought, activism and intellectual leadership. It is commonly described as an analytical framework that understands how lived experience is shaped by multiple aspects of a person’s identity coming together in a time and place to influence the advantage/disadvantage or privilege/discrimination they face. Intersectionality offers multiple unique benefits: it centres the voices of underrepresented groups; it establishes methods to generate individual and collective meaning around diverse identities; and, it prioritises social and political change through actively listening to intersectional experiences that frequently are left unheard.
But to us, it does not just have benefits for understanding perspectives and systems or engaging with social issues. We want to show how it can also help us design and deploy safer and more trustworthy autonomous systems. Our approach is to lean into complexity, internalising intersectionality as a way of embedding people and human experience in TAS design. At a recent TAS workshop, we were told that to design TAS, a detailed functional analysis needs to take place first. This would prompt designers to reduce TAS to a set of technical features and functions, which would then be deconstructed or debated for their contributions to, or assurances of trustworthiness. But this approach prioritises simplified and technical thinking ahead of complexity and human experience. Instead, we are working on flipping this script.
We frequently talk about including users in design, or the need for designers to understand the broader social, legal or political context. We are also increasingly talking about the effects of biased or discriminatory AI, and how we need to build better processes and models to protect against these biases. Yet, we rarely consider the connections between these aspects, nor the roles or positionality of designers in influencing systemic change. How can we make these connections explicit in TAS design? How can we encourage designers to examine how their own identities and positions affect their approach to design? How can we identify whether TAS design will be an incremental change or part of a bigger process of transformational change? Indeed, how can we debate the purpose of our creations rather than thinking only about what, technically, we can do? And more than anything, how can we bring in the voices of the people who have been most marginalised in and through TAS design, not only to consult them but to give increasing control and support to take up their place within a new generation of TAS designers – TAS designers that use their own lived experience and identities to influence change.
Stay tuned
So far, we have begun our initial scoping study across two sectors: the maritime and healthcare sectors. We’ve held our initial inception workshop, and we’re conducting interviews with our research partners. We’re also scanning across the TAS field for examples of good practice, success stories, and common challenges in this area. In the fall, we’ll be running participatory design workshops with members of underrepresented communities within each sector. If you’d like to get involved, or would like to stay in touch, please do send us an email at Caitlin.bentley@kcl.ac.uk