What does it mean to trust an autonomous vehicle?
To investigate how trust works with autonomous vehicles. My existing research into conscious and unconscious control aligns with some of the challenges of designing cars that are fully in control, partially in control and the handover of control from the car to the driver and vice versa. To start exploring this space I have found it useful to look at depictions of autonomous cars in popular culture. These examples have been useful in a couple of ways, firstly to group, theme and do a meta-analysis of the depictions and secondly as a tool to foster discussion amongst researchers. The end goal of my research is to write an interactive script that can be used to explore this space and encourage critical engagement with the question: What does it mean to trust an autonomous vehicle?
Design fiction is a type of speculative design which can be used as a way of exploring people’s attitudes to and implications of future technologies. The fictional designs of autonomous cars over the years give us an insight upon what the desires and fears are and how they have changed over the years. By analysing these design fictions, and that of current autonomous car design and research I hope to discover gaps in the narratives around future self-driving cars to inspire new questions. From these questions I plan to extrapolate a design for an interactive creative product.
Self-driving vehicles in TV and film:
- The Safest Place (1935). A public service film with the message that a car that could drive itself would be the safest car.
- The Love Bug (1968). A car with a big heart and strong emotions.
- Crash (1977). A b-movie where a driverless car causes accidents to dark synthesiser music.
- Knight Rider (1982). Michael’s friendly companion.
- Christine (1983). A demonic car goes on a murder rampage.
- Total Recall (1990). Jonny Cab, an obtuse taxi driver with a kamikaze streak.
- Robocop 2 (1990). Magna Volt TV ad that promises a “Lethal response” for car thieves.
- Demolition Man (1993). A police car goes haywire.
- Minority Report (2002). The occupant of a self-driving car is rerouted when they are suspected of potentially committing a crime.
- Monolith (2016). Safety features trap a son, the car gets them home in the end.
- Logan (2017). Unflinching unstoppable Juggernauts.
- The Fate of the Furious (2017). An entire city of autonomous cars are hacked
- NEXUS (2018). A car witnesses a hit and run, and kills the driver as they wouldn’t hand themselves in.
Autonomous vehicles in Cinema and TV
Talking about self-driving cars
The following insights have been collated from transcripts made during workshops where researchers discussed the above movie and TV depictions of self-driving cars.
It was noted that a lot of the challenges in the examples were also current challenges.
There is a contradiction that we want to be free, we want to be powerful, and we want to have control. So that’s the reason we are off loading activities to the machines. But at the same time, we want to use that freedom for something we want, not what devices are trying to tell us. So that’s the contradiction; that we want to offload but at the same time we want to have the control, it spans across the various narratives.
We are giving over not only the control but also the skillset. The control is one thing but the skillset makes a lot of people uneasy.
A few of these clips show people overriding the autonomous car. They don’t trust them to do what they want, or they are challenged that the car might be better than them.
Thinking about the trucks in Logan, a lot of the issues we envisage come back to the idea of the clashes between the humans trying to drive in a space alongside the autonomous vehicles. The clip from the 1930s said that if all cars were driven by autonomous systems there would be no accidents. That would probably be true but when you start bringing humans into that mix there is going to be problems. The autonomous systems can’t tell what the humans are going to do on the same roads.
If all cars were to change to autonomous vehicles, then it would be easy to make something in that space, but tesla and others are talking about the challenges of stepping up to automation and that’s where it becomes difficult.
Alexa will listen for inflections of frustration in the user’s voice and try a different tact, it has learned over the years. The fact it has been given a female voice and how that fits in to emotions and how we react to things; Do we react differently to something with a female voice and why?
When the topic of emotion comes up the hurdles of all sorts of dependence on these technologies come with it. The users’ interactions become something expressed almost involuntary as they get used to the technology. Then, if that technology has set out to create an emotional connection what happens when it breaks down, becomes unsupported?
The Love Bug clip had the car being jealous and that is projecting a complex emotion. Certainly, when we are talking about fictional depictions of autonomous vehicles, which we see in Knight Rider as well, there is a type of general AI which is very unlikely to be present in autonomous vehicles any time soon.
The examples where the cars have the most agency such as Christine, or Herbie, they are not autonomous cars, they are magic cars rather than technological, maybe that says something. Just how much agency do we want from the technological?
What can we learn from these conversations and fictions of the future?
There seems to be 4 distinct types of fictional autonomous car. The first are cars that have a personality, general ai or are magically alive. Knight Rider, Christine, The Love Bug, Transformers, Crash. These are sometime companions, sometimes nemeses. The second type are Cars that are doing a job and have little to no interest to things outside that. Jonny cab, Logan, Monolith, West World, Robocop 2, I Robot, Batman, Black Mirror: Most hated in the Nation. These are often in conflict with the situations the characters find themselves in, a reversal of this is in Westworld season 3 where an AI character is the one who effectively utilises a self-driving car in a car chase. The third are Cars that can be hacked, Demolition Man, the Fate of the Furious, in these cases the narrative existence of the self-driving car is to be hacked, which then justifies the characters distrust in the autonomous cars. And finally, cars that that act on behalf of a perceived injustice, Nexus, Minority Report, Jonny Cab, Reminiscent of Čapeck’s play, the Autonomous car makes a moral judgement and sets out to punish the driver/passenger for infractions.
How Autonomous cars are seen as impacting us
There is often reticence to use the car when first introduced, which is usually justified in the narrative either because of the larger dialogue around AI or that the car is in fact a threat.
There are often conflicts which are indirect threats to the characters masculinity, or a threat to human skillsets. The majority of depictions are American, where the car is an essential part of life. This may also be a reason for the drivers to be exclusively white or male, mostly both, never neither.
What is also conspicuous by its absence in these visions of the future are environmental concerns, not a tree in sight in Minority Report, only concrete, and highways. There is always the underlying desire for technology to make life easier, when this means automation there develops conflicts with the desire to be in control.
Humans take pride in developing a skill set such as driving, ideologically there is a self-determinism in being a driver, even a romanticism, the open road, the opportunities laid out in front of you. This probably goes back to before cars, a deeply ingrained metaphor of determining your own path in life.
Drivers could get overly attached to their cars as they are designed to engage them emotionally. Drivers will be likely locked into the manufacturers value chain, rather than designed to fit into a circular economy, with some manufacturers not allowing owners to repair their own property. Cars could also be designed to act in the company’s best interest which may not be the same at the users’ as we saw with the emission fixing scandal and the calculation of when to implement a factory recall.
Self-driving cars can and have been hacked. They are at risk of both low-tech hacks, such as trapping them by painting lines in the road, and high-tech, unlocking and starting up and taking control of a vehicle.
Magical / AI cars are probably the most far-fetched, but they are still an influential idea in research and design. In contrast the depictions of autonomous vehicles doing their jobs which have unintended consequences for humans are more likely to happen. Vehicles may behave unpredictably to novel situations on roads in which they were not trained. Different countries and even regions have their own driving styles and rules of the road, written and unwritten. Integrating with human traffic is going to be difficult as people are so unpredictable. If all traffic were automated, it would be easier to design how the cars behave, but that is a long way away.
Their design also assumes a stable society, what unique and life-threatening situations will happen with our changing climate? How will an autonomous cars fare escaping forest fires? Storms? A warzone? Cars are for commuting yes, but are used for much more, they can save lives. How will they understand our changing world?
The end goal of my research is to write an interactive script that can be used to explore this space and encourage critical engagement with the question: What does it mean to trust an autonomous vehicle? A complicated system of human agency vs automation comes into play as I grapple with this question to create an experience which engages thought, conversation and ultimately may challenge preconceptions.
The fate of the furious: Copyright Universal Pictures
Minority Report: Copyright 20th Century Fox
The love bug: Copyright Disney
The safest place: Copyright Chevrolet