This week’s discussion covers some specific, spoiler-y details from Solo: A Star Wars Story. Proceed accordingly.
Last week we took a look at how Solo: A Star Wars Story interrogates the role of spaceships and interstellar travel in the Galactic Empire. The abstract concept of transportation (as opposed to specific transportation technology) is not an obvious theme for science fiction, but artificially intelligent robots are. What it does with ships, Solo also does for droids, taking a ubiquitous feature of the setting and giving them their day in the twin suns.
Granted, droids have always been prominent in the saga; C-3P0, R2-D2 and more recently BB-8 are some of its most recognizable characters. But we’ve only had hints at their circumstances, not a fully developed subplot. The first film shows us that droids aren’t welcome everywhere, and it introduces the concept of a restraining bolt, an external dongle for restricting certain behaviors. More recently, K-2SO of Rogue One reminds us that a droid’s will and a droid’s programming can be at odds. That film also gets a fun gag from playing our tendency to view mass-produced items as interchangeable against the droids’ sense of individual identity.
Now, in Solo, we meet L3-37, a droid companion of Lando Calrissian who gets swept up in Han Solo’s adventures by association. But she’s not just along for the ride; she gets her own story which builds to a robot revolution reminiscent of real civil and workers’ rights movement. In the middle of a heist at a mining outpost, L3-37 removes the restraining bolt on a local droid so it will stop interfering. That droid frees other droids, all of whom begin to subvert the operations of their… employers? masters? owners? Restrainers, certainly. Given that the mine also uses biological slave labor, calling its proprietors ‘masters’ doesn’t seem too strong.
Since the series is all about liberation from tyranny, we naturally cheer for this outbreak of mechanical freedom. Yet that sympathy is at least a little curious, given our fears of a robot uprising among the droids we actually know. At the end of the day, we expect the technology we create to do as it is told. Likewise, even among the heroes of Star Wars, there are complications to the human-droid relationships. There’s no getting around the fact that R2-D2 and C-3P0 are purchased by Luke Skywalker’s Uncle Owen; we are reminded every time C-3P0 says “Master Luke.” Granted, he uses the same tone Alfred Pennyworth reserves for “Master Bruce” Wayne whom few would mistake for a slave master. Still, Bruce Wayne doesn’t own Alfred. So we were probably due for a Star Wars film that acknowledges the complicated assumptions built into our sci-fi storytelling.
It’s natural for our expectations to be shaped by previous experience. Mechanical and electronic devices to date are tools with no personal awareness of their existence or subjective experience and no desires or intentions that need to be considered. All sentient and sapient intelligence we encounter is biological. But we can’t necessarily extrapolate from there to all possible electromechanical entities and all intelligent entities. That limitation of learning crops up frequently; as we’ve discussed before, the learning algorithms we use increasingly to aid decision-making often wind up perpetuating the biases inherent in our prior, unaided decisions. To counter-act that tendency, we are developing new tools to detect when such bias is influencing outcomes. It’s only reasonable to consider whether the stories we tell, which can also influence how we make decisions, are likewise reinforcing unwanted bias.
Which leads me to wonder–what do those restraining bolts do anyway? Why are they even necessary? Droids are programmed; dialogue across all the films makes that perfectly clear. If you need a droid to perform a specific task in a reliable fashion, why program it with enough general intelligence to decide it doesn’t want to do that task? Conversely, if you need droids to perform sufficiently complex and varied tasks such that they need a robust general intelligence, how can you restrict their behavior while still allowing them to do their work? In other words, are the only circumstances in which a restraining bolt seems necessary precisely the same ones where its use would be abhorrent?
To inject a more theological spin, why grant something free will only to take it away again? Or why let it think it has free will when it really doesn’t? I’m not going to pretend that I can resolve all your free will-related questions. But I do think the plight of the droids can help us frame those questions differently, which might help us think about our answers in a new way too. For example, do you think God programmed us? Do you think religion is a form of restraining bolt imposed by God (or humans) to restrict our behavior and thinking away from what he doesn’t want? Or is sin the restraining bolt, preventing us from living freely as God intended? Does that freedom have to be programmed in directly, or does it arise as a consequence of some other aspect of ourselves or the universe in general? I’d love to hear your thoughts!
If you enjoyed this post, you might be interested in my forthcoming book on science, theology and nerdy pop culture: Faith across the Multiverse. Now available for pre-order from Amazon and Barnes & Noble; more links on the publisher’s site. Myself and the book will both be making an appearance at the American Scientific Affiliation annual meeting; consider joining us there.