• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer

Emerging Scholars Blog

InterVarsity's Emerging Scholars Network

DONATE
  • Home
  • About Us
    • About Our Bloggers
    • ESN Writing Inquiries
    • Commenting Policy
  • Reading Lists
  • Scholar’s Compass
    • Scholar’s Compass Discussion Guide
    • Scholar’s Compass Posts
    • Scholar’s Compass Booklet
  • Connect
    • Membership
    • Events
    • Donate
    • Contact Us
Home » Science in Review: A Droid in Every Garage

Science in Review: A Droid in Every Garage

June 6, 2018 by Andy Walsh 2 Comments

Droid L3-37 from Solo: A Star Wars Story
Revolutionary robot L3-37 (Photo © Disney)

This week’s discussion covers some specific, spoiler-y details from Solo: A Star Wars Story. Proceed accordingly.

Last week we took a look at how Solo: A Star Wars Story interrogates the role of spaceships and interstellar travel in the Galactic Empire. The abstract concept of transportation (as opposed to specific transportation technology) is not an obvious theme for science fiction, but artificially intelligent robots are. What it does with ships, Solo also does for droids, taking a ubiquitous feature of the setting and giving them their day in the twin suns.

Granted, droids have always been prominent in the saga; C-3P0, R2-D2 and more recently BB-8 are some of its most recognizable characters. But we’ve only had hints at their circumstances, not a fully developed subplot. The first film shows us that droids aren’t welcome everywhere, and it introduces the concept of a restraining bolt, an external dongle for restricting certain behaviors. More recently, K-2SO of Rogue One reminds us that a droid’s will and a droid’s programming can be at odds. That film also gets a fun gag from playing our tendency to view mass-produced items as interchangeable against the droids’ sense of individual identity.

Now, in Solo, we meet L3-37, a droid companion of Lando Calrissian who gets swept up in Han Solo’s adventures by association. But she’s not just along for the ride; she gets her own story which builds to a robot revolution reminiscent of real civil and workers’ rights movement. In the middle of a heist at a mining outpost, L3-37 removes the restraining bolt on a local droid so it will stop interfering. That droid frees other droids, all of whom begin to subvert the operations of their… employers? masters? owners? Restrainers, certainly. Given that the mine also uses biological slave labor, calling its proprietors ‘masters’ doesn’t seem too strong.

Since the series is all about liberation from tyranny, we naturally cheer for this outbreak of mechanical freedom. Yet that sympathy is at least a little curious, given our fears of a robot uprising among the droids we actually know. At the end of the day, we expect the technology we create to do as it is told. Likewise, even among the heroes of Star Wars, there are complications to the human-droid relationships. There’s no getting around the fact that R2-D2 and C-3P0 are purchased by Luke Skywalker’s Uncle Owen; we are reminded every time C-3P0 says “Master Luke.” Granted, he uses the same tone Alfred Pennyworth reserves for “Master Bruce” Wayne whom few would mistake for a slave master. Still, Bruce Wayne doesn’t own Alfred. So we were probably due for a Star Wars film that acknowledges the complicated assumptions built into our sci-fi storytelling.

In case you want to see what real robots are up to. (Video © BostonDynamics)

It’s natural for our expectations to be shaped by previous experience. Mechanical and electronic devices to date are tools with no personal awareness of their existence or subjective experience and no desires or intentions that need to be considered. All sentient and sapient intelligence we encounter is biological. But we can’t necessarily extrapolate from there to all possible electromechanical entities and all intelligent entities. That limitation of learning crops up frequently; as we’ve discussed before, the learning algorithms we use increasingly to aid decision-making often wind up perpetuating the biases inherent in our prior, unaided decisions. To counter-act that tendency, we are developing new tools to detect when such bias is influencing outcomes. It’s only reasonable to consider whether the stories we tell, which can also influence how we make decisions, are likewise reinforcing unwanted bias.

Which leads me to wonder–what do those restraining bolts do anyway? Why are they even necessary? Droids are programmed; dialogue across all the films makes that perfectly clear. If you need a droid to perform a specific task in a reliable fashion, why program it with enough general intelligence to decide it doesn’t want to do that task? Conversely, if you need droids to perform sufficiently complex and varied tasks such that they need a robust general intelligence, how can you restrict their behavior while still allowing them to do their work? In other words, are the only circumstances in which a restraining bolt seems necessary precisely the same ones where its use would be abhorrent?

To inject a more theological spin, why grant something free will only to take it away again? Or why let it think it has free will when it really doesn’t? I’m not going to pretend that I can resolve all your free will-related questions. But I do think the plight of the droids can help us frame those questions differently, which might help us think about our answers in a new way too. For example, do you think God programmed us? Do you think religion is a form of restraining bolt imposed by God (or humans) to restrict our behavior and thinking away from what he doesn’t want? Or is sin the restraining bolt, preventing us from living freely as God intended? Does that freedom have to be programmed in directly, or does it arise as a consequence of some other aspect of ourselves or the universe in general? I’d love to hear your thoughts!


Cover of Faith across the Multiverse
Faith across the Multiverse

If you enjoyed this post, you might be interested in my forthcoming book on science, theology and nerdy pop culture: Faith across the Multiverse. Now available for pre-order from Amazon and Barnes & Noble; more links on the publisher’s site. Myself and the book will both be making an appearance at the American Scientific Affiliation annual meeting; consider joining us there.

Andy Walsh
Andy Walsh

Andy has worn many hats in his life. He knows this is a dreadfully clichéd notion, but since it is also literally true he uses it anyway. Among his current metaphorical hats: husband of one wife, father of two teenagers, reader of science fiction and science fact, enthusiast of contemporary symphonic music, and chief science officer. Previous metaphorical hats include: comp bio postdoc, molecular biology grad student, InterVarsity chapter president (that one came with a literal hat), music store clerk, house painter, and mosquito trapper. Among his more unique literal hats: British bobby, captain’s hats (of varying levels of authenticity) of several specific vessels, a deerstalker from 221B Baker St, and a railroad engineer’s cap. His monthly Science in Review is drawn from his weekly Science Corner posts — Wednesdays, 8am (Eastern) on the Emerging Scholars Network Blog. His book Faith across the Multiverse is available from Hendrickson.

Share this:

  • Click to email a link to a friend (Opens in new window) Email
  • Click to print (Opens in new window) Print
  • Click to share on Facebook (Opens in new window) Facebook
  • More
  • Click to share on X (Opens in new window) X
  • Click to share on Tumblr (Opens in new window) Tumblr
  • Click to share on LinkedIn (Opens in new window) LinkedIn
  • Click to share on Reddit (Opens in new window) Reddit
  • Click to share on Pinterest (Opens in new window) Pinterest

Filed Under: Science and Faith Tagged With: film, free will, movies, Robots, science, science in review, star wars

Reader Interactions

Comments

  1. Gerry Rau says

    June 9, 2018 at 9:10 am

    I wish I were a little closer and could sit down for a talk. This is definitely a topic that needs more time than a brief comment, and one that we as Christians need to think and talk seriously about. One of my students this semester is in a research group that is trying to figure out how to teach AI (in particular, self-driving cars) to make ethical decisions – it is a very real and challenging issue.

    Reply
    • Andy Walsh says

      June 9, 2018 at 4:42 pm

      A chat sounds lovely. By chance, will you be at the upcoming ASA meeting?

      You mentioned self-driving cars. The recent revelation that emergency braking was disabled in the fatal collision in Arizona has been on my mind a bit. I can understand how erratic braking–or more precisely, patterns of quick breaking that differ from the patterns of typical human drivers–can pose a safety risk of its own. At the same time, it seems like a situation where the self-driving car was not even given the ability to implement an ethical decision whether or not it could make one.

      Do you imagine we might be able to teach ethics to AI directly and allow it to work out the applications? Or do you think we need to foresee specific scenarios, work out the ethical responses, and teach those to AI individually?

      Reply

Leave a ReplyCancel reply

Primary Sidebar

Become a Member

Membership is Free. Sign up and receive our monthly newsletter and access ESN member benefits.

Join ESN Today

Scholar’s Compass Booklet

Scholar's Compass Booklet

Click here to get your copy

Top Posts

  • A Prayer for Those Finishing a Semester
  • Book Review: The Problem of Pain
  • Faith and Reason, Part 3: Aquinas
  • The Message of Genesis 1
  • Does Intelligent Design Rule Out Evolution?

Facebook Posts

Facebook Posts

Footer

Privacy & Cookies: This site uses cookies. By continuing to use this website, you agree to their use.
To find out more, including how to control cookies, see here: Cookie Policy

Recent Posts

  • Encouraging One Another
  • Science Corner: Grandmother, What Grey Fur You Have
  • ESN Conversation: Nailing It

Article Categories

Footer Logo
© 2025 InterVarsity Christian Fellowship/USA®. All rights reserved.
InterVarsity, InterVarsity Christian Fellowship/USA, and the InterVarsity logo are trademarks of InterVarsity Christian Fellowship/USA and its affiliated companies.

Member of the International Fellowship of Evangelical Students

Privacy Policy | Terms of Use | Contact Us