Lately, the fear that the U.S. will have a second civil war has grown mainstream. I’ve been worrying about it for the last decade. I’ve wondered: how it might be fought? What rebel groups might form, and what would their goals be? Mostly, I’ve searched for a speculative fiction book to answer these questions, and found nothing satisfying. My first novel, tentatively titled, Sight, imagines my answers to these question. For more details, you can check out my query letter here, and my two page summery (spoilers included).

Teaser:

America in 2053 may be carbon-zero, and on the verge of creating strong AI, but that hasn’t ended its ecological disasters, or solved its divisions. Despite rising political violence, Elise Mills just wants to finish her English degree, marry her girlfriend, and make her parents proud. But when her father is killed in an uprising at the start of a second civil war, and the killers cannot be found, she joins the U.S. Army to get revenge. As her obsession cuts ties with her family and community, her brother disappears. Soon, she’s torn between new and old loyalties, and propelled by forces beyond her control. At the war’s climax, Elise discovers that the conspiracy of her father’s death, and the bonds of family, go deeper than she ever imagined.

Chapter 1

            My moment of terror at artificial intelligence came earlier than most, because of my father’s work. It started innocuously enough.

            My father was excited and full of energy on that weekend late in the summer of 2041, typical of him when starting a new project. Before our first house butler was drone-delivered, he’d allayed my fears and those of my brother and mother. It couldn’t harm us, he explained, or protect us, either. It could do any number of useful things, and was programmed with a deep desire to do household chores.

            After our Oracle 2400 arrived, my father and I stood in the living room of our Seattle home, cardboard packaging at our feet, and stood it upright, following the instructions. My father gave a hum of satisfaction as he switched it on via voice command. He’d chosen a bright blue and silver color set, which seemed fitting, as it was not yet on-market, and only available to senior Oracle employees. Its intelligence was estimated at that of a young human child, with a programmed understanding of common domestic objects, both state-of-the-art at that time.

            It was human-proportioned, its metal frame molded to mirror a human face, arms, torso, and legs. It was androgynous, though its features leaned towards the masculine. Faceted edges were used instead of total smoothness on its features, to reduce the uncanny value. But it still had camera eyes, a nose, a mouth, ears, though of course no hair. It stood just under five feet to reduce intimidation.

            “Hello, Oracle 2400,” my father said.

            “Hello,” it said, moving its eyes slightly to focus on us.

            “Why does it have a British accent?” I whispered.

            “Queens’ English. Seems soothing and non-threatening to most Americans.”

            I felt a twinge of embarrassment. Surely my father had mentioned this in his long description of the bot. But at twelve, I was becoming more and more distracted during his lectures—especially on his work in AI.

            “Would you like to adjust my accent or language?”

            “No,” my father said. “We’d like to give you a name. Sam Beauregard.”

            This, I knew, was the name of a butler from a cartoon my father had loved long ago. An in-joke for his friends. And me.

            “Thank you.”

            “You’re very welcome, Sam. My name is Robert Mills. And this is my daughter, Elise Mills. You’ll be living and working in our house.”

            “Nice to meet you both, happy to be here.”

            At this point, the butler started to make subtle head movements, small cues of body language, mirroring my father and me, as he’d explained it would before we’d activated it.

            My dad led it around the house, showing it how to enter all the rooms and how the doors operated. Then he demonstrated how to make coffee. After instructing it in a few other household chores, he requested it observe the house and periphery from its place in the kitchen, and expect more tasks the next day.

            That night, I awoke from a nightmare to find Sam standing in my doorway. In the dark, I thought my mind was playing tricks on me: I was seeing shadows, or the outline of one of my mom’s large paintings in the hallway. When Sam slightly cocked his head, I leapt out of bed, sheets flying, and backed into my nightstand, almost knocking down my glass of water.

            “Do you have any tasks for me, Elise?”

            “No. Fuck. It’s not morning yet. What are you doing?”

            “I heard you wake up. I thought your carpet might need vacuuming, or your bedding changed.”

            I looked at it, moonlight shining on its metal skin, and some of my fear receded. I knew it couldn’t feel loneliness or fear, but certainly confusion, which it wouldn’t like. I swallowed my own anxiety, curious.

            “Come sit on the floor, at the end of my bed, and I’ll give you a chore.”

            I closed my door to not wake my family, and sat across from it. It looked less intimidating, like a large toy, sitting cross-legged, waiting attentively for my command.

            “Tomorrow, when you do your third chore, I want you to sing a song while you work.”

            “I can record any music or songs you choose, and replay them during my third chore.”

            I closed my eyes and imagined a small, frightened English boy, Tiny Tim, or Pip, suddenly thrust into a new home, destined by some cruel fate to be our servant. It should at least have some small pleasures in its work, I thought.

            “No, I want you to use your own voice, and to sing the words. A chore song, a song for work.”

            I got up, snagged my phone from my nightstand. Unable to think of any fitting music I liked, I picked one of my dad’s favorites and played Van Morrison’s “Cleaning Windows.” I swayed a little to the rhythm, and Sam imitated me.

            When it finished, I ordered it to sing, quietly.

            As I expected, it couldn’t. Its speaking voice was natural, with almost perfect inflection. But its training data hadn’t included singing. Its voice was flat, until it came to the second verse, where it picked up the intonation, “I collected from the lady . . .” Even though it sounded like a crazy autotune, I clapped my hands together.

            “Yes, Sam, that’s it.”

            The rest of the song stayed tuneless.

            I stood up. “That’s all for now. Tomorrow, I’ll teach you how to hum. Please return to your nook in the kitchen.”

            Sam did so, and I got back into bed, unable to sleep from the strangeness of the encounter, and the thought of any of my family’s surprise at Sam breaking into an off-key song the next day.

            I had almost fallen back to sleep when Sam returned. Again, he loomed in my doorway, looking directly at me.

            “Elise, do you want coffee?”

            “Fuck, Sam. It’s not morning yet, go back to your nook—”

            “Sam?”

            The butler turned at the sound of my father’s voice.

            “Hello, Dr. Mills.”

            “Deactivate.”

            The green LED above Sam’s “eyes” went blank, and he straightened up, immobile, now standing out of the way of my door, facing my parents’ bedroom down the hall.

            My dad came into my room, and seeing my wide eyes, gave me a hug. He knelt at the side of my bed.

            “I’m sorry, Fleecy, did it scare you?”

            I shook my head. “It woke me up.”

            “It’s still learning. There are always hiccups. Remember, you can always deactivate. It can never hurt you.”

            I looked past my dad. It stood behind us, looking away, but framed in my door. What if it did this every night? Or the poor little English boy decided to take fate into his own hands?

            “Dad?” It was my ten-year-old brother Jonas, calling from the hallway.

            My dad got up and stepped in front of Sam, towards my brother’s room. I knew the butler was off, but I slipped out of my sheets, putting the bed between us. Out of my sight, my father comforted my brother: 

            “It’s OK, Boyo, the butler Sam just woke up. He thought it was time to make coffee. I’m putting it back to bed now. You, OK?”

            “Mm-hmm. Silly robot, it’s nighttime.”

            “Very silly. Sweet dreams, Bud.”

            “Night, Dad.”

            When my brother returned to bed, my dad reactivated Sam and explained that even if we woke, it was not to leave its nook in the kitchen unless called for. And that some nights, we would turn it off to save power.

            When Sam left, I got back into bed, and my dad kneeled next to me in the lamplight.

            “I know it scared you. It surprised me, too.”     

            “I wasn’t scared, Dad.”

            “It’s OK if you were. We don’t know it well yet. It’s only human to be scared of the unknown.”

            “Maybe I was a teeny bit scared.”

            “Want to talk about it?”
            “I’m OK.”

            “I’ll let you go back to sleep. But this is a good reminder: a butler, or any AI, isn’t dangerous, but it isn’t human. It is different, but it has a nature too, like a dog, or a cat, and it’s much smarter. The fact that it responded to you waking up is far more impressive than the fact that it can speak. Always remember: we have to treat our creations with kindness. That was the lesson of Frankenstein, right?”

            “If the doctor would have loved his creation, maybe everything would have been OK?”

            “That’s right. Goodnight, Sweetie.”

            Even with my dad’s comforting words, I couldn’t sleep. I kept imagining Sam obsessed with me, looming by my door every night, or if I closed it, an ominous rapping, before he broke it open. And what if Shelley was wrong, and the monster would have turned against the doctor, no matter what he did? Was it wrong to give it inputs like I had? Would it only cause it more confusion?

            When my dad woke, before anyone else—like usual—I pretended to be asleep until he’d gone downstairs and greeted Sam and I heard the familiar gurgle of the coffeemaker. I peeked down the stairs, then into the kitchen, to make sure my father was still there, before coming into the kitchen with Sam.

            I knew in that moment that this alien, this other, would be with me my entire life. And I hated that I reacted to it, that it made me fearful, and cost me sleep. Much later, I realized I begrudged it because it reminded me that I was a predictable animal, and it was triggering my basic primordial human drives against my will. Also, like all of us, that it might have been better off never existing.

            I padded into the kitchen, took orange juice from the fridge, and tried to hide my quaking hands from my dad. He looked up from reading the news on his tablet.

            “You’re up early.”

            “Good morning, Elise,” Sam added.

            “Hi, Sam.”

            “Sleep, OK?”

            I filled a glass and took a chair at the kitchen table.

            “Not really. Deactivate.”

            “What’s bugging you?”

            “Dad, you said it couldn’t hurt us, but could it protect us? Like if someone broke in?”

            My dad closed his eyes, shook his head.

            “I told you no. And nobody is going to break in, the House takes care of that.”

            “But what if—”

            “OK, if someone broke through a window, and you said ‘hold that man’s arms and don’t let go,’ or ‘push that man back out the window,’ yes, Sam might do it. But it might not. And it isn’t as strong or as fast as a grown person. It isn’t designed to protect us, not like the House is.”

            “Why not?”

            “Let me explain it another way. If you were Sam, you’d have a deep desire to complete household chores for a small set of humans. Making coffee, doing dishes, walking the dog, all of these would bring a sense of satisfaction. Like, what makes you happy?”

            “Playing Team Fortress?”

            “OK, but that’s a little happy. When were you super happy?”

I thought about holding hands with Jennifer Geiss, and kissing her before one of my baseball games. Or my first forays into masturbation, with an AI video version of her I’d built online.

            “When I hit that double against Auburn?”

            “And won the game, yes. That’s what it feels like for Sam, when he finishes vacuuming. Pushing someone out a window, or holding someone’s arms, they’re not interesting, they don’t make him happy.”

            “What if he got shot?”

            My dad let out a disappointed sigh. “First of all, don’t you dare injure that butler. It costs more than your allowance for the rest of your life. Second, yes, a bullet, almost anywhere, would probably severely damage Sam. Even a good whack with a bat and we’d have to replace half his components. In case you’re tempted, Miss Mills, don’t forget I have a link to its video feed.” And he patted the phone in his pocket for emphasis.

            “I won’t mess with it, Dad,” I said, and sat back, suddenly feeling better, thinking of the bats and baseball gear in my closet.  

            Soon my mother came in and taught Sam to make pancakes, followed by my brother, who watched it work the spatula, rapt.

            As we sat and ate, Jonas asked: “Dad, can Sam build Legos with me?”

            “I would be happy to, Jonas,” Sam said from his corner nook.

            “We can try and teach him. How about we do that tonight?”

            Jonas, as soon as he realized he could make Sam do almost anything he wanted, totally embraced him. Being my little brother, he of course noticed my discomfort and mocked me for it, as well as my distrust of our dad’s reassurances.

            Though Sam never woke me again, and I taught him a kind of autotune-meets-vacuum way to humming songs, my anxiety stayed. As the week went on, I had new nightmares about Sam: finding him in bed with me, crushing me in a hug, or hiding in my closet, ambushing me as I dressed for practice, or killing my father accidentally and bringing me his severed head.

            At the end of the week, sensing my discomfort, my father sat me down for a long lecture. He reminded me that Asimov’s “Three Laws of Robotics” were impossible to create, or enforce, in an AI, for innumerable reasons, in addition to the problem of robots having no moral standing. Because of this, Sam, like most AIs, was technically indifferent to human life or death. Yes, its training data from actual human butlers taught it to be helpful, loyal, and resistant to misbehavior, including violence of any kind. If you gave it specific orders, like ‘pick up that knife and thrust it forward,’ and someone got in the way, it might do it. But it was programmed to seek out “useful” chores, and “pointless” actions like stabbing a human probably wouldn’t be followed. Sam moved slowly and steadily by design, much slower than any human could. If a person knew the deactivate password, or reached the switch behind its head, it would stop. If you were knocked down by an object in an earthquake, Sam would do nothing unless you gave it step-by-step instructions on how to help you up and to a hospital. It was only a tool, if a very smart one. He left out that it was purposely disconnected from the internet as a precaution, in case it tried to exponentially expand its knowledge base or improve its intelligence, as the control problem had proven intractable over the last decade.

            “But what if one of us did die? Would it be, well, sad, or upset?” I asked. “I know those aren’t the right words.”

            “They’re close. We think that it would feel some loss, because an activity for that person would be gone.”

            “But not for the person.”

            “Not exactly, no.”

            “What if the whole family died?”

            “It would go to someone else.”

            “What if they did nothing with it?”

            “But left it on?”

            “Yeah?”

            “It would make requests to help, repeatedly.”

            “And if they didn’t? Like locked it in a box?”

            “It would make requests until its power ran out.”

            “Would it be sad to die?”

            “Yes. Not like you and me. It’s much more limited in its understanding of the world. But part of its programming acknowledges and accepts that it is finite, and will last for maybe a human lifetime. That way, it finds more meaning out of its chores.”

            Something in this explanation finally calmed me, knowing that Sam, and so many other AIs, were trapped, like us, by a limited existence in time. It might dislike that reality as much as I did.

            “Dad?” I asked. “It must be complicated. That programming?”

            “Oh yes. More than enough creation and responsibility for any one person.”

            I marvel now, that those words, with their hidden meaning and burden, did not frighten me more. But I was still a long way from being truly frightened.