|This article is a stub. You can help Wikiquote by expanding it.|
All Systems Red (2017) Edit
- All page numbers are from the trade paperback first edition published by Tor, ISBN 978-0-7653-9753-9
- Won the 2018 Hugo, Nebula, and Locus awards for best novella
- That sounded like a great plan, in that it didn’t involve me.
- Chapter 2 (p. 32)
- They’re academics, surveyors, researchers, not action-hero explorers from the serials I liked because they were unrealistic and not depressing and sordid like reality.
- Chapter 3 (p. 50)
Artificial Condition (2018) Edit
- All page numbers are from the hardcover first edition published by Tor, ISBN 978-1-250-18692-8
- Won the 2019 Hugo and Locus awards for best novella, and was nominated for the 2019 Nebula
- SecUnits don’t care about the news…mostly because the news was boring and I didn’t care what humans were doing to each other as long as I didn’t have to a) stop it or b) clean up after it.
- Chapter 1 (p. 9; ellipsis represents the elision of several minor reasons)
- I would defer to your expertise in shooting and killing things. You should defer to mine in data analysis.
- Chapter 4 (p. 54)
- Yes, the giant transport bot is going to help the construct SecUnit pretend to be human. This will go well.
- Chapter 4 (p. 60)
- I am terrible at estimating human ages because it’s not one of the few things I care about. Also most of my experience is with the humans on the entertainment feed, and they aren’t anything like the ones you see in reality. (One of the many reasons I’m not fond of reality.)
- Chapter 4 (p. 64)
- Humans are nervous of me because I’m a terrifying murderbot, and I’m nervous of them because they’re humans.
- Chapter 4 (pp. 65-66)
- Young humans can be impulsive. The trick is keeping them around long enough to become old humans. This is what my crew tells me and my own observations seem to confirm it.
- Chapter 7 (p. 125)
- I was only 97 percent certain this meeting was a trap.
- Chapter 7 (p. 138)
Rogue Protocol (2018) Edit
- All page numbers are from the hardcover first edition published by Tor, ISBN 978-1-250-19178-6
- There needs to be an error code that means “I received your request but decided to ignore you.”
- Chapter 1 (p. 10)
- They were all annoying and deeply inadequate humans, but I didn’t want to kill them. Okay, maybe a little.
- Chapter 1 (p. 13)
- I know that’s actually not a permanent solution and pretending bad things aren’t happening is not a great survival strategy in the long run, but there was nothing I could do about it now.
- Chapter 5 (pp. 112-113)
- This was another reason I didn’t like human security consultants. Some of them enjoyed their job too much.
- Chapter 5 (p. 119)
Exit Strategy (2018) Edit
- All page numbers are from the hardcover first edition published by Tor, ISBN 978-1-250-19185-4
- Possibly I was overthinking this. I do that; it’s the anxiety that comes with being a part-organic murderbot. The upside was paranoid attention to detail. The downside was also paranoid attention to detail.
- Chapter 1 (p. 14)
- Hard work really did make you improve; who knew?
- Chapter 1 (p. 15)
- The station approach traffic was heavy, and we were showing a twenty-seven-minute docking delay. Twenty-seven minutes was more than enough time for me to do something stupid.
- Chapter 2 (p. 38)
- Real humans don’t act like the ones in the media.
- Chapter 3 (p. 56)
- The company is like an evil vending machine, you put money in and it does what you want, unless somebody else puts more money in and tells it to stop.
- Chapter 3 (p. 67)
- This lobby was on multiple levels and had large square biozones depicting different ecologies, with furniture arranged around them. It looked nice, inviting humans to sit around and discuss proprietary information in the hotel’s choked feed so the hotel could record it and sell it to the highest bidder.
- Chapter 4 (p. 76)
- Disinformation, which is the same as lying but for some reason has a different name, is the top tactic in corporate negotiation/warfare.
- Chapter 5 (p. 96)
- “I don’t want to be human.”
Dr. Mensah said, “That’s not an attitude a lot of humans are going to understand. We tend to think that because a bot or a construct looks human, its ultimate goal would be to become human.”
“That’s the dumbest thing I’ve ever heard.”
- Chapter 8 (pp. 154-155)
- It was very dramatic, like something out of a historical adventure serial. Also correct in every aspect except for all the facts, like something out of a historical adventure serial.
- Chapter 8 (p. 158)