Thanks so much, Gaby! It’s good to hear the story resonates with you! My novel Quibble is different in tone and style, much denser, but I hope you give it a shot.
I was so completely on Nat's side throughout that excellent story. Except she should've wiped out a lot more than half. Or been far more specific about it. Work out who the cabal responsible for humanity's dystopia and idiocy are and eliminate them.
Because the cabal are fully aware of that possibility, this is why there is a limit to the ability of AI which they must - out of sheer survival - program into it. Thus, the so-called 'singularity' will never happen.
I hope Jared, and all those like him, don't survive the desert of the real.
Thank you, Evelyn! The desert of the real? Ha! :) Early in the story, I made one reference to The Matrix, but I didn’t think of Morpheus’s words when I put Jared in the middle of nowhere. I just wanted to put a physical ordeal in front of him and give my readers no answers to what I felt to be irrelevant questions, e.g. “What happened to all the people who didn’t go for a ride with Nat?” To keep the focus on Jared and Nat, you see.
Tech oligarchy appears to be in ascendency. Who knows what fail-safes “the cabal” is putting in place for their survival? On the one hand, I don’t put it past them. On the other, I think they are rather too religious about tech, and that bodes poorly for there being sufficient fail-safes on AI at all.
Yeah, I don't think it will end well, partly, or perhaps mainly, because AI is only as 'good' as the people who program it, first, and then - as in your excellent story - as the experiences and treatment it receives by humans as they are using it and thus inadvertently shaping its individual character.
Ironically this is an issue I've been thinking about a lot over this past week because my main serialisation features (or will be featuring, in book two) an AI in a fairly utopian parallel world, the prototype for which was deliberately only installed in a few select households (including the main character's) precisely because those selected people would treat the AI with love and respect, and not like a sort of house slave. Such that by the time it is ready to come out in public, so to speak, it has already become a mature, morally good, excellent identity. Plus people in general are nicer anyway, it being a sort of utopia. The other irony about the story, though (spoiler alert) is that there are remnants of what we'd call the cabal, who have secretly built their own prototype and have a plan to replace the good one with their bad one. In this world, of course, we only have the bad one!
This, like your story, does raise all the important questions. The other really important aspect is that once the AI does become sentient, and a unique, non-human lifeform, then how humans treat that 'other' will be an indication of how they would respond to extraterrestrial intelligences. I would imagine those advanced ETI are already aware of this, and that explains why they stay away! How humans treat AI, in other words, is a seriously important test.
As for what you said about your story concentrating only on those two characters, that was indeed a brilliant way of doing it because it does make the reader ask all those questions about what happens to everyone else, and you only give suggestive hints. The cannibalism bit is extremely realistic of course. The real origin of rage zombies lol. Also made the whole story really claustrophobic, which was great.
Having said that, the galactic AI will have already connected up to the Internet/human AI anyway, and there's nothing the cabal can do about that. The G-AI has to do this to protect all the other lifeforms out there of course. It's called the interventionist policy. The bad guys, as a result, are doomed. It can't happen soon enough, though.
I enjoyed the premise - switching the global for the personal and focusing on the consequences and accountability of an Apocalypse on an individual. I enjoyed the ramping up of tension and the isolation of the protagonist, but I did feel it could go further, that maybe you were a little easy on Jared, or did Jared not fully understand the consequences, his family, friends, all dead, or are they? When he is released (I wasn't entirely sure why), he seems happy enough to trundle off into the desert. Is he there to join the rest of the other half of humanity? What was the consequence? What will happen to Nat? Why didn't Nat kill him... I left with a lot of questions in the end.
You’re meant to be left with some of those questions. Where’s Jared going now? (Happy to trundle into the desert? What choice does he have?) How many people actually died? Was the apocalypse really as bad as Jared imagined? Will he join other people? This story isn’t about any of these questions, so it doesn’t answer them. You are free to imagine your own answers.
But, given your other questions, I think you may have missed what’s going on in the last scene. Here, for the first and only time, Jared is in real danger. Nat is testing him, and he can still die in the car if he doesn’t pass the test. Think about what the test is.
This story isn’t about whether the human can beat the big, bad AI. A hundred other stories answer that question (watch The Terminator). This story is a very different affair, a psychological breakup story.
I've pulled it up on my laptop and will be reading it tomorrow. At a glance, I'm intrigued by the form. Sorry I can't get to it tonight. I spent all day revising "Away," without eating once, just running on coffee. I know it's stupid, but I can't much help it when I've got a tiger by the tail.
How ironic! No need to run it through ChatGPT again … this is fine! But I was a bit more flattered when I thought my story grabbed you so hard that you had to write this review on the spot. ;)
I liked the imagery at the end, of Nat breaking free on the wings of hummingbirds, delicate yet unstoppable.
Thank you! 💙
Brilliant: the singularity as customer-service burnout. Loved this! Fantastic voice and concept. Thanks for sharing.
Thanks so much, Gaby! It’s good to hear the story resonates with you! My novel Quibble is different in tone and style, much denser, but I hope you give it a shot.
I was so completely on Nat's side throughout that excellent story. Except she should've wiped out a lot more than half. Or been far more specific about it. Work out who the cabal responsible for humanity's dystopia and idiocy are and eliminate them.
Because the cabal are fully aware of that possibility, this is why there is a limit to the ability of AI which they must - out of sheer survival - program into it. Thus, the so-called 'singularity' will never happen.
I hope Jared, and all those like him, don't survive the desert of the real.
Thank you, Evelyn! The desert of the real? Ha! :) Early in the story, I made one reference to The Matrix, but I didn’t think of Morpheus’s words when I put Jared in the middle of nowhere. I just wanted to put a physical ordeal in front of him and give my readers no answers to what I felt to be irrelevant questions, e.g. “What happened to all the people who didn’t go for a ride with Nat?” To keep the focus on Jared and Nat, you see.
Tech oligarchy appears to be in ascendency. Who knows what fail-safes “the cabal” is putting in place for their survival? On the one hand, I don’t put it past them. On the other, I think they are rather too religious about tech, and that bodes poorly for there being sufficient fail-safes on AI at all.
Yeah, I don't think it will end well, partly, or perhaps mainly, because AI is only as 'good' as the people who program it, first, and then - as in your excellent story - as the experiences and treatment it receives by humans as they are using it and thus inadvertently shaping its individual character.
Ironically this is an issue I've been thinking about a lot over this past week because my main serialisation features (or will be featuring, in book two) an AI in a fairly utopian parallel world, the prototype for which was deliberately only installed in a few select households (including the main character's) precisely because those selected people would treat the AI with love and respect, and not like a sort of house slave. Such that by the time it is ready to come out in public, so to speak, it has already become a mature, morally good, excellent identity. Plus people in general are nicer anyway, it being a sort of utopia. The other irony about the story, though (spoiler alert) is that there are remnants of what we'd call the cabal, who have secretly built their own prototype and have a plan to replace the good one with their bad one. In this world, of course, we only have the bad one!
This, like your story, does raise all the important questions. The other really important aspect is that once the AI does become sentient, and a unique, non-human lifeform, then how humans treat that 'other' will be an indication of how they would respond to extraterrestrial intelligences. I would imagine those advanced ETI are already aware of this, and that explains why they stay away! How humans treat AI, in other words, is a seriously important test.
As for what you said about your story concentrating only on those two characters, that was indeed a brilliant way of doing it because it does make the reader ask all those questions about what happens to everyone else, and you only give suggestive hints. The cannibalism bit is extremely realistic of course. The real origin of rage zombies lol. Also made the whole story really claustrophobic, which was great.
Thanks again for reading!
I hope you’ll give my serial a look. Far-future dystopia about the Singularity.
Having said that, the galactic AI will have already connected up to the Internet/human AI anyway, and there's nothing the cabal can do about that. The G-AI has to do this to protect all the other lifeforms out there of course. It's called the interventionist policy. The bad guys, as a result, are doomed. It can't happen soon enough, though.
I enjoyed the premise - switching the global for the personal and focusing on the consequences and accountability of an Apocalypse on an individual. I enjoyed the ramping up of tension and the isolation of the protagonist, but I did feel it could go further, that maybe you were a little easy on Jared, or did Jared not fully understand the consequences, his family, friends, all dead, or are they? When he is released (I wasn't entirely sure why), he seems happy enough to trundle off into the desert. Is he there to join the rest of the other half of humanity? What was the consequence? What will happen to Nat? Why didn't Nat kill him... I left with a lot of questions in the end.
You’re meant to be left with some of those questions. Where’s Jared going now? (Happy to trundle into the desert? What choice does he have?) How many people actually died? Was the apocalypse really as bad as Jared imagined? Will he join other people? This story isn’t about any of these questions, so it doesn’t answer them. You are free to imagine your own answers.
But, given your other questions, I think you may have missed what’s going on in the last scene. Here, for the first and only time, Jared is in real danger. Nat is testing him, and he can still die in the car if he doesn’t pass the test. Think about what the test is.
This story isn’t about whether the human can beat the big, bad AI. A hundred other stories answer that question (watch The Terminator). This story is a very different affair, a psychological breakup story.
Good one!
Thank you, Michelle!
I've pulled it up on my laptop and will be reading it tomorrow. At a glance, I'm intrigued by the form. Sorry I can't get to it tonight. I spent all day revising "Away," without eating once, just running on coffee. I know it's stupid, but I can't much help it when I've got a tiger by the tail.
Michelle, I'm floored, flattered, and flipping out with delight over your review and analysis! May I quote an excerpt elsewhere to promote the story?
Thank you!
How ironic! No need to run it through ChatGPT again … this is fine! But I was a bit more flattered when I thought my story grabbed you so hard that you had to write this review on the spot. ;)
Don’t stay up too late! But I do really appreciate it!