Christopher Galt (the pseudonym of Craig Russell) examines the consequences—both positive and negative—of technological advances in his near-future thriller Biblical.

How do you feel about rapidly accelerating technology?

I think this is the most exciting time to be alive. We have progressed more technologically in the last 50 years than in the entire history of mankind. I think the benefits of advances in neuroscience, medicine, genetics, artificial intelligence, and computing all have had a massive positive effect on our lives and will continue to do so. Research into the potential uses of graphene, an ultrathin form of carbon, for example, hasn’t really yet registered with the general population, but it’s going to change the lives of everyone on the planet on a massive scale.

Is there a peril?

I am deeply concerned about the use of these technologies in the military field. The military is developing automated weapons, many of which move on legs rather than wheels and will enter hostile territory and make decisions, based on algorithms, on whether to kill the people they find. I honestly believe, before there’s any further development, that autonomous weapons based on AI should be banned under international convention, pretty much like chemical weapons are.

What do you see as the future of these developments?

The biggest threat to humanity posed by AI and other technologies is paradoxically seen as their biggest benefit, and this is one of the main engines behind Biblical. Some people believe our only hope of surviving the technological singularity is to engage with it fully and to use emergent technologies to enhance ourselves as human beings. Transhumanists believe we actually have a duty to do this and that the next phase in human evolution should be kick-started by humans themselves. To avoid becoming slaves to technologies of our own making, which we will no longer understand, we fuse with the technology. We become posthuman. Others argue, with some logic, that doing so would be self-defeating, and we would lose everything that makes us human. That we would, effectively, evolve humanity into extinction.

Can we anticipate the consequences?

There’s more than a touch of Boiled Frog Syndrome at work. Everyone is aware of the rapid advance of information and artificial intelligence technologies, but only because their TVs get thinner and have better definition every year. And their smartphones get smarter. We really are within a decade of seeing driverless cars become the norm rather than the exception, and that’s something that came up on us really fast. My concern is that other, less beneficent technologies might similarly sneak up on us.