One week after Hachette Book Group pulled Mia Ballard’s Shy Girl for strong suspicions of AI use, the industry is reeling—and struggling to contend with the implications of the novel’s cancelation.

Ballard’s originally self-published horror novel, which was released by Hachette UK last fall, was set for to be published by Hachette’s Orbit imprint in April. For months, readers had been raising alarms about the book’s prose and what they saw as a lack of depth.

"Ballard’s lackluster sophomore outing is a gory and intense attempt at feminist horror that doesn’t have much new to say," PW's review opened.

But it took until last Thursday for Hachette to pull the book from its website, shortly after the New York Times approached the publisher with evidence of AI use.

In a statement about the book's cancelation, Hachette cited its commitment to protecting "original creative expression and storytelling.” But book industry researcher Rachel Noorda argued that Hachette’s actions reflected market demands more than anything.

"When publishers limit authors from using AI, this tends to reflect a market position rather than purely an ethical principle," she told PW via email. "In the Shy Girl example, Hachette became concerned with the possibility that the book was written with the assistance of Gen AI after reviewers and readers began pushing back against it."

Hachette's concern may have been delayed, but some speculate it knew about the debates Shy Girl had triggered from the start. In a blog post, Thad McIllroy, a PW contributing editor and industry consultant, said that it "strains credulity to imagine that no one connected with the book at Hachette knew about the online discussions." He added, "the main reason a publisher acquires rights to a self-published book is because of all the online chatter (and the accompanying sales activity)."

While countless readers and authors have spoken out about the debacle in the past week, many publishers have avoided public statements about how they're adapting to the reality of AI-assisted authorship.

Penguin Random House was the sole Big Five publisher to reply to PW's requests for comment, sharing information about its internal process for consulting with authors about their use of generative AI. PRH's author contracts do require "original" work, but given the slipperiness of the word, the publisher has also trained editorial staff on how to "set parameters around A.I. use for authors and illustrators" on an interpersonal level.

Meanwhile, agents, whose job boils down to liaising between authors and publishers, have been outspoken on the subject.

In a conversation with PW, Jennifer Weltz, president of the Jean V. Naggar Literary Agency, suggested that history is doomed to repeat itself unless authors and publishers find a way to talk about AI use.

"Communication and transparency is what I feel was the downfall here, and where we all as an industry need to improve when it comes to AI," Weltz said.

Hannah Bowman, a literary agent at Liza Dawson Associates, agrees that mistrust—between readers and publishers, but also between publishers and authors—is the industry's greatest peril.

"I think that it’s essential for all parties in the publishing process to have transparency and clarity in conversations about how AI tools are being used by any party, especially in the creative process," Bowman, who also chairs the Association of American Literary Agents' AI Special Committee, said.

While it's always newsworthy when a major book contract is broken, the Shy Girl situation has morphed into scandal largely because the use of AI in creative works is an ethically polarizing issue.

McIllroy, on his blog, emphasized the deep material consequences the situation has had for Ballard. "Hachette threw Mia Ballard under the bus, sullied and cancelled without a chance to defend herself in the court of public opinion," he wrote.

Ballard told the Times that her writing was original, but that a friend editing the book had, unbeknownst to her, employed generative AI. She added that she is "pursuing legal action" and has since wiped her social media presence.

"Accusations of AI are incredibly difficult to prove and will inevitably involve accusations for authors who did not use AI," Noorda told PW. "When accusations of AI use and contract cancellations become more common, how will authors who haven’t used AI prove the 'humanness' of their writing?"

Noorda also highlighted the imbalance of power between authors and publishers, who are increasingly incorporating AI into their workflows.

On a page on its website offering guidance to authors on AI use, Hachette says the company itself uses AI for tasks it defines as "operational" rather than "creative," a distinction that the publisher admits can be "difficult" to define. For example, Noorda drew attention to Hachette's mention of using AI to "generate short marketing text or social media assets to facilitate broadest possible reach for all titles."

The Shy Girl situation is unfolding against a backdrop of ever-multiplying lawsuits against big tech firms for pirating and using copyrighted material to train their AI models. While the first headline cases, such as the $1.5 billion class action settlement with Anthropic, were brought by authors, publishers are now joining in. Last month, Hachette joined an infringement suit against Google that was originally brought by authors and illustrators in 2023.

In the long run, publishers and authors' material stakes in generative AI development are aligned, and Weltz, for one, is still hopeful that common ground, though ethically craggy, can be found.

"We're in the weeds with it just like everybody else," she said of agents. "There are going to be mistakes made. There's no question. But with each mistake, hopefully we will move forward with a better structure."