In my new novel, Homeland, the sequel to Little Brother, I explore what happens to people when their computers don’t listen to them anymore. Imagine a world where you tell your computer to copy a file, or to play it, or display it, and it says no, where it looks at you out of the webcam’s unblinking eye and says, “I can’t let you do that, Dave.”

Broadly speaking, that’s the whole point of DRM. Embracing DRM means embracing a world in which your computer and other devices must be designed to disobey you—you, the owner, are the adversary. What’s more, in order to design computers and other devices to effectively disobey you, it means designing them so that they must hide things from you. After all, if there was an unfamiliar icon on your desktop labeled “HAL9000.EXE,” you would just drag it into the trash, right?

And that’s the deeper problem: once we demand that our computers be designed to hide things from us, we invite a world where machines stop listening to our orders, and start issuing them.

Design Flaw

In Homeland, Marcus, the protagonist, finds himself being spied upon both by hackers and by law enforcement through his compromised computers and devices. In writing the novel, I drew on the expertise of Jacob Appelbaum, an engineer who works with Wikileaks and the TOR anonymization project, who had recently attended the Wiretapper’s Ball, an annual event in Washington, D.C. where governments and law enforcement agencies from around the world shop for spyware, including viruses that can infect victims’ phones and computers to turn them into portable tracking devices and bugs.

Jake came out of the Wiretapper’s Ball with the details on dozens of such products, including products that masquerade as legitimate Android apps, including iTunes updates, and other innocuous files. The customers for this software include just about every repressive regime on Earth. Jacob wrote me an afterword about this kind of technology, and explained to me what it means to live in a world where technology can be so compromised by design. And there are many examples of such malicious technology in action.

In February 2010, a student, Blake J. Robbins, filed a lawsuit against the Lower Merion, Pa., school district. He was a student at a “laptop school” where all the kids were issued computers, and unbeknownst to him, these machines had been lo-jacked with secret software that could flip on the webcam without lighting the green light and transmit images to school officials (and anyone who had their logins). Blake and other students were photographed thousands of times, asleep and awake, dressed and undressed, at home and at school. So were their parents and siblings.

In November 2010, activists in Germany’s Chaos Computer Club outed the Bavarian government for covertly installing the spyware on suspects’ computers—a “Bundestrojaner” (a “state trojan”) that could snoop through the microphone, webcam, keyboard, and hard-drive. The Bundestrojaner was very badly written and it was easy for anyone—not just the Bavarian police, as intended—to tap into infected computers.

In November 2011, a security researcher, Trevor Eckhart, received a legal threat after revealing that America’s cellular phone companies had installed spyware called “CarrierIQ” on an estimated 141 million handsets. The software could be used to track its victims’ locations, read their text messages, and intercept any passwords or other sensitive information keyed into the phones.

In September 2012, the FTC settled with DesignerWare, a software company in northeastern Pennsylvania, and seven rent-to-own companies to whom Designerware had supplied laptop spyware. This spyware came pre-loaded on rent-to-own machines, and could be used to record secret videos out of the webcam, tap into the microphone, and read files and passwords. The FTC ordered the rent-to-own companies to cease using the software—unless they included a notice of its use in their license agreements.

In November 2012, a security researcher in Australia, Barnaby Jack, presented research showing that he could wirelessly reprogram implanted defibrillators so that they could deliver lethal shocks to their owners. Not knowing what your device is doing isn’t just inconvenient. Now that we put computers into our bodies, it’s potentially lethal.

Still, we not only discourage people from knowing what’s happening in their devices, we actively criminalize it, and we do so with the sort of penalties designed to scare people to death. For example, last month, the U.S. Copyright Office affirmed its intent to recommend that Congress re-criminalize the unlocking of mobile phones. Phones, obviously, are not copyrighted works. But the 1998 Digital Millennium Copyright Act (DMCA) made it a crime to remove DRM or any “I can’t let you do that, Dave” programs from a device or digital product. Case law since then has been unclear on whether this stretches to cover phone unlocking, but you probably don’t want to chance it, because the DMCA provides for five years in jail and a $500,000 fine for first offenses.

In other words, if you’re convicted of illegally unlocking your phone, you face the potential of greater penalties than if you were convicted of turning it into a bomb.

Enablers

In addition to Appelbaum, the afterword to Homeland features another barn-burner of an activist, a young man that I’d known for more than a decade, and who was a crucial player in the effort to kill the proposed legislation in U.S. known as SOPA (the Stop Online Piracy Act) and PIPA (The Protect IP Act). That young man’s name was Aaron Swartz.

By now, many of you know that Aaron Swartz killed himself on January 11, 2013—the second anniversary of his arrest for what the U.S. federal government alleges was the illegal downloading of a large cache of scholarly journal articles from MIT’s open, publicly accessible network. Aaron’s prosecutor had threatened him with 35 years in prison for this. He was just 26 years old when he died.

Aaron got involved with fighting SOPA in spite of himself. He’d long before given up fighting for more liberal, reasonable copyright regimes, which he saw as a sideshow in a larger fight against corruption, surveillance, and censorship. But SOPA dragged him back into the copyfight, because SOPA would have made censoring the Internet trivial, by ratcheting up the penalties for inadvertent, indirect infringement to farcical heights.

For example, under SOPA, if you hosted a place on the Internet where people could communicate with each other, it would have become your responsibility to ensure that your site didn’t link to any Web site that, in turn, contained infringing links, on penalty of losing your domain, your payment processor, your advertising relationships, your Web site, and most likely, your ability to make a living, as you would be subject to huge fines. In other words, if you had a site where people exchanged information—whether about the Arab Spring, Little League carpools, or a plan to get a favorite candidate elected to office—and one of your chatters happened to link to Facebook, you could be held responsible forevermore for ensuring that none of the links on Facebook ever went to other sites that infringed upon copyright.

What Aaron saw in SOPA was a law that would have made it virtually impossible for people to communicate with each other except within narrow, heavily moderated and surveilled constraints. In SOPA, Aaron saw a mechanism that could be used for terrorizing dissidents and the powerless with threats of terrible punishments in a shoot-first-ask-questions-later, guilty-by-default system that purported to defend creativity by destroying free speech.

There is only one Internet. And if we make it easy to censor Web sites based on unverified claims of infringement, then we invite every bully to silence his detractors through mere accusations.

Free

Contrary to what’s been written in some quarters, Aaron Swartz didn’t attempt to download those journal articles because “information wants to be free.” No one cares what information wants. He was almost certainly attempting to download those articles because they were publicly funded scholarship that was not available to the public. They were scientific and scholarly truths about the world, information that the public paid for and needs in order to make informed choices about their lives and their governance. Fighting for information’s freedom isn’t the point. It’s people’s freedom that matters.

All of which makes the publishing community’s embrace of DRM and its advocacy for badly written, overly broad legislation to support DRM, fraught with peril. Since Frankenstein, writers and thinkers have recoiled in visceral horror at the idea of technology overpowering its creators. But when we actively build businesses that require censorship, surveillance, and control to thrive, we make a Frankenstein’s monster out of the devices that fill our pockets and homes, and the network that binds them all together.