TECHSPLOITATION War changes everything, including technology. We are roughly six years into what the George W. Bush administration calls the war on terror and what hundreds of thousands of soldiers know as the occupation of Iraq. Gizmos that a decade ago would have been viewed entirely as communications tools and toys are now potential surveillance and killing machines.
Don't believe me? Consider how much the Web has changed. Referred to naively 10 years ago by Bill Clinton and Co. as the friendly, welcoming "information superhighway," the Web is now the National Security Agency's surveillance playground. Last year a whistle-blower at AT&T revealed that every bit of Internet traffic routed by AT&T was also being routed through an NSA surveillance system. Millions of innocent people's private Internet information, including online purchases and e-mail, was being watched without warrants.
Cuddly consumer robots epitomized by Sony's Aibo robot dog have changed too. The company that makes adorable Roomba vacuum robots, iRobot, just announced a huge deal with the United States military to make reconnaissance and killing robots called PackBots for use in combat zones. Already, 50 PackBots have been deployed in Iraq and Afghanistan. These are the ground versions of crewless aerial vehicles, remote-controlled spy planes that can also shoot weapons.
Tech security expert Bruce Schneier describes technology as having "dual uses": one for peacetime and one for war. The Wii video game console, for example, is great for transutf8g physical movements into movements onscreen. That makes the Wii great for party games in which you swing your arms to move dancing penguins on the screen. It also makes a great interface for remote-controlled guns in a combat robot. Just move your arm to aim.
In a time of war you can't enjoy a party game without thinking about your game console being used to kill people. I realize that sounds melodramatic, but looked at pragmatically it's quite simply true.
Once you realize that every form of technology has a dual use, it becomes much easier to argue for ways of limiting the uses that aren't ethical or legal. Consider that a roboticized antiaircraft cannon (similar to the PackBot) turned on its operators during a field exercise in South Africa in October 2007, killing nine people before it ran out of ammo. The software error that led this robot to slaughter friendly soldiers is no different from errors that make your Roomba crash. What do we draw from this analogy? Perhaps robots that are perfectly legal as vacuums should be illegal on the battlefield. Perhaps no weapon should ever be completely autonomous like the Roomba.
Questions like these led me and my colleagues at Computer Professionals for Social Responsibility to put together a conference at Stanford University on the topic of technology in wartime, focusing especially on ethics and the law. Coming up on Jan. 26, the conference will be a day packed with talks and panels about everything from dual-use technology (Schneier will be a keynote speaker) to what happens when robots commit war crimes. We'll also hear from people who are appropriating military technologies for human rights causes the very technologies that let military spies hide online also help human rights workers and dissidents shield themselves while still getting out their subversive messages.
We'll also have a panel on so-called cyberterrorism, or destructive hacks aimed at taking down a nation's tech infrastructure. But should fears of cyberterror lead to total government surveillance of the Internet?