Tag Archives: Wired Magazine

Hackers Remotely Kill a Jeep on the Highway – With Me in It

by Andy Greenberg  I   Security  I  07.21.15.07.21.15  I  6:00 AM

If consumers don’t realize this is an issue, they should, and they should start complaining to carmakers. This might be the kind of software bug most likely to kill someone. – CHARLIE MILLER

 

I WAS DRIVING 70 mph on the edge of downtown St. Louis when the exploit began to take hold.

Though I hadn’t touched the dashboard, the vents in the Jeep Cherokee started blasting cold air at the maximum setting, chilling the sweat on my back through the in-seat climate control system. Next the radio switched to the local hip hop station and began blaring Skee-lo at full volume. I spun the control knob left and hit the power button, to no avail. Then the windshield wipers turned on, and wiper fluid blurred the glass.

As I tried to cope with all this, a picture of the two hackers performing these stunts appeared on the car’s digital display: Charlie Miller and Chris Valasek, wearing their trademark track suits. A nice touch, I thought.

The Jeep’s strange behavior wasn’t entirely unexpected. I’d come to St. Louis to be Miller and Valasek’s digital crash-test dummy, a willing subject on whom they could test the car-hacking research they’d been doing over the past year. The result of their work was a hacking technique—what the security industry calls a zero-day exploit—that can target Jeep Cherokees and give the attacker wireless control, via the Internet, to any of thousands of vehicles. Their code is an automaker’s nightmare: software that lets hackers send commands through the Jeep’s entertainment system to its dashboard functions, steering, brakes, and transmission, all from a laptop that may be across the country.

To better simulate the experience of driving a vehicle while it’s being hijacked by an invisible, virtual force, Miller and Valasek refused to tell me ahead of time what kinds of attacks they planned to launch from Miller’s laptop in his house 10 miles west. Instead, they merely assured me that they wouldn’t do anything life-threatening. Then they told me to drive the Jeep onto the highway. “Remember, Andy,” Miller had said through my iPhone’s speaker just before I pulled onto the Interstate 64 on-ramp, “no matter what happens, don’t panic.”1

Charlie Miller (left) and Chris Valasek hacking into a Jeep Cherokee from Miller's basement as I drove the SUV on a highway ten miles away.
Charlie Miller (left) and Chris Valasek (right) hacking into a Jeep Cherokee from Miller’s basement as I drove the SUV on a highway ten miles away. Whitney Curtis for WIRED

As the two hackers remotely toyed with the air-conditioning, radio, and windshield wipers, I mentally congratulated myself on my courage under pressure. That’s when they cut the transmission.

Immediately my accelerator stopped working. As I frantically pressed the pedal and watched the RPMs climb, the Jeep lost half its speed, then slowed to a crawl. This occurred just as I reached a long overpass, with no shoulder to offer an escape. The experiment had ceased to be fun.

At that point, the interstate began to slope upward, so the Jeep lost more momentum and barely crept forward. Cars lined up behind my bumper before passing me, honking. I could see an 18-wheeler approaching in my rearview mirror. I hoped its driver saw me, too, and could tell I was paralyzed on the highway.

“You’re doomed!” Valasek shouted, but I couldn’t make out his heckling over the blast of the radio, now pumping Kanye West. The semi loomed in the mirror, bearing down on my immobilized Jeep.
I followed Miller’s advice: I didn’t panic. I did, however, drop any semblance of bravery, grab my iPhone with a clammy fist, and beg the hackers to make it stop.

Wireless Carjackers

This wasn’t the first time Miller and Valasek had put me behind the wheel of a compromised car. In the summer of 2013, I drove a Ford Escape and a Toyota Prius around a South Bend, Indiana, parking lot while they sat in the backseat with their laptops, cackling as they disabled my brakes, honked the horn, jerked the seat belt, and commandeered the steering wheel. “When you lose faith that a car will do what you tell it to do,” Miller observed at the time, “it really changes your whole view of how the thing works.” Back then, however, their hacks had a comforting limitation: The attacker’s PC had been wired into the vehicles’ onboard diagnostic port, a feature that normally gives repair technicians access to information about the car’s electronically controlled systems.

A mere two years later, that carjacking has gone wireless. Miller and Valasek plan to publish a portion of their exploit on the Internet, timed to a talk they’re giving at the Black Hat security conference in Las Vegas next month. It’s the latest in a series of revelations from the two hackers that have spooked the automotive industry and even helped to inspire legislation; WIRED has learned that senators Ed Markey and Richard Blumenthal plan to introduce an automotive security bill today to set new digital security standards for cars and trucks, first sparked when Markey took note of Miller and Valasek’s work in 2013.

As an auto-hacking antidote, the bill couldn’t be timelier. The attack tools Miller and Valasek developed can remotely trigger more than the dashboard and transmission tricks they used against me on the highway. They demonstrated as much on the same day as my traumatic experience on I-64; After narrowly averting death by semi-trailer, I managed to roll the lame Jeep down an exit ramp, re-engaged the transmission by turning the ignition off and on, and found an empty lot where I could safely continue the experiment.

Miller and Valasek’s full arsenal includes functions that at lower speeds fully kill the engine, abruptly engage the brakes, or disable them altogether. The most disturbing maneuver came when they cut the Jeep’s brakes, leaving me frantically pumping the pedal as the 2-ton SUV slid uncontrollably into a ditch. The researchers say they’re working on perfecting their steering control—for now they can only hijack the wheel when the Jeep is in reverse. Their hack enables surveillance too: They can track a targeted Jeep’s GPS coordinates, measure its speed, and even drop pins on a map to trace its route.

Miller attempts to rescue the Jeep after its brakes were remotely disabled, sending it into a ditch.
Miller attempts to rescue the Jeep after its brakes were remotely disabled, sending it into a ditch. Photo:  Andy Greenberg WIRED

All of this is possible only because Chrysler, like practically all carmakers, is doing its best to turn the modern automobile into a smartphone. Uconnect, an Internet-connected computer feature in hundreds of thousands of Fiat Chrysler cars, SUVs, and trucks, controls the vehicle’s entertainment and navigation, enables phone calls, and even offers a Wi-Fi hot spot. And thanks to one vulnerable element, which Miller and Valasek won’t identify until their Black Hat talk, Uconnect’s cellular connection also lets anyone who knows the car’s IP address gain access from anywhere in the country. “From an attacker’s perspective, it’s a super nice vulnerability,” Miller says.

 

From that entry point, Miller and Valasek’s attack pivots to an adjacent chip in the car’s head unit—the hardware for its entertainment system—silently rewriting the chip’s firmware to plant their code. That rewritten firmware is capable of sending commands through the car’s internal computer network, known as a CAN bus, to its physical components like the engine and wheels. Miller and Valasek say the attack on the entertainment system seems to work on any Chrysler vehicle with Uconnect from late 2013, all of 2014, and early 2015. They’ve only tested their full set of physical hacks, including ones targeting transmission and braking systems, on a Jeep Cherokee, though they believe that most of their attacks could be tweaked to work on any Chrysler vehicle with the vulnerable Uconnect head unit. They have yet to try remotely hacking into other makes and models of cars.

After the researchers reveal the details of their work in Vegas, only two things will prevent their tool from enabling a wave of attacks on Jeeps around the world. First, they plan to leave out the part of the attack that rewrites the chip’s firmware; hackers following in their footsteps will have to reverse-engineer that element, a process that took Miller and Valasek months. But the code they publish will enable many of the dashboard hijinks they demonstrated on me as well as GPS tracking.

Second, Miller and Valasek have been sharing their research with Chrysler for nearly nine months, enabling the company to quietly release a patch ahead of the Black Hat conference. On July 16, owners of vehicles with the Uconnect feature were notified of the patch in a post on Chrysler’s website that didn’t offer any details or acknowledge Miller and Valasek’s research. “[Fiat Chrysler Automobiles] has a program in place to continuously test vehicles systems to identify vulnerabilities and develop solutions,” reads a statement a Chrysler spokesperson sent to WIRED. “FCA is committed to providing customers with the latest software updates to secure vehicles against any potential vulnerability.”

Unfortunately, Chrysler’s patch must be manually implemented via a USB stick or by a dealership mechanic. (Download the update here.) That means many—if not most—of the vulnerable Jeeps will likely stay vulnerable.

Chrysler stated in a response to questions from WIRED that it “appreciates” Miller and Valasek’s work. But the company also seemed leery of their decision to publish part of their exploit. “Under no circumstances does FCA condone or believe it’s appropriate to disclose ‘how-to information’ that would potentially encourage, or help enable hackers to gain unauthorized and unlawful access to vehicle systems,” the company’s statement reads. “We appreciate the contributions of cybersecurity advocates to augment the industry’s understanding of potential vulnerabilities. However, we caution advocates that in the pursuit of improved public safety they not, in fact, compromise public safety.”

The two researchers say that even if their code makes it easier for malicious hackers to attack unpatched Jeeps, the release is nonetheless warranted because it allows their work to be proven through peer review. It also sends a message: Automakers need to be held accountable for their vehicles’ digital security. “If consumers don’t realize this is an issue, they should, and they should start complaining to carmakers,” Miller says. “This might be the kind of software bug most likely to kill someone.”

In fact, Miller and Valasek aren’t the first to hack a car over the Internet. In 2011 a team of researchers from the University of Washington and the University of California at San Diego showed that they could wirelessly disable the locks and brakes on a sedan. But those academics took a more discreet approach, keeping the identity of the hacked car secret and sharing the details of the exploit only with carmakers.
Miller and Valasek represent the second act in a good-cop/bad-cop routine. Carmakers who failed to heed polite warnings in 2011 now face the possibility of a public dump of their vehicles’ security flaws. The result could be product recalls or even civil suits, says UCSD computer science professor Stefan Savage, who worked on the 2011 study. “Imagine going up against a class-action lawyer after Anonymous decides it would be fun to brick all the Jeep Cherokees in California,” Savage says.2

For the auto industry and its watchdogs, in other words, Miller and Valasek’s release may be the last warning before they see a full-blown zero-day attack. “The regulators and the industry can no longer count on the idea that exploit code won’t be in the wild,” Savage says. “They’ve been thinking it wasn’t an imminent danger you needed to deal with. That implicit assumption is now dead.”

471,000 Hackable Automobiles

Miller and Vasalek’s exploit uses a burner phone’s cellular connection to attack the Jeep’s internet-connected entertainment system. Photo: Whitney Curtis for WIRED

Sitting on a leather couch in Miller’s living room as a summer storm thunders outside, the two researchers scan the Internet for victims.

Uconnect computers are linked to the Internet by Sprint’s cellular network, and only other Sprint devices can talk to them. So Miller has a cheap Kyocera Android phone connected to his battered MacBook. He’s using the burner phone as a Wi-Fi hot spot, scouring for targets using its thin 3G bandwidth.

A set of GPS coordinates, along with a vehicle identification number, make, model, and IP address, appears on the laptop screen. It’s a Dodge Ram. Miller plugs its GPS coordinates into Google Maps to reveal that it’s cruising down a highway in Texarkana, Texas. He keeps scanning, and the next vehicle to appear on his screen is a Jeep Cherokee driving around a highway cloverleaf between San Diego and Anaheim, California. Then he locates a Dodge Durango, moving along a rural road somewhere in the Upper Peninsula of Michigan. When I ask him to keep scanning, he hesitates. Seeing the actual, mapped locations of these unwitting strangers’ vehicles—and knowing that each one is vulnerable to their remote attack—unsettles him.

When Miller and Valasek first found the Uconnect flaw, they thought it might only enable attacks over a direct Wi-Fi link, confining its range to a few dozen yards. When they discovered the Uconnect’s cellular vulnerability earlier this summer, they still thought it might work only on vehicles on the same cell tower as their scanning phone, restricting the range of the attack to a few dozen miles. But they quickly found even that wasn’t the limit. “When I saw we could do it anywhere, over the Internet, I freaked out,” Valasek says. “I was frightened. It was like, holy fuck, that’s a vehicle on a highway in the middle of the country. Car hacking got real, right then.”

That moment was the culmination of almost three years of work. In the fall of 2012, Miller, a security researcher for Twitter and a former NSA hacker, and Valasek, the director of vehicle security research at the consultancy IOActive, were inspired by the UCSD and University of Washington study to apply for a car-hacking research grant from Darpa. With the resulting $80,000, they bought a Toyota Prius and a Ford Escape. They spent the next year tearing the vehicles apart digitally and physically, mapping out their electronic control units, or ECUs—the computers that run practically every component of a modern car—and learning to speak the CAN network protocol that controls them.

When they demonstrated a wired-in attack on those vehicles at the DefCon hacker conference in 2013, though, Toyota, Ford, and others in the automotive industry downplayed the significance of their work, pointing out that the hack had required physical access to the vehicles. Toyota, in particular, argued that its systems were “robust and secure” against wireless attacks. “We didn’t have the impact with the manufacturers that we wanted,” Miller says. To get their attention, they’d need to find a way to hack a vehicle remotely.

Charlie Miller.
Charlie Miller. Photo: Whitney Curtis for WIRED
Chris Valasek.
Chris Valasek. Photo: Whitney Curtis for WIRED

Congress Takes on the
So the next year, they signed up for mechanic’s accounts on the websites of every major automaker and downloaded dozens of vehicles’ technical manuals and wiring diagrams. Using those specs, they rated 24 cars, SUVs, and trucks on three factors they thought might determine their vulnerability to hackers: How many and what types of radios connected the vehicle’s systems to the Internet; whether the Internet-connected computers were properly isolated from critical driving systems, and whether those critical systems had “cyberphysical” components—whether digital commands could trigger physical actions like turning the wheel or activating brakes.

Based on that study, they rated Jeep Cherokee the most hackable model. Cadillac’s Escalade and Infiniti’s Q50 didn’t fare much better; Miller and Valasek ranked them second- and third-most vulnerable. When WIRED told Infiniti that at least one of Miller and Valasek’s warnings had been borne out, the company responded in a statement that its engineers “look forward to the findings of this [new] study” and will “continue to integrate security features into our vehicles to protect against cyberattacks.” Cadillac emphasized in a written statement that the company has released a new Escalade since Miller and Valasek’s last study, but that cybersecurity is “an emerging area in which we are devoting more resources and tools,” including the recent hire of a chief product cybersecurity officer.

After Miller and Valasek decided to focus on the Jeep Cherokee in 2014, it took them another year of hunting for hackable bugs and reverse-engineering to prove their educated guess. It wasn’t until June that Valasek issued a command from his laptop in Pittsburgh and turned on the windshield wipers of the Jeep in Miller’s St. Louis driveway.

Since then, Miller has scanned Sprint’s network multiple times for vulnerable vehicles and recorded their vehicle identification numbers. Plugging that data into an algorithm sometimes used for tagging and tracking wild animals to estimate their population size, he estimated that there are as many as 471,000 vehicles with vulnerable Uconnect systems on the road.

Pinpointing a vehicle belonging to a specific person isn’t easy. Miller and Valasek’s scans reveal random VINs, IP addresses, and GPS coordinates. Finding a particular victim’s vehicle out of thousands is unlikely through the slow and random probing of one Sprint-enabled phone. But enough phones scanning together, Miller says, could allow an individual to be found and targeted. Worse, he suggests, a skilled hacker could take over a group of Uconnect head units and use them to perform more scans—as with any collection of hijacked computers—worming from one dashboard to the next over Sprint’s network. The result would be a wirelessly controlled automotive botnet encompassing hundreds of thousands of vehicles.

“For all the critics in 2013 who said our work didn’t count because we were plugged into the dashboard,” Valasek says, “well, now what?”

Chris Valasek.
Chris Vasalek. Photo: Whitney Curtis for WIRED

Congress Takes on Car Hacking

Now the auto industry needs to do the unglamorous, ongoing work of actually protecting cars from hackers. And Washington may be about to force the issue.

Later today, senators Markey and Blumenthal intend to reveal new legislation designed to tighten cars’ protections against hackers. The bill (which a Markey spokesperson insists wasn’t timed to this story) will call on the National Highway Traffic Safety Administration and the Federal Trade Commission to set new security standards and create a privacy and security rating system for consumers. “Controlled demonstrations show how frightening it would be to have a hacker take over controls of a car,” Markey wrote in a statement to WIRED. “Drivers shouldn’t have to choose between being connected and being protected…We need clear rules of the road that protect cars from hackers and American families from data trackers.”

Markey has keenly followed Miller and Valasek’s research for years. Citing their 2013 Darpa-funded research and hacking demo, he sent a letter to 20 automakers, asking them to answer a series of questions about their security practices. The answers, released in February, show what Markey describes as “a clear lack of appropriate security measures to protect drivers against hackers who may be able to take control of a vehicle.” Of the 16 automakers who responded, all confirmed that virtually every vehicle they sell has some sort of wireless connection, including Bluetooth, Wi-Fi, cellular service, and radios. (Markey didn’t reveal the automakers’ individual responses.) Only seven of the companies said they hired independent security firms to test their vehicles’ digital security. Only two said their vehicles had monitoring systems that checked their CAN networks for malicious digital commands.

UCSD’s Savage says the lesson of Miller and Valasek’s research isn’t that Jeeps or any other vehicle are particularly vulnerable, but that practically any modern vehicle could be vulnerable. “I don’t think there are qualitative differences in security between vehicles today,” he says. “The Europeans are a little bit ahead. The Japanese are a little bit behind. But broadly writ, this is something everyone’s still getting their hands around.”

Miller (left) and Valasek demonstrated the rest of their attacks on the Jeep while I drove it around an empty parking lot.
Miller (left) and Vasalek demonstrated the rest of their attacks on the Jeep while I drove it around an empty parking lot. Photo: Whitney Curtis

Aside from wireless hacks used by thieves to open car doors, only one malicious car-hacking attack has been documented: In 2010 a disgruntled employee in Austin, Texas, used a remote shutdown system meant for enforcing timely car payments to brick more than 100 vehicles. But the opportunities for real-world car hacking have only grown, as automakers add wireless connections to vehicles’ internal networks. Uconnect is just one of a dozen telematics systems, including GM Onstar, Lexus Enform, Toyota Safety Connect, Hyundai Bluelink, and Infiniti Connection.

In fact, automakers are thinking about their digital security more than ever before, says Josh Corman, the cofounder of I Am the Cavalry, a security industry organization devoted to protecting future Internet-of-things targets like automobiles and medical devices. Thanks to Markey’s letter, and another set of questions sent to automakers by the House Energy and Commerce Committee in May, Corman says, Detroit has known for months that car security regulations are coming.

But Corman cautions that the same automakers have been more focused on competing with each other to install new Internet-connected cellular services for entertainment, navigation, and safety. (Payments for those services also provide a nice monthly revenue stream.) The result is that the companies have an incentive to add Internet-enabled features—but not to secure them from digital attacks. “They’re getting worse faster than they’re getting better,” he says. “If it takes a year to introduce a new hackable feature, then it takes them four to five years to protect it.”

Corman’s group has been visiting auto industry events to push five recommendations: safer design to reduce attack points, third-party testing, internal monitoring systems, segmented architecture to limit the damage from any successful penetration, and the same Internet-enabled security software updates that PCs now receive. The last of those in particular is already catching on; Ford announced a switch to over-the-air updates in March, and BMW used wireless updates to patch a hackable security flaw in door locks in January.
Corman says carmakers need to befriend hackers who expose flaws, rather than fear or antagonize them—just as companies like Microsoft have evolved from threatening hackers with lawsuits to inviting them to security conferences and paying them “bug bounties” for disclosing security vulnerabilities. For tech companies, Corman says, “that enlightenment took 15 to 20 years.” The auto industry can’t afford to take that long. “Given that my car can hurt me and my family,” he says, “I want to see that enlightenment happen in three to five years, especially since the consequences for failure are flesh and blood.”

As I drove the Jeep back toward Miller’s house from downtown St. Louis, however, the notion of car hacking hardly seemed like a threat that will wait three to five years to emerge. In fact, it seemed more like a matter of seconds; I felt the vehicle’s vulnerability, the nagging possibility that Miller and Valasek could cut the puppet’s strings again at any time.

The hackers holding the scissors agree. “We shut down your engine—a big rig was honking up on you because of something we did on our couch,” Miller says, as if I needed the reminder. “This is what everyone who thinks about car security has worried about for years. This is a reality.”

Update 3:30 7/24/2015: Chrysler has issued a recall for 1.4 million vehicles as a result of Miller and Valasek’s research. The company has also blocked their wireless attack on Sprint’s network to protect vehicles with the vulnerable software.

1Correction 10:45 7/21/2015: An earlier version of the story stated that the hacking demonstration took place on Interstate 40, when in fact it was Route 40, which coincides in St. Louis with Interstate 64.

2Correction 1:00pm 7/27/2015: An earlier version of this story referenced a Range Rover recall due to a hackable software bug that could unlock the vehicles’ doors. While the software bug did lead to doors unlocking, it wasn’t publicly determined to exploitable by hackers.

A New Way for Tech Firms to Fight Orders to Unlock Devices

Wired_logo-500x143

ALTHOUGH THE FEDERAL government recently backed down on its efforts to compel tech companies to install backdoors on their electronic devices, it doesn’t mean the government has given up on getting access to protected phones and other devices.

A ruling unsealed by a federal magistrate judge in New York last week has shone a light on a 200-year-old legal remedy prosecutors have dusted off in an attempt to force companies like Apple to unlock its customers’ devices. Last year, a District Court in California ordered Apple to unlock an iPhone for investigators. Similarly, the District Court in Manhattan ordered an unnamed phone manufacturer to do the same.

But last week a federal magistrate in New York declined to fall in step with the government’s demand to access an Apple device(.pdf) seized by investigators, fanning the flames of a national debate that has been playing out in the media and in the halls of the executive branch with no resolution to date.

Magistrate Judge James Orenstein, in the Eastern District of New York, didn’t reject the request outright, but instead asked Apple to respond this week about whether it would even be technically possible to disable the security lock on the device in question. He also asked Apple if doing so would be unduly burdensome for the company. This may be a moot point if it turns out the device in question is not, as the Washington Post reported this weekend, an iPhone running an older version of iOS, which has built-in capabilities for Apple to unlock it, as opposed to the new iOS 8, which locks even Apple out of devices.

But by burdensome, Orenstein didn’t just mean how much effort Apple would have to expend to unlock the device. He also meant how significant of a cost the company might bear in the marketplace for capitulating to government demands to unlock its customers’ devices.

Apple may have concluded that failing to provide its customers with privacy protection ‘would have long-term costs’ on its business prospects.
This raises interesting new issues around surveillance that have only come into play in the wake of the Edward Snowden revelations and the public’s changing views about government surveillance.

Few details around the New York case are known, since all documents, except the magistrate’s response to the government’s motion to compel, are sealed in the case. But that documented response reveals that the government invoked the All Writs Act to make its case to compel Apple to unlock an unspecified device. The All Writs Act is a part of the 1789 Judiciary Act, which was established centuries ago to give federal courts the power to issue writs when appropriate to compel third parties to help execute another court order—for example, a search warrant. The writs aren’t intended to be an end-run around existing statutes but simply to give courts a tool to enforce existing statutory authorities, particularly when there might be a gap in what the statutes cover.

“The general idea is this is a supplemental authority, and it does allow them to call on third parties to help them execute a search warrant or a valid court order,” says Andrew Crocker, staff attorney with the Electronic Frontier Foundation.

The government asserted in a motion to the New York court that it had the authority, under the All Writs Act, to compel Apple to unlock a device investigators had seized. But Orenstein wasn’t so sure.
He noted that the Supreme Court has asserted that courts can’t use a writ if an existing law already covers the issue at hand. Nor can a writ be used simply when compliance with existing statutory procedures is “inconvenient or less appropriate.” The Supreme Court has also ruled that a court could issue an order to compel only as long as the order did not impose an unreasonable burden on the third party being compelled.

After examining the case at hand, Orenstein concluded that prosecutors were asking the court to give them authority that Congress has so far specifically chosen not to give them—that is, the authority to compel a company to unlock a protected device.

Lawmakers and the public, he noted, are still wrestling with the issue. The fact that no statute currently exists specifically giving courts the authority to compel a company to unlock a device can’t be interpreted as an oversight on the part of lawmakers, or a sign that the courts should step in to fill the gap left by the absence of a statute, Orenstein argued. Instead, the lack of a clear statute seems to indicate lawmakers’ ambivalence on whether such a law compelling companies is appropriate or necessary. Issuing an order to compel Apple to unlock the device would assume an intent on the part of lawmakers that isn’t there.

“[T]he question becomes whether the government seeks to fill in a statutory gap that Congress has failed to consider, or instead seeks to have the court give it authority that Congress chose not to confer,” Orenstein ponders in the document.

In fact, he notes, Sen. Ron Wyden (D-Oregon) and a bipartisan group of Congressional lawmakers introduced bills in 2015 that would specifically preclude the government from forcing a private entity like Apple to compromise data security in the way the government is seeking. Although the bills have not advanced as of yet, they signal at the very least an ambivalence and lack of consensus around granting the authority to compel that prosecutors in this case are seeking.

It’s not the first time Orenstein has pushed back against government surveillance. In 2005, in a different case involving the All Writs Act, he ruled that cell-site location data is protected under the Fourth Amendment and therefore investigators need a warrant to obtain it. Orenstein called the government’s attempt to use the All Writs Act in that case a “Hail Mary play” and denied it on grounds that granting the executive branch authority to use investigative techniques that were explicitly denied it by the legislative branch was inappropriate. A decade later, the government is using the same playbook, and Orenstein is still resisting.

To bolster its request in the current case, the government cited United States v. New York Tel. Co, a 1977 Supreme Court case in which the judges ruled that a court could use the All Writs Act to compel New York Telephone Company to install a pen register at its facilities to assist in executing a search warrant. The phone company, the court argued, was a public utility that had a duty to serve and already regularly used pen registers, therefore no burden would be placed on it to install the requested surveillance tool. The court also noted that a writ was important because there was no other method for investigators to acquire the information they needed.
Orenstein rejected this argument, however, saying it didn’t fit the present circumstances. Apple, as a private commercial entity, is not a public utility with a duty to serve and is “free to choose to promote its customers’ interest in privacy over the competing interest of law enforcement.” And Apple, unlike the New York Telephone Company, does not own the equipment the government wants to unlock. What’s more, the government can obtain the information it wants in another way—it can compel the device owner, through the court, to unlock the device instead of compelling Apple.

The government argued that Apple has unlocked phones in the past under court order, and therefore, like the New York Telephone Company, it would suffer no burden to do so under another court order.

But the deciding argument may lie in how burdensome it would be for Apple to unlock the device. Orenstein has given Apple until October 15th to respond to his question about whether it is technically feasible to unlock the device without undue burden.

If the device is indeed using an older version of software prior to iOS 8, then there would be no technological hurdle for Apple. But he seems to have left open the possibility for a different kind of burden that unlocking the device might entail—economic and market burden.

Orenstein writes that in the past the burden on a third party had been presumed to be “limited to the physical demands and immediate monetary costs of compliance.” Likewise, he notes, that the government in the current case has indicated that Apple is not likely to suffer any unreasonable burden in meeting the request.

“I am less certain,” Orenstein writes in his remarkable conclusion. “The decision to allow consumers to encrypt their devices in such a way that would be resistant to ready law enforcement access was likely one that Apple did not make in haste, or without significant consideration of the competing interests of public safety and the personal privacy and data security of its customers.”

He goes on to say that Apple may have concluded that failing to provide its customers with privacy protection “would have long-term costs” on its business prospects.

It’s possible, Crocker acknowledges, that Apple could argue that even though it has the ability to unlock the phone and in fact has done so in the past, the political environment and public support for surveillance has changed since the device was sold and by extension the economic consequences of unlocking a device have also changed. Where previously it might not have been a burden to comply, it is now.

“I don’t see why Apple couldn’t raise that argument,” says the EFF’s Crocker. “It sounds like they would get a sympathetic reading from Orenstein [if they did].”

Hackers Can Silently Control Siri From 16 Feet Away

Wired_logo-500x143

SIRI MAY BE your personal assistant. But your voice is not the only one she listens to. As a group of French researchers have discovered, Siri also helpfully obeys the orders of any hacker who talks to her—even, in some cases, one who’s silently transmitting those commands via radio from as far as 16 feet away.

A pair of researchers at ANSSI, a French government agency devoted to information security, have shown that they can use radio waves to silently trigger voice commands on any Android phone or iPhone that has Google Now or Siri enabled, if it also has a pair of headphones with a microphone plugged into its jack. Their clever hack uses those headphones’ cord as an antenna, exploiting its wire to convert surreptitious electromagnetic waves into electrical signals that appear to the phone’s operating system to be audio coming from the user’s microphone. Without speaking a word, a hacker could use that radio attack to tell Siri or Google Now to make calls and send texts, dial the hacker’s number to turn the phone into an eavesdropping device, send the phone’s browser to a malware site, or send spam and phishing messages via email, Facebook, or Twitter.

‘The sky is the limit here. Everything you can do through the voice interface you can do remotely and discreetly through electromagnetic waves.’
“The possibility of inducing parasitic signals on the audio front-end of voice-command-capable devices could raise critical security impacts,” the two French researchers, José Lopes Esteves and Chaouki Kasmi, write in a paper published by the IEEE. Or as Vincent Strubel, the director of their research group at ANSSI puts it more simply, “The sky is the limit here. Everything you can do through the voice interface you can do remotely and discreetly through electromagnetic waves.”

The researchers’ work, which was first presented at the Hack in Paris conference over the summer but received little notice outside of a few French websites, uses a relatively simple collection of equipment: It generates its electromagnetic waves with a laptop running the open-source software GNU Radio, a USRP software-defined radio, an amplifier, and an antenna. In its smallest form, which the researchers say could fit inside a backpack, their setup has a range of around six and a half feet. In a more powerful form that requires larger batteries and could only practically fit inside a car or van, the researchers say they could extend the attack’s range to more than 16 feet.

IMG_24828-1024x765
The experimental setup Kasmi and Esteves used to hijack smartphones’ voice commands with radio waves. JOSÉ LOPES ESTEVES

Here’s a video showing the attack in action: In the demo, the researchers commandeer Google Now via radio on an Android smartphone and force the phone’s browser to visit the ANSSI website. (That experiment was performed inside a radio-wave-blocking Faraday cage, the researchers say, to abide by French regulations that forbid broadcasting certain electromagnetic frequencies. But Kasmi and Esteves say that the Faraday cage wasn’t necessary for the attack to work.)

The researchers’ silent voice command hack has some serious limitations: It only works on phones that have microphone-enabled headphones or earbuds plugged into them. Many Android phones don’t have Google Now enabled from their lockscreen, or have it set to only respond to commands when it recognizes the user’s voice. (On iPhones, however, Siri is enabled from the lockscreen by default, with no such voice identity feature.) Another limitation is that attentive victims would likely be able to see that the phone was receiving mysterious voice commands and cancel them before their mischief was complete.

Then again, the researchers contend that a hacker could hide the radio device inside a backpack in a crowded area and use it to transmit voice commands to all the surrounding phones, many of which might be vulnerable and hidden in victims’ pockets or purses. “You could imagine a bar or an airport where there are lots of people,” says Strubel. “Sending out some electromagnetic waves could cause a lot of smartphones to call a paid number and generate cash.”
Although the latest version of iOS now has a hands-free feature that allows iPhone owners to send voice commands merely by saying “Hey Siri,” Kasmi and Esteves say that their attack works on older versions of the operating system, too. iPhone headphones have long had a button on their cord that allows the user to enable Siri with a long press. By reverse engineering and spoofing the electrical signal of that button press, their radio attack can trigger Siri from the lockscreen without any interaction from the user. “It’s not mandatory to have an always-on voice interface,” says Kasmi. “It doesn’t make the phone more vulnerable, it just makes the attack less complex.”

Of course, security conscious smartphone users probably already know that leaving Siri or Google Now enabled on their phone’s login screen represents a security risk. At least in Apple’s case, anyone who gets hands-on access to the device has long been able to use those voice command features to squeeze sensitive information out of the phone—from contacts to recent calls—or even hijack social media accounts. But the radio attack extends the range and stealth of that intrusion, making it all the more important for users to disable the voice command functions from their lock screen.

The ANSSI researchers say they’ve contacted Apple and Google about their work and recommended other fixes, too: They advise that better shielding on headphone cords would force attackers to use a higher-power radio signal, for instance, or an electromagnetic sensor in the phone could block the attack. But they note that their attack could also be prevented in software, too, by letting users create their own custom “wake” words that launch Siri or Google Now, or by using voice recognition to block out strangers’ commands. Neither Google nor Apple has yet responded to WIRED’s inquiry about the ANSSI research.

Without the security features Kasmi and Esteves recommend, any smartphone’s voice features could represent a security liability—whether from an attacker with the phone in hand or one that’s hidden in the next room. “To use a phone’s keyboard you need to enter a PIN code. But the voice interface is listening all the time with no authentication,” says Strubel. “That’s the main issue here and the goal of this paper: to point out these failings in the security model.”

Original Source: http://www.wired.com/2015/10/this-radio-trick-silently-hacks-siri-from-16-feet-away/

The Most Wanted Man in the World

Wired_logo-500x143

BY JAMES BAMFORD

THE MESSAGE ARRIVES on my “clean machine,” a MacBook Air loaded only with a sophisticated encryption package. “Change in plans,” my contact says. “Be in the lobby of the Hotel ______ by 1 pm. Bring a book and wait for ES to find you.” ¶ ES is Edward Snowden, the most wanted man in the world. For almost nine months, I have been trying to set up an interview with him—traveling to Berlin, Rio de Janeiro twice, and New York multiple times to talk with the handful of his confidants who can arrange a meeting. Among other things, I want to answer a burning question: What drove Snowden to leak hundreds of thousands of top-secret documents, revelations that have laid bare the vast scope of the government’s domestic surveillance programs? In May I received an email from his lawyer, ACLU attorney Ben Wizner, confirming that Snowden would meet me in Moscow and let me hang out and chat with him for what turned out to be three solid days over several weeks. It is the most time that any journalist has been allowed to spend with him since he arrived in Russia in June 2013. But the finer details of the rendezvous remain shrouded in mystery. I landed in Moscow without knowing precisely where or when Snowden and I would actually meet. Now, at last, the details are set.

cover2

I am staying at the Hotel Metropol, a whimsical sand-colored monument to pre-revolutionary art nouveau. Built during the time of Czar Nicholas II, it later became the Second House of the Soviets after the Bolsheviks took over in 1917. In the restaurant, Lenin would harangue his followers in a greatcoat and Kirza high boots. Now his image adorns a large plaque on the exterior of the hotel, appropriately facing away from the symbols of the new Russia on the next block—Bentley and Ferrari dealerships and luxury jewelers like Harry Winston and Chopard.

I’ve had several occasions to stay at the Metropol during my three decades as an investigative journalist. I stayed here 20 years ago when I interviewed Victor Cherkashin, the senior KGB officer who oversaw American spies such as Aldrich Ames and Robert Hanssen. And I stayed here again in 1995, during the Russian war in Chechnya, when I met with Yuri Modin, the Soviet agent who ran Britain’s notorious Cambridge Five spy ring. When Snowden fled to Russia after stealing the largest cache of secrets in American history, some in Washington accused him of being another link in this chain of Russian agents. But as far as I can tell, it is a charge with no valid evidence.

I confess to feeling some kinship with Snowden. Like him, I was assigned to a National Security Agency unit in Hawaii—in my case, as part of three years of active duty in the Navy during the Vietnam War. Then, as a reservist in law school, I blew the whistle on the NSA when I stumbled across a program that involved illegally eavesdropping on US citizens. I testified about the program in a closed hearing before the Church Committee, the congressional investigation that led to sweeping reforms of US intelligence abuses in the 1970s. Finally, after graduation, I decided to write the first book about the NSA. At several points I was threatened with prosecution under the Espionage Act, the same 1917 law under which Snowden is charged (in my case those threats had no basis and were never carried out). Since then I have written two more books about the NSA, as well as numerous magazine articles (including two previous cover stories about the NSA for WIRED), book reviews, op-eds, and documentaries.

But in all my work, I’ve never run across anyone quite like Snowden. He is a uniquely postmodern breed of whistle-blower. Physically, very few people have seen him since he disappeared into Moscow’s airport complex last June. But he has nevertheless maintained a presence on the world stage—not only as a man without a country but as a man without a body. When being interviewed at the South by Southwest conference or receiving humanitarian awards, his disembodied image smiles down from jumbotron screens. For an interview at the TED conference in March, he went a step further—a small screen bearing a live image of his face was placed on two leg-like poles attached vertically to remotely controlled wheels, giving him the ability to “walk” around the event, talk to people, and even pose for selfies with them. The spectacle suggests a sort of Big Brother in reverse: Orwell’s Winston Smith, the low-ranking party functionary, suddenly dominating telescreens throughout Oceania with messages promoting encryption and denouncing encroachments on privacy.

Of course, Snowden is still very cautious about arranging face-to-face meetings, and I am reminded why when, preparing for our interview, I read a recent Washington Post report. The story, by Greg Miller, recounts daily meetings with senior officials from the FBI, CIA, and State Department, all desperately trying to come up with ways to capture Snowden. One official told Miller: “We were hoping he was going to be stupid enough to get on some kind of airplane, and then have an ally say: ‘You’re in our airspace. Land.’ ” He wasn’t. And since he disappeared into Russia, the US seems to have lost all trace of him.

I do my best to avoid being followed as I head to the designated hotel for the interview, one that is a bit out of the way and attracts few Western visitors. I take a seat in the lobby facing the front door and open the book I was instructed to bring. Just past one, Snowden walks by, dressed in dark jeans and a brown sport coat and carrying a large black backpack over his right shoulder. He doesn’t see me until I stand up and walk beside him. “Where were you?” he asks. “I missed you.” I point to my seat. “And you were with the CIA?” I tease. He laughs.
Snowden is about to say something as we enter the elevator, but at the last moment a woman jumps in so we silently listen to the bossa nova classic “Desafinado” as we ride to an upper floor. When we emerge, he points out a window that overlooks the modern Moscow skyline, glimmering skyscrapers that now overshadow the seven baroque and gothic towers the locals call Stalinskie Vysotki, or “Stalin’s high-rises.” He has been in Russia for more than a year now. He shops at a local grocery store where no one recognizes him, and he has picked up some of the language. He has learned to live modestly in an expensive city that is cleaner than New York and more sophisticated than Washington. In August, Snowden’s temporary asylum was set to expire. (On August 7, the government announced that he’d been granted a permit allowing him to stay three more years.)

Entering the room he has booked for our interview, he throws his backpack on the bed alongside his baseball cap and a pair of dark sunglasses. He looks thin, almost gaunt, with a narrow face and a faint shadow of a goatee, as if he had just started growing it yesterday. He has on his trademark Burberry eyeglasses, semi-rimless with rectangular lenses. His pale blue shirt seems to be at least a size too big, his wide belt is pulled tight, and he is wearing a pair of black square-toed Calvin Klein loafers. Overall, he has the look of an earnest first-year grad student.

Snowden is careful about what’s known in the intelligence world as operational security. As we sit down, he removes the battery from his cell phone. I left my iPhone back at my hotel. Snowden’s handlers repeatedly warned me that, even switched off, a cell phone can easily be turned into an NSA microphone. Knowledge of the agency’s tricks is one of the ways that Snowden has managed to stay free. Another is by avoiding areas frequented by Americans and other Westerners. Nevertheless, when he’s out in public at, say, a computer store, Russians occasionally recognize him. “Shh,” Snowden tells them, smiling, putting a finger to his lips.

08_Cnt24_Fr19a1

DESPITE BEING THE subject of a worldwide manhunt, Snowden seems relaxed and upbeat as we drink Cokes and tear away at a giant room-service pepperoni pizza. His 31st birthday is a few days away. Snowden still holds out hope that he will someday be allowed to return to the US. “I told the government I’d volunteer for prison, as long as it served the right purpose,” he says. “I care more about the country than what happens to me. But we can’t allow the law to become a political weapon or agree to scare people away from standing up for their rights, no matter how good the deal. I’m not going to be part of that.”

Meanwhile, Snowden will continue to haunt the US, the unpredictable impact of his actions resonating at home and around the world. The documents themselves, however, are out of his control. Snowden no longer has access to them; he says he didn’t bring them with him to Russia. Copies are now in the hands of several news organizations, including: First Look Media, set up by journalist Glenn Greenwald and American documentary filmmaker Laura Poitras, the two original recipients of the documents; The Guardian newspaper, which also received copies before the British government pressured it into transferring physical custody (but not ownership) to The New York Times; and Barton Gellman, a writer for The Washington Post. It’s highly unlikely that the current custodians will ever return the documents to the NSA.


Edward Snowden explains in his own words why he decided to reveal secret details of the domestic surveillance being conducted by US intelligence services. PLATON

That has left US officials in something like a state of impotent expectation, waiting for the next round of revelations, the next diplomatic upheaval, a fresh dose of humiliation. Snowden tells me it doesn’t have to be like this. He says that he actually intended the government to have a good idea about what exactly he stole. Before he made off with the documents, he tried to leave a trail of digital bread crumbs so investigators could determine which documents he copied and took and which he just “touched.” That way, he hoped, the agency would see that his motive was whistle-blowing and not spying for a foreign government. It would also give the government time to prepare for leaks in the future, allowing it to change code words, revise operational plans, and take other steps to mitigate damage. But he believes the NSA’s audit missed those clues and simply reported the total number of documents he touched—1.7 million. (Snowden says he actually took far fewer.) “I figured they would have a hard time,” he says. “I didn’t figure they would be completely incapable.”

Asked to comment on Snowden’s claims, NSA spokesperson Vanee Vines would say only, “If Mr. Snowden wants to discuss his activities, that conversation should be held with the US Department of Justice. He needs to return to the United States to face the charges against him.”

Snowden speculates that the government fears that the documents contain material that’s deeply damaging—secrets the custodians have yet to find. “I think they think there’s a smoking gun in there that would be the death of them all politically,” Snowden says. “The fact that the government’s investigation failed—that they don’t know what was taken and that they keep throwing out these ridiculous huge numbers—implies to me that somewhere in their damage assessment they must have seen something that was like, ‘Holy shit.’ And they think it’s still out there.”

Yet it is very likely that no one knows precisely what is in the mammoth haul of documents—not the NSA, not the custodians, not even Snowden himself. He would not say exactly how he gathered them, but others in the intelligence community have speculated that he simply used a web crawler, a program that can search for and copy all documents containing particular keywords or combinations of keywords. This could account for many of the documents that simply list highly technical and nearly unintelligible signal parameters and other statistics.
And there’s another prospect that further complicates matters: Some of the revelations attributed to Snowden may not in fact have come from him but from another leaker spilling secrets under Snowden’s name. Snowden himself adamantly refuses to address this possibility on the record. But independent of my visit to Snowden, I was given unrestricted access to his cache of documents in various locations. And going through this archive using a sophisticated digital search tool, I could not find some of the documents that have made their way into public view, leading me to conclude that there must be a second leaker somewhere. I’m not alone in reaching that conclusion. Both Greenwald and security expert Bruce Schneier—who have had extensive access to the cache—have publicly stated that they believe another whistle-blower is releasing secret documents to the media.

In fact, on the first day of my Moscow interview with Snowden, the German newsmagazine Der Spiegel comes out with a long story about the NSA’s operations in Germany and its cooperation with the German intelligence agency, BND. Among the documents the magazine releases is a top-secret “Memorandum of Agreement” between the NSA and the BND from 2002. “It is not from Snowden’s material,” the magazine notes.

Some have even raised doubts about whether the infamous revelation that the NSA was tapping German chancellor Angela Merkel’s cell phone, long attributed to Snowden, came from his trough. At the time of that revelation, Der Spiegel simply attributed the information to Snowden and other unnamed sources. If other leakers exist within the NSA, it would be more than another nightmare for the agency—it would underscore its inability to control its own information and might indicate that Snowden’s rogue protest of government overreach has inspired others within the intelligence community. “They still haven’t fixed their problems,” Snowden says. “They still have negligent auditing, they still have things going for a walk, and they have no idea where they’re coming from and they have no idea where they’re going. And if that’s the case, how can we as the public trust the NSA with all of our information, with all of our private records, the permanent record of our lives?”

The Der Spiegel articles were written by, among others, Poitras, the filmmaker who was one of the first journalists Snowden contacted. Her high visibility and expertise in encryption may have attracted other NSA whistle-blowers, and Snowden’s cache of documents could have provided the ideal cover. Following my meetings with Snowden, I email Poitras and ask her point-blank whether there are other NSA sources out there. She answers through her attorney: “We are sorry but Laura is not going to answer your question.”

09_Cnt26_Fr29a-

THE SAME DAY I share pizza with Snowden in a Moscow hotel room, the US House of Representatives moves to put the brakes on the NSA. By a lopsided 293-to-123 tally, members vote to halt the agency’s practice of conducting warrantless searches of a vast database that contains millions of Americans’ emails and phone calls. “There’s no question Americans have become increasingly alarmed with the breadth of unwarranted government surveillance programs used to store and search their private data,” the Democratic and Republican sponsors announce in a joint statement. “By adopting this amendment, Congress can take a sure step toward shutting the back door on mass surveillance.”

It’s one of many proposed reforms that never would have happened had it not been for Snowden. Back in Moscow, Snowden recalls boarding a plane for Hong Kong, on his way to reveal himself as the leaker of a spectacular cache of secrets and wondering whether his risk would be worth it. “I thought it was likely that society collectively would just shrug and move on,” he says. Instead, the NSA’s surveillance has become one of the most pressing issues in the national conversation. President Obama has personally addressed the issue, Congress has taken up the issue, and the Supreme Court has hinted that it may take up the issue of warrantless wiretapping. Public opinion has also shifted in favor of curtailing mass surveillance. “It depends a lot on the polling question,” he says, “but if you ask simply about things like my decision to reveal Prism”—the program that allows government agencies to extract user data from companies like Google, Microsoft, and Yahoo—“55 percent of Americans agree. Which is extraordinary given the fact that for a year the government has been saying I’m some kind of supervillain.”

That may be an overstatement, but not by much. Nearly a year after Snowden’s first leaks broke, NSA director Keith Alexander claimed that Snowden was “now being manipulated by Russian intelligence” and accused him of causing “irreversible and significant damage.” More recently, Secretary of State John Kerry said that “Edward Snowden is a coward, he is a traitor, and he has betrayed his country.” But in June, the government seemed to be backing away from its most apocalyptic rhetoric. In an interview with The New York Times, the new head of the NSA, Michael Rogers, said he was “trying to be very specific and very measured in my characterizations”: “You have not heard me as the director say, ‘Oh my God, the sky is falling.’”

Snowden keeps close tabs on his evolving public profile, but he has been resistant to talking about himself. In part, this is because of his natural shyness and his reluctance about “dragging family into it and getting a biography.” He says he worries that sharing personal details will make him look narcissistic and arrogant. But mostly he’s concerned that he may inadvertently detract from the cause he has risked his life to promote. “I’m an engineer, not a politician,” he says. “I don’t want the stage. I’m terrified of giving these talking heads some distraction, some excuse to jeopardize, smear, and delegitimize a very important movement.”

02_Cnt1_Fr61

But when Snowden finally agrees to discuss his personal life, the portrait that emerges is not one of a wild-eyed firebrand but of a solemn, sincere idealist who—step by step over a period of years—grew disillusioned with his country and government.

Born on June 21, 1983, Snowden grew up in the Maryland suburbs, not far from the NSA’s headquarters. His father, Lon, rose through the enlisted ranks of the Coast Guard to warrant officer, a difficult path. His mother, Wendy, worked for the US District Court in Baltimore, while his older sister, Jessica, became a lawyer at the Federal Judicial Center in Washington. “Everybody in my family has worked for the federal government in one way or another,” Snowden says. “I expected to pursue the same path.” His father told me, “We always considered Ed the smartest one in the family.” It didn’t surprise him when his son scored above 145 on two separate IQ tests.

Rather than spending hours watching television or playing sports as a kid, Snowden fell in love with books, especially Greek mythology. “I remember just going into those books, and I would disappear with them for hours,” he says. Snowden says reading about myths played an important role growing up, providing him with a framework for confronting challenges, including moral dilemmas. “I think that’s when I started thinking about how we identify problems, and that the measure of an individual is how they address and confront those problems,” he says.

Soon after Snowden revealed himself as a leaker, there was enormous media focus on the fact that he quit school after the 10th grade, with the implication that he was simply an uneducated slacker. But rather than delinquency, it was a bout of mononucleosis that caused him to miss school for almost nine months. Instead of falling back a grade, Snowden enrolled in community college. He’d loved computers since he was a child, but now that passion deepened. He started working for a classmate who ran his own tech business. Coincidentally, the company was run from a house at Fort Meade, where the NSA’s headquarters are located.

Snowden was on his way to the office when the 9/11 attacks took place. “I was driving in to work and I heard the first plane hit on the radio,” he says. Like a lot of civic-minded Americans, Snowden was profoundly affected by the attacks. In the spring of 2004, as the ground war in Iraq was heating up with the first battle of Fallujah, he volunteered for the Army special forces. “I was very open to the government’s explanation—almost propaganda—when it came to things like Iraq, aluminum tubes, and vials of anthrax,” he says. “I still very strongly believed that the government wouldn’t lie to us, that our government had noble intent, and that the war in Iraq was going to be what they said it was, which was a limited, targeted effort to free the oppressed. I wanted to do my part.”

Snowden says that he was particularly attracted to the special forces because it offered the chance to learn languages. After performing well on an aptitude test, he was admitted. But the physical requirements were more challenging. He broke both of his legs in a training accident. A few months later he was discharged.

10_Cnt24_Fr26a1

OUT OF THE Army, Snowden landed a job as a security guard at a top-secret facility that required him to get a high-level security clearance. He passed a polygraph exam and the stringent background check and, almost without realizing it, he found himself on his way to a career in the clandestine world of intelligence. After attending a job fair focused on intelligence agencies, he was offered a position at the CIA, where he was assigned to the global communications division, the organization that deals with computer issues, at the agency’s headquarters in Langley, Virginia. It was an extension of the network and engineering work he’d been doing since he was 16. “All of the covert sites—cover sites and so forth—they all network into the CIA headquarters,” he says. “It was me and one other guy who worked the late shifts.” But Snowden quickly discovered one of the CIA’s biggest secrets: Despite its image as a bleeding-edge organization, its technology was woefully out-of-date. The agency was not at all what it appeared to be from the outside.

As the junior man on the top computer team, Snowden distinguished himself enough to be sent to the CIA’s secret school for technology specialists. He lived there, in a hotel, for some six months, studying and training full-time. After the training was complete, in March 2007, Snowden headed for Geneva, Switzerland, where the CIA was seeking information about the banking industry. He was assigned to the US Mission to the United Nations. He was given a diplomatic passport, a four-bedroom apartment near the lake, and a nice cover assignment.
It was in Geneva that Snowden would see firsthand some of the moral compromises CIA agents made in the field. Because spies were promoted based on the number of human sources they recruited, they tripped over each other trying to sign up anyone they could, regardless of their value. Operatives would get targets drunk enough to land in jail and then bail them out—putting the target in their debt. “They do really risky things to recruit them that have really negative, profound impacts on the person and would have profound impacts on our national reputation if we got caught,” he says. “But we do it simply because we can.”

While in Geneva, Snowden says, he met many spies who were deeply opposed to the war in Iraq and US policies in the Middle East. “The CIA case officers were all going, what the hell are we doing?” Because of his job maintaining computer systems and network operations, he had more access than ever to information about the conduct of the war. What he learned troubled him deeply. “This was the Bush period, when the war on terror had gotten really dark,” he says. “We were torturing people; we had warrantless wiretapping.”

He began to consider becoming a whistle-blower, but with Obama about to be elected, he held off. “I think even Obama’s critics were impressed and optimistic about the values that he represented,” he says. “He said that we’re not going to sacrifice our rights. We’re not going to change who we are just to catch some small percentage more terrorists.” But Snowden grew disappointed as, in his view, Obama didn’t follow through on his lofty rhetoric. “Not only did they not fulfill those promises, but they entirely repudiated them,” he says. “They went in the other direction. What does that mean for a society, for a democracy, when the people that you elect on the basis of promises can basically suborn the will of the electorate?”

It took a couple of years for this new level of disillusionment to set in. By that time—2010—Snowden had shifted from the CIA to the NSA, accepting a job as a technical expert in Japan with Dell, a major contractor for the agency. Since 9/11 and the enormous influx of intelligence money, much of the NSA’s work had been outsourced to defense contractors, including Dell and Booz Allen Hamilton. For Snowden, the Japan posting was especially attractive: He had wanted to visit the country since he was a teen. Snowden worked at the NSA offices at Yokota Air Base, outside Tokyo, where he instructed top officials and military officers on how to defend their networks from Chinese hackers.

05_Cnt5_Fr61But Snowden’s disenchantment would only grow. It was bad enough when spies were getting bankers drunk to recruit them; now he was learning about targeted killings and mass surveillance, all piped into monitors at the NSA facilities around the world. Snowden would watch as military and CIA drones silently turned people into body parts. And he would also begin to appreciate the enormous scope of the NSA’s surveillance capabilities, an ability to map the movement of everyone in a city by monitoring their MAC address, a unique identifier emitted by every cell phone, computer, and other electronic device.

Even as his faith in the mission of US intelligence services continued to crumble, his upward climb as a trusted technical expert proceeded. In 2011 he returned to Maryland, where he spent about a year as Dell’s lead technologist working with the CIA’s account. “I would sit down with the CIO of the CIA, the CTO of the CIA, the chiefs of all the technical branches,” he says. “They would tell me their hardest technology problems, and it was my job to come up with a way to fix them.”

But in March 2012, Snowden moved again for Dell, this time to a massive bunker in Hawaii where he became the lead technologist for the information-sharing office, focusing on technical issues. Inside the “tunnel,” a dank, chilly, 250,000-square-foot pit that was once a torpedo storage facility, Snowden’s concerns over the NSA’s capabilities and lack of oversight grew with each passing day. Among the discoveries that most shocked him was learning that the agency was regularly passing raw private communications—content as well as metadata—to Israeli intelligence. Usually information like this would be “minimized,” a process where names and personally identifiable data are removed. But in this case, the NSA did virtually nothing to protect even the communications of people in the US. This included the emails and phone calls of millions of Arab and Palestinian Americans whose relatives in Israel-occupied Palestine could become targets based on the communications. “I think that’s amazing,” Snowden says. “It’s one of the biggest abuses we’ve seen.” (The operation was reported last year by The Guardian, which cited the Snowden documents as its source.)

Another troubling discovery was a document from NSA director Keith Alexander that showed the NSA was spying on the pornography-viewing habits of political radicals. The memo suggested that the agency could use these “personal vulnerabilities” to destroy the reputations of government critics who were not in fact accused of plotting terrorism. The document then went on to list six people as future potential targets. (Greenwald published a redacted version of the document last year on the Huffington Post.)

Snowden was astonished by the memo. “It’s much like how the FBI tried to use Martin Luther King’s infidelity to talk him into killing himself,” he says. “We said those kinds of things were inappropriate back in the ’60s. Why are we doing that now? Why are we getting involved in this again?”

In the mid-1970s, Senator Frank Church, similarly shocked by decades of illegal spying by the US intelligence services, first exposed the agencies’ operations to the public. That opened the door to long-overdue reforms, such as the Foreign Intelligence Surveillance Act. Snowden sees parallels between then and now. “Frank Church analogized it as being on the brink of the abyss,” he says. “He was concerned that once we went in we would never come out. And the concern we have today is that we’re on the brink of that abyss again.” He realized, just like Church had before him, that the only way to cure the abuses of the government was to expose them. But Snowden didn’t have a Senate committee at his disposal or the power of congressional subpoena. He’d have to carry out his mission covertly, just as he’d been trained.

11_Cnt27_Fr16-

THE SUN SETS late here in June, and outside the hotel window long shadows are beginning to envelop the city. But Snowden doesn’t seem to mind that the interview is stretching into the evening hours. He is living on New York time, the better to communicate with his stateside supporters and stay on top of the American news cycle. Often, that means hearing in almost real time the harsh assessments of his critics. Indeed, it’s not only government apparatchiks that take issue with what Snowden did next—moving from disaffected operative to whistle-blowing dissident. Even in the technology industry, where he has many supporters, some accuse him of playing too fast and loose with dangerous information. Netscape founder and prominent venture capitalist Marc Andreessen has told CNBC, “If you looked up in the encyclopedia ‘traitor,’ there’s a picture of Edward Snowden.” Bill Gates delivered a similarly cutting assessment in a Rolling Stone interview. “I think he broke the law, so I certainly wouldn’t characterize him as a hero,” he said. “You won’t find much admiration from me.”

0914WIFFMANG001_sq1
Snowden with General Michael Hayden at a gala in 2011. Hayden, former director of the NSA and CIA, defended US surveillance policies in the wake of Snowden’s revelations.

Snowden adjusts his glasses; one of the nose pads is missing, making them slip occasionally. He seems lost in thought, looking back to the moment of decision, the point of no return. The time when, thumb drive in hand, aware of the enormous potential consequences, he secretly went to work. “If the government will not represent our interests,” he says, his face serious, his words slow, “then the public will champion its own interests. And whistle-blowing provides a traditional means to do so.”

The NSA had apparently never predicted that someone like Snowden might go rogue. In any case, Snowden says he had no problem accessing, downloading, and extracting all the confidential information he liked. Except for the very highest level of classified documents, details about virtually all of the NSA’s surveillance programs were accessible to anyone, employee or contractor, private or general, who had top-secret NSA clearance and access to an NSA computer.

But Snowden’s access while in Hawaii went well beyond even this. “I was the top technologist for the information-sharing office in Hawaii,” he says. “I had access to everything.”

Well, almost everything. There was one key area that remained out of his reach: the NSA’s aggressive cyberwarfare activity around the world. To get access to that last cache of secrets, Snowden landed a job as an infrastructure analyst with another giant NSA contractor, Booz Allen. The role gave him rare dual-hat authority covering both domestic and foreign intercept capabilities—allowing him to trace domestic cyberattacks back to their country of origin. In his new job, Snowden became immersed in the highly secret world of planting malware into systems around the world and stealing gigabytes of foreign secrets. At the same time, he was also able to confirm, he says, that vast amounts of US communications “were being intercepted and stored without a warrant, without any requirement for criminal suspicion, probable cause, or individual designation.” He gathered that evidence and secreted it safely away.

By the time he went to work for Booz Allen in the spring of 2013, Snowden was thoroughly disillusioned, yet he had not lost his capacity for shock. One day an intelligence officer told him that TAO—a division of NSA hackers—had attempted in 2012 to remotely install an exploit in one of the core routers at a major Internet service provider in Syria, which was in the midst of a prolonged civil war. This would have given the NSA access to email and other Internet traffic from much of the country. But something went wrong, and the router was bricked instead—rendered totally inoperable. The failure of this router caused Syria to suddenly lose all connection to the Internet—although the public didn’t know that the US government was responsible. (This is the first time the claim has been revealed.)

Inside the TAO operations center, the panicked government hackers had what Snowden calls an “oh shit” moment. They raced to remotely repair the router, desperate to cover their tracks and prevent the Syrians from discovering the sophisticated infiltration software used to access the network. But because the router was bricked, they were powerless to fix the problem.

Fortunately for the NSA, the Syrians were apparently more focused on restoring the nation’s Internet than on tracking down the cause of the outage. Back at TAO’s operations center, the tension was broken with a joke that contained more than a little truth: “If we get caught, we can always point the finger at Israel.”

Cnt4_Fr53-1

MUCH OF SNOWDEN’S focus while working for Booz Allen was analyzing potential cyberattacks from China. His targets included institutions normally considered outside the military’s purview. He thought the work was overstepping the intelligence agency’s mandate. “It’s no secret that we hack China very aggressively,” he says. “But we’ve crossed lines. We’re hacking universities and hospitals and wholly civilian infrastructure rather than actual government targets and military targets. And that’s a real concern.”

The last straw for Snowden was a secret program he discovered while getting up to speed on the capabilities of the NSA’s enormous and highly secret data storage facility in Bluffdale, Utah. Potentially capable of holding upwards of a yottabyte of data, some 500 quintillion pages of text, the 1 million-square-foot building is known within the NSA as the Mission Data Repository. (According to Snowden, the original name was Massive Data Repository, but it was changed after some staffers thought it sounded too creepy—and accurate.) Billions of phone calls, faxes, emails, computer-to-computer data transfers, and text messages from around the world flow through the MDR every hour. Some flow right through, some are kept briefly, and some are held forever.
The massive surveillance effort was bad enough, but Snowden was even more disturbed to discover a new, Strangelovian cyberwarfare program in the works, codenamed MonsterMind. The program, disclosed here for the first time, would automate the process of hunting for the beginnings of a foreign cyberattack. Software would constantly be on the lookout for traffic patterns indicating known or suspected attacks. When it detected an attack, MonsterMind would automatically block it from entering the country—a “kill” in cyber terminology.

Programs like this had existed for decades, but MonsterMind software would add a unique new capability: Instead of simply detecting and killing the malware at the point of entry, MonsterMind would automatically fire back, with no human involvement. That’s a problem, Snowden says, because the initial attacks are often routed through computers in innocent third countries. “These attacks can be spoofed,” he says. “You could have someone sitting in China, for example, making it appear that one of these attacks is originating in Russia. And then we end up shooting back at a Russian hospital. What happens next?”

In addition to the possibility of accidentally starting a war, Snowden views MonsterMind as the ultimate threat to privacy because, in order for the system to work, the NSA first would have to secretly get access to virtually all private communications coming in from overseas to people in the US. “The argument is that the only way we can identify these malicious traffic flows and respond to them is if we’re analyzing all traffic flows,” he says. “And if we’re analyzing all traffic flows, that means we have to be intercepting all traffic flows. That means violating the Fourth Amendment, seizing private communications without a warrant, without probable cause or even a suspicion of wrongdoing. For everyone, all the time.” (A spokesperson for the NSA declined to comment on MonsterMind, the malware in Syria, or on the specifics of other aspects of this article.)

Given the NSA’s new data storage mausoleum in Bluffdale, its potential to start an accidental war, and the charge to conduct surveillance on all incoming communications, Snowden believed he had no choice but to take his thumb drives and tell the world what he knew. The only question was when.

07_Cnt8_Fr531

On March 13, 2013, sitting at his desk in the “tunnel” surrounded by computer screens, Snowden read a news story that convinced him that the time had come to act. It was an account of director of national intelligence James Clapper telling a Senate committee that the NSA does “not wittingly” collect information on millions of Americans. “I think I was reading it in the paper the next day, talking to coworkers, saying, can you believe this shit?”

Snowden and his colleagues had discussed the routine deception around the breadth of the NSA’s spying many times, so it wasn’t surprising to him when they had little reaction to Clapper’s testimony. “It was more of just acceptance,” he says, calling it “the banality of evil”—a reference to Hannah Arendt’s study of bureaucrats in Nazi Germany.

“It’s like the boiling frog,” Snowden tells me. “You get exposed to a little bit of evil, a little bit of rule-breaking, a little bit of dishonesty, a little bit of deceptiveness, a little bit of disservice to the public interest, and you can brush it off, you can come to justify it. But if you do that, it creates a slippery slope that just increases over time, and by the time you’ve been in 15 years, 20 years, 25 years, you’ve seen it all and it doesn’t shock you. And so you see it as normal. And that’s the problem, that’s what the Clapper event was all about. He saw deceiving the American people as what he does, as his job, as something completely ordinary. And he was right that he wouldn’t be punished for it, because he was revealed as having lied under oath and he didn’t even get a slap on the wrist for it. It says a lot about the system and a lot about our leaders.” Snowden decided it was time to hop out of the water before he too was boiled alive.

At the same time, he knew there would be dire consequences. “It’s really hard to take that step—not only do I believe in something, I believe in it enough that I’m willing to set my own life on fire and burn it to the ground.”

But he felt that he had no choice. Two months later he boarded a flight to Hong Kong with a pocket full of thumb drives.

snowden-security

THE AFTERNOON OF our third meeting, about two weeks after our first, Snowden comes to my hotel room. I have changed locations and am now staying at the Hotel National, across the street from the Kremlin and Red Square. An icon like the Metropol, much of Russia’s history passed through its front doors at one time or another. Lenin once lived in Room 107, and the ghost of Felix Dzerzhinsky, the feared chief of the old Soviet secret police who also lived here, still haunts the hallways.

But rather than the Russian secret police, it’s his old employers, the CIA and the NSA, that Snowden most fears. “If somebody’s really watching me, they’ve got a team of guys whose job is just to hack me,” he says. “I don’t think they’ve geolocated me, but they almost certainly monitor who I’m talking to online. Even if they don’t know what you’re saying, because it’s encrypted, they can still get a lot from who you’re talking to and when you’re talking to them.”

More than anything, Snowden fears a blunder that will destroy all the progress toward reforms for which he has sacrificed so much. “I’m not self-destructive. I don’t want to self-immolate and erase myself from the pages of history. But if we don’t take chances, we can’t win,” he says. And so he takes great pains to stay one step ahead of his presumed pursuers—he switches computers and email accounts constantly. Nevertheless, he knows he’s liable to be compromised eventually: “I’m going to slip up and they’re going to hack me. It’s going to happen.”

Indeed, some of his fellow travelers have already committed some egregious mistakes. Last year, Greenwald found himself unable to open a large trove of NSA secrets that Snowden had passed to him. So he sent his longtime partner, David Miranda, from their home in Rio to Berlin to get another set from Poitras, who fixed the archive. But in making the arrangements, The Guardian booked a transfer through London. Tipped off, probably as a result of surveillance by GCHQ, the British counterpart of the NSA, British authorities detained Miranda as soon as he arrived and questioned him for nine hours. In addition, an external hard drive containing 60 gigabits of data—about 58,000 pages of documents—was seized. Although the documents had been encrypted using a sophisticated program known as True Crypt, the British authorities discovered a paper of Miranda’s with the password for one of the files, and they were able to decrypt about 75 pages, according to British court documents. *

Another concern for Snowden is what he calls NSA fatigue—the public becoming numb to disclosures of mass surveillance, just as it becomes inured to news of battle deaths during a war. “One death is a tragedy, and a million is a statistic,” he says, mordantly quoting Stalin. “Just as the violation of Angela Merkel’s rights is a massive scandal and the violation of 80 million Germans is a nonstory.”

Nor is he optimistic that the next election will bring any meaningful reform. In the end, Snowden thinks we should put our faith in technology—not politicians. “We have the means and we have the technology to end mass surveillance without any legislative action at all, without any policy changes.” The answer, he says, is robust encryption. “By basically adopting changes like making encryption a universal standard—where all communications are encrypted by default—we can end mass surveillance not just in the United States but around the world.”

Until then, Snowden says, the revelations will keep coming. “We haven’t seen the end,” he says. Indeed, a couple of weeks after our meeting, The Washington Post reported that the NSA’s surveillance program had captured much more data on innocent Americans than on its intended foreign targets. There are still hundreds of thousands of pages of secret documents out there—to say nothing of the other whistle-blowers he may have already inspired. But Snowden says that information contained in any future leaks is almost beside the point. “The question for us is not what new story will come out next. The question is, what are we going to do about it?”

*CORRECTION APPENDED [10:55am/August, 22 2014]: An earlier version of this story incorrectly reported that Miranda retrieved GCHQ documents from Poitras; it also incorrectly stated that Greenwald has not gained access to the complete GCHQ documents.

Original Source:

The Most Wanted Man in the World
BY JAMES BAMFORD

RELATED STORIES

Call Me Ed: A Day with Edward Snowden
BY SCOTT DADICH

How the NSA Almost Killed the Internet
BY STEVEN LEVY

The NSA Is Building the Country’s Biggest Spy Center
BY JAMES BAMFORD

The Secret War
BY JAMES BAMFORD

Snowden’s First Move Against the NSA Was a Party in Hawaii
BY KEVIN POULSEN

How a Crypto ‘Backdoor’ Pitted the Tech World Against the NSA
BY KIM ZETTER