Category Archives: Privacy & Security

Greenwald Meets Bernstein: From Watergate to Snowden

Glenn Greenwald and Carl Bernstein discuss how NSA surveillance has affected contemporary investigative journalism with  journalist, Fredrik Laurin, of Swedish Radio. Greenwald and Bernstien discuss the U.S. Government’s history of placing journalists, activists and whistleblowers under surveillance. Greenwald discusses how the Snowden revelations have affected the precautions investigative journalists must take  to protect their sources (and themselves), as well as the current practice of prosecuting whistleblowers under the Obama administration.  Bernstein explores how institutional secrecy has increased since Watergate and suggests that there is much less oversight of intelligence abuses than in the past.

NSA Program Stopped No Terror Attacks, Says White House Panel Member

by

A member of the White House review panel on NSA surveillance said he was “absolutely” surprised when he discovered the agency’s lack of evidence that the bulk collection of telephone call records had thwarted any terrorist attacks.

“It was, ‘Huh, hello? What are we doing here?’” said Geoffrey Stone, a University of Chicago law professor, in an interview with NBC News. “The results were very thin.”

While Stone said the mass collection of telephone call records was a “logical program” from the NSA’s perspective, one question the White House panel was seeking to answer was whether it had actually stopped “any [terror attacks] that might have been really big.”

“We found none,” said Stone.

Under the NSA program, first revealed by ex-contractor Edward Snowden, the agency collects in bulk the records of the time and duration of phone calls made by persons inside the United States.

Stone was one of five members of the White House review panel – and the only one without any intelligence community experience – that this week produced a sweeping report recommending that the NSA’s collection of phone call records be terminated to protect Americans’ privacy rights.

The panel made that recommendation after concluding that the program was “not essential in preventing attacks.”

“That was stunning. That was the ballgame,” said one congressional intelligence official, who asked not to be publicly identified. “It flies in the face of everything that they have tossed at us.”

Despite the panel’s conclusions, Stone strongly rejected the idea they justified Snowden’s actions in leaking the NSA documents about the phone collection. “Suppose someone decides we need gun control and they go out and kill 15 kids and then a state enacts gun control?” Stone said, using an analogy he acknowledged was “somewhat inflammatory.” What Snowden did, Stone said, was put the country “at risk.”

“My emphatic view,” he said, “is that a person who has access to classified information — the revelation of which could damage national security — should never take it upon himself to reveal that information.”

Stone added, however, that he would not necessarily reject granting an amnesty to Snowden in exchange for the return of all his documents, as was recently suggested by a top NSA official. “It’s a hostage situation,” said Stone. Deciding whether to negotiate with him to get all his documents back was a “pragmatic judgment. I see no principled reason not to do that.”

The conclusions of the panel’s reports were at direct odds with public statements by President Barack Obama and U.S. intelligence officials. “Lives have been saved,” Obama told reporters last June, referring to the bulk collection program and another program that intercepts communications overseas. “We know of at least 50 threats that have been averted because of this information.”

But in one little-noticed footnote in its report, the White House panel said the telephone records collection program – known as Section 215, based on the provision of the U.S. Patriot Act that provided the legal basis for it – had made “only a modest contribution to the nation’s security.” The report said that “there has been no instance in which NSA could say with confidence that the outcome [of a terror investigation] would have been any different” without the program.

The panel’s findings echoed that of U.S. Judge Richard Leon, who in a ruling this week found the bulk collection program to be unconstitutional. Leon said that government officials were unable to cite “a single instance in which analysis of the NSA’s bulk collection metadata collection actually stopped an imminent attack, or otherwise aided the Government in achieving any objective that was time-sensitive in nature.”

Stone declined to comment on the accuracy of public statements by U.S. intelligence officials about the telephone collection program, but said that when they referred to successes they seemed to be mixing the results of domestic metadata collection with the intelligence derived from the separate, and less controversial, NSA program, known as 702, to intercept communications overseas.

The comparison between 702 overseas interceptions and 215 bulk metadata collection was “night and day,” said Stone. “With 702, the record is very impressive. It’s no doubt the nation is safer and spared potential attacks because of 702. There was nothing like that for 215. We asked the question and they [the NSA] gave us the data. They were very straight about it.”

He also said one reason the telephone records program is not effective is because, contrary to the claims of critics, it actually does not collect a record of every American’s phone call. Although the NSA does collect metadata from major telecommunications carriers such as Verizon and AT&T, there are many smaller carriers from which it collects nothing. Asked if the NSA was collecting the records of 75 percent of phone calls, an estimate that has been used in briefings to Congress , Stone said the real number was classified but “not anything close to that” and far lower.

When panel members asked NSA officials why they didn’t expand the program to include smaller carriers, the answer they gave was “money,” Stone said. “They were setting financial priorities,” said Stone, and that was “really revealing” about how useful the bulk collection of telephone calls really was.

An NSA spokeswoman declined to comment on any aspect of the panel’s report, saying the agency was deferring to the White House. Asked Wednesday about the surveillance panel’s conclusions about telephone record collection, White House press secretary Jay Carney said that “the president does still believe and knows that this program is an important piece of the overall efforts that we engage in to combat threats against the lives of American citizens and threats to our overall national security.”

Related:

In NSA-Intercepted Data, Those Not Targeted Far Outnumber the Foreigners Who Are

Files provided by Snowden show extent to which ordinary Web users are caught in the net

July 5, 2014

Ordinary Internet users, American and non-American alike, far outnumber legally targeted foreigners in the communications intercepted by the National Security Agency from U.S. digital networks, according to a four-month investigation by The Washington Post.

Nine of 10 account holders found in a large cache of intercepted conversations, which former NSA contractor Edward Snowden provided in full to The Post, were not the intended surveillance targets but were caught in a net the agency had cast for somebody else.

Many of them were Americans. Nearly half of the surveillance files, a strikingly high proportion, contained names, e-mail addresses or other details that the NSA marked as belonging to U.S. citizens or residents. NSA analysts masked, or “minimized,” more than 65,000 such references to protect Americans’ privacy, but The Post found nearly 900 additional e-mail addresses, unmasked in the files, that could be strongly linked to U.S. citizens or U.S.residents.

Related Story: Communication Breakdown (How 160,000 intercepted conversations led to The Post’s latest NSA story)

The surveillance files highlight a policy dilemma that has been aired only abstractly in public. There are discoveries of considerable intelligence value in the intercepted messages — and collateral harm to privacy on a scale that the Obama administration has not been willing to address.

Among the most valuable contents — which The Post will not describe in detail, to avoid interfering with ongoing operations — are fresh revelations about a secret overseas nuclear project, double-dealing by an ostensible ally, a military calamity that befell an unfriendly power, and the identities of aggressive intruders into U.S. computer networks.

Months of tracking communications across more than 50 alias accounts, the files show, led directly to the 2011 capture in Abbottabad of Muhammad Tahir Shahzad, a Pakistan-based bomb builder, and Umar Patek, a suspect in a 2002 terrorist bombing on the Indonesian island of Bali. At the request of CIA officials, The Post is withholding other examples that officials said would compromise ongoing operations.

Many other files, described as useless by the analysts but nonetheless retained, have a startlingly intimate, even voyeuristic quality. They tell stories of love and heartbreak, illicit sexual liaisons, mental-health crises, political and religious conversions, financial anxieties and disappointed hopes. The daily lives of more than 10,000 account holders who were not targeted are catalogued and recorded nevertheless.

In order to allow time for analysis and outside reporting, neither Snowden nor The Post has disclosed until now that he obtained and shared the content of intercepted communications. The cache Snowden provided came from domestic NSA operations under the broad authority granted by Congress in 2008 with amendments to the Foreign Intelligence Surveillance Act. FISA content is generally stored in closely controlled data repositories, and for more than a year, senior government officials have depicted it as beyond Snowden’s reach.

The Post reviewed roughly 160,000 intercepted e-mail and instant-message conversations, some of them hundreds of pages long, and 7,900 documents taken from more than 11,000 online accounts.

The material spans President Obama’s first term, from 2009 to 2012, a period of exponential growth for the NSA’s domestic collection.

A composite image of two of the more than 5,000 private photos among data collected by the National Security Agency from online accounts and network links in the United States. The images were included in a large cache of NSA intercepts provided by former agency contractor Edward Snowden. (Images obtained by The Washington Post)

Taken together, the files offer an unprecedented vantage point on the changes wrought by Section 702 of the FISA amendments, which enabled the NSA to make freer use of methods that for 30 years had required probable cause and a warrant from a judge. One program, code-named PRISM, extracts content stored in user accounts at Yahoo, Microsoft, Facebook, Google and five other leading Internet companies. Another, known inside the NSA as Upstream, intercepts data on the move as it crosses the U.S. junctions of global voice and data networks.

No government oversight body, including the Justice Department, the Foreign Intelligence Surveillance Court, intelligence committees in Congress or the president’s Privacy and Civil Liberties Oversight Board, has delved into a comparably large sample of what the NSA actually collects — not only from its targets but also from people who may cross a target’s path.

Among the latter are medical records sent from one family member to another, résumés from job hunters and academic transcripts of schoolchildren. In one photo, a young girl in religious dress beams at a camera outside a mosque.

Scores of pictures show infants and toddlers in bathtubs, on swings, sprawled on their backs and kissed by their mothers. In some photos, men show off their physiques. In others, women model lingerie, leaning suggestively into a webcam or striking risque poses in shorts and bikini tops.

“None of the hits that were received were relevant,” two Navy cryptologic technicians write in one of many summaries of nonproductive surveillance. “No additional information,” writes a civilian analyst. Another makes fun of a suspected kidnapper, newly arrived in Syria before the current civil war, who begs for employment as a janitor and makes wide-eyed observations about the state of undress displayed by women on local beaches.

By law, the NSA may “target” only foreign nationals located overseas unless it obtains a warrant based on probable cause from a special surveillance court. For collection under PRISM and Upstream rules, analysts must state a reasonable belief that the target has information of value about a foreign government, a terrorist organization or the spread of nonconventional weapons.

Most of the people caught up in those programs are not the targets and would not lawfully qualify as such. “Incidental collection” of third-party communications is inevitable in many forms of surveillance, but in other contexts the U.S. government works harder to limit and discard irrelevant data. In criminal wiretaps, for example, the FBI is supposed to stop listening to a call if a suspect’s wife or child is using the phone.

There are many ways to be swept up incidentally in surveillance aimed at a valid foreign target. Some of those in the Snowden archive were monitored because they interacted directly with a target, but others had more-tenuous links.

If a target entered an online chat room, the NSA collected the words and identities of every person who posted there, regardless of subject, as well as every person who simply “lurked,” reading passively what other people wrote.

“1 target, 38 others on there,” one analyst wrote. She collected data on them all.

In other cases, the NSA designated as its target the Internet protocol, or IP, address of a computer server used by hundreds of people.

The NSA treats all content intercepted incidentally from third parties as permissible to retain, store, search and distribute to its government customers. Raj De, the agency’s general counsel, has testified that the NSA does not generally attempt to remove irrelevant personal content, because it is difficult for one analyst to know what might become relevant to another.

The Obama administration declines to discuss the scale of incidental collection. The NSA, backed by Director of National Intelligence James R. Clapper Jr., has asserted that it is unable to make any estimate, even in classified form, of the number of Americans swept in. It is not obvious why the NSA could not offer at least a partial count, given that its analysts routinely pick out “U.S. persons” and mask their identities, in most cases, before distributing intelligence reports.

If Snowden’s sample is representative, the population under scrutiny in the PRISM and Upstream programs is far larger than the government has suggested. In a June 26 “transparency report,” the Office of the Director of National Intelligence disclosed that 89,138 people were targets of last year’s collection under FISA Section 702. At the 9-to-1 ratio of incidental collection in Snowden’s sample, the office’s figure would correspond to nearly 900,000 accounts, targeted or not, under surveillance.

‘He didn’t get this data’
U.S. intelligence officials declined to confirm or deny in general terms the authenticity of the intercepted content provided by Snowden, but they made off-the-record requests to withhold specific details that they said would alert the targets of ongoing surveillance. Some officials, who declined to be quoted by name, described Snowden’s handling of the sensitive files as reckless.

In an interview, Snowden said “primary documents” offered the only path to a concrete debate about the costs and benefits of Section 702 surveillance. He did not favor public release of the full archive, he said, but he did not think a reporter could understand the programs “without being able to review some of that surveillance, both the justified and unjustified.”

“While people may disagree about where to draw the line on publication, I know that you and The Post have enough sense of civic duty to consult with the government to ensure that the reporting on and handling of this material causes no harm,” he said.

In Snowden’s view, the PRISM and Upstream programs have “crossed the line of proportionality.”

“Even if one could conceivably justify the initial, inadvertent interception of baby pictures and love letters of innocent bystanders,” he added, “their continued storage in government databases is both troubling and dangerous. Who knows how that information will be used in the future?”

For close to a year, NSA and other government officials have appeared to deny, in congressional testimony and public statements, that Snowden had any access to the material.

As recently as May, shortly after he retired as NSA director, Gen. Keith Alexander denied that Snowden could have passed FISA content to journalists.

“He didn’t get this data,” Alexander told a New Yorker reporter. “They didn’t touch —”

“The operational data?” the reporter asked.

“They didn’t touch the FISA data,” Alexander replied. He added, “That database, he didn’t have access to.”

Robert S. Litt, the general counsel for the Office of the Director of National Intelligence, said in a prepared statement that Alexander and other officials were speaking only about “raw” intelligence, the term for intercepted content that has not yet been evaluated, stamped with classification markings or minimized to mask U.S. identities.

“We have talked about the very strict controls on raw traffic, the training that people have to have, the technological lockdowns on access,” Litt said. “Nothing that you have given us indicates that Snowden was able to circumvent that in any way.”

In the interview, Snowden said he did not need to circumvent those controls, because his final position as a contractor for Booz Allen at the NSA’s Hawaii operations center gave him “unusually broad, unescorted access to raw SIGINT [signals intelligence] under a special ‘Dual Authorities’ role,” a reference to Section 702 for domestic collection and Executive Order 12333 for collection overseas. Those credentials, he said, allowed him to search stored content — and “task” new collection — without prior approval of his search terms.

“If I had wanted to pull a copy of a judge’s or a senator’s e-mail, all I had to do was enter that selector into XKEYSCORE,” one of the NSA’s main query systems, he said.

The NSA has released an e-mail exchange acknowledging that Snowden took the required training classes for access to those systems.

‘Minimized U.S. president’
At one level, the NSA shows scrupulous care in protecting the privacy of U.S. nationals and, by policy, those of its four closest intelligence allies — Britain, Australia, Canada and New Zealand.

More than 1,000 distinct “minimization” terms appear in the files, attempting to mask the identities of “possible,” “potential” and “probable” U.S. persons, along with the names of U.S. beverage companies, universities, fast-food chains and Web-mail hosts.

Some of them border on the absurd, using titles that could apply to only one man. A “minimized U.S. president-elect” begins to appear in the files in early 2009, and references to the current “minimized U.S. president” appear 1,227 times in the following four years.

Even so, unmasked identities remain in the NSA’s files, and the agency’s policy is to hold on to “incidentally” collected U.S. content, even if it does not appear to contain foreign intelligence.

In one exchange captured in the files, a young American asks a Pakistani friend in late 2009 what he thinks of the war in Afghanistan. The Pakistani replies that it is a religious struggle against 44 enemy states.

Startled, the American says “they, ah, they arent heavily participating . . . its like . . . in a football game, the other team is the enemy, not the other teams waterboy and cheerleaders.”

“No,” the Pakistani shoots back. “The ther teams water boy is also an enemy. it is law of our religion.”

“haha, sorry thats kind of funny,” the American replies.

When NSA and allied analysts really want to target an account, their concern for U.S. privacy diminishes. The rationales they use to judge foreignness sometimes stretch legal rules or well-known technical facts to the breaking point.

In their classified internal communications, colleagues and supervisors often remind the analysts that PRISM and Upstream collection have a “lower threshold for foreignness ‘standard of proof’ ” than a traditional surveillance warrant from a FISA judge, requiring only a “reasonable belief” and not probable cause.

One analyst rests her claim that a target is foreign on the fact that his e-mails are written in a foreign language, a quality shared by tens of millions of Americans. Others are allowed to presume that anyone on the chat “buddy list” of a known foreign national is also foreign.

In many other cases, analysts seek and obtain approval to treat an account as “foreign” if someone connects to it from a computer address that seems to be overseas. “The best foreignness explanations have the selector being accessed via a foreign IP address,” an NSA supervisor instructs an allied analyst in Australia.

Apart from the fact that tens of millions of Americans live and travel overseas, additional millions use simple tools called proxies to redirect their data traffic around the world, for business or pleasure. World Cup fans this month have been using a browser extension called Hola to watch live-streamed games that are unavailable from their own countries. The same trick is routinely used by Americans who want to watch BBC video. The NSA also relies routinely on locations embedded in Yahoo tracking cookies, which are widely regarded by online advertisers as unreliable.

In an ordinary FISA surveillance application, the judge grants a warrant and requires a fresh review of probable cause — and the content of collected surveillance — every 90 days. When renewal fails, NSA and allied analysts sometimes switch to the more lenient standards of PRISM and Upstream.

“These selectors were previously under FISA warrant but the warrants have expired,” one analyst writes, requesting that surveillance resume under the looser standards of Section 702. The request was granted.

‘I don’t like people knowing’
She was 29 and shattered by divorce, converting to Islam in search of comfort and love. He was three years younger, rugged and restless. His parents had fled Kabul and raised him in Australia, but he dreamed of returning to Afghanistan.

One day when she was sick in bed, he brought her tea. Their faith forbade what happened next, and later she recalled it with shame.

“what we did was evil and cursed and may allah swt MOST merciful forgive us for giving in to our nafs [desires]”

Still, a romance grew. They fought. They spoke of marriage. They fought again.

All of this was in the files because, around the same time, he went looking for the Taliban.

He found an e-mail address on its English-language Web site and wrote repeatedly, professing loyalty to the one true faith, offering to “come help my brothers” and join the fight against the unbelievers.

On May 30, 2012, without a word to her, he boarded a plane to begin a journey to Kandahar. He left word that he would not see her again.

If that had been the end of it, there would not be more than 800 pages of anguished correspondence between them in the archives of the NSA and its counterpart, the Australian Signals Directorate.

He had made himself a target. She was the collateral damage, placed under a microscope as she tried to adjust to the loss.

Three weeks after he landed in Kandahar, she found him on Facebook.

“Im putting all my pride aside just to say that i will miss you dearly and your the only person that i really allowed myself to get close to after losing my ex husband, my dad and my brother.. Im glad it was so easy for you to move on and put what we had aside and for me well Im just soo happy i met you. You will always remain in my heart. I know you left for a purpose it hurts like hell sometimes not because Im needy but because i wish i could have been with you.”

His replies were cool, then insulting, and gradually became demanding. He would marry her but there were conditions. She must submit to his will, move in with his parents and wait for him in Australia. She must hand him control of her Facebook account — he did not approve of the photos posted there.

She refused. He insisted:

“look in islam husband doesnt touch girl financial earnings unless she agrees but as far as privacy goes there is no room….i need to have all ur details everything u do its what im supposed to know that will guide u whether its right or wrong got it”

Later, she came to understand the irony of her reply:

“I don’t like people knowing my private life.”

Months of negotiations followed, with each of them declaring an end to the romance a dozen times or more. He claimed he had found someone else and planned to marry that day, then admitted it was a lie. She responded:

“No more games. You come home. You won’t last with an afghan girl.”

She begged him to give up his dangerous path. Finally, in September, she broke off contact for good, informing him that she was engaged to another man.

“When you come back they will send you to jail,” she warned.

They almost did.

In interviews with The Post, conducted by telephone and Facebook, she said he flew home to Australia last summer, after failing to find members of the Taliban who would take him seriously. Australian National Police met him at the airport and questioned him in custody. They questioned her, too, politely, in her home. They showed her transcripts of their failed romance. When a Post reporter called, she already knew what the two governments had collected about her.

Eventually, she said, Australian authorities decided not to charge her failed suitor with a crime. Police spokeswoman Emilie Lovatt declined to comment on the case.

Looking back, the young woman said she understands why her intimate correspondence was recorded and parsed by men and women she did not know.

“Do I feel violated?” she asked. “Yes. I’m not against the fact that my privacy was violated in this instance, because he was stupid. He wasn’t thinking straight. I don’t agree with what he was doing.”

What she does not understand, she said, is why after all this time, with the case long closed and her own job with the Australian government secure, the NSA does not discard what it no longer needs.

 

Jennifer Jenkins and Carol D. Leonnig contributed to this report.

Barton Gellman writes for the national staff. He has contributed to three Pulitzer Prizes for The Washington Post, most recently the 2014 Pulitzer Prize for Public Service.

Open Access Explained!

 

What is open access? Open Access (OA) stands for unrestricted access and unrestricted reuse of scholarly materials. Here’s why that matters. Most publishers own the rights to the articles in their research and professional journals. Anyone who wants to access the articles must pay a fee. Anyone who wants to re-publish the findingsis required to obtain permission from the publisher and is often required to pay an additional fee. However, much of this scientific and scholarly research is publicly funded through government research grants. Therefore, open access advocates argue that publicly funded research should be freely accessible to the public . Publishers should not be given exclusive ownership rights to research knowledge or be allowed to create a financial obstacle to public dissemination  of this knowledge through charging fees. Nick Shockey and Jonathan Eisen of PHD Comics take us through the world of open access publishing and explain just what it’s all about.

Fair Use Notice

This video contains copyrighted material. Such material is made available for educational purposes only in an effort to advance the understanding of human rights and social justice issues and is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. This constitutes a ‘fair use’ of any such copyrighted material in accordance with Title 17 U.S.C. Section 107 of the U.S. Copyright Law.

A Primer on Open Access to Science and Scholarship

This article originally appeared in Against the Grain, vol. 16, no. 3 (June 2004). The published version is not open access and I haven’t seen it. This is the version of the text that I submitted to the editor.  – Peter Suber

A Primer on Open Access to Science and Scholarship

The scientific journal was invented in 1665.[1] For readers, the new form of publication surpassed books for learning quickly about the recent work of others. For authors, it surpassed books for sharing new work quickly with the wider world and, above all, for establishing priority over other scientists working on the same problem. Because authors were rewarded in these strong, intangible ways, they accepted the fact that journals couldn’t afford to pay them. Over time, journal revenue grew but authors continued in the tradition of writing articles for impact, not for money. Scholars may write books and software for money, but for journal articles they are amply paid by advancing knowledge and advancing their careers.

The tradition that started in 1665 continues today and makes the scientific or scholarly journal article nearly unique in the landscape of intellectual property. It’s not rare for authors to earn nothing from their work. But it is rare and peculiar for authors to earn nothing regardless of the sales of their work. It’s rare and peculiar for authors to submit their work voluntarily, even eagerly, to publishers who will not buy it or pay royalties. One measure of its rarity is that our copyright law is designed for literature that will produce royalties; it provides a temporary monopoly on distribution, and limits fair use, primarily in order to protect royalties. Even content that users get for free, such as TV and radio shows supported by advertising, generates royalties for its creators.

Scientific and scholarly journal articles are among the most significant examples of this rare and peculiar breed, which could be called royalty-free literature. The only other examples of comparable in significance are the statutes and judicial opinions of public law. But in most countries, the texts of public law are uncopyrightable. This makes the research journal literature even more rare and peculiar: it is royalty-free and still copyrightable. Researchers relinquish revenue for their journal articles but do not necessarily relinquish intellectual property rights.[2]

Open access may come one day to royalty-producing bodies of literature, like books, and other categories of royalty-producing content, like music and movies. But the open-access movement is focused on royalty-free literature. To understand why, note two important consequences that follow from the fact that this literature generates no royalties. First, authors of journal articles can consent to open access without losing revenue. Second, journal publishers who charge subscription fees and then limit access to paying customers are violating the interests of these authors, not advancing them. Authors of scientific and scholarly journal articles want to disseminate their work as widely as possible, so that it will be noticed, read, taken up, built upon, applied, used, and cited. Publishers who stand between authors and readers, and charge access fees, negate the authors’ sacrifice, shrink the authors’ audience, and reduce the authors’ impact.

In the age of print, journals had significant expenses that could only be recovered through subscription fees. Price was a barrier for readers seeking access and for authors seeking readers, but the economics of print left no alternative. Moreover, until the 1970s or so, the price barrier was fairly low. But since the 1970s, journal prices have risen faster than inflation, and since 1986 they have risen four times faster.[3] Libraries now speak of a pricing crisis and cope with exorbitant price increases by canceling subscriptions and cutting into their book budgets. Today the price barrier is so high that even the wealthiest research institutions in the world are canceling journals by the hundreds and issuing public statements condemning the price increases, licensing terms, bundling policies, and negotiating tactics of the major publishers.[4]

The explosive growth of published knowledge will only intensify the problem. Adhering to business models that charge for access (subscription fees, licensing fees, or pay-per-view) will guarantee that over time researchers will have access to a rapidly shrinking percentage of the published knowledge in their field.[5]

Against this background, the internet emerged in the 1990’s as a kind of miracle. For the first time, it became physically and economically possible to distribute perfect copies to a worldwide audience at virtually no cost. For the first time, it became physically and economically possible to connect authors, who want to give away their work, with readers who want to read and build on it. This new form of distribution —digital, online, free of charge, and free of most licensing restrictions— is now called open access.[6]

Only the creators of royalty-producing content had any reason to hold back from taking full advantage of the new medium. The creators of royalty-free content could finally share their work in a way that matched their interests.

The internet is not just a faster, cheaper, searchable alternative to print. By widening distribution and reducing expenses at the same time, it frees us to adopt new business models. We are not limited to the business models that evolved for print. We can seriously explore models that dispense with subscriptions and other price barriers, and use new and better ways to recover the (newly reduced) expenses of a peer-reviewed journal. We can dispense not only with subscriptions, which are costs for users, but for many costs at the journal end as well: the costs of soliciting and renewing subscribers, the costs of maintaining their addresses or authentication data, the costs of blocking online access to non-subscribers, and the costs of drafting and enforcing licensing agreements. We can seize rather than fear the opportunities created by new medium for making copies and sharing them with others. We can serve the research interests of researchers first, rather than subordinate them to the economic interests of intermediaries. We can no longer say, as we could say in the age of print, that the access barriers attached to traditional journals are an unavoidable consequence of the best available method of distribution.

Open access is compatible with copyright. Authors hold copyright on their articles until and unless they transfer copyright to someone else. If authors consent to open access while they still hold copyright, then open access is fully authorized and lawful. The fact that most musicians and movie-makers do not consent to open access should not make us pessimistic about open access to science and scholarship. Musicians and movie-makers create royalty-producing content and have understandable concerns that open access will diminish their revenue (even if there is growing evidence that it could actually enhance it). Authors of royalty-free literature do no share this concern, and have everything to gain and nothing to lose by consenting to open access.

Open access is compatible with print. Users who prefer to read printed text can print any open-access file they like. Libraries and publishers can use print for long-term preservation. Both BioMed Central and the Public Library of Science, the two leading open-access journal publishers, offer print editions at cost for those who want them. As long as journals offer an open-access edition, then priced, printed, or enhanced editions do not interfere in any way. As long as we make print copies and deposit them in libraries, then open-access literature has at least the longevity of print literature.

Open access is compatible with peer review. In fact, all the major open-access projects and campaigns —the Budapest Open Access Initiative, BioMed Central, the Public Library of Science, SPARC, the Bethesda group, the Berlin Declaration— insist on the importance of peer review. Open access to science and scholarship is not about using the internet to bypass peer review. It’s about removing the barrier of price, not the filter of quality control.

Peer review is a mix of professional and clerical tasks. The professional task is editorial judgment exercised by subject-matter experts. In most journals in most fields, the editors and referees who exercise editorial judgment donate their labor, just like the authors. The clerical tasks are increasingly being taken over by software -assigning the files to referees, distributing files, monitoring progress, nagging dawdlers, collecting comments, sharing comments with the right people, tracking revisions, and collecting data. Because the professional labor in this process is largely donated, the cost is already low. Because the clerical labor is being automated, even by open-source software,[7] the overall cost of peer review continues to drop.[8]

But even low expenses must be recovered if open access is to be sustainable. Open access repositories, which do not perform peer review, have negligible expenses, are built on open-source software,[9] and are supported by the institutions that benefit from increasing the visibility and impact of their faculty.

Open access journals, which do perform peer review, are usually supported by processing fees on accepted articles paid by the author’s sponsor rather than access fees paid by the reader’s sponsor. These processing fees are closely related to the costs of peer review, manuscript preparation, and hosting. When they cover all of a journal’s expenses in producing and distributing an article, then the journal is on sound financial footing to provide free online access to the article for all connected users worldwide. This model is similar to the economic model of television or radio, in which some viewers pay for everyone, or advertisers pay the costs of production and distribution upfront so that the public needn’t pay anything for access.

One reason to think that this model is economically sustainable is that it works in an industry, broadcasting, with much higher costs and none of the advantages arising from the use of royalty-free intellectual property. A more specific and persuasive reason is that the costs of vetting and disseminating articles online are much lower that the prices currently charged by publishers, and paid by libraries, for access to them.

Some traditional publishers charge that this business model endangers the integrity of peer review. The objection is that charging processing fees on accepted articles gives a journal an incentive to accept more articles regardless of their quality. This objection seems strong at first, but its strength disappears as we look more closely. In fact, the objection takes a turn and starts to raise suspicions about the business model of conventional of subscription-based journals. (1) First we must note that accepting more excellent submissions does not compromise peer review. A journal would only compromise peer review if it lowered its standard and accepted more weak submissions. (2) Conventional journals that justify price increases by pointing the growing number of articles they publish have exactly the same incentive that the objection attributes to open-access journals. (3) But conventional journals must fill their allotted space and give subscribers their money’s worth. Hence, when they face a temporary dearth of excellent submissions, their business model gives them an incentive to lower their standards in order to fill the issue. This incentive does not exist for open-access journals, which have no allotted space to fill and no subscribers who paid for a certain volume of content. If open-access journals face a large number of excellent submissions, they can publish a large issue. If they face a small number of excellent submissions, they can publish a small issue.

Moreover, (4) the processing fees charged by open-access journals barely cover their costs. So if these fees do create an incentive to lower standards, it is a weak force. Essentially all the financial gain of increasing the acceptance rate would be offset by the costs of publishing the additional articles. But subscription-based journals that tie their prices to their volume, raise prices faster than inflation, and realize high profit margins, have much more to gain by lowering their standard, accepting more papers, and increasing their subscription price.[10]

Finally, we may add that conventional journals with low submission and rejection rates have higher profit margins than conventional journals with high submission and rejection rates. This gives their publishers and incentive to lower standards until the margins peak. It also gives them an incentive to bundle the lower-quality journals with higher-quality journals so that it is costly for libraries to cancel them. These incentives do not arise for open-access journals.

In any case, the upfront funding model is not the only business model for open-access journals. It works best in fields like biomedicine where most research is funded and where the major funders are already on the record as willing to pay the upfront fees.[11] But in less prosperous fields, including the humanities, one attractive model is for university libraries to publish open-access journals. The Philosophers’ Imprint, for example, is a peer-reviewed, open-access journal published by the University of Michigan.[12] Its motto is, “Edited by philosophers, published by librarians.” Because the philosophers and librarians are already on the university payroll, the journal needn’t charge processing fees. Documenta Mathematica is another peer-reviewed, open-access journal that charges no processing fees. It covers its expenses through the sale of printed volumes.[13] What’s important is that there is not just one way to cover the expenses of a peer-reviewed, open-access journal, and we have a long way to go before we can say that we’ve exhausted our cleverness and imagination.

There are many reasons why open access is moving more slowly in the humanities than the STM fields (science, technology, and medicine). One is the relatively low level of research funding. This reduces the money available to subsidize open-access dissemination of research results. Because there is much less government funding of the humanities than the STM fields (to match the lower level of private funding), humanists get less traction than scientists with the “taxpayer argument” for open access, or the argument that taxpayers shouldn’t have to pay a second fee for access to publicly funded research.[14]

A major difference is that journal prices in the STM fields are much higher than in the humanities, drawing urgent attention to open access as an obvious method of relief and giving institutions a motive to join scholars in promoting it. Open-access preprint exchanges thrive in the STM fields, where researchers need to put a time-stamp on their results in order to establish their priority. By contrast, open-access preprint exchanges are rare in the humanities. Conversely, demand for journal articles in the humanities persists longer than in the STM fields, which means that humanities journals have a greater fear than STM journals that offering open access even after some delay (say, six months after publication) will undercut subscriptions and reduce revenue. Similarly, the average rejection rate is higher at humanities journals than at STM journals, increasing the cost of peer review per accepted article, and making open access harder to subsidize through processing fees on accepted articles. Humanities journals often want to reprint poems or illustrations that require permission from a copyright holder, and this permission rarely extends to open-access publication.

Behind these specific reasons is a more general one. Journal articles are the primary literature in the sciences, but in the humanities they tend to report on the history and interpretation of the primary literature, which lies in books. But as we’ve seen, journal articles are royalty-free while books are royalty-producing. The logic of open access applies much better to journal-based fields than to book-based fields.[15]

All these are reasons why we should not expect to make progress toward open access in all disciplines at the same rate. They are also reasons to expect that different business models will evolve in different fields to cover the costs of open-access journals. (By contrast, the costs of open-access repositories are already low and essentially constant across disciplines.)

Open access is within reach of scientists and scholars today. They can launch an open-access repository whenever they like, at essentially no cost, and more and more universities and disciplines are doing so. With a bit more planning and investment, scholars can launch an open-access journal. Scholars themselves decide whether to submit their work to open-access journals, whether to deposit it in open-access repositories, and whether to transfer copyright.

Conventional journals can experiment with open access to their back runs, to back issues after a certain embargo period, to all new articles, or to selected new articles, in order to learn the methods and economics of open-access publishing. But scholars needn’t wait for conventional journals to make these experiments, and needn’t persuade them to accept open access as a superior, or even desirable, alternative. The internet has already given scholars a chance to reclaim control of scholarly communication. For the first time since the journal appeared on the scene in 1665, price needn’t be an access barrier to this critical body of royalty-free literature. For the first time since the rise of the commercial publishing of scholarly journals, scholarly communication can be in the hands of scholars, who answer to one another, rather than corporations, who answer to shareholders. The only question is when scholars will fully seize this beautiful opportunity.[17]

—–

Portions of this article appeared in the InfoPaper for the World Summit on the Information Society for November 11, 2003, and the SPARC Open Access Newsletter for November 2, 2003, and February 2, 2004.

—–

Notes

1. Claude Guédon, “In Oldenburg’s Long Shadow: Librarians, Research Scientists, Publishers, and the Control of Scientific Publishing,” ARL Proceedings, May 2001.
http://www.arl.org/arl/proceedings/138/guedon.html

2. For more on royalty-free literature, its peculiarities, and its connection to open access, see my “Creating an Intellectual Commons through Open Access,” a preprint based on a presentation at the Workshop on Scholarly Communication as a Commons, Workshop in Political Theory and Policy Analysis, Indiana University, Bloomington, Indiana, April 1, 2004.
http://dlc.dlib.indiana.edu/archive/00001246/

3. See the data of the Association of Research Libraries, “Monograph and Serial Costs in ARL Libraries, 1986-2002.”http://www.arl.org/stats/arlstat/graphs/2002/2002t2.html

4. For example, Harvard, Stanford, MIT, Cornell, and Duke are in this category. See my list of “University actions against high journal prices,” first published in the SPARC Open Access Newsletter, April 2, 2004, and then put on the web for future updating. The list includes substantial excerpts from the public statements that accompanied the cancellation decisions.

Newsletter version
http://www.earlham.edu/~peters/fos/newsletter/04-02-04.htm#actions

Updated web version
http://www.earlham.edu/~peters/fos/lists.htm#actions

5. See my “The scaling argument,” SPARC Open Access Newsletter, March 2, 2004.
http://www.earlham.edu/~peters/fos/newsletter/03-02-04.htm#scaling

6. See the Budapest Open Access Initiative (February 2002), which made “open access” the term of art for this kind of free online availability.
http://www.soros.org/openaccess/

7. See the SPARC list of journal management software (both open and closed source).
http://www.arl.org/sparc/core/index.asp?page=h16#journals

8. A 2002 survey of the literature put the cost of peer review at about $400 per accepted article. See Fytton Rowland, “The peer-review process,” Learned Publishing, 15, 4 (October 2002) pp. 247-58, <http://makeashorterlink.com/?E2CA26BF7>. This is the amount that an average journal spends on peer review per accepted article, and therefore includes the cost of reviewing the average number of rejected articles per accepted article. Rowland’s survey was published one month before the launch of Open Journal Systems, <http://www.pkp.ubc.ca/ojs/>, the first open-source journal-management software. I’m not aware of more recent surveys that take new and open-source software into account.

9. See Raym Crow, “A Guide to Institutional Repository Software v 2.0,” Open Society Institute, January 2004. Crow’s guide is limited to open-source software.
http://www.soros.org/openaccess/software/

10. For a more detailed exposition of this argument, see my “Objection-reply: Whether the upfront payment model corrupts peer review at open-access journals,” SPARC Open Access Newsletter, March 2, 2004.
http://www.earlham.edu/~peters/fos/newsletter/03-02-04.htm#objreply

11. The largest private funder of medical research in the United States, the Howard Hughes Medical Institute, and the largest in Britain, the Wellcome Trust, have adopted this policy. In June 2003, they and other stakeholders issued the Bethesda Statement on Open Access Publishing, calling on others to follow suit.
http://www.earlham.edu/~peters/fos/bethesda.htm

Also see BioMed Central’s list of funding agencies willing allow grantees to use grant funds to cover article processing charges.
http://www.biomedcentral.com/info/about/apcfaq#grants

12. Philosophers’ Imprint,
http://www.philosophersimprint.org/

Another example is the Journal of Insect Science, , published by the library of the University of Arizona at Tucson. For details on the arrangement, see Henry Hagedorn et al., “Publishing by the American Library,” <http://www.arl.org/sparc/meetings/Henry_Hagedorn.htm>, a conference presentation from January 2004, and Eulalia Roel, “Electronic journal publication: A new library contribution to scholarly communication,” College & Research Libraries News, January 2004,
http://www.ala.org/ala/acrl/acrlpubs/crlnews/backissues2004/crlbackjan504/electronicjournal.htm

13. Documenta Mathematica,
http://www.mathematik.uni-bielefeld.de/documenta/

Also see Ulf Rehmann, “Documenta Mathematica: A Community-Driven Scientific Journal,” High Energy Physics Libraries Webzine, October 2003.
http://library.cern.ch/HEPLW/8/papers/3/

14. See my “The taxpayer argument for open access,” SPARC Open Access Newsletter, September 4, 2003.
http://www.earlham.edu/~peters/fos/newsletter/09-04-03.htm

15. For a longer discussion, see my “Promoting Open Access in the Humanities,” a preprint based on a talk to the American Philological Association, January 3, 2004.
http://www.earlham.edu/~peters/fos/apa.htm

For more on the possibility of open access to books, see Section 6 of my “Creating an Intellectual Commons through Open Access,” cited in note 2, above.

16. For more, see my list, “Disciplinary differences relevant to open access.”
http://www.earlham.edu/~peters/fos/lists.htm#disciplines

17. See my list, “What you can do to help the cause of open access.”
http://www.earlham.edu/~peters/fos/lists.htm#do


Peter Suber
Research Professor of Philosophy, Earlham College
Open Access Project Director, Public Knowledge
Senior Researcher, SPARC
peters@earlham.edu

Copyright © 2004, Peter Suber. This is an open-access document.

PBS Frontline: Spying on the Home Front (2007)

 

Synopsis: Spying on the Home Front looks at the massive FBI data sweep of U.S. citizen’s records and the electronic surveillance of their communications.  FRONTLINE investigates  National Security Agency (NSA) wiretapping and how the FBI and other intelligence agencies”data mine” – sift through Internet communications –  of millions of Americans, and that the FBI and are mining social media and commercial-sector data banks to an unprecedented degree.

Experienced national security officials and government attorneys see a troubling and potentially dangerous collision between the strategy of pre-emption at home and the Fourth Amendment’s protections against unreasonable search and seizure. Peter Swire, a law professor and former White House privacy adviser to President Clinton, tells FRONTLINE that since 9/11 the government has been moving away from the traditional legal standard of investigations based on individual suspicion to generalized suspicion. The new standard, Swire says, is: “Check everybody. Everybody is a suspect.” Former CIA Assistant General Counsel, Suzanne Spaulding, warns “So many people in America think this does not affect them. They’ve been convinced that these programs are only targeted at suspected terrorists. … I think that’s wrong. … Our programs are not perfect, and it is inevitable that totally innocent Americans are going to be affected by these programs.”

 

Fair Use Notice

This video contains copyrighted material. Such material is made available for educational purposes only in an effort to advance the understanding of human rights and social justice issues and is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. This constitutes a ‘fair use’ of any such copyrighted material in accordance with Title 17 U.S.C. Section 107 of the U.S. Copyright Law.

How Western Intelligence Agencies Manipulate the Internet to Attack and Marginalize Government Critics

Glenn Greenwald

Feb. 24, 2014 6:25 p.m.

 

One of the many pressing stories that remains to be told from the Snowden archive is how western intelligence agencies are attempting to manipulate and control online discourse with extreme tactics of deception and reputation-destruction. It’s time to tell a chunk of that story, complete with the relevant documents.

Over the last several weeks, I worked with NBC News to publish a series ofarticles about “dirty trick” tactics used by GCHQ’s previously secret unit, JTRIG (Joint Threat Research Intelligence Group). These were based onfour classified GCHQ documents presented to the NSA and the other three partners in the English-speaking “Five Eyes” alliance. Today, we at the Intercept are publishing another new JTRIG document, in full, entitled “The Art of Deception: Training for Online Covert Operations.”

By publishing these stories one by one, our NBC reporting highlighted some of the key, discrete revelations: the monitoring of YouTube and Blogger, the targeting of Anonymous with the very same DDoS attacks they accuse “hacktivists” of using, the use of “honey traps” (luring people into compromising situations using sex) and destructive viruses. But, here, I want to focus and elaborate on the overarching point revealed by all of these documents: namely, that these agencies are attempting to control, infiltrate, manipulate, and warp online discourse, and in doing so, are compromising the integrity of the internet itself.

Among the core self-identified purposes of JTRIG are two tactics: (1) to inject all sorts of false material onto the internet in order to destroy the reputation of its targets; and (2) to use social sciences and other techniques to manipulate online discourse and activism to generate outcomes it considers desirable. To see how extremist these programs are, just consider the tactics they boast of using to achieve those ends: “false flag operations” (posting material to the internet and falsely attributing it to someone else), fake victim blog posts (pretending to be a victim of the individual whose reputation they want to destroy), and posting “negative information” on various forums. Here is one illustrative list of tactics from the latest GCHQ document we’re publishing today:

Other tactics aimed at individuals are listed here, under the revealing title “discredit a target”:

Then there are the tactics used to destroy companies the agency targets:

GCHQ describes the purpose of JTRIG in starkly clear terms: “using online techniques to make something happen in the real or cyber world,” including “information ops (influence or disruption).”

Critically, the “targets” for this deceit and reputation-destruction extend far beyond the customary roster of normal spycraft: hostile nations and their leaders, military agencies, and intelligence services. In fact, the discussion of many of these techniques occurs in the context of using them in lieu of “traditional law enforcement” against people suspected (but not charged or convicted) of ordinary crimes or, more broadly still, “hacktivism”, meaning those who use online protest activity for political ends.

The title page of one of these documents reflects the agency’s own awareness that it is “pushing the boundaries” by using “cyber offensive” techniques against people who have nothing to do with terrorism or national security threats, and indeed, centrally involves law enforcement agents who investigate ordinary crimes:

No matter your views on Anonymous, “hacktivists” or garden-variety criminals, it is not difficult to see how dangerous it is to have secret government agencies being able to target any individuals they want – who have never been charged with, let alone convicted of, any crimes – with these sorts of online, deception-based tactics of reputation destruction and disruption. There is a strong argument to make, as Jay Leiderman demonstrated in the Guardian in the context of the Paypal 14 hacktivist persecution, that the “denial of service” tactics used by hacktivists result in (at most) trivial damage (far less than the cyber-warfare tactics favored by the US and UK) and are far more akin to the type of political protest protected by the First Amendment.

The broader point is that, far beyond hacktivists, these surveillance agencies have vested themselves with the power to deliberately ruin people’s reputations and disrupt their online political activity even though they’ve been charged with no crimes, and even though their actions have no conceivable connection to terrorism or even national security threats. As Anonymous expert Gabriella Coleman of McGill University told me, “targeting Anonymous and hacktivists amounts to targeting citizens for expressing their political beliefs, resulting in the stifling of legitimate dissent.” Pointing to this study she published, Professor Coleman vehemently contested the assertion that “there is anything terrorist/violent in their actions.”

Government plans to monitor and influence internet communications, and covertly infiltrate online communities in order to sow dissension and disseminate false information, have long been the source of speculation. Harvard Law Professor Cass Sunstein, a close Obama adviser and the White House’s former head of the Office of Information and Regulatory Affairs, wrote a controversial paper in 2008 proposing that the US government employ teams of covert agents and pseudo-”independent” advocates to “cognitively infiltrate” online groups and websites, as well as other activist groups.

Sunstein also proposed sending covert agents into “chat rooms, online social networks, or even real-space groups” which spread what he views as false and damaging “conspiracy theories” about the government. Ironically, the very same Sunstein was recently named by Obama to serve as a member of the NSA review panel created by the White House, one that – while disputing key NSA claims – proceeded to propose many cosmetic reforms to the agency’s powers (most of which were ignored by the President who appointed them).

But these GCHQ documents are the first to prove that a major western government is using some of the most controversial techniques to disseminate deception online and harm the reputations of targets. Under the tactics they use, the state is deliberately spreading lies on the internet about whichever individuals it targets, including the use of what GCHQ itself calls “false flag operations” and emails to people’s families and friends. Who would possibly trust a government to exercise these powers at all, let alone do so in secret, with virtually no oversight, and outside of any cognizable legal framework?

Then there is the use of psychology and other social sciences to not only understand, but shape and control, how online activism and discourse unfolds. Today’s newly published document touts the work of GCHQ’s “Human Science Operations Cell,” devoted to “online human intelligence” and “strategic influence and disruption”:

Under the title “Online Covert Action”, the document details a variety of means to engage in “influence and info ops” as well as “disruption and computer net attack,” while dissecting how human beings can be manipulated using “leaders,” “trust,” “obedience” and “compliance”:

The documents lay out theories of how humans interact with one another, particularly online, and then attempt to identify ways to influence the outcomes – or “game” it:

We submitted numerous questions to GCHQ, including: (1) Does GCHQ in fact engage in “false flag operations” where material is posted to the Internet and falsely attributed to someone else?; (2) Does GCHQ engage in efforts to influence or manipulate political discourse online?; and (3) Does GCHQ’s mandate include targeting common criminals (such as boiler room operators), or only foreign threats?

As usual, they ignored those questions and opted instead to send their vague and nonresponsive boilerplate: “It is a longstanding policy that we do not comment on intelligence matters. Furthermore, all of GCHQ’s work is carried out in accordance with a strict legal and policy framework which ensures that our activities are authorised, necessary and proportionate, and that there is rigorous oversight, including from the Secretary of State, the Interception and Intelligence Services Commissioners and the Parliamentary Intelligence and Security Committee. All our operational processes rigorously support this position.”

These agencies’ refusal to “comment on intelligence matters” – meaning: talk at all about anything and everything they do – is precisely why whistleblowing is so urgent, the journalism that supports it so clearly in the public interest, and the increasingly unhinged attacks by these agencies so easy to understand. Claims that government agencies are infiltrating online communities and engaging in “false flag operations” to discredit targets are often dismissed as conspiracy theories, but these documents leave no doubt they are doing precisely that.

Whatever else is true, no government should be able to engage in these tactics: what justification is there for having government agencies target people – who have been charged with no crime – for reputation-destruction, infiltrate online political communities, and develop techniques for manipulating online discourse? But to allow those actions with no public knowledge or accountability is particularly unjustifiable.

Documents referenced in this article:

CONTACT THE AUTHOR:

Glenn Greenwald

Email: glenn.greenwald@​theintercept.com

Twitter: @ggreenwald

 

 

 

Original Source:

https://theintercept.com/2014/02/24/jtrig-manipulation/

 

NSA Documents Reveal British Intelligence Partner GCHQ Targets Hactivists and Dissidents

First Published FEB 4 2014, 6:26 PM ET

by Mark Schone, Richard Esposito, Matthew Cole and Glenn Greenwald, Special Contributor

 

A secret British spy unit created to mount cyber attacks on Britain’s enemies has waged war on the hacktivists of Anonymous and LulzSec, according to documents taken from the National Security Agency by Edward Snowden and obtained by NBC News.

The blunt instrument the spy unit used to target hackers, however, also interrupted the web communications of political dissidents who did not engage in any illegal hacking. It may also have shut down websites with no connection to Anonymous.

According to the documents, a division of Government Communications Headquarters (GCHQ), the British counterpart of the NSA, shut down communications among Anonymous hacktivists by launching a “denial of service” (DDOS) attack – the same technique hackers use to take down bank, retail and government websites – making the British government the first Western government known to have conducted such an attack.

The documents, from a PowerPoint presentation prepared for a 2012 NSA conference called SIGDEV, show that the unit known as the Joint Threat Research Intelligence Group, or JTRIG, boasted of using the DDOS attack – which it dubbed Rolling Thunder — and other techniques to scare away 80 percent of the users of Anonymous internet chat rooms.

The existence of JTRIG has never been previously disclosed publicly.

The documents also show that JTRIG infiltrated chat rooms known as IRCs and identified individual hackers who had taken confidential information from websites. In one case JTRIG helped send a hacktivist to prison for stealing data from PayPal, and in another it helped identify hacktivists who attacked government websites.

In connection with this report, NBC is publishing documents that Edward Snowden took from the NSA before fleeing the U.S. The documents are being published with minimal redactions.

Intelligence sources familiar with the operation say that the British directed the DDOS attack against IRC chat rooms where they believed criminal hackers were concentrated. Other intelligence sources also noted that in 2011, authorities were alarmed by a rash of attacks on government and corporate websites and were scrambling for means to respond.

“While there must of course be limitations,” said Michael Leiter, the former head of the U.S. government’s National Counterterrorism Center and now an NBC News analyst, “law enforcement and intelligence officials must be able to pursue individuals who are going far beyond speech and into the realm of breaking the law: defacing and stealing private property that happens to be online.”

“No one should be targeted for speech or thoughts, but there is no reason law enforcement officials should unilaterally declare law breakers safe in the online environment,” said Leiter.

But critics charge the British government with overkill, noting that many of the individuals targeted were teenagers, and that the agency’s assault on communications among hacktivists means the agency infringed the free speech of people never charged with any crime.

Facebook Twitter Google Plus Embed
British Spies Can Snoop on Social Media, Documents Reveal 3:54
“Targeting Anonymous and hacktivists amounts to targeting citizens for expressing their political beliefs,” said Gabriella Coleman, an anthropology professor at McGill University and author of an upcoming book about Anonymous. “Some have rallied around the name to engage in digital civil disobedience, but nothing remotely resembling terrorism. The majority of those embrace the idea primarily for ordinary political expression.” Coleman estimated that the number of “Anons” engaged in illegal activity was in the dozens, out of a community of thousands.

“TARGETING ANONYMOUS AND HACKTIVISTS AMOUNTS TO TARGETING CITIZENS FOR EXPRESSING THEIR POLITICAL BELIEFS.”
In addition, according to cyber experts, a DDOS attack against the servers hosting Anonymous chat rooms would also have shut down any other websites hosted by the same servers, and any other servers operated by the same Internet Service Provider (ISP), whether or not they had any connection to Anonymous. It is not known whether any of the servers attacked also hosted other websites, or whether other servers were operated by the same ISPs.

In 2011, members of the loose global collective called Anonymous organized an online campaign called “Operation Payback” targeting the pay service PayPal and several credit card companies. Some hacktivists also targeted U.S. and British government websites, including the FBI, CIA and GCHQ sites. The hacktivists were protesting the prosecution of Chelsea Manning, who took thousands of classified documents from U.S. government computers, and punishing companies that refused to process donations to WikiLeaks, the website that published the Manning documents.

The division of GCHQ known as JTRIG responded to the surge in hacktivism. In another document taken from the NSA by Snowden and obtained by NBC News, a JTRIG official said the unit’s mission included computer network attacks, disruption, “Active Covert Internet Operations,” and “Covert Technical Operations.” Among the methods listed in the document were jamming phones, computers and email accounts and masquerading as an enemy in a “false flag” operation. The same document said GCHQ was increasing its emphasis on using cyber tools to attack adversaries.

In the presentation on hacktivism that was prepared for the 2012 SIGDEV conference, one official working for JTRIG described the techniques the unit used to disrupt the communications of Anonymous and identify individual hacktivists, including some involved in Operation Payback. Called “Pushing the Boundaries and Action Against Hacktivism,” the presentation lists Anonymous, Lulzsec and the Syrian Cyber Army among “Hacktivist Groups,” says the hacktivists’ targets include corporations and governments, and says their techniques include DDOS and data theft.

Under “Hacktivism: Online Covert Action,” the presentation refers to “Effects Operations.” According to other Snowden documents obtained by NBC News, “Effects” campaigns are offensive operations intended to “destroy” and “disrupt” adversaries.

“ANYONE HERE HAVE ACCESS TO A WEBSITE WITH AT LEAST 10,000+ UNIQUE TRAFFIC PER DAY?”
The presentation gives detailed examples of “humint” (human intelligence) collection from hacktivists known by the on-line names G-Zero, Topiary and p0ke, as well as a fourth whose name NBC News has redacted to protect the hacker’s identity. The hacktivists were contacted by GCHQ agents posing as fellow hackers in internet chat rooms. The presentation includes transcripts of instant message conversations between the agents and the hackers in 2011.

“Anyone here have access to a website with at least 10,000+ unique traffic per day?” asks one hacktivist in a transcript taken from a conversation that began in an Operation Payback chat room. An agent responds and claims to have access to a porn website with 27,000 users per day. “Love it,” answers the hacktivist. The hackers ask for access to sites with traffic so they can identify users of the site, secretly take over their computers with malware and then use those computers to mount a DDOS attack against a government or commercial website.

A GCHQ agent then has a second conversation with a hacker known as GZero who claims to “work with” the first hacktivist. GZero sends the agent a series of lines of code that are meant to harvest visitors to the agent’s site and make their computers part of a “botnet” operation that will attack other computers.

The “outcome,” says the presentation, was “charges, arrest, conviction.” GZero is revealed to be a British hacker in his early 20s named Edward Pearson, who was prosecuted and sentenced to 26 months in prison for stealing 8 million identities and information from 200,000 PayPal accounts between Jan. 1, 2010 and Aug. 30, 2011. He and his girlfriend were convicted of using stolen credit card identities to purchase take-out food and hotel stays.

In a transcript taken from a second conversation in an Operation Payback chat room, a hacktivist using the name “p0ke” tells another named “Topiary” that he has a list of emails, phone numbers and names of “700 FBI tards.”

An agent then begins a conversation with p0ke, asking him about what sites he’s accessed. The hacktivist responds that he was able to defeat the security on a U.S. government website, and pulled up credit card information that’s attached to congressional and military email addresses.

Image: Screen shot of Power Point
Screen shot of Power Point NBC News
The agent then asks whether p0ke has looked at a BBC News web article called “Who loves the hacktivists?” and sends him a link to the story.

“Cool huh?” asks the agent, and pOke responds, “ya.”

When p0ke clicked on the link, however, JTRIG was able to pull up the IP address of the VPN (virtual private network) the hacktivist was using. The VPN was supposed to protect his identity, but GCHQ either hacked into the network, asked the VPN for the hacker’s personal information, or asked law enforcement in the host nation to request the information.

A representative of the VPN told NBC News the company had not provided GCHQ with the hacker’s information, but indicated that in past instances it has cooperated with local law enforcement.

In whatever manner the information was retrieved, GCHQ was able to establish p0ke’s real name and address, as shown in the presentation slides. (NBC News has redacted the information).

P0ke was never arrested for accessing the government databases, but Topiary, actually an 18-year-old member of Anonymous and LulzSec spokesman from Scotland named Jake Davis, was arrested in July 2011. Davis was arrested soon after LulzSec mounted hack attacks against Congress, the CIA and British law enforcement.

Two weeks before his arrest, the Guardian published an interview with Davis in which he described himself as “an internet denizen with a passion for change.” Davis later pled guilty to two DDOS attacks and was sentenced to 24 months in a youth detention center, but was released in June 2013 after five weeks because he had worn an electronic ankle tag and been confined to his home without computer access for 21 months after his arrest. Davis declined comment to NBC News.

In the concluding portion of the JTRIG presentation, the presenters sum up the unit’s “Effects on Hacktivism” as part of “Op[eration] Wealth” in the summer of 2011 and apparently emphasize the unit’s success against Anonymous, including the DDOS attack. The listed effects include identifying top targets for law enforcement and “Denial of Service on Key Communications outlets.”

A slide headlined “DDOS” refers to “initial trial info” from the operation known as “Rolling Thunder.” It then quotes from a transcript of a chat room conversation between hacktivists. “Was there any problem with the IRC [chat room] network?” asks one. “I wasn’t able to connect the past 30 hours.”

“Yeah,” responds another. “We’re being hit by a syn flood. I didn’t know whether to quit last night, because of the DDOS.”

The next slide is titled “Information Operations,” and says JTRIG uses Facebook, Twitter, email, instant messenger, and Skype to dissuade hacktivists with the message, “DDOS and hacking is illegal, please cease and desist.”

The following slide lists the outcome of the operation as “80% of those messaged where (sic) not in the IRC channels 1 month later.”

Gabriella Coleman, the author and expert on Anonymous, said she believed the U.K. government had punished a large number of people for the actions of a few. “It is hard to put a number on Anonymous,” she said, “but at the time of those events, there were thousands of supporters and probably a dozen or two individuals who were breaking the law.”

Said Coleman, “Punishing thousands of people, who are engaging in their democratic right to protest, because a couple people committed vandalism is … an appalling example of overreacting in order to squash dissent.”

Jason Healey, a former top White House cyber security official under George W. Bush, called the British government’s DDOS attack on Anonymous “silly,” and said it was a tactic that should only be used against another nation-state.

JASON HEALEY, A FORMER TOP WHITE HOUSE CYBER SECURITY OFFICIAL UNDER GEORGE W. BUSH, CALLED THE BRITISH GOVERNMENT’S DDOS ATTACK ON ANONYMOUS “SILLY.”
He also questioned the time and energy spent chasing teenage hackers.

“This is a slippery slope,” said Healey. “It’s not what you should be doing. It justifies [Anonymous]. Giving them this much attention justifies them and is demeaning to our side.”

In a statement, a GCHQ spokesperson emphasized that the agency operated within the law.

“All of GCHQ’s work is carried out in accordance with a strict legal and policy framework,” said the statement, “which ensure[s] that our activities are authorized, necessary and proportionate, and that there is rigorous oversight, including from the Secretary of State, the Interception and Intelligence Services Commissioners and the Parliamentary Intelligence and Security Committee. All of our operational processes rigorously support this position.”

Told by NBC News that his on-line alias appeared in the JTRIG presentation, the hacker known as p0ke, a college student in Scandinavia, said he was confused about why he hadn’t been confronted by authorities. (NBC News is withholding his name, age and country of residence.)

But p0ke said he had stopped hacking because he’d grown bored with it, and was too busy with his studies. He was never a “hacktivist” anyway, he said. “Politics aren’t mah thang,” he said in an online interview. “Seriously tho, I had no motive for doing it.”

He said that hacking had only satisfied an urge to show off. “Fancy the details for a while,” he wrote, “then publish em to enlarge my e-penis.”

A British hacktivist known as T-Flow, who was prosecuted for hacking alongside Topiary, told NBC News he had long suspected that the U.K.’s intelligence agencies had used hacker techniques to catch him, since no evidence of how his identity was discovered ever appeared in court documents. T-Flow, whose real name is Mustafa Al-Bassam, pleaded guilty but did not serve time in an adult facility because he was 16 when he was arrested.

“When I was going through the legal process,” explained Al-Bassam, “I genuinely felt bad for all those attacks on government organizations I was involved in. But now that I know they partake in the exact same activities, I have no idea what’s right and wrong anymore.”

Journalist Glenn Greenwald was formerly a columnist at Salon and the Guardian. In late 2012 he was contacted by NSA contractor Edward Snowden, who later provided him with thousands of sensitive documents, and he was the first to report on Snowden’s documents in June 2013 while on the staff of the Guardian. Greenwald has since reported on the documents with multiple media outlets around the world, and has won several journalism awards for his NSA reporting both in the U.S. and abroad. He is now helping launch, and will write for, a new, non-profit media outlet known as First Look Media that will “encourage, support and empower … independent, adversarial journalists.”

Mark Schone
MARK SCHONE
TWITTER FACEBOOK EMAIL

TOPICS INVESTIGATIONS

ACLU: Domestic Surveillance Program Goes Too Far

AP  September 20, 2013, 11:50 AM

SAN FRANCISCO Two men of Middle Eastern descent were reported buying pallets of water at a grocery store. A police sergeant reported concern about a doctor “who is very unfriendly.” And photographers of all races and nationalities have been reported taking snapshots of post offices, bridges, dams and other structures.

The American Civil Liberties Union and several other groups released 1,800 “suspicious activity reports” Thursday, saying they show the inner-workings of a domestic surveillance program that is sweeping up innocent Americans and forever placing their names in a counterterrorism database.

Shortly after the 9/11 attacks, the federal government created a multibillion-dollar information-sharing program meant to put local, state and federal officials together to analyze intelligence at sites called fusion centers.

Instead, according to a Senate report, the Government Accountability Office and now the ACLU, the program has duplicated the work of other agencies, has appeared rudderless and hasn’t directly been responsible for any terror-related prosecutions. According to the GAO, the government maintains 77 fusion centers throughout the country and their operations are funded by federal and local sources.

The ACLU obtained about 1,700 suspicious activity reports filed with the Sacramento office through a California Public Record Acts request. Another 100 were submitted as part of a court case in Los Angeles filed by the ACLU on behalf of photographers who say they are being harassed by Southern California law officials.

The documents do not appear to show valuable counterterrorism intelligence.

A report from Bakersfield, phoned in to a police officer by a “close personal friend,” describes two men who appear to be of Middle Eastern descent stocking up on water.

Another report shows a Lodi police sergeant “reporting on a suspicious individual in his neighborhood.” The sergeant, whose name was redacted, said he “has been long concerned about a residence in his neighborhood occupied by a Middle Eastern male adult physician who is very unfriendly.”

A third report states, “An off-duty supervising dispatcher with Sacramento P.D. noticed a female subject taking pictures of the outside of the post office in Folsom on Riley Street this morning. The female departed as a passenger in a silver Mazda.”

The fusion center project was a target of a blistering congressional report last year complaining that too many innocent Americans engaging in routine and harmless behavior have become ensnared in the program.

The ACLU and others are calling on the Obama administration to make overhauls so that only activities with legitimate links to terrorism investigations are reported.

“We want the administration to stop targeting racial and religious minorities,” ACLU lawyer Linda Lye said.

A Senate report last year concluded that the program has improperly collected information and produced little valuable intelligence on terrorism. The report suggested the program’s intent ballooned far beyond anyone’s ability to control.

What began as an attempt to put local, state and federal officials in the same room analyzing the same intelligence has instead cost huge amounts of money for data-mining software, flat screen televisions and, in Arizona, two fully equipped Chevrolet Tahoes that are used for commuting, investigators found.

The lengthy, bipartisan report was a scathing evaluation of what the Department of Homeland Security has held up as a crown jewel of its security efforts.

A Homeland Security spokesman countered that the program is “safe and effective.”

“In recent years, reporting of suspicious activity by the public has led to the arrest of multiple individuals planning mass casualty attacks,” Peter Boogaard said. “These programs are governed by robust privacy and civil rights and civil liberty protections.”

Homeland Security Department spokesman Matthew Chandler at the time the Senate report was released called it “out of date, inaccurate and misleading.”

He said it focused entirely on information being produced by fusion centers and didn’t consider the benefit to involved officials from receiving intelligence from the federal government.

“If we desire respect for the law, we must first make the law respectable.” – U.S. Supreme Court Justice Louis D. Brandeis