Chelsea Manning to Technologists: Please Take the Time To Contemplate Your System’s Potential Misuse

Chelsea Manning will be speaking at the Fifth Annual Aaron Swartz Day Evening Event – Saturday, November 4, 2017 – 7:30 pm – TICKETS

Chelsea E. Manning at Dolores Park in San Francisco, September, 2017.

From October 8, 2017, in New York City (at the New Yorker Festival):

I think the most important think that we have to learn, because I think it’s been forgotten, is that every single one of us has the ability to change things. Each and every one of us has this ability. We need to look to each other and realize our values are what we care about, and then assert them, and say these things, and to take actions in our political discourse to make that happen. Because it’s not going to happen at the Ballot Box. It’s not.

Make your own decisions. Make your own choices. Make your own judgement.

You have to pay attention. For engineers in particular. We design and we develop systems, but the systems that we develop can be used for different things. The software that I was using in Iraq for predictive analysis was the same that you would use in marketing. It’s the same tools. It’s the same analysis. I believe engineers and software engineers and technologists. (That’s a new term that came out while I was away :-)

I guess technologists should realize that we have an ethical obligation to make decisions that go beyond just meeting deadlines or creating a product. What actually takes some chunks of time is to say “what are the consequences of this system?” “How can this be used?” “How can this be misused?” Let’s try to figure out how we can mitigate a software system from being misused. Or decide whether you want to implement it at all. There are systems where, if misused, could be very dangerous. — Chelsea E. Manning, October 8, 2017.

Excerpt WNYC The New Yorker Radio Hour (starts at 31:45):
http://www.wnyc.org/story/chelsea-manning-life-after-prison/

About the Ethical Algorithms Panel and Technology Track

See Caroline Sinders and Kristian Lum, live at 2pm, on November 4th.

Technology Track – Ethical Algorithms
2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A.
Kristian Lum (Human Rights Data Analysis Group – HRDAG) As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging).
Caroline Sinders (Wikimedia Foundation) – Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ 8 AI Designers You Need to Know.” Plus Special guests TBA

About the Ethical Algorithms Panel and Technology Track
by Lisa Rein, Co-founder, Aaron Swartz Day

I created this track based on my phone conversations with Chelsea Manning on this topic.

Chelsea was an Intelligence Analyst for the Army and used algorithms in the day to day duties of her job. She and I have been discussing algorithms, and their ethical implications, since the very first day we spoke on the phone, back in October 2015.

Chelsea recently published as a New York Times Op-Ed on the subject: The Dystopia We Signed Up For.

From the Op-Ed:

“The consequences of our being subjected to constant algorithmic scrutiny are often unclear… algorithms are already analyzing social media habits, determining credit worthiness, deciding which job candidates get called in for an interview and judging whether criminal defendants should be released on bail. Other machine-learning systems use automated facial analysis to detect and track emotions, or claim the ability to predict whether someone will become a criminal based only on their facial features. These systems leave no room for humanity, yet they define our daily lives.”

A few weeks later, in December, I went to the Human Rights Data Analysis Group (HRDAG) holiday party, and met HRDAG’s Executive Director, Megan Price. She explained a great deal to me about the predictive software used by the Chicago police, and how it was predicting crime in the wrong neighborhoods based on the biased data it was getting from meatspace. Meaning, the data itself was “good” in that it was accurate, but unfortunately, the actual less-than-desirable behavior by the Chicago PD was being used as a guide for sending officers out into the field. Basically the existing bad behavior of the Chicago PD was being used to assign future behavior.

This came as a revelation to me. Here we have a chance to stop the cycle of bad behavior, by using technology to predict where the next real crime may occur, but instead, we have chosen to memorialize the faulty techniques used in the past into software, to be used forever.

I have gradually come to understand that, although these algorithms are being used in all aspects of our lives, it is not often clear how or why they are working. Now, it has become clear that they can develop their own biases, based on the data they have been given to “learn” from. Often the origin of that “learning data” is not shared with the public.

I’m not saying that we have to understand how exactly every useful algorithm works; which I understand would be next to impossible, but I’m not sure a completely “black box” approach is best at least when the public, public data, and public safety are involved. (Thomas Hargrove’s Murder Accountability Project‘s “open” database is one example of a transparent approach that seems to be doing good things.)

There also appears to be a disconnect with law enforcement, while some precincts seem to be content to rely on on technology for direction, for better or worse, such as the predictive software used by the Chicago Police Department. In other situations, such Thomas Hargrove’s, “Murder Accountability Project” (featured in the article Murder He Calculated) technologists are having a hard time getting law enforcement to take these tools seriously. Even when these tools appear to have the potential to find killers, there appear to be numerous invisible hurdles in the way of any kind of a timely implementation. Even for these “life and death” cases, Hargrove has had a very hard time getting anyone to listen to him.

So, how do we convince law enforcement to do more with some data while we are, at the same time, concerned about the oversharing other forms of public data?

I find myself wondering what can even be done, if simple requests such as “make the NCIC database’s data for unsolved killings searchable” seem to be falling on deaf ears.

I am hoping to have some actual action items that can be followed up on in the months to come, as a result of this panel.

References:

1. The Dystopia We Signed Up For, Op-Ed by Chelsea Manning, New York Times, September 16, 2017. (Link goes to a free version not behind a paywall, at Op-Ed News)

2. Pitfalls of Predictive Policing, by Jessica Saunders for Rand Corporation, October 11, 2016. https://www.rand.org/blog/2016/10/pitfalls-of-predictive-policing.html

3. Predictions put into practice: a quasi-experimental evaluation of Chicago’s predictive policing pilot. by Jessica Saunders, Priscillia Hunt, John S. Hollywood, for the Journal of Experimental Criminology, August 12, 2016. https://link.springer.com/article/10.1007/s11292-016-9272-0

4. Murder He Calculated – by Robert Kolker, for Bloomberg.com, February 12th 2017.

5. Murder Accountability Project, founded by Thomas Hargrove. http://www.murderdata.org/

6. Secret Algorithms Are Deciding Criminal Trials and We’re Not Even Allowed to Test Their Accuracy – By Vera Eidelman, William J. Brennan Fellow, ACLU Speech, Privacy, and Technology Project, September 15, 2017. https://www.aclu.org/blog/privacy-technology/surveillance-technologies/secret-algorithms-are-deciding-criminal-trials-and

7. Machine Bias – There’s software used across the country to predict future criminals. And it’s biased against blacks. by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

8. Criminality Is Not A Nail – A new paper uses flawed methods to predict likely criminals based on their facial features. by Katherine Bailey for Medium.com, November 29, 2016. https://medium.com/backchannel/put-away-your-machine-learning-hammer-criminality-is-not-a-nail-1309c84bb899

Saturday, November 4, 2017
2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A.
Kristian Lum (Human Rights Data Analysis Group – HRDAG)
As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging).
Caroline Sinders (Wikimedia Foundation)
Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ 8 AI Designers You Need to Know.” Plus Special guests TBA

 

What Makes A Good Lightning Talk

Lightning Talk Schedule

Our lightning talks are only 20 minutes in length, and usually focus on working code – or often, a collection of working implementations that someone has done over time.

These are very advanced, not general in scope, and implementation-oriented. Additionally, the goal is to feature projects that represent our community’s ideals.

Saturday Lightning talks are meant to explain potential hackathon projects.

Sunday talks are to present work done on projects over the weekend.

Think of a topic this way:

What is the exact problem space?
– How do you plan to fix it?
– How is this idea different than other ideas for fixing that problem?
– How have you *implemented* your idea? preferably with at least on screen code, if not working code?

 

 

Barrett Brown and Steve Phillips Discuss The Pursuance Project

Barrett Brown is an award winning writer and anarchist activist.

Barrett Brown and Steve Phillips are speaking both Saturday and Sunday at the Aaron Swartz Day
San Francisco Hackathon.

(RSVP)

 

Steve Phillips, is a privacy developer who has now brought an incredible project to life, The Pursuance Project, with none other than Barrett Brown.

The Pursuance Project is more than software. The project proposes a much needed new way of organizing and sharing information. A new way of drilling down to get to the truth as a team of people. It can be a team of people in the same building, or scattered all around the world. All that matters is that a group of people who really care about a topic are joining together to do something about it.

Perhaps Pursuance could be one of the missing pieces we need to organize ourselves towards a better democracy.

 

 

 

 

 

——

It’s not just about the software, it’s about thinking about new ways to organize and create positive change. Of course, this is not a concept that Aaron invented, but it is one that he lived.

So, let’s learn as much as we can about Pursuance, as quickly as possible, ahead of time, so that we can all get as much as possible out of our in-person time with Barrett and Steve on November 4th, when they demo Pursuance for the first time, at the Aaron Swartz Day Hackathon, on November 4th, at 3pm, and again on November 5th, when they go over more of the technical details.

I spoke to Barrett and Steve to find out how they met and how they pulled all this off in less than a year.

LR: What does Pursuance actually do?

BB: The pursuance system is a framework for process democracy. That is, it allows individuals with no prior relationship to self-organize into robust, agile entities governed via a “proceduralism of agreement.” These entities, called pursuances, in turn engage and collaborate among themselves to whatever extent they choose.

SP: Fundamentally, the Pursuance System software enables you to create a pursuance (which is a sort of organization), invite people to that pursuance (with the level of permissions and privileges that you choose), assign those people tasks (manually, or automatically based on their skill set!), brainstorm and discuss what should be done, rapidly record exciting ideas or strategies in an actionable format (namely as tasks), share files and documents, be notified when relevant events occur (e.g., when you are assigned a task or mentioned), and effectively get help from others.

LR: But is it simply end to end encrypted project management software? It seems like there is something larger going on here?

BB: A variety of existing tools for crowd-sourced research and secure communication will be implemented into the system. The ecosystem will be seeded with about 200 individuals and groups with a track record of advancing individual rights, state accountability, and robust journalism and information dissemination; each of these initial users will have the right to bring others into the system, and so on. This is not a content neutral medium; although any political ideology or combination of views is permitted in theory, everyone who joins does so under the condition that they oppose the drug war, police state, and national security state (although participants are free to interpret these issues broadly, and need not agree entirely on definitions or solutions).

This is a server-based ecosystem of collaboration and self-governance in which all participants will have equal opportunity to create and join pursuances: structured entities best thought of as evolvable organizational charts, with a wide range of customization available, as well as the ability for individual pursuances to link up in various ways; indeed, the ultimate goal of this process, which will provide a superior means by which to organize collaborative activism, is to eventually give rise to a sort of technocratic super-organism capable of confronting criminalized institutions and ultimately rolling them back.

SP: Aside from the specific software features, we are quite excited about having an ecosystem of like-minded individuals with shared goals and interests. The world needs an energetic network of activists effectively collaborating to achieve such things as prison reform, an end to the drug war, an end to mass, suspicionless surveillance, and various other issues. We need many researchers to assist journalists in finding the facts and getting stories right. And we need a great number of people to assist non-profits and political action groups in achieving their political ends. Pursuance amplifies these efforts.

LR: Other articles referenced its potential as a tool for democracy, could you elaborate? :-)

BB: As opposed to institutional democracy, whereby some artificial structure is generally implemented from above, Pursuance allows everyone the equal opportunity to define the exact terms of their associations with others, either by creating a Pursuance or by joining one that provides what they consider to be sufficient agency. Pursuances themselves may or may not involve voting; they can certainly be structured so that some, most, or all decisions, major or minor, requires majority votes by all participants, but others are driven more by free association, depending upon the ability of individuals to quickly and easily form new Pursuances with particular requirements so as to create a polity that’s sufficiently in agreement that participants are comfortable giving most responsibilities to a few people.

Importantly, the ease of creating, applying to join, and leaving pursuances will encourage experimentation and evolution, such that differing models of participation can be used and improved upon. One pursuance may be doing the exact same sort of work as another, but simply with a more regimented system whereby everyone is taking orders from above, with one person initially delegating power to others along a structure whereby no voting is done at all; another may involve each participant having the exact same degree of control, with decisions subject to majority votes or even requiring unanimous ones. By allowing every participant to employ free association, and by providing a structure that makes it easy to try different approaches to governance, we’re providing a highly customizable framework for collaboration that’s universal enough to be used for everything from running a bike drive to governing a political party.

LR: How did you two connect? Did Steve write to you when you were in prison?

BB: Steve saw the Wired article on my release, which went into the broad aspects of the project, and tracked me down to D Magazine, where he called me. We spoke and then he flew down to Dallas for a meeting. Over that three or four hours, we came up with many of the major additions to the basic idea that will ultimately be used; he happened to be perfect for this, both as programmer and project manager as well as a broad thinker with a great deal of knowledge relevant to this undertaking.

SP: Backstory: in 2015 I gave a DEF CON talk regarding my project CrypTag, which makes encrypted data partially searchable and stores it in any folder or file-syncing service. I started a non-profit around CrypTag with the slogan, “Secure mobile and desktop apps for activists, journalists, and you,” and with the 10-year goal of providing “data privacy for every Internet user”. I launched a graphical, user-friendly encrypted wiki/note-taking app — CrypTag Notes — solicited and got some great user feedback, and had some people using it.

But there were a couple problems.

First, I hadn’t found a significant number of people who thought they needed their privacy protected. Secondly, I didn’t have a means through which I could reach such people, and I wasn’t networked with that many activists other than a few I’d met at Occupy. Thirdly, since I have extremely broad interests and, thanks to the Internet, am aware of many problems in the world that I would like to see solved (if not help solve), I was concerned that even in the best-case scenario, if I could help fundamentally solve the problem of human privacy, that this wouldn’t be nearly enough in light of all that we face — global warming and environmental destruction, superhuman AI, Neoliberalism, racial unjustice, political bribery, technological employment and the apparent need for a basic income, and more.

But in the last week of March I was reading a Wired article, “Anonymous’

Barrett Brown is Free — and Ready to Pick New Fights” which reads, in part:

[Barrett] intends to build a piece of software called Pursuan[ce], designed to serve as a platform for coordinating activists, journalists, and troublemakers of all stripes. Pursuan[ce], as Brown describes it, would be an open-source, end-to-end-encrypted collaboration platform anyone could host on their own server. Users will be able to create a “pursuance,” an installation of the software focused on a group’s particular cause or target for investigation. The software would offer those groups the same real-time collaboration features as Slack or Hipchat, but also include a kind of org-chart function to define different users’ roles, the ability to host and search large collections of documents, and a Wiki feature that would allow collaborators to share and edit their findings from those  documents.

Brown has yet to recruit a team of coders or volunteers to launch  Pursuan[ce]. … But Brown has never had trouble finding followers …

I quickly realized that not only did Barrett have the public platform that I lacked, he also attracts and excites thousands of activists who *know* they need privacy protections because they are opposing the corrupt and powerful elements of the status quo.

It was also immediately clear that I had exactly what Barrett needed — experience building secure, user-friendly software; open source development; managing small teams of developers; and recruiting other technical people, as I was hosting weekly privacy hackathons at Noisebridge (which continue to this day), and I had recently moved to San Francisco.

I figured this was a once-in-a-lifetime opportunity to work with someone like a Barrett Brown, or a John Kiriakou, or an Edward Snowden, or a Glenn Greenwald, or a Laura Poitras, and that I must take massive action to turn into reality this amazing possibility to work with with Barrett Brown to amplify the efforts of activists and journalists in order to help them solve as many of the world’s problems as possible.

I could not believe how much overlap there was between what Barrett and I wanted to accomplish, and how much we could complement each other.

So I brainstormed with a friend about the best course of action, which led to my aggressively reaching out to people I knew may be connected to Barrett, attempting to contact him in several different ways all in parallel, and successfully getting through just two days later. He said he was interested to have me involved, so I then flew to Texas, met twice with Barrett, began designing the software, then flew back to California. Two days later, Barrett emailed the others involved and said, “this is Steve Phillips based in San Francisco, and he is in charge of building the Pursuance System” — the very software I had been merely reading about less than two weeks prior.

That was just six months ago, and it’s been a hell of a ride since. (And of course, John Kiriakou and others are on our board of directors.)

My extremely excitement toward what can be accomplished with Pursuance continues to this day.

LR: Steve mentioned that you both were inspired one of Aaron’s posts, entitled When Is Transparency Useful? – could you elaborate on that please? :)

SP: I was talking to a friend about Pursuance, and he pointed me to one of Aaron Swartz’s essays. Part of what blew me away was this line and the argument leading up to it:

Imagine it: an investigative strike team, taking on an issue, uncovering the truth, and pushing for reform. They’d use technology, of course, but also politics and the law.

I found that this complemented Barrett’s thinking very well regarding what can be accomplished with a diverse mix of complementary skill sets, rather than having silos of just journalists working by themselves, and my experience with seeing tech geeks building more tech for geeks rather than solving bigger problems.

I knew that Aaron had co-invented RSS at the age of 14, that he had the foresight to create software that has become SecureDrop, and that he convinced Larry Lessig that getting money out of politics is a fundamental, but this is yet another example of Aaron being ahead of his time.

BB: Transparency is something we generally want to apply to institutions, particularly governments that are funded by its population and have a legal monopoly on violence, and specifically on government entities that have a history of misusing secrecy. On the other hand, the question of transparency becomes vastly more complicated when we’re talking about private entities. Within Pursuance, a given pursuance can be entirely opaque to outsiders, which in some cases will be a necessary defense against states and powerful firms that have a history of retaliating against activists and even journalists. But most of them, I think, will be highly transparent, both as basic policy and as a means of better allowing other pursuances to find areas where they might want to collaborate.

A good part of the concept behind Pursuance is to encourage not just individuals to arrange themselves into efficient entities, but also to encourage pursuances to eventually develop similar connections, sharing information, resources, and talent. This also goes for those existing non-profits and NGOs and the like that we’ll be actively recruiting; with this system, they’ll be able to easily create a pursuance presence by which to organize their supporters as well as finding areas of efficient potential partnerships with both pursuances and other institutions who’ve come on to the system. Those areas are most easily discoverable when everyone concerned can quickly see what other groups are doing and how they’re doing it.

Barrett Brown and Steve Phillips are speaking Saturday and Sunday at the Aaron Swartz Day San Francisco Hackathon.

Saturday November 4th 3pm -4:30 pm Barrett Brown and Steve Phillips – Building a Better Opposition: Process Democracy and the Second Wave of Online Resistance w/ Q and A (First live demo of the Pursuance Project!)

Sunday November 5th 2pm – 3 pm Pursuance Advanced Tech (w Q and A) – Steve Phillips and Barrett Brown

RSVP for hackathon here.

Non-hackathon folks: Tickets for evening event are still available.

EFF Pioneer Awards – Part Two – Ashley Nicole Black

Come to the Fifth Annual Aaron Swartz Day Evening Event! Only 75 tickets left :)

“I’m afraid of my own government targeting me for surveillance because I make fun of the President for a living, and while I do it, I’m also black. I need government transparency and accountability. I need Freedom of Speech. I need quality journalism by journalists who feel safe to do their jobs. Because, without them, I can’t do my job.”

– Ashley Nicole Black, September 14, 2017

Ashley Nicole Black is an American comedian, actress, and writer from Los Angeles, California. In 2016, she became a writer and correspondent for Full Frontal w/ Samantha Bee. She was the Keynote Speaker for the 2017 EFF Pioneer Awards.

2013 Post by Jason Leopold About Aaron’s PACER-related FOIA Requests

Jason Leopold, who is speaking Saturday at the San Francisco hackathon, and also that night at the evening event,  wrote about Aaron’s FOIA requests, immediately following his death.

We will be going through the articles referenced in this excerpt below, one by one.

Aaron Swartz’s FOIA Requests Shed Light on His Struggle

From the Truthout article:

Swartz filed his first FOIA request in December 2010, more than two years after he landed on the government’s radar. He was seeking information about himself.

In 2008, Swartz’s friend and fellow open government activist Carl Malamud, the founder of the nonprofit public.resource.org, wanted to make federal court documents housed on the Public Access to Court Electronic Records system (PACER) available to the public for free. Using $600,000 he raised from supporters, Malamud purchased 50 years worth of appellate court documents and posted them on his website.

Then, the government started a pilot program in which access to federal court documents on PACER would be made available to users at no cost at 17 libraries around the country. Malamud urged activists like Swartz to visit the libraries, download the documents and send it over to him so he could make it availble to the public via his website.

“So Aaron went to one of them and installed a small PERL script he had written that cycled sequentially through case numbers, requesting a new document from Pacer every three seconds, and uploading it to” Amazon’s Elastic Compute (EC2) Cloud server, Wired reported. “Aaron pulled nearly 20 million pages of public court documents, which are now available for free on the Internet Archive.”

The court documents Swartz legally accessed were worth $1.5 million. The government shut down the PACER pilot program and the FBI launched an investigtation. Malamud has since published on his website emails he exchanged with Swartz about the incident.

On December 10, 2010, Swartz filed a FOIA request with the Justice Department’s Criminal Division seeking “documents related to me, Aaron Swartz, as well as any documents related to any associated PACER investigation.” The Justice Department said responded by stating it could not locate any records. He also filed an identical FOIA request that day with the Executive Office of United States Attorneys. The office identified 72 documents that were withheld in full.

Case Challenging PACER Fees is allowed to move forward

Editor’s Note: we will be writing a lot about this in the weeks to come, re: how the PACER system in the United States is highly questionable, as it actually forces members of the public to pay, page by the page (and only if they have a credit card) to view the law.

Theodore D’Apuzzo received a favorable opinion, denying the U.S. Government’s Motion to Dismiss in his case against PACER.

In a nutshell:

  1. The case can proceed.
  2. Stay on discovery is now lifted.
  3. Government must now answer the complaint by October 10, 2017.