Chelsea Manning’s Statement for the Fourth Annual Aaron Swartz Day and International Hackathon

Please sign the petition: “President Obama: Give Chelsea Manning #timeserved.”

chelsea_large clean cropped(As read to the crowd at Aaron Swartz Day, at the Internet Archive, San Francisco, November 5, 2016)

By Chelsea Manning

Thinking forward, I can imagine — or envision really — a world of endless possibilities. This is a world where new technology can clean up the environment and start to repair centuries of human activity. Our cities become more integrated, optimized, and harmonized. Our health would improve, and improvements in safety would dramatically decrease accidental deaths. New opportunities for work, education and recreation would spread. Our lives would be better — a “utopia.”

That said — I can also envision a world of despair. This is a world where technology has divided society into two distinctly unequal classes. Military, law enforcement, and intelligence, and indistinguishably blended. This world fosters an extensive police and surveillance state. What I like to call “microcrimes,” which are relatively minor actions that — for those who don’t have power — are policed and enforced aggressively, and follows you for the rest of your life. Identification cards and keys, as well as arfits and their cousins intertwined and enmeshed into all aspects of life — from shopping at the store, to walking into a subway station. Loss of unskilled jobs would cause depression and idolness. In essence, our lives would be worse — a “dystopia.”

Yet, these two worlds are not mutually exclusive. These worlds, in some regard, actually exist. The debates over issues such as income inequality, economic policy, and civil liberties are no longer separated from the technology sector. Our actions when it comes to the development of algorithms and platforms are increasingly acting as a new “invisible arbiter,” determining who wins and who loses in a zero sum game. There’s now commercial, political and legal separation — and sometimes discrimination.

In fact, our technology has rapidly gentrified our cities. Just take a moment sometime and look around you. We have created an increasingly segregated society. This is especially visible there in the San Francisco Bay Area. Of course, there is no conspiracy, but it is becoming clear that those of us who are skilled and lucky can end up working in Palo Alto, Mountain View, or downtown San Francisco — while others move further and further away from the opportunities in our cities and in our corporations.

Consider machine learning — how are our logical “black boxes” working? Neural Networks provide us with opportunities for noticing correlations — like how Republicans are more likely to own a truck or SUV, and Democrats are more likely to use public transportation or car sharing. The enormous information asymmetry that is developing between algorithms, their mechanisms, and public understanding is particularly troubling. Are our algorithms creating self-fulfilling prophecies? Can they go horribly wrong? Sometimes this can be comical — just look at the “deep dream” technique that produces trippy jpegs. Or it can be dangerous and deadly. This is especially the case for “self-driving” or “self-flying” vehicles. If we weaponize our algorithms for the politically uncertain “cyberwar gap” — I must point out here that the prefix “cyber” makes me gag — are we going to be able to contain and control these when they can start to adapt?

Is the Google search engine going to suddenly “come alive” and claim global, military, and political superiority in order to more effectively provide relevant search results? You might laugh, but, do we know whether this is really possible or not? I suspect you know the answer.

Who is responsible if things go wrong? If a car crashes and injures you, who takes the blame? If a state created computer virus goes berserk, who do you point the finger at?

We need to make our algorithms and machine learning mechanisms as accountable and transparent as possible. We should carefully and thoughtfully tread, as our sometimes awkward selves quickly enter into the politics and ethics of technology.

There’s already been a promising debate in the public. Even in the “mainstream,” we are seeing opinion columns and editorials that are asking these questions. We are bringing our conundrums to light of an increasingly curious, diligent and aware public. We have a responsibility to continue to encourage the spread of this debate. Now, what about our “sprawling surveillance apparatus?” Apple and the FBI had a legal feud over phone encryption this year. How many other feuds are happening behind the scenes? How many small and medium-sized companies and organizations responding? Are they quietly complying?

Even if we can legally protect our information, how do we protect our information for the long term, when someone can potentially just build a quantum computer 10 or 15 years from now that makes it horribly obsolete? We need to develop a viable “post-quantum” encryption system. There are several current proposals — such as “lattice-based cryptography,” which I have found an interest in myself lately — out there that are worth exploring.

Time is not on our side. It’s one thing to worry about encryption of frequently expiring credit card information. What about medical records — or, mental health records? What about users of SecureDrop? How can we protect journalistic sources for years to come?

I just want you to ponder these things when you go home, or to your hotel, or wherever you just happen to sleep: Are we doing the right things? Are we paying attention to the right issues? Is what we are creating, developing or modifying going to have an impact on someone? What is it going to look like? Can you think of anything from your own work and experience?

Aaron was an insatiably curious person. His boundless curiosity reminds me of the physicist Richard Feinman. This was his greatest strength. Yet we now know, from Aaron, that curiosity might be punished, so it might be good to think through any necessary legal defenses ahead of time.

Nevertheless, we need to continue to be curious. We need to ask questions. How else are we going to understand our world?