Category Archives: Aaron Swartz Day 2017

Meet “The Planet” Hackerspace In Cairo: A few words with organizer Tarek Nasr

About the Aaron Swartz Day Hackathon in Cairo, Egypt:

Date: November 4, 2017
Time: 10:00 am-6:00 pm
Contact: Tarek Nasr – shisha@theplanet.com.eg

Location: The Planet
9 Gamal Eldin Kholousy
off Shahin, Agouza
Cairo, Egypt

 

Tarek Nasr

 

 

 

 

 

 

ASD: Tell us about “The Planet” hackerspace.

TN: Ok so, within ThePlanet, we have 10 of our core team members. We all met during our #Jan25 revolution in Egypt. We were all active in terms of taking the streets, organizing online, creating digital content around the protests and organizing to get activists out of jail.

For example, we played a big role in the online movement around the #FreeAJStaff campaign, when 3 of their journalists were arrested in Egypt ,that ultimately helped in getting them released.

ASD: How are you able to be activists in Egypt, with so much at stake, should you offend the wrong person?

TN: We have decreased activity significantly. There is no way to do so, to be honest.

ASD: As far as I can tell, you are already taking a risk even existing. Is that correct, to a certain extent? Or would you say that that is an exaggeration?

TN: It is not an exaggeration at all.

ASD: Can you talk about any more specifics with regard to the #FreeAJStaff campaign?

TN: Yes, if you recall, 3 journalists were arrested from AJ in what is referred to locally as the “Marriot Cell.” 2 journalists were foreigners and the 3rd was Egyptian. Local + international media only focused on the 2 foreign journalists, and all campaigns encapsulated the 2 foreigners, and their governments (Australia and Canada) worked tirelessly to get them out.

We put together an online campaign aiming to get the 3rd (Egyptian) journalist included within the narrative so he could benefit indirectly from the buzz foreign governments and news outlets were making around the case. We quickly (within days) reached tens of thousands online and began being approached by international news outlets, including the BBC,that began to ask us about the 3rd journalist. We acted as his voice to the world, and ultimately were successful in including him in the global and local narrative, which forced the hand of the government to release him when they released the 2nd foreign journalist.

ASD: So you decided that you worked well together, and formed “The Planet?”

TN: Yes, we started the business before the revolution and found ourselves and formed bonds during the revolution.

ASD: How did you guys hear about the hackathon and Aaron? Is there anything in particular. A personal story or something that resonated with one or all of you that led you to want to participate this year?

TN: For me Aaron is my role model. He could have just focused on making money, but he wanted to make sure he added true value, tried to make the world a better place and essentially gave his life for what he believed in. I was first introduced to him 2-3 years ago, when I came across the documentary, and was absolutely shocked. I’ve been spreading “the gospel” ever since.

ASD: Is open source software popular in Egypt?

TN: Within a niche techie community, yes it is. The majority of developers here utilize open source technologies for work and personal projects. Most of the web market in Egypt is based on open source software and technologies. For example, WordPress, Drupal, Magento (needs licence), PHP (lalavel), Python (django), and js (node.js-angular.js).

ASD: What is licensing like for software in Egypt? Are you able to sell your software creations to the global market?

TN: Yes.

ASD: Are the app stores pretty much centralized, so it doesn’t matter where the actual software is being created?

TN: Yes.

ASD: Are young and independent developers being exploited in Egypt with this “gig economy” like they are here in the US? (Underpaid generally, no health benefits.)

TN: In Egypt they are actually thriving, because the gig economy pays in USD and the exchange rate is so high, they can make a killing, on Upwork for example, and only need to work two weeks a month.

References

Al Jazeera Journo Mohamed Fahmy’s Egypt Hell Memoir ‘The Marriott Cell’ Being Developed Into Feature FilmVariety, Nick Vivarelli, February 1, 2016.

Egypt’s “Marriott Terror Cell” Travesty, The New Yorker, By Jon Lee Anderson, June 25, 2014.

A Brief History of Blockchain Name Systems – Lightning Talk

Title: A Brief History of Blockchain Name Systems
Speaker: John Light

Video of this talk: (YouTube) – (Internet Archive)

This talk took place Sunday, November 5, 2017

Description:
Aaron Swartz once published a blog post entitled “Squaring the Triangle“, hypothesizing that a blockchain could be used to create a name system that had secure, decentralized, and human-readable names, thus “squaring” Zooko’s Triangle.

Since that post was published, numerous blockchain name systems have been developed, putting Aaron’s idea into practice. This talk will give a brief overview of the most popular blockchain name systems* in production and show some of their applications.

References:

  1. Squaring Zooko’s Triangle – Aaron Swartz’ Raw Thought, January 6, 2011 – http://www.aaronsw.com/weblog/squarezooko
  2. Frequently Asked Questions (FAQ), January 14, 2011 – https://squaretriangle.jottit.com/faq
  3.  A Censorship-Resistant Web (Re: “The Distribution Problem”) – Aaron Swartz’ Raw Thought, December 21, 2010 – http://www.aaronsw.com/weblog/uncensor
  4. *The most popular blockchain name systems are:

Namecoin – https://namecoin.org/

Namecoin was the first fork of Bitcoin and still is one of the most innovative “altcoins”. It was first to implement merged mining and a decentralized DNS. Namecoin was also the first solution to Zooko’s Triangle, the long-standing problem of producing a naming system that is simultaneously secure, decentralized, and human-meaningful.

Blockstack – https://blockstack.org/

Blockstack is a new internet for decentralized apps where users own their data. With Blockstack, users get digital keys that let them own their identity. They sign in to apps locally without remote servers or identity providers.

Ethereum Name System (ENS) – https://ens.domains/

ENS offers a secure and decentralised way to address resources both on and off the blockchain using simple, human-readable names. ENS is built on smart contracts on the Ethereum blockchain, meaning it doesn’t suffer from the insecurity of the DNS system. You can be confident names you enter work the way their owner intended.

Chelsea Looks Back At Her Teenage Years

Chelsea Manning will be speaking at the Fifth Annual Aaron Swartz Day Evening Event – Saturday, November 4, 2017 – 7:30 pm – TICKETS (Just going to the hackathon? It’s free.)

Chelsea E. Manning at the New York City Pride Parade, June 24, 2017

From October 8, 2017, in New York City (at the New Yorker Festival):

I grew up in central Oklahoma. A small town, Crescent, Oklahoma. And my parents were both voting Republicans and I wasn’t aware there was an alternative. Everybody held those views. And I didn’t really understand them.

I’m trans and I felt different than everybody else. I knew I was different. I didn’t have words to like, describe that. All of my friends. All of my family. All of my teachers. They all knew it as well. It felt like there was something about me that was different. It caused friction. And it caused difficulty for me.

My mother is British, and when my mother and my father split up, my mother decided to move back to the UK, and so I went and I spent four years there. I went to school there, ya know, it was different. I was a kid from the mid west. I didn’t fit in. I didn’t know. It was just a completely different world for me.

My father exposed me to computers at a young age. I learned how to program by the time I was about 8 or 9, although I didn’t fully understand probably till I was about 10. And my parents, we always had a computer in the house. And we always had internet access. So, it was a “normal” thing for me. Even though, at the time, in the early to mid 90s, it wasn’t a normal thing. And there were a lot of communities on the Internet in this time. And so, I was exploring. I was exploring who I was. I was exploring different ways of presenting myself.

I spent more time text messaging and instant messaging my friends than actually spending time with them. The term is IRL (In Real Life), but, ya know, we weren’t spending a whole lot of time IRL. My mother didn’t know how to write checks, so I used the internet to learn how. It ended up being a symbiotic relationship, but also my mother had a drinking problem, and as I got older, I realized how bad it was. And I love my mother. It just, I realized this is not the environment I needed to be in at the time. So I decided to move after my mom, she had a medical problem happen. And it was a scare for me, because I realized, if something happened to my mother, I didn’t have a back up plan. I didn’t have anywhere else to go.

So, I moved back. We didn’t get along. To say the least. I was 17, and I moved back to the states, and it was just very difficult because she (her father’s wife) didn’t like me, and so she was creating all these rules that were impossible to follow. Like, “you can’t leave your bedroom after 8pm.”

So she called the police on me one night, after an argument. It was over a sandwich, because I wanted to have a sandwich. It was 8:30 at night. So, I went out of the room, and I used *her* kitchen, after like 8 o’clock or whatever, to like make a sandwich. It was a swiss cheese and baloney sandwich. And I would cut it with a knife, so I had a knife in my hand. I wasn’t wielding it or anything like that. She had ran off and like, called the police on me. And I’m just like ok that’s weird. And so the Oklahoma Police Department knocked on the door. I’m like “hello,” and they’re like “we’re here for a domestic incident.” And I was like “Okay. She’s in there.” And so, like, the police officer understood what was going on. He basically said “you shouldn’t go back there.”

I borrowed my dad’s truck. I ended up driving to Chicago and living on the streets of Chicago for a summer in Chicago, and here I am living out of a pickup truck, and dealing with that.

My aunt did some detective work, and she asked around all the people that I used to hang out with. She told me that she called about 50 or 60 people, until she finally found somebody that had my cell phone number. So, I get a call from my aunt, and she’s like “come to my house,” and I did. I drove a night and a day, all the way to Maryland. And I lived with her for a year. It was so wonderful for her to be there for me at a time like this, and I realize now, that she really saved my life in many ways, and I didn’t realize it, I didn’t understand it at the time, cause I was so used to being in crisis mode that even whenever I was there, I was like “this is temporary.” So I was scared.

I was trying to re-establish a relationship with my father, and so I’m calling him, and he kept on saying “You need structure. You need the military. I was in the Navy for four years: You should go into the Navy or the Air Force.” And, at that time, the Iraq war was going on. So I saw the images on TV every day of chaos and violence in Bagdad, and I really wanted to do something. And I joined the Army because, ya know, it was Bagdad, where the fight was, and I wanted to help with that. I thought, “if I become an intelligence analyst, I can use my skills or learn something, and make a difference, and maybe stop this. — Chelsea E. Manning, October 8, 2017.

Excerpt from the WNYC The New Yorker Radio Hour (Starts at 3 minutes 19 seconds in.):
http://www.wnyc.org/story/chelsea-manning-life-after-prison/

Chelsea Manning to Technologists: Please Take the Time To Contemplate Your System’s Potential Misuse

Chelsea Manning will be speaking at the Fifth Annual Aaron Swartz Day Evening Event – Saturday, November 4, 2017 – 7:30 pm – TICKETS (Just going to the hackathon? It’s free.)

Chelsea E. Manning at Dolores Park in San Francisco, September, 2017.

From October 8, 2017, in New York City (at the New Yorker Festival):

I think the most important think that we have to learn, because I think it’s been forgotten, is that every single one of us has the ability to change things. Each and every one of us has this ability. We need to look to each other and realize our values are what we care about, and then assert them, and say these things, and to take actions in our political discourse to make that happen. Because it’s not going to happen at the Ballot Box. It’s not.

Make your own decisions. Make your own choices. Make your own judgement.

You have to pay attention. For engineers in particular. We design and we develop systems, but the systems that we develop can be used for different things. The software that I was using in Iraq for predictive analysis was the same that you would use in marketing. It’s the same tools. It’s the same analysis. I believe engineers and software engineers and technologists. (That’s a new term that came out while I was away :-)

I guess technologists should realize that we have an ethical obligation to make decisions that go beyond just meeting deadlines or creating a product. What actually takes some chunks of time is to say “what are the consequences of this system?” “How can this be used?” “How can this be misused?” Let’s try to figure out how we can mitigate a software system from being misused. Or decide whether you want to implement it at all. There are systems where, if misused, could be very dangerous. — Chelsea E. Manning, October 8, 2017.

Excerpt from the WNYC The New Yorker Radio Hour (starts at 31:45):
http://www.wnyc.org/story/chelsea-manning-life-after-prison/

About the Ethical Algorithms Panel and Technology Track

This panel is part of the San Francisco Aaron Swartz Day Hackathon. Admission is FREE.

See Caroline Sinders and Kristian Lum, live at 2pm, on November 4th.

Technology Track – Ethical Algorithms
2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A.
Kristian Lum (Human Rights Data Analysis Group – HRDAG) As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging).
Caroline Sinders (Wikimedia Foundation) – Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ 8 AI Designers You Need to Know.” Plus Special guests TBA

About the Ethical Algorithms Panel and Technology Track
by Lisa Rein, Co-founder, Aaron Swartz Day

I created this track based on my phone conversations with Chelsea Manning on this topic.

Chelsea was an Intelligence Analyst for the Army and used algorithms in the day to day duties of her job. She and I have been discussing algorithms, and their ethical implications, since the very first day we spoke on the phone, back in October 2015.

Chelsea recently published as a New York Times Op-Ed on the subject: The Dystopia We Signed Up For.

From the Op-Ed:

“The consequences of our being subjected to constant algorithmic scrutiny are often unclear… algorithms are already analyzing social media habits, determining credit worthiness, deciding which job candidates get called in for an interview and judging whether criminal defendants should be released on bail. Other machine-learning systems use automated facial analysis to detect and track emotions, or claim the ability to predict whether someone will become a criminal based only on their facial features. These systems leave no room for humanity, yet they define our daily lives.”

A few weeks later, in December, I went to the Human Rights Data Analysis Group (HRDAG) holiday party, and met HRDAG’s Executive Director, Megan Price. She explained a great deal to me about the predictive software used by the Chicago police, and how it was predicting crime in the wrong neighborhoods based on the biased data it was getting from meatspace. Meaning, the data itself was “good” in that it was accurate, but unfortunately, the actual less-than-desirable behavior by the Chicago PD was being used as a guide for sending officers out into the field. Basically the existing bad behavior of the Chicago PD was being used to assign future behavior.

This came as a revelation to me. Here we have a chance to stop the cycle of bad behavior, by using technology to predict where the next real crime may occur, but instead, we have chosen to memorialize the faulty techniques used in the past into software, to be used forever.

I have gradually come to understand that, although these algorithms are being used in all aspects of our lives, it is not often clear how or why they are working. Now, it has become clear that they can develop their own biases, based on the data they have been given to “learn” from. Often the origin of that “learning data” is not shared with the public.

I’m not saying that we have to understand how exactly every useful algorithm works; which I understand would be next to impossible, but I’m not sure a completely “black box” approach is best at least when the public, public data, and public safety are involved. (Thomas Hargrove’s Murder Accountability Project‘s “open” database is one example of a transparent approach that seems to be doing good things.)

There also appears to be a disconnect with law enforcement, while some precincts seem to be content to rely on on technology for direction, for better or worse, such as the predictive software used by the Chicago Police Department. In other situations, such Thomas Hargrove’s, “Murder Accountability Project” (featured in the article Murder He Calculated) technologists are having a hard time getting law enforcement to take these tools seriously. Even when these tools appear to have the potential to find killers, there appear to be numerous invisible hurdles in the way of any kind of a timely implementation. Even for these “life and death” cases, Hargrove has had a very hard time getting anyone to listen to him.

So, how do we convince law enforcement to do more with some data while we are, at the same time, concerned about the oversharing other forms of public data?

I find myself wondering what can even be done, if simple requests such as “make the NCIC database’s data for unsolved killings searchable” seem to be falling on deaf ears.

I am hoping to have some actual action items that can be followed up on in the months to come, as a result of this panel.

References:

1. The Dystopia We Signed Up For, Op-Ed by Chelsea Manning, New York Times, September 16, 2017. (Link goes to a free version not behind a paywall, at Op-Ed News)

2. Pitfalls of Predictive Policing, by Jessica Saunders for Rand Corporation, October 11, 2016. https://www.rand.org/blog/2016/10/pitfalls-of-predictive-policing.html

3. Predictions put into practice: a quasi-experimental evaluation of Chicago’s predictive policing pilot. by Jessica Saunders, Priscillia Hunt, John S. Hollywood, for the Journal of Experimental Criminology, August 12, 2016. https://link.springer.com/article/10.1007/s11292-016-9272-0

4. Murder He Calculated – by Robert Kolker, for Bloomberg.com, February 12th 2017.

5. Murder Accountability Project, founded by Thomas Hargrove. http://www.murderdata.org/

6. Secret Algorithms Are Deciding Criminal Trials and We’re Not Even Allowed to Test Their Accuracy – By Vera Eidelman, William J. Brennan Fellow, ACLU Speech, Privacy, and Technology Project, September 15, 2017. https://www.aclu.org/blog/privacy-technology/surveillance-technologies/secret-algorithms-are-deciding-criminal-trials-and

7. Machine Bias – There’s software used across the country to predict future criminals. And it’s biased against blacks. by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

8. Criminality Is Not A Nail – A new paper uses flawed methods to predict likely criminals based on their facial features. by Katherine Bailey for Medium.com, November 29, 2016. https://medium.com/backchannel/put-away-your-machine-learning-hammer-criminality-is-not-a-nail-1309c84bb899

Saturday, November 4, 2017
2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A.
Kristian Lum (Human Rights Data Analysis Group – HRDAG)
As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging).
Caroline Sinders (Wikimedia Foundation)
Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ 8 AI Designers You Need to Know.” Plus Special guests TBA

 

What Makes A Good Lightning Talk

Lightning Talk Schedule

Our lightning talks are only 20 minutes in length, and usually focus on working code – or often, a collection of working implementations that someone has done over time.

These are very advanced, not general in scope, and implementation-oriented. Additionally, the goal is to feature projects that represent our community’s ideals.

Saturday Lightning talks are meant to explain potential hackathon projects.

Sunday talks are to present work done on projects over the weekend.

Think of a topic this way:

What is the exact problem space?
– How do you plan to fix it?
– How is this idea different than other ideas for fixing that problem?
– How have you *implemented* your idea? preferably with at least on screen code, if not working code?

 

 

Caroline Sinders Named By Forbes as an “AI Designer That You Need To Know”

See Caroline Sinders at this year’s Aaron Swartz Day International Hackathon, at the San Francisco Hackathon‘s Ethical Algorithm Panel, Saturday at 2pm, and at the evening event, Saturday night, November 4, 7:30 pm.

8 AI Designers That You Need To Know by Adelyn Zhou for Forbes.

Caroline Sinders – Machine Learning Designer and Researcher, former Interaction Designer for IBM Watson

Caroline Sinders Caroline Sinders

Caroline is an artist, designer, and activist who also loves writing codes. She helped design and market IBM Watson, a billion-dollar artificial intelligence system built on advanced natural language processing, automated reasoning, machine learning, and other technologies. Sinders’ work on Watson focused on user flows and the impact of human decision-making in the development of robotics software. She recently left her dream job at IBM to pursue an equally challenging fellowship at Open Labs. A passionate crusader against online harassment, Caroline probes the different ways design can influence and shape digital conversations, with the ultimate goal of using machine learning to address online harassment. You can weigh her strong opinions on Twitter, Medium, LinkedIn, and her personal website.

Plan A November 4th Hackathon In Your Town

Hackathons are being planned for November 4th in San Francisco, New York, and even Cairo!

So far, the projects are the Freedom of the Press Foundation’s SecureDrop and these topics:

  1. Ethical Algorithms
  2. Usable Crypto
  3. Post-Quantum Crypto
  4. FOIA

Send an email to lisa@lisarein.com if you are planning a hackathon :-)

We are putting together new “Hackathon 101” materials  too!

So, if you have good “how to have a hackathon” resources, please email them too! :-)

 

 

Setting the Record Straight

Seems like a good time for a reminder. (This content is from our “Setting the Record Straight” page that has been up since October 2014.)

FACT: Aaron implemented a piece of software that downloaded articles from the JSTOR website faster than JSTOR originally intended. Aaron’s software downloaded articles from the JSTOR website to Aaron’s laptop, just like a live person would have downloaded them, but without his having to sit there and click through each of the steps manually.  Source: Alex Stamos, http://unhandled.com/2013/01/12/the-truth-about-aaron-swartzs-crime/

FACT: Aaron did not hack into any of MIT’s computers. The CFAA requires that a person gain access to a computer that they weren’t authorized to access. Aaron was obviously authorized to access his own laptop.

FACT: Aaron did not hack into MIT’s network. Aaron connected his laptop to MIT’s open network by walking into an open computer closet on MITs open campus and simply plugging into an unused ethernet port.  Source: Alex Stamos, http://unhandled.com/2013/01/12/the-truth-about-aaron-swartzs-crime/

FACT: Aaron was a “Fellow” at the Harvard University Edmond J. Safra Center for Ethics at the time. Aaron was exactly the type of academic researcher that MIT meant to have downloading articles from the JSTOR database over its open network. Aaron’s past research in this regard was the basis of a Stanford Law Review Article where he found troubling connections between corporations and their funding of legal research. Source: Stanford Law Review
http://www.stanfordlawreview.org/print/article/punitive-damages-remunerated-research-and-legal-profession

FACT: Aaron wasn’t even violating JSTOR’s Terms of Service at the time. JSTOR and MIT had contractual agreements allowing unlimited downloads to any computers on MITs network.
Source: Alex Stamos, http://unhandled.com/2013/01/12/the-truth-about-aaron-swartzs-crime/

FACT: Downloading JSTOR articles was one minor footnote among the many amazing projects Aaron was working on at the time. From the fall of 2010 until his death in 2013, Aaron’s projects included, but were not limited to: SecureDrop, the leak-protecting technology for journalists now implemented by outlets ranging from The New Yorker to Forbes to The Guardian; the SOPA/PIPA fight, The Flaming Sword of Justice (now The Good Fight), a podcast about activism which went on to reach the top of the iTunes charts; VictoryKit, an online campaigning toolset still mobilizing activists around the world; and co-founding Demand Progress.