Tag Archives: Caroline Sinders

Chelsea Manning, Caroline Sinders, and Kristian Lum: “Technologists, It’s Time to Decide Where You Stand On Ethics”

(Left to Right) Kristian Lum, Caroline Sinders, Chelsea Manning.

A lot of folks were wondering about what Chelsea Manning‘ meant when she discussed a “Code of Ethics” during her SXSW talk, last March. Well there’s no need to wonder, because Chelsea discussed this in detail, with her co-panelists Kristian Lum (Human Rights Data Analysis Group) and Caroline Sinders (Wikimedia Foundation), during the Ethical Algorithms track at the last Aaron Swartz Day at the Internet Archive.

Chelsea Manning, Caroline Sinders, and Kristian Lum: “Technologists, It’s Time to Decide Where You Stand On Ethics”

By Lisa Rein for Mondo 2000.

Link to the complete video for Ethical Algorithms panel.

Chelsea Manning

Chelsea Manning: Me personally, I think that we in technology have a responsibility to make our own decisions in the workplace – wherever that might be. And to communicate with each other, share notes, talk to each other, and really think – take a moment – and think about what you are doing. What are you doing? Are you helping? Are you harming things? Is it worth it? Is this really what you want to be doing? Are deadlines being prioritized over – good results? Should we do something? I certainly made a decision in my own life to do something. It’s going to be different for every person. But you really need to make your own decision as to what to do, and you don’t have to act individually.

Kristian Lum and Caroline Sinders.

Caroline Sinders: Even if you feel like a cog in the machine, as a technologist, you aren’t. There are a lot of people like you trying to protest the systems you’re in. Especially in the past year, we’ve heard rumors of widespread groups and meetings of people inside of Facebook, inside of Google, really talking about the ramifications of the U.S. Presidential election, of questioning, “how did this happen inside these platforms?” – of wanting there even to be accountability inside of their own companies. I think it’s really important for us to think about that for a second. That that’s happening right now. That people are starting to organize. That they are starting to ask questions.

Aaron Swartz Ceramic Statue (by Nuala Creed) and Kristian Lum.

Kristen Lum: There are a lot of models now predicting whether an individual will be re-arrested in the future. Here’s a question: What counts as a “re-arrest?” Say someone fails to appear for court and a bench warrant is issued, and then they are arrested. Should that count? So I don’t see a whole lot of conversation about this data munging.

Read the whole thing here. Watch the whole video here.

See all the Aaron Swartz Day 2017 videos here with the New Complete Speaker Index!

Thanks to ThoughtWorks for sponsoring the Ethical Algorithms Track at Aaron Swartz Day 2017. This track has also led to the launch of our Aaron Swartz Day Police Surveillance Project, and we have lots to tell you all about it, very soon :-)

About the Ethical Algorithms Panel and Technology Track

This panel is part of the San Francisco Aaron Swartz Day Hackathon. Admission is FREE.

See Caroline Sinders and Kristian Lum, live at 2pm, on November 4th.

Technology Track – Ethical Algorithms
2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A.
Kristian Lum (Human Rights Data Analysis Group – HRDAG) As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging).
Caroline Sinders (Wikimedia Foundation) – Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ 8 AI Designers You Need to Know.” Plus Special guests TBA

About the Ethical Algorithms Panel and Technology Track
by Lisa Rein, Co-founder, Aaron Swartz Day

I created this track based on my phone conversations with Chelsea Manning on this topic.

Chelsea was an Intelligence Analyst for the Army and used algorithms in the day to day duties of her job. She and I have been discussing algorithms, and their ethical implications, since the very first day we spoke on the phone, back in October 2015.

Chelsea recently published as a New York Times Op-Ed on the subject: The Dystopia We Signed Up For.

From the Op-Ed:

“The consequences of our being subjected to constant algorithmic scrutiny are often unclear… algorithms are already analyzing social media habits, determining credit worthiness, deciding which job candidates get called in for an interview and judging whether criminal defendants should be released on bail. Other machine-learning systems use automated facial analysis to detect and track emotions, or claim the ability to predict whether someone will become a criminal based only on their facial features. These systems leave no room for humanity, yet they define our daily lives.”

A few weeks later, in December, I went to the Human Rights Data Analysis Group (HRDAG) holiday party, and met HRDAG’s Executive Director, Megan Price. She explained a great deal to me about the predictive software used by the Chicago police, and how it was predicting crime in the wrong neighborhoods based on the biased data it was getting from meatspace. Meaning, the data itself was “good” in that it was accurate, but unfortunately, the actual less-than-desirable behavior by the Chicago PD was being used as a guide for sending officers out into the field. Basically the existing bad behavior of the Chicago PD was being used to assign future behavior.

This came as a revelation to me. Here we have a chance to stop the cycle of bad behavior, by using technology to predict where the next real crime may occur, but instead, we have chosen to memorialize the faulty techniques used in the past into software, to be used forever.

I have gradually come to understand that, although these algorithms are being used in all aspects of our lives, it is not often clear how or why they are working. Now, it has become clear that they can develop their own biases, based on the data they have been given to “learn” from. Often the origin of that “learning data” is not shared with the public.

I’m not saying that we have to understand how exactly every useful algorithm works; which I understand would be next to impossible, but I’m not sure a completely “black box” approach is best at least when the public, public data, and public safety are involved. (Thomas Hargrove’s Murder Accountability Project‘s “open” database is one example of a transparent approach that seems to be doing good things.)

There also appears to be a disconnect with law enforcement, while some precincts seem to be content to rely on on technology for direction, for better or worse, such as the predictive software used by the Chicago Police Department. In other situations, such Thomas Hargrove’s, “Murder Accountability Project” (featured in the article Murder He Calculated) technologists are having a hard time getting law enforcement to take these tools seriously. Even when these tools appear to have the potential to find killers, there appear to be numerous invisible hurdles in the way of any kind of a timely implementation. Even for these “life and death” cases, Hargrove has had a very hard time getting anyone to listen to him.

So, how do we convince law enforcement to do more with some data while we are, at the same time, concerned about the oversharing other forms of public data?

I find myself wondering what can even be done, if simple requests such as “make the NCIC database’s data for unsolved killings searchable” seem to be falling on deaf ears.

I am hoping to have some actual action items that can be followed up on in the months to come, as a result of this panel.

References:

1. The Dystopia We Signed Up For, Op-Ed by Chelsea Manning, New York Times, September 16, 2017. (Link goes to a free version not behind a paywall, at Op-Ed News)

2. Pitfalls of Predictive Policing, by Jessica Saunders for Rand Corporation, October 11, 2016. https://www.rand.org/blog/2016/10/pitfalls-of-predictive-policing.html

3. Predictions put into practice: a quasi-experimental evaluation of Chicago’s predictive policing pilot. by Jessica Saunders, Priscillia Hunt, John S. Hollywood, for the Journal of Experimental Criminology, August 12, 2016. https://link.springer.com/article/10.1007/s11292-016-9272-0

4. Murder He Calculated – by Robert Kolker, for Bloomberg.com, February 12th 2017.

5. Murder Accountability Project, founded by Thomas Hargrove. http://www.murderdata.org/

6. Secret Algorithms Are Deciding Criminal Trials and We’re Not Even Allowed to Test Their Accuracy – By Vera Eidelman, William J. Brennan Fellow, ACLU Speech, Privacy, and Technology Project, September 15, 2017. https://www.aclu.org/blog/privacy-technology/surveillance-technologies/secret-algorithms-are-deciding-criminal-trials-and

7. Machine Bias – There’s software used across the country to predict future criminals. And it’s biased against blacks. by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

8. Criminality Is Not A Nail – A new paper uses flawed methods to predict likely criminals based on their facial features. by Katherine Bailey for Medium.com, November 29, 2016. https://medium.com/backchannel/put-away-your-machine-learning-hammer-criminality-is-not-a-nail-1309c84bb899

Saturday, November 4, 2017
2:00 – 2:45 pm – Ethical Algorithms Panel – w/Q and A.
Kristian Lum (Human Rights Data Analysis Group – HRDAG)
As the Lead Statistician at HRDAG, Kristian’s research focus has been on furthering HRDAG’s statistical methodology (population estimation or multiple systems estimation—with a particular emphasis on Bayesian methods and model averaging).
Caroline Sinders (Wikimedia Foundation)
Caroline uses machine learning to address online harassment at Wikimedia, and before that, she helped design and market IBM’s Watson. Caroline was also just named as one of Forbes’ 8 AI Designers You Need to Know.” Plus Special guests TBA

 

Caroline Sinders Named By Forbes as an “AI Designer That You Need To Know”

See Caroline Sinders at this year’s Aaron Swartz Day International Hackathon, at the San Francisco Hackathon‘s Ethical Algorithm Panel, Saturday at 2pm, and at the evening event, Saturday night, November 4, 7:30 pm.

8 AI Designers That You Need To Know by Adelyn Zhou for Forbes.

Caroline Sinders – Machine Learning Designer and Researcher, former Interaction Designer for IBM Watson

Caroline Sinders Caroline Sinders

Caroline is an artist, designer, and activist who also loves writing codes. She helped design and market IBM Watson, a billion-dollar artificial intelligence system built on advanced natural language processing, automated reasoning, machine learning, and other technologies. Sinders’ work on Watson focused on user flows and the impact of human decision-making in the development of robotics software. She recently left her dream job at IBM to pursue an equally challenging fellowship at Open Labs. A passionate crusader against online harassment, Caroline probes the different ways design can influence and shape digital conversations, with the ultimate goal of using machine learning to address online harassment. You can weigh her strong opinions on Twitter, Medium, LinkedIn, and her personal website.

Time for this year’s Aaron Swartz Day and International Hackathon

TICKETS 

The Internet Archive is hosting the Fifth Annual Aaron Swartz Day International Hackathon and Evening Event:

Location: Internet Archive, 300 Funston Ave, San Francisco, CA 94118

November 4, 2017, from 6:00-7:00 (Reception)               7:30pm – 9:30 pm (Speakers)

The purpose of the evening event, as always, is to inspire direct action toward improving the world. Everyone has been asked to speak about whatever they feel is most important.

The event will take place following this year’s San Francisco Aaron Swartz International Hackathon, which is going on Saturday, November 4, from 10-6 and Sunday, November 5, from 11am-6pm at the Internet Archive.

Hackathon Reception: 6:00pm-7:00pm(A paid ticket for the evening event also gets you in to the Hackathon Reception.) 

Come talk to the speakers and the rest of the Aaron Swartz Day community, and join us in celebrating many incredible things that we’ve accomplished by this year! (Although there is still much work to be done.)

We will toast to the launch of the Pursuance Project (an open source, end-to-end encrypted Project Management suite, envisioned by Barrett Brown and brought to life by Steve Phillips).

Migrate your way upstairs: 7:00-7:30pm – The speakers are starting early, at 7:30pm this year – and we are also providing a stretch break at 8:15pm – and for those to come in that might have arrived late.

Speakers upstairs begin at 7:30 pm.

Speakers in reverse order:                

Chelsea Manning (Network Security Expert, Former Intelligence Analyst)

Lisa Rein (Chelsea Manning’s Archivist, Co-founder Creative Commons, Co-founder Aaron Swartz Day)

Daniel Rigmaiden (Transparency Advocate)

Barrett Brown (Journalist, Activist, Founder of the Pursuance Project) (via SKYPE)

Jason Leopold (Senior Investigative Reporter, Buzzfeed News)

Jennifer Helsby (Lead Developer, SecureDrop, Freedom of the Press Foundation)

Cindy Cohn (Executive Director, Electronic Frontier Foundation)

Gabriella Coleman (Hacker Anthropologist, Author, Researcher, Educator)

Caroline Sinders (Designer/Researcher, Wikimedia Foundation, Creative Dissent Fellow, YBCA)

Brewster Kahle (Co-founder and Digital Librarian, Internet Archive, Co-founder Aaron Swartz Day)

Steve Phillips (Project Manager, Pursuance)

Mek Karpeles (Citizen of the World, Internet Archive)

Brenton Cheng (Senior Engineer, Open Library, Internet Archive)

TICKETS

About the Speakers (speaker bios are at the bottom of this invite):

Chelsea Manning – Network Security Expert, Transparency Advocate

Chelsea E. Manning is a network security expert, whistleblower, and former U.S. Army intelligence analyst. While serving 7 years of an unprecedented 35 year sentence for a high-profile leak of government documents, she became a prominent and vocal advocate for government transparency and transgender rights, both on Twitter and through her op-ed columns for The Guardian and The New York Times. She currently lives in the Washington, D.C. area, where she writes about technology, artificial intelligence, and human rights.

Lisa Rein – Chelsea Manning’s Archivist, Co-founder, Aaron Swartz  Day & Creative Commons

Lisa Rein is Chelsea Manning’s archivist, and ran her @xychelsea Twitter account from December 2015 – May 2017. She is a co-founder of Creative Commons, where she worked with Aaron Swartz on its technical specification, when he was only 15. She is a writer, musician and technology consultant, and lectures for San Francisco State University’s BECA department. Lisa is the Digital Librarian for the Dr. Timothy Leary Futique Trust.

Daniel Rigmaiden – Transparency Advocate

Daniel Rigmaiden became a government transparency advocate after U.S. law enforcement used a secret cell phone surveillance device to locate him inside his home. The device, often called a “Stingray,” simulates a cell tower and tricks cell phones into connecting to a law enforcement controlled cellular network used to identify, locate, and sometimes collect the communications content of cell phone users. Before Rigmaiden brought Stingrays into the public spotlight in 2011, law enforcement concealed use of the device from judges, defense attorneys and defendants, and would typically not obtain a proper warrant before deploying the device.

Barrett Brown – Journalist, Activist, and Founder of the Pursuance Project

Barrett Brown is a writer and anarchist activist. His work has appeared in Vanity Fair, the Guardian, The Intercept, Huffington Post, New York Press, Skeptic, The Daily Beast, al-Jazeera, and dozens of other outlets. In 2009 he founded Project PM, a distributed think-tank, which was later re-purposed to oversee a crowd-sourced investigation into the private espionage industry and the intelligence community at large via e-mails stolen from federal contractors and other sources. In 2011 and 2012 he worked with Anonymous on campaigns involving the Tunisian revolution, government misconduct, and other issues. In mid-2012 he was arrested and later sentenced to four years in federal prison on charges stemming from his investigations and work with Anonymous. While imprisoned, he won the National Magazine Award for his column, The Barrett Brown Review of Arts and Letters and Prison. Upon his release, in late 2016, he began work on the Pursuance System, a platform for mass civic engagement and coordinated opposition. His third book, a memoir/manifesto, will be released in 2018 by Farrar, Strauss, and Giroux.

Jason Leopold, Senior Investigative Reporter, Buzzfeed News

Jason Leopold is an Emmy-nominated investigative reporter on the BuzzFeed News Investigative Team. Leopold’s reporting and aggressive use of the Freedom of Information Act has been profiled by dozens of media outlets, including a 2015 front-page story in The New York Times. Politico referred to Leopold in 2015 as “perhaps the most prolific Freedom of Information requester.” That year, Leopold, dubbed a ‘FOIA terrorist’ by the US government testified before Congress about FOIA (PDF) (Video). In 2016, Leopold was awarded the FOI award from Investigative Reporters & Editors and was inducted into the National Freedom of Information Hall of Fame by the Newseum Institute and the First Amendment Center.

Jennifer Helsby, Lead Developer, SecureDrop (Freedom of the Press Foundation)

Jennifer is Lead Developer of SecureDrop. Prior to joining FPF, she was a postdoctoral researcher at the Center for Data Science and Public Policy at the University of Chicago, where she worked on applying machine learning methods to problems in public policy. Jennifer is also the CTO and co-founder of Lucy Parsons Labs, a non-profit that focuses on police accountability and surveillance oversight. In a former life, she studied the large scale structure of the universe, and received her Ph.D. in astrophysics from the University of Chicago in 2015.

Cindy Cohn – Executive Director, Electronic Frontier Foundation (EFF)

Cindy Cohn is the Executive Director of the Electronic Frontier Foundation. From 2000-2015 she served as EFF’s Legal Director as well as its General Counsel.The National Law Journal named Ms. Cohn one of 100 most influential lawyers in America in 2013, noting: “[I]f Big Brother is watching, he better look out for Cindy Cohn.”

Gabriella Coleman – Hacker Anthropologist, Author, Researcher, Educator

Gabriella (Biella) Coleman holds the Wolfe Chair in Scientific and Technological Literacy at McGill University. Trained as an anthropologist, her scholarship explores the politics and cultures of hacking, with a focus on the sociopolitical implications of the free software movement and the digital protest ensemble Anonymous. She has authored two books, Coding Freedom: The Ethics and Aesthetics of Hacking (Princeton University Press, 2012) and Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous (Verso, 2014).

Caroline Sinders – Researcher/Designer, Wikimedia Foundation

Caroline Sinders is a machine learning designer/user researcher, artist. For the past few years, she has been focusing on the intersections of natural language processing, artificial intelligence, abuse, online harassment and politics in digital, conversational spaces. Caroline is a designer and researcher at the Wikimedia Foundation, and a Creative Dissent fellow with YBCA. She holds a masters from New York University’s Interactive Telecommunications Program from New York University.

Brewster Kahle, Founder & Digital Librarian, Internet Archive

Brewster Kahle has spent his career intent on a singular focus: providing Universal Access to All Knowledge. He is the founder and Digital Librarian of the Internet Archive, which now preserves 20 petabytes of data – the books, Web pages, music, television, and software of our cultural heritage, working with more than 400 library and university partners to create a digital library, accessible to all.

Steve Phillips, Project Manager, Pursuance Project

Steve Phillips is a programmer, philosopher, and cypherpunk, and is currently the Project Manager of Barrett Brown’s Pursuance Project. In 2010, after double-majoring in mathematics and philosophy at UC Santa Barbara, Steve co-founded Santa Barbara Hackerspace. In 2012, in response to his concerns over rumored mass surveillance, he created his first secure application, Cloakcast. And in 2015, he spoke at the DEF CON hacker conference, where he presented CrypTag. Steve has written over 1,000,000 words of philosophy culminating in a new philosophical methodology, Executable Philosophy.

Mek Karpeles, Citizen of the World, Internet Archive

Mek is a citizen of the world at the Internet Archive. His life mission is to organize a living map of the world’s knowledge. With it, he aspires to empower every person to overcome oppression, find and create opportunity, and reach their fullest potential to do good. Mek’s favorite media includes non-fiction books and academic journals — tools to educate the future — which he proudly helps make available through his work on Open Library.

Brenton Cheng, Senior Engineer, Open Library, Internet Archive

Brenton Cheng is a technology-wielding explorer, inventor, and systems thinker. He spearheads the technical and product development of Open Library and the user-facing Archive.org website. He is also an adjunct professor in the Performing Arts & Social Justice department at University of San Francisco.

TICKETS

For more information, contact:

Lisa Rein, Co-founder, Aaron Swartz Day
lisa@lisarein.com
http://www.aaronswartzday.org