Tag Archives: ACLU

ACLU: Amazon Needs To Get Out Of The Surveillance Business

“But wait,” you may start to say “I didn’t even know Amazon was even IN the surveillance business.”

Yeah. Neither did we. :-/

This is pretty much our worst fears realized: A huge corporation quietly implementing biased facial recognition software without any oversight from anyone.

Needless to say, this situation falls under the territory of our #EthicalAlgorithms mandate.

Here’s an ACLU Petition with links to more information:

Amazon: Get out of the surveillance business

(https://action.aclu.org/petition/amazon-stop-selling-surveillance)

We are still evaluating the documents and will be planning a specific strategy to deal with this situation – Aaron Swartz Day style :-

We have been making enormous progress on the Aaron Swartz Day Police Surveillance Project – which is a 100% successful experiment done in collaboration with the EFF, Oakland Privacy.net, cell phone privacy expert Daniel Rigmaiden and wonderful Muckrock.

The project provides letter templates to make it easy to ask your local police and sherriff’s departments what surveillance equipment they may have already purchased; they have to give you receipts and contracts if you guess correctly. (It’s like a little game show.)

So we are still in catch up mode at this time – but we are on the case. And we have many experts and technologists working to explain and expose the truth, before it’s too late.

If we can’t stop it from being implemented in the short term, perhaps we can develop technologies to stop it from functioning properly. While we are working out these issues in the courts, there is nothing saying we can’t share information and take defensive action. If you know techniques that folks should know about, email us at aaronswartzday [@] gmail.com

More on the situation from the New York Times.

New York Times: Amazon Pushes Facial Recognition to Police.

Sign the ACLU petition here.   More on this issue here.

The Ethical Algorithms  Panel & Track will be even more full than last year – at Aaron Swartz Day 2018 ‘s San Francisco Hackathon. We will have projects for you to hack on from afar. (Keeps your eyes right here for more information this week! :-) Pro publica story on Machine Bias here.

New York Times: Amazon Pushes Facial Recognition to Police.

By Nick Wingfield for the NY Times:

On Tuesday, the American Civil Liberties Union led a group of more than two dozen civil rights organizations that asked Amazon to stop selling its image recognition system, called Rekognition, to law enforcement. The group says that the police could use it to track protesters or others whom authorities deem suspicious, rather than limiting it to people committing crimes.

Here is the full text of the entire article – because, in our opinion, it is a clear cut case of Fair Use – being information that is clearly in the public interest (and should not be behind a paywall in the first place).

*****

By Nick Wingfield

May 22, 2018

SEATTLE — In late 2016, Amazon introduced a new online service that could help identify faces and other objects in images, offering it to anyone at a low cost through its giant cloud computing division, Amazon Web Services.

Not long after, it began pitching the technology to law enforcement agencies, saying the program could aid criminal investigations by recognizing suspects in photos and videos. It used a couple of early customers, like the Orlando Police Department in Florida and the Washington County Sheriff’s Office in Oregon, to encourage other officials to sign up.

But now that aggressive push is putting the giant tech company at the center of an increasingly heated debate around the role of facial recognition in law enforcement. Fans of the technology see a powerful new tool for catching criminals, but detractors see an instrument of mass surveillance.

On Tuesday, the American Civil Liberties Union led a group of more than two dozen civil rights organizations that asked Amazon to stop selling its image recognition system, called Rekognition, to law enforcement. The group says that the police could use it to track protesters or others whom authorities deem suspicious, rather than limiting it to people committing crimes.

Facial recognition is not new technology, but the organizations appear to be focusing on Amazon because of its prominence and what they see as a departure from the company’s oft-stated focus on customers.

“Amazon Rekognition is primed for abuse in the hands of governments,” the group said in the letter, which was addressed to Jeff Bezos, Amazon’s chief executive. “This product poses a grave threat to communities, including people of color and immigrants, and to the trust and respect Amazon has worked to build.”

With the letter, the A.C.L.U. released a collection of internal emails and other documents from law enforcement agencies in Washington County and Orlando that it obtained through open records requests. The correspondence between Amazon and law enforcement officials provides an unusual peek into the company’s ambitions with facial recognition tools, and how it has interacted with some of the officials using its products.

Many of the companies supplying the technology are security contractors little known to the public, but Amazon is one of the first major tech companies to actively market technology for conducting facial recognition to law enforcement. The efforts are still a tiny part of Amazon’s business, with the service one of dozens it offers through Amazon Web Services. But few companies have Amazon’s ability to effectively push widespread adoption of tech products.
EDITORS’ PICKS
The Bicultural Blackness of the Royal Wedding
How Trump’s Lawyer Built a Business Empire in the Shadows
Trump Team’s Infighting Thwarts Victory on China Trade
Image
Amazon’s campus in downtown Seattle. The American Civil Liberties Union and other civil rights groups are asking the company to stop selling its image-recognition system, Rekognition, to law enforcement authorities.CreditRuth Fremson/The New York Times

“The idea that a massive and highly resourced company like Amazon has moved decisively into this space could mark a sea change for this technology,” said Alvaro Bedoya, executive director at the Center on Privacy & Technology at the Georgetown University Law Center.

In a statement, a spokeswoman for Amazon Web Services stressed that the company offered a general image recognition technology that could automate the process of identifying people, objects and activities. She said amusement parks had used it to find lost children, and Sky News, the British broadcaster, used it last weekend to automatically identify guests attending the royal wedding. (The New York Times has also used the technology, including for the royal wedding.)

The spokeswoman said that, as with all A.W.S. services, the company requires customers to comply with the law.

The United States military and intelligence agencies have used facial recognition tools for years in overseas conflicts to identify possible terrorist suspects. But domestic law enforcement agencies are increasingly using the technology at home for more routine forms of policing.

The people who can be identified through facial recognition systems are not just those with criminal records. More than 130 million American adults are in facial recognition databases that can be searched in criminal investigations, the Center on Privacy & Technology at Georgetown Law estimates.

Facial recognition is showing up in new corners of public life all the time, often followed by challenges from critics about its efficacy as a security tool and its impact on privacy. Arenas are using it to screen for known troublemakers at events, while the Department of Homeland Security is using it to identify foreign visitors who overstay their visas at airports. And in China, facial recognition is ubiquitous, used to identify customers in stores and single out jaywalkers.

There are also concerns about the accuracy of facial recognition, with troubling variations based on gender and race. One study by the Massachusetts Institute of Technology showed that the gender of darker-skinned women was misidentified up to 35 percent of the time by facial recognition software.

“We have it being used in unaccountable ways and with no regulation,” said Malkia Cyril, executive director of the Center for Media Justice, a nonprofit civil rights organization that signed the A.C.L.U.’s letter to Amazon.

The documents the A.C.L.U. obtained from the Orlando Police Department show city officials considering using video analysis tools from Amazon with footage from surveillance cameras, body-worn cameras and drones.

Amazon may have gone a little far in describing what the technology can do. This month, it published a video of an Amazon official, Ranju Das, speaking at a company event in Seoul, South Korea, in which he said Orlando could even use Amazon’s Rekognition system to find the whereabouts of the mayor through cameras around the city.
Video from an Amazon event where a company official spoke about the company’s facial recognition system.CreditVideo by Amazon Web Services Korea

In a statement, a spokesman for the Orlando Police Department, Sgt. Eduardo Bernal, said the city was not using Amazon’s technology to track the location of elected officials in its jurisdiction, nor did it have plans to. He said the department was testing Amazon’s service now, but was not using it in investigations or public spaces.

“We are always looking for new solutions to further our ability to keep the residents and visitors of Orlando safe,” he said.

Early last year, the company began courting the Washington County Sheriff’s Office outside of Portland, Ore., eager to promote how it was using Amazon’s service for recognizing faces, emails obtained by the A.C.L.U. show. Chris Adzima, a systems analyst in the office, told Amazon officials that he fed about 300,000 images from the county’s mug shot database into Amazon’s system.

Within a week of going live, the system was used to identify and arrest a suspect who stole more than $5,000 from local stores, he said, adding there were no leads before the system identified him. The technology was also cheap, costing just a few dollars a month after a setup fee of around $400.

Mr. Adzima ended up writing a blog post for Amazon about how the sheriff’s office was using Rekognition. He spoke at one of the company’s technical conferences, and local media began reporting on their efforts. After the attention, other law enforcement agencies in Oregon, Arizona and California began to reach to Washington County to learn more about how it was using Amazon’s system, emails show.

In February of last year, before the publicity wave, Mr. Adzima told an Amazon representative in an email that the county’s lawyer was worried the public might believe “that we are constantly checking faces from everything, kind of a Big Brother vibe.”

“They are concerned that A.C.L.U. might consider this the government getting in bed with big data,” Mr. Adzima said in an email. He did not respond to a request for comment for this article.

Deputy Jeff Talbot, a spokesman for the Washington County Sheriff’s Office, said Amazon’s facial recognition system was not being used for mass surveillance by the office. The company has a policy to use the technology only to identify a suspect in a criminal investigation, he said, and has no plans to use it with footage from body cameras or real-time surveillance systems.

“We are aware of those privacy concerns,” he said. “That’s why we have a policy drafted and why we’ve tried to educate the public about what we do and don’t do.”

Artificial General Intelligences (AGIs) & Corporations Seminar at the Internet Archive Tomorrow (Sunday)

Note: if you can’t make this event, check out this literature review and this paper, which will still give you good idea of some of the subject matter :)

When: Sunday, April 8, 2018
Where: The Internet Archive, 300 Funston Ave, San Francisco, CA
Time: 2-6pm

Artificial General Intelligences & Corporations

Description:

Even if we don’t know yet how to align Artificial General Intelligences with our goals, we do have experience in aligning organizations with our goals. Some argue corporations are in fact Artificial Intelligences – legally at least we treat them as persons already.

The Foresight Institute, along with the Internet Archive, invite you to spend an afternoon examining AI alignment, especially whether our interactions with different types of organizations, e.g. our treatment of corporations as persons, allow insights into how to align AI goals with human goals.

While this meeting focuses on AI safety, it merges AI safety, philosophy, computer security, and law and should be highly relevant for anyone working in or interested in those areas.

Why this is really really important:

As we learned during last year’s Ethical Algorithms panel, there are many different ways that unchecked black box algorithms are being used against citizens daily.

This kind of software can literally ruin a person’s life, through no fault of their own – especially if they are already being discriminated against or profiled unfairly in some way in real life. This is because the algorithms tend to amplify and exaggerate any biases that already occur in the data being fed into the system (that it “learns” on).

Algorithms are just one of many tools that an an AGI (Artificial General Intelligence) might use in the course of its daily activities on behalf of whatever Corporation for which it operates.

The danger lies in the potential for misinterpretation by these AGIs should they be making decisions based on the faulty interpretations of unchecked black box algorithmic calculations.  For this reason, preservation of and public access to the original data sets used to train these algorithms is of paramount importance. And currently, that just isn’t the case.

The promise of AGIs is downright exciting, but how do we ensure that corporate-driven AGIs do not gain unruly control over public systems?

Arguably, corporations are already given too many rights – those rivaling or surpassing those of actual humans, at this point.

What happens when these Corporate “persons” have AGIs out in the world, interacting with live humans and other AGIs, on a constant basis. (AGIs never sleep.) How many tasks could your AGI do for you while you sleep at night? What instructions would you give your AGI? And whose “fault” is it when the goals of an AGI conflict with those of a living person?

Joi Ito, the Director of the MIT Media Lab, wrote a piece for the ACLU this week, concluding that AI Engineers Must Open Their Designs to Democratic Control  -“The internet, artificial intelligence, genetic engineering, crypto-currencies, and other technologies are providing us with ever more tools to change the world around us. But there is a cost. We’re now awakening to the implications that many of these technologies have for individuals and society…

AI is now making decisions for judges about the risks that someone accused of a crime will violate the terms of his pretrial probation, even though a growing body of research has shown flaws in such decisions made by machines,” he writes. “A significant problem is that any biases or errors in the data the engineers used to teach the machine will result in outcomes that reflect those biases

Joi explains that the researchers at the M.I.T. Media Lab, have been starting to refer to these technologies as “extended intelligence” rather than “artificial intelligence.” “The term “extended intelligence” better reflects the expanding relationship between humans and society, on the one hand, and technologies like AI, blockchain, and genetic engineering on the other. Think of it as the principle of bringing society or humans into the loop,” he explains.

Sunday’s seminar will discuss all of these ideas and more, working towards a concept called “AI Alignment” – where the Corporate-controlled AGIs and humans work toward shared goals.

The problem is that almost all of the AGIs being developed are, in fact, some form of corporate AGI.

That’s why a group of AGI scientists founded OpenCog, to provide a framework that anyone can use.

Aaron Swartz Day is working with OpenCog on building an in-world robot concierge for our VR Destination, and we will be discussing and teaching about the privacy and security considerations of AGI and VR in an educational area within the museum – and of course on this website :-). Also #AGIEthics will be a hackathon track this year, along with #EthicalAlgorithms :-)

So! If this is all interesting to you – PLEASE come on Sunday :-) !

There will also be an Aaron Swartz Day planning meeting –> way early this year –> because really we never stopped working on the projects from last November –> you are gonna love it! –> The meeting is at the Internet Archive on May 23, 2018 at 6pm. There will be an RSVP soon – but save the date! :-)

More on that soon! :)

References

  1.  AGI and Corporations Seminar, Internet Archive & Foresight Institute, April 8, 2018
  2. AI Engineers Must Open Their Designs to Democratic Control , by Joi Ito for the ACLU. April 2, 2018
  3. Machine Bias – There’s software used across the country to predict future criminals. And it’s biased against blacks. by Julia Angwin, Jeff Larson, Surya Mattu and Lauren Kirchner, ProPublica, May 23, 2016
  4. The OpenCog Foundation – Building Better AGI Minds Together
  5. The Swartz-Manning VR Destination, An Aaron Swartz Day Op
  6. The Algorithmic Justice League
  7. Gendershades.org

 

 

Interview with Alison Macrina, Founder of the Library Freedom Project

lison Macrina, Founder, Library Freedom Project
Alison Macrina, Founder, Library Freedom Project

About the Library Freedom Project, the ACLU, and Tor

The Library Freedom Project (LFP), along with its partners the ACLU and the Tor Project, provides trainings for library communities, teaching people their rights under the law, and how to find and use free and open source, privacy protective technologies.

Alison spoke at this year’s Aaron Swartz Day event (video, transcript).

LFP had a bit of excitement last summer, when it and the Tor Project worked with the Kilton Library in Lebanon, New Hampshire, to set up a Tor relay. Those who run Tor relays are providing a public service, as Tor is a free, open network that helps people defend against mass surveillance by providing them anonymity online. Tor depends on thousands of volunteers who run “relays” (computer servers that support the Tor network).

Libraries are ideal locations to host Tor relays, because they are staunch supporters of intellectual freedom and privacy, and because they provide access to other essential internet services. This was the spirit behind the Kilton Library seeking to become one of the many nodes in Tor’s worldwide internet freedom system.

Tor is used by human rights activists, diplomats, journalists, government officials, and anyone else who values privacy. For instance, Journalists in repressive countries use it to publish their work without fear of government surveillance, censorship or prosecution. Domestic violence survivors use it, so that they cannot be tracked by former partners. People in African countries like Zimbabwe and South Africa use it to report poaching of endangered animals without fear of retribution.

Human Rights Watch recommends Tor for human rights advocates in their report about censorship in China. Reporters without borders suggests that journalists and bloggers all over the world should use Tor to keep themselves and their sources safe.

Tor was originally developed by the US Navy, and still gets funding from the State Department, as it is used by many high officials in the US Government.

When LFP announced the Tor relay project at the Kilton Library, that project received popular media attention and overwhelming community support. Then, in mid-August (2015), the Boston office of the Department of Homeland Security contacted the Portsmouth and Lebanon Police Departments, to warn them, falsely, that Tor’s primary use is to aid and abet criminal activity. In the face of this Federal Law Enforcement pressure, the Kilton Library shut down the project.

The kind of pre-emptory thought crime was disturbing to say the least. LFP compared the move to shutting down public parks for fear that crimes might be committed there in the future. This Kilton Letter, published by LFP, on September 2, provides a more thorough explanation of what took place and why. The letter was signed by members of the ACLU, The Tor Project, Electronic Frontier Foundation, and the Freedom of the Press Foundation.

Luckily, the Lebanon Board of Trustees had a change of heart, as explained in the Valley News article, Despite Law Enforcement Concerns, Lebanon Board Will Reactivate Privacy Network Tor at Kilton Library:

The Lebanon Library Board of Trustees let stand its unanimous June decision to devote some of the library’s excess bandwidth to a node, or “relay,” for Tor, after a full room of about 50 residents and other interested members of the public expressed their support for Lebanon’s participation in the system at a meeting Tuesday night.

“With any freedom there is risk,” library board Chairman Francis Oscadal said. “It came to me that I could vote in favor of the good … or I could vote against the bad. “I’d rather vote for the good because there is value to this.”

Interview with Alison Macrina

Lisa:  So the good guys won in Kilton! Is the Tor relay still up and going strong?

Alison: Quick note: we won in Lebanon, New Hampshire. The name of the library is Kilton Library, of the Lebanon Libraries. And yes, the board and community decided unanimously to keep the relay online. Chuck McAndrew, the IT librarian, recently turned it from a non-exit into an exit, so we’re going to write a blog post soon detailing the success of the pilot and encouraging other libraries to get on board.

Lisa: Can other libraries contact you about setting up their own Tor relay?

Alison: Yes, they can contact us at exits@libraryfreedomproject.org for all the information and supporting materials they might need. We have a questionnaire for them to fill out regarding their network details. And then we can schedule a time for us to do a site visit.

Lisa: What is your advice to Librarians who are thinking about setting up a Tor relay, that might be getting pressured by their local law enforcement to not do so?

Alison: We can’t guarantee that law enforcement won’t try to halt other libraries from participating in this project, but we can use Kilton Library’s example in case such a thing happens again. If law enforcement pressures another library, we will do what we did in Lebanon — rally a network of global support to stand behind the library and urge them to continue their participation in the project. We think that our overwhelming victory at Kilton shows us that we’ll be victorious at other libraries, should it come to that.

Lisa: So there’s nothing inherently criminal about using Tor any more than there is something inherently criminal about using the Internet?

Alison: Not at all! Privacy-enhancing technologies like Tor are
perfectly legal. Tools like Tor are also the best ways to protect
ourselves against government and corporate surveillance. By using and promoting Tor Browser and running Tor relays, libraries can help
ordinary people protect their privacy and other basic civil rights.

Alison Macrina, Founder of the Library Freedom Project, spoke at this year’s Celebration of Hackers and Whistleblowers, on November 7th, and also gave a two-hour tutorial on Sunday morning, at the Privacy-enabling Mini-Conference, on November 8th.

 

Congrats to Citizen Four’s Oscar Win! Ed Snowden’s Statement via the ACLU

Congratulations to Laura Poitras and her team for winning an Oscar for Best Documentary! Her film is truly unprecedented.

academy awards newLaura lists SecureDrop (the whistleblower submission platform originally developed by Aaron Swartz and Kevin Poulsen) in the credits of tools she used during the making of Citizen Four.

citizen four

Ed Snowden is legally represented by the ACLU. (See his statement on the film winning here, and also reprinted below.) He is  on the Board of Directors of the Freedom of the Press Foundation, the organization that picked up SecureDrop’s development, at Kevin Poulsen’s request, after Aaron’s death.

Garrett Robinson, Lead Developer of SecureDrop, presented at last year’s Aaron Swartz Day (video). Here’s a relevant interview with Garrett Robinson from last year about why SecureDrop is so important for a functioning democracy.

The purpose of SecureDrop is to provide a secure, anonymous platform where citizens can upload information to a news organization, but without having to potentially put their whole life at risk in the process. There are now 15 SecureDrop implementations all over the world!

Here’s the ACLU press release:

Edward Snowden Congratulates Laura Poitras for Winning Best Documentary Oscar for Citizenfour

The following is a statement from Edward Snowden provided to the American Civil Liberties Union, which represents him:

“When Laura Poitras asked me if she could film our encounters, I was extremely reluctant. I’m grateful that I allowed her to persuade me. The result is a brave and brilliant film that deserves the honor and recognition it has received. My hope is that this award will encourage more people to see the film and be inspired by its message that ordinary citizens, working together, can change the world.”

Anthony D. Romero, executive director of the ACLU, had this reaction:

“Laura’s remarkable film has helped fuel a global debate on the dangers of mass surveillance and excessive government secrecy. The ACLU could not be more delighted that she has been recognized with an Academy Award.”

The ACLU’s petition asking President Obama to grant clemency to Snowden is at:
https://www.aclu.org/secure/grant_snowden_immunity

Information on government spying is at:
https://www.aclu.org/nsa-surveillance

Help Protect The Next Aaron Swartz (ACLU Petition)