community data workshop- case studies

AI Case Studies/Objects:
* Most of the below case studies are from 2018-20 in the US; there are MANY others.  Feel free to suggest other potential case studies/objects.
Facial Recognition Technologies (FRTs):

  • Algorithmic Justice League’s Gender Shades report on racial and gender bias in machine learning applications led to studies by the National Institute of Standards and Technology
  • ACLU of Northern California’s audit of Amazon Rekognition, which falsely matched Congress members  and minority athletes to criminal mugshots.
  • Proposed federal legislation – Commercial Facial Recognition Privacy Act of 2019 , the No Biometric Barriers to Housing Act of 2019, Facial Recognition Technology Warrant Act of 2019
  • Washington, first state to pass legislation outlining how and when FRTs can be used by law enforcement.
  • San Francisco (CA), Oakland (CA), Cambridge (MA), Brookline (MA), and Springfield (MA)’s approved moratoriums on government use of FRTs
  • Microsoft’s funding of the facial-recognition surveillance company AnyVision that targets Palestinians in the West Bank, allowing Israeli authorities to identify Palenstenian individuals and track their movements in public space.
  • Australian Parliament orders complete pause on the use of a national face database.
  • States in the US—Washington, Texas, California, Arkansas, New York, and Illinois—have begun actively restricting and regulating to limit biometric collection and facial recognition. Washington, Michigan, California,  Massachusetts, Arizona, and Florida have introduced efforts, with Florida Biometric Privacy Act, the California Consumer Privacy Act, Bill S. 1385 in Massachusetts, NY SB 1203 in New York, and HB1493 in Washington, explicitly modeled after Biometric Information Privacy Act (BIPA), a 2008 Illinois privacy act
  • Illinois class-action lawsuit under Biometric Information Privacy Act against Facebook’s use of facial-recognition technology in August, finding that Facebook’s collection of biometric face data from users injured their rights to privacy
  • Chicago and Detroit purchased software to deploy facial recognition in video feeds from city camera
  • IBM, Amazon, and Microsoft pledge moratoriums on FRTs following 2020 Black Lives Matter protests.
  • France has announced plans to establish a national facial-recognition database.
  • Police in Cardiff and London, UK began trial use of FRTs, leading to legal challenges and objections by civil society groups, academics, and at least one department’s ethics committee
  • China’s use of biometric recognition as weapons of state power to target a Muslim minority and Hong Kong protestors, despite GDPR-style Personal Data (Privacy) Ordinance (PDPO) in HK

Affect Recognition:

  • Amazon’s Rekognition, Kairos, HireVue and VCV will screen job candidates for personalities
  • BrainCo is creating headbands that purport to detect and quantify students’ attention levels through brain-activity detection
  • Converus examines eye movements and changes in pupil size to flag potential deception.
  • Oxygen Forensics’ data-extraction tools includes facial recognition and emotion detection, and “analysis of videos and images captured by drones used to identify possible known terrorists” for the FBI, Interpol, London Metropolitan Police, and Hong Kong Customs.
  • Sound Intelligence, Face++, Microsoft’s Face API – emotion-recognition programs to detect stress and aggression before violence erupts.
  • China employs affect recognition to try to identify criminals at airports and subway stations.

AI Industry + Labor Organizing

  • Google Walkout to protest lack of women in tech becomes largest global labor action in tech
  • Riot Games employees walk out in protest of the company’s stance on forced arbitration, following allegations by multiple employees that the company violated California’s Equal Pay Act and claims of gender-based discrimination and harassment.
  • Developers in China protest what they described as the 996 schedule—9 a.m. to 9 p.m., six days a week—through a GitHub repository of companies and projects asking for excessive hours.
  • Google workers again walked out, hosting a rally of hundreds of workers in San Francisco protesting retaliation against two organizers. Following this rally, Google fired four organizers, signaling both the growing power of such efforts to impact Google and the company’s intolerance of them.
  • 2016 Working Partnerships USA report finds that 58 percent of blue-collar contract workers in tech are Black and Latinx, and make an average of $19,900 annually; only 10 percent of “employee” tech workers are Black or Latinx who make over $100 thousand annually.
  • Contract workers’ role in the recent wave of tech-worker organizing: from temp workers at Foxconn factories protesting unpaid wages and bonuses, to Amazon warehouses workers  walking out on Prime Day and successfully winning compromises to improve conditions. Amazon-owned Whole Foods’ contract workers publish a letter demanding Amazon end its involvement with ICE and sharing a video revealing the company’s union-busting tactics.
  • Unions among workers of tech firms started in 2014, starting with food-service workers at Airbnb, Facebook, and Yahoo, and shuttle drivers and security guards at a host of Silicon Valley firms.  
  • Unionized Amazon warehouse workers strikes to demand higher pay and better working conditions.
  • 14 software engineers at the start-up Lanetix were fired shortly after unionizing in 2019; workers filed charges with the National Labor Relations Board and ultimately won their case.
  • Google hires a consulting firm known for its anti-union work amid employee unrest
  • Uber drivers’ strikes in cities around the globe included drivers’ occupying Uber’s offices in France
  • Ola drivers in India protested decreasing driver incentives amid increasing fuel prices.
  • California State Assembly Bill 5 (AB5) introduced to make it much harder for companies like Uber to label workers as independent contractors, granting them basic worker protections
  • New Jersey courts rule that Uber had misclassified drivers as independent contractors, and demanded the company pay $649 million in unpaid employment taxes
  • US senators write to Google to object to the company’s heavy reliance on temporary workers (over half its workforce) and urging the company to end its abuse of worker classifications (that deny  workers essential benefits). Low-paid contract workers are an essential labor force labeling AI training data, and moderating content on large algorithmically driven platforms.

Algorithmic Impact Assessments (AIAs)

  • Canada’s implementation of AIAs appears under its Directive on Automated Decision-Making, as part of the Pan-Canadian AI Strategy
  • Australia’s AI Ethics Framework contemplates the use of AIAs.
  • Washington became the first state to propose AIAs for government with its House and Senate bills HB 165 243 and SB 5527.
  • Scholars advocate for a model AIA to complement the GDPR
  • Alabama, New York City, and Vermont establish commissions and task forces for algorithmic impacts, with legislation pending in Massachusetts, Washington, and New York State

AI in Hiring

  • Target, Unilever, Goldman Sachs’ integrating predictive technologies into hiring process (systems shape employment advertising, résumé ranking, active and passive recruitment).
  • Illinois’s enactment of the Artificial Intelligence Video Interview Act requires employers to disclose use of FRT in hiring, obtain consent, and engage in data minimization practice
  • Electronic Privacy Information Center-EPIC’s complaint with the Federal Trade Commission against AI hiring company, HireVue, for deceptive practices on validity of its algorithic results

Face Datasets:

  • Clarifai used profile photos from the dating website OkCupid
  • IBM’s “Diversity in Faces” report
  • DUKE MTMC dataset and Brainwash datasets were collected by setting up surveillance cameras at college campuses
  • Algorithmic Justice League’s Gender Shades report on racial and gender bias in machine learning

Health Datasets:

  • Apple Watch, Fit Bit
  • Project Nightingale, a partnership between Google and Ascension, one of the largest nonprofit health systems in the US, reported for privacy risks
  • IBM and APS Healthcare sells Medicaid eligibility assessment A tool that determines eligibility and compliance for Medicaid. Similar tools are used to assess eligibility and compliance for other public benefits.
  • Appriss sells prescription drug monitoring databases to enable some states to use proprietary algorithms applied to prescription drug monitoring databases to identify possible doctor shopping or improper prescribing.

Algorithmic Management/Worker Productivity AI

  • Amazon’s AI system sets ideal performance targets for workers in fulfillment warehouses
  • Uber/Lyft traffic routing
  • Justicia for Migrant Workers’ documentation of use of tracking and productivity technologies in agriculture sector in Canada to regiment their pace and their production levels
  • Apps used to manage hotel, restaurant and health care workers (ie. Philadelphia Marriott’s app to give its housekeepers room assignments)
  • Mijente joined Media Justice (who worked on San Francisco’s facial-recognition ban) and Tech Workers Coalition to host Take Back Tech for community organizers, tech workers and students

AI + Climate

  • Athena Coalition’s multi-issue approach around Amazon includes warehouse worker rights and  climate justice
  • In April 2019, ~9,000 Amazon workers publicly signed a letter calling on the company to address its contributions to climate change through a shareholder resolution and staged a Climate Walkout with other tech companies in September (as 1st labor action coordinated across multiple tech companies)
  • Greenpeace, openAI + University of Massachusetts, Amherst estimated the carbon footprint of training a large natural-language processing

Company Oversight Boards + AI

  • Google’s Advanced Technology External Advisory Council (ATEAC), an ethics board that was dismantled over controversy
  • Facebook’s 2019 Civil Rights Audit

Biased Ads + AI

  • Department of Urban Housing and Defense, civil rights groups, and labor organizations lawsuit against Facebook in 2019 for discriminatory online advertising filtering
  • ACLU’s Racial Justice Project has highlighted how Facebook’s ad-targeting platform can allow advertisers to illegally discriminate against people of color by limiting their audience for housing advertisements by “ethnic affinity.”

Autonomous Vehicles

  • National Transportation Safety Board investigation into with Uber’s autonomous system following over 37 accidents involving autonomous Uber vehicles.

Military AI

  • Stanford students’ 2018 circulation of a pledge not to accept interviews from Google until the company canceled its work on Project Maven, a US military effort to build AI-enabled drone surveillance, and committed to no further military involvement.
  • Students around the US demonstrated against recruiting events on campus by technology companies known to be supporting border control or policing activities, such as Amazon Salesforce, and Palantir. Over 1,200 students representing 17 campuses signed a pledge asserting they would not work at Palantir because of its ties to ICE.
  • Microsoft employees signed an open letter to the company asking it not to bid on JEDI, a major Department of Defense cloud-computing contract, which the company ultimately won
  • Microsoft employees in 2019 call to cancel a $480 million contract to provide augmented reality headsets to the US military, saying they did “not want to become war profiteers.”
  • Students from Central Michigan University fought against the creation of a university

Army AI Task Force that was poised to endorse the military use of AI

  • Microsoft funded an Israeli firm to conduct facial-recognition surveillance on West Bank Palestinians in public space.

AI + Immigration:

  • Mijente’s 2019 report based on FOIA record requests reveals lucrative tech company contracts with military and border agencies, and the central role of tech in racial detentions and  immigration enforcement; spearhead the #NoTechforICE campaign
  • Never Again Action and Jews for Racial and Economic Justice’s campaign against Amazon to protest against the company’s giving cloud computing services to ICE.
  • Immigrant rights group Make the Road New York organized academics and tech professionals to demand that prominent conferences drop Palantir as a sponsor for its role in ICE detention of immigrants – UC Berkeley’s Privacy Law Scholars Conference, Lesbians Who Tech, and the Grace Hopper Celebration all pull Palantir as a sponsor
  • Tech workers at Salesforce, Microsoft,  Accenture, Google, Tableau, GitHub, and Palintir signed petitions and open letters protesting their companies’ contracts with ICE.
  • Developer Seth Vago pulled his open-source code out of the codebase used by the company Chef after learning of the company’s contract with ICE.  Chef commits to cancel their contract
  • Palantir Technologies provides database management and AI to ICE, allowing them to combine and analyze information from varying government databases, and to use this to track, target, and detain people whom they believe are in the US “illegally.
  • Palantir workers circulate two open letters and have express mistrust of and frustration with the company’s leadership for its decision to keep its contract with ICE
  • public-records requests showed that 300 police departments in California have access, through Palantir, to data collected and stored by the Department of Homeland Security’s Northern California Regional Intelligence Center, without any requirement to disclose their access to this information.
  • “Smart wall” that utilizes drones, sensors, and increased facial recognition to detect individuals is receiving bipartisan support in design and implementation.  Anduril Industries, a technology company that recently replaced Google on a Project Maven Department of Defense contract developing AI-based surveillance systems and that also produces autonomous drones,  now provides solar-powered “sentry” towers for the Customs and Border Protection (CBP) agency.
  • EU aims to deploy an AI-based “lie detector” built by iBorderCtrl, but makes no mention of the predictive accuracy or the inherent bias that might exist within such tools.
  • Public-records requests showed that 300 police departments in California have access, through Palantir, to data collected and stored by the Department of Homeland Security’s Northern California Regional Intelligence Center, without any requirement to disclose their access to this information.
  • UK’s Home Office facial-recognition systems were found to be wrongfully identifying travelers as criminals, delaying travels and detaining them with no elements of due process.

AI + Economic/Urban Development

  • Anti-Displacement Coalition in San Francisco and the Bay Area-wide Tenant Organizing Network, formed by rent-control protection groups and tenant unions
  • Cluj, Romania (the “Silicon Valley of Eastern Europe”) housing justice group, Social Housing Now (Căsi Sociala Acum), formed after Roma evictions
  • Serve the People San Jose formed around Google’s campus growth, concerns over mass displacement and unaffordability, organizing marches, Google bus blockades, and City Council demos
  • Berlin’s Google Is Not a Good Neighbor (Google ist kein guter Nachbar) in 2018 collectively blocked Google from launching a new tech campus in the neighborhood of Kreuzberg.
  • Athena Coalition – that includes groups like ALIGN, New York Communities for Change, Make The Road New York, Desis Rising Up and Moving, and others – successfully fought a new NYC Amazon campus in Queens in 2019
  • Toronto organizers committed to stopping gentrification induced by Google/Alphabet’s Sidewalk Labs launched the #BlockSidewalk campaign; in 2019, the Canadian Civil Liberties Association (CCLA) filed a lawsuit against Waterfront Toronto over Sidewalk.
  • IBM and Cisco’s Smart City Technologies
  • Siemen’s 600 million Euro Berlin contract to create smart-city neighborhood
  • Replica (a Sidewalk Labs spinoff company supplying urban-planning software) contract with Portland, Oregon for regional transportation provides no public access to Replica’s algorithms
  • Huawei’s $1.5 billion project to create smart cities in Africa includes a project in Nairobi where it installed 1,800 cameras, 200 traffic surveillance systems, and a national police command center as part of its “Safe City” program. Huawei’s Safe City technology has been used by some African governments to spy on political opponents.
  • San Diego has installed thousands of microphones and cameras on street lamps in recent years in an effort to study traffic and parking conditions; police have used the video footage in more than 140 cases without any oversight or accountability.
  • Miami is actively considering a 30-year contract with Illumination Technologies, providing the company with free access to set up light poles containing cameras and license-plate readers, collecting information that will filter through the Miami Police Department (and that the company can use in unchecked ways).

AI + Policing

  • Surveillance technologies – made by Palantir, Vigilant Solutions, Cognitec, Amazon, Microsoft, Motorola, IBM, Axon – used by local and state law enforcement use algorithms including but not limited to, facial recognition (including on body cameras), automatic license plate readers, and visual or data analytics systems. Law enforcement agencies also data-mining software that processes large quantities of data from commercial and government sources to identify relationships or connections between people, places, and things
  • Predictive Policing systems sold by Predpol; Azavea (Hunchlab); Palantir, Starlight, Bair Analytics, IBM, RTMDx will analyze available data to predict either where a crime may happen in a given time window or who will be involved in a crime as either victim or perpetrator.
  • Los Angeles Police Department’s predictive policing program LASER, which claimed to identify individuals likely to predict violent crimes, temporarily suspended from community complaints
  • Stop LAPD Spying Coalition partners with UCLA students to demonstrate how LAPD used proxy data to discriminate against Latinx and Black community members and to stop UCLA’s development of the predictive policing tool PredPol
  • St. Louis, Missouri residents protest a proposed agreement between St. Louis police and the company Predictive Surveillance Systems to deploy surveillance planes to collect images of citizens on the ground
  • Canadian RCMP troop in Red Deer, Alberta, launched a program called CAPTURE to enable community assisted policing through the use of recorded evidence
  • Project Green Light in Detroit’s “Neighborhood Real-Time Intelligence Program,” launched in 2016 and expanded in 2019 with $9 million, state- and federally-funds to install surveillance equipment at 500 Detroit intersections—on top of the over 500 already installed at businesses—but also utilize facial recognition software to identify potential criminals.
  • Amazon Ring’s video doorbell and smart-security-device allows users to see, talk to, and record from their doorbell. Amazon had negotiated Ring video-sharing partnerships with more than 700 police departments across the US
  • Amazon Ring is paired with the Neighbors app and other like Nextdoor and Citizen, which allow users to view local crime in real time and discuss it with one another, share videos, photos, etc.
  • DNA Analysis systems – also known as probabilistic genotyping – sold by Strmix, TrueAllele, Cybergenetics to interpret forensic DNA samples on a mixture of DNA from different people to determine the probability that a sample derives from a potential suspect.

AI + Education

  • Kansas, New York, Pennsylvania, and Connecticut, parents and students oppose the use of a web-based educational platform called Summit Learning in high schools. High schoolers staged sit-ins, and parents protested at school board meetings, emphasizing that the work of teachers could not be outsourced to technology-based platforms.
  • Swedish government fined a high school for its facial-recognition attendance registry as a violation of GDPR
  • France’s data protection authority, CNIL, declared it illegal to use facial recognition in schools based on privacy concerns

Government Predictive Analytics

  • Disability Rights Oregon (DRO) sued the state’s Department of Human Services over sudden cuts in Oregonians’ disability benefits due to the State hard-coding a 30-percent across-the-board reduction of hours into their algorithmic assessment tool.
  • In Michigan, a group of unemployment beneficiaries brought a class-action lawsuit against the Michigan Unemployment Insurance Agency (UIA) over a failed automation project, called MiDAS that claimed to be able to detect and “robo-adjudicate” claims of benefits fraud algorithmically
  • State v. Loomis (2016) in front of the Wisconsin Supreme Court, the defendant claimed that the court’s use of a risk assessment software when determining his sentence was a due process violation.
  • Allegheny County (PA) Department of Human Services use of an automated decision system for evaluating the risk of child abuse or neglect, which augments its own data with data from local police departments, mental health services, and public benefits agencies.
  • Pretrial Risk Assessment sold by Northpointe (COMPAS) and the University of Chicago Crime Lab analyzes information collected during interviews with an arrested person to assess the person’s likelihood of nonappearance, rearrest, and rearrest for a violent crime. Most pretrial risk assessments use a simple algorithm that is reliant on a small number of input variables, which are usually determined by state law. When predicting what defendants might not appear in court, such risk assessments are sometimes called “failure to appear” tools.
  • Sentencing Risk Assessment systems – sold by MHS Assessments (Juvenile Sentencing tool); and PA Sentence Risk Assessment – are designed to reduce recidivism by targeting defendants that are considered “high risk” and reduce prison populations by diverting “low risk” defendants from prison.

National Biometric ID

  • Aadhaar, India’s national ID to create more efficient welfare enabled – until intervention by the Indian Supreme Court –  any private entity to use the state’s biometric ID infrastructure for authentication, including banks, telecom companies, and a range of other private vendors with little scrutiny or privacy safeguards.
  • Report on ID databases in Ghana, Rwanda, Tunisia, Uganda and Zimbabwe shows how they are facilitating “citizen scoring” exercises like credit reference bureaus to emerge at scale. Kenya
  • Security  flaw in the Estonian ID system enabled breach of ID databases,
  • Ireland’s Data Protection Commissioner orders government to delete the ID records of 3.2 million people after it was discovered that the new “Public Services Card” was being used without limits on data retention or sharing between government department

Data Protection

  • More than 130 countries passed comprehensive data protection laws, with Kenya and Brazil being the latest to have modeled their laws largely on the GDPR
  • Māori scholars and government leaders and Aboriginal rights developers published the book Indigenous Data Sovereignty: Toward an Agenda in response to oversights in the United Nations Declaration on the Rights of Indigenous Peoples (UNDRIP)
  • US Indigenous Data Sovereignty Network (USIDSN) was established to “link American Indian, Alaska Native, and Native Hawaiian data users, tribal leaders, information and communication technology providers, researchers, policymakers and planners, businesses, service providers, and community advocates together to share stories about data initiatives, successes, and challenges, and resources”
  • Global Indigenous Data Alliance (GIDA) launched in 2019 to respond to international open-data and open-science debates. GIDA has put forward a set of “CARE principles” around are Collective benefit, Authority to control, Responsibility, and Ethics to address the power differentials and historical contexts neglected by the open-data movements.