An article in Medium reports that Amazon has described its Rekognition technology as “Real-time face recognition across tens of millions of faces, and detection of up to 100 faces in challenging crowded photos.” This technology was combined with real-time tracking across several cameras’ feeds throughout Orlando, Florida and Washington County, Oregon. People were concerned enough about these trials that Orlando Police Chief John Mina released a statement saying, “We would never use this technology to track random citizens, immigrants, activists, or people of color.” The fact that the police chief felt led to release the statement shows how concerned people are about misuse and ethical issues around surveillance technologies in the United States and throughout the world.
London Metropolitan Police’s facial recognition matches have been shown to be 98 percent inaccurate. The New York Times reports that facial recognition algorithms for Microsoft, IBM and Face++ are very accurate, but mostly for white men. For example, darker-skinned females were misidentified 35 percent of the time. Because of this tendency to amplify racial and gender biases, there is a great need now for more accountability of tech companies and law enforcement.
Enter the disrupters at Hyphen-Labs, an international collective of women technologists of color. Led by Ece Tankal, Ashley Baccus Clark, and Carmen Aguilar y Wedge, and in collaboration with Adam Harvey, an artist and researcher, the collective has designed a face scarf that causes the facial recognition algorithms to be scrambled and confused, pulling up incorrect matches. Hyphen-Labs says that tricking the machines by giving it faulty inputs is a way to use them to “address the issues that arise when we depend on them too much.” They state that the root of facial recognition is steeped in identity politics. “Beyond this, the control of identity and image has been a way to oppress freedom from groups who have been historically and systemically marginalized both in the U.S. and globally.”
Because of the potential for supercharged discrimination, it is vital that we take a close look at these technologies while they are growing and before they are everywhere. We deserve to have control over our identity and our image. It is imperative that we consider having discussions about what the reality is of a world where cameras are always watching, capturing our image regardless of consent. Designing ways to subvert technology is one way to promote better tech.
Read more HERE.
Reality Changing Observations:
Q1. Where do you see cameras being used in your community?
Q2. How can facial recognition technology be used for good?
Q3. What should our rights be when it comes to having our image captured and identified?