Facial Recognition Bias in Civil Rights Enforcement
Key Questions
What bias was found in facial recognition technology?
NIST tests showed facial recognition error rates 100x higher for Black women, raising concerns in civil rights enforcement.
What facial recognition plans does Dallas PD have?
Dallas PD allocated $51.5M for FIFA cameras involving facial recognition, amid Geiger FOIA requests.
What concerns exist about AI in sentencing?
AI sentencing simulations highlight biases, tied to DOJ/NMA/ICE and CCJ/NFHA reports.
What action did civil rights groups take regarding Meta?
A coalition of 75 civil rights organizations urged Meta to halt facial recognition in smart glasses due to bias risks.
What FISA-related issues connect to surveillance?
Calls to reform FISA 702 address abuses like warrantless searches of U.S. citizen data by the FBI.
NIST 100x Black women error; DPD $51.5M FIFA cams; Geiger FOIA; CCJ/NFHA; AI sentencing sims; DOJ/NMA/ICE ties; civil rights coalition (75 orgs) urges Meta halt facial rec smart glasses.