FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál

Police Say a Simple Warning Will Prevent Face Recognition Wrongful Arrests. That's Just Not True.

pFace recognition technology in the hands of police is dangerous. Police departments across the country frequently use the technology to try to identify images of unknown suspects by comparing them to large photo databases, but it often fails to generate a correct match. And numerous a href=https://www.washingtonpost.com/technology/2019/12/19/federal-study-confirms-racial-bias-many-facial-recognition-systems-casts-doubt-their-expanding-use/studies/a have shown that face recognition technology misidentifies Black people and other people of color at higher rates than white people. To date, there have been at least seven wrongful arrests we know of in the United States due to police reliance on incorrect face recognition results — and those are just the known cases. In nearly every one of those instances, a href=https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.htmlthe/a a href=https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.htmlperson/a a href=https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence/wrongfully/a a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmlarrested/a a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlwas/a a href=https://www.freep.com/story/news/local/michigan/detroit/2020/07/10/facial-recognition-detroit-michael-oliver-robert-williams/5392166002/Black/a./p pSupporters of police using face recognition technology often portray these failures as unfortunate mistakes that are unlikely to recur. Yet, they keep coming. Last year, six Detroit police officers showed up at the doorstep of an a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmleight-months pregnant woman/a and wrongfully arrested her in front of her children for a carjacking that she could not plausibly have committed. A month later, the prosecutor dismissed the case against her./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 img width=2800 height=1400 src=https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed.jpg class=attachment-4x3_full size-4x3_full alt=Robert Williams and his daughter, Rosie Williams decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed.jpg 2800w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-768x384.jpg 768w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1536x768.jpg 1536w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-2048x1024.jpg 2048w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-400x200.jpg 400w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-600x300.jpg 600w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-800x400.jpg 800w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1000x500.jpg 1000w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1200x600.jpg 1200w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1400x700.jpg 1400w, https://www.aclu.org/wp-content/uploads/2021/07/Robert-Williams-Full-Bleed-1600x800.jpg 1600w sizes=(max-width: 2800px) 100vw, 2800px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank I Did Nothing Wrong. I Was Arrested Anyway. /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletOver a year after a police face recognition tool matched me to a crime I did not commit, my family still feels the impact. We must stop this.../p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anyway target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pPolice departments should be doing everything in their power to avoid wrongful arrests, which can turn people’s lives upside down and result in loss of work, inability to care for children, and other harmful consequences. So, what’s behind these repeated failures? As the ACLU explained in a a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13erecent submission/a to the federal government, there are multiple ways in which police use of face recognition technology goes wrong. Perhaps most glaring is that the most widely adopted police policy designed to avoid false arrests in this context emsimply does not work/em. Records from the wrongful arrest cases demonstrate why./p pIt has become standard practice among police departments and companies making this technology to warn officers that a result from a face recognition search does not constitute a positive identification of a suspect, and that additional investigation is necessary to develop the probable cause needed to obtain an arrest warrant. For example, the International Association of Chiefs of Police a href=https://www.theiacp.org/sites/default/files/2019-10/IJIS_IACP%20WP_LEITTF_Facial%20Recognition%20UseCasesRpt_20190322.pdfcautions/a that a face recognition search result is “a strong clue, and nothing more, which must then be corroborated against other facts and investigative findings before a person can be determined to be the subject whose identity is being sought.” The Detroit Police Department’s face recognition technology a href=https://detroitmi.gov/sites/detroitmi.localhost/files/2020-10/307.5%20Facial%20Recognition.pdfpolicy/a adopted in September 2019 similarly states that a face recognition search result is only an “an investigative lead and IS NOT TO BE CONSIDERED A POSITIVE IDENTIFICATION OF ANY SUBJECT. Any possible connection or involvement of any subject to the investigation must be determined through further investigation and investigative resources.”/p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 /a /div div class=wp-link__title a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank ACLU Comment re: Request for Comment on Law Enforcement Agencies' Use of Facial Recognition Technology, Other Technologies Using Biometric Information, and Predictive Algorithms (Exec. Order 14074, Section 13(e)) /a /div div class=wp-link__description a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/documents/aclu-comment-facial-recognition-and-biometric-technologies-eo-14074-13e target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pPolice departments across the country, from a href=https://lacris.org/LACRIS Facial Recognition Policy v_2019.pdfLos Angeles County/a to the a href=https://www.in.gov/iifc/files/Indiana_Intelligence_Fusion_Center_Face_Recognition_Policy.pdfIndiana State Police/a, to the U.S. a href=https://www.dhs.gov/sites/default/files/2023-09/23_0913_mgmt_026-11-use-face-recognition-face-capture-technologies.pdfDepartment of Homeland Security/a, provide similar warnings. However ubiquitous, these warnings have failed to prevent harm./p pWe’ve seen police treat the face recognition result as a positive identification, ignoring or not understanding the warnings that face recognition technology is simply not reliable enough to provide a positive identification./p pIn Louisiana, for example, police relied solely on an incorrect face recognition search result from Clearview AI as purported probable cause for an arrest warrant. The officers did this even though the law enforcement agency signed a contract with the face recognition company acknowledging officers “must conduct further research in order to verify identities or other data generated by the [Clearview] system.” That overreliance led to a href=https://www.nytimes.com/2023/03/31/technology/facial-recognition-false-arrests.htmlRandal Quran Reid/a, a Georgia resident who had never even been to Louisiana, being wrongfully arrested for a crime he couldn’t have committed and held for nearly a week in jail./p pIn an a href=https://www.courierpress.com/story/news/local/2023/10/19/evansville-police-using-clearview-ai-facial-recognition-to-make-arrests/70963350007/Indiana investigation/a, police similarly obtained an arrest warrant based only upon an assertion that the detective “viewed the footage and utilized the Clearview AI software to positively identify the female suspect.” No additional confirmatory investigation was conducted./p pBut even when police do conduct additional investigative steps, those steps often emexacerbate and compound/em the unreliability of face recognition searches. This is a particular problem when police move directly from a facial recognition result to a witness identification procedure, such as a photographic lineup./p pFace recognition technology is designed to generate a list of faces that are emsimilar/em to the suspect’s image, but often will not actually be a match. When police think they have a match, they frequently ask a witness who saw the suspect to view a photo lineup consisting of the image derived from the face recognition search, plus five “filler” photos of other people. Photo lineups have long been known to carry a high risk of misidentification. The addition of face recognition-generated images only makes it worse. Because the face recognition-generated image is likely to appear more similar to the suspect than the filler photos, there is a a href=https://www.newyorker.com/magazine/2023/11/20/does-a-i-lead-police-to-ignore-contradictory-evidence/heightened chance/a that a witness will a href=https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4101826mistakenly choose/a that image out of the lineup, even though it is not a true match./p pThis problem has contributed to known cases of wrongful arrests, including the arrests of a href=https://www.nytimes.com/2023/08/06/business/facial-recognition-false-arrest.htmlPorcha Woodruff/a, a href=https://www.freep.com/story/news/local/michigan/detroit/2020/07/10/facial-recognition-detroit-michael-oliver-robert-williams/5392166002/Michael Oliver/a, and a href=https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.htmlRobert Williams/a by Detroit police (the ACLU represents Mr. Williams in a a href=https://www.aclu.org/news/privacy-technology/i-did-nothing-wrong-i-was-arrested-anywaywrongful arrest lawsuit/a). In these cases, police obtained an arrest warrant based solely on the combination of a false match from face recognition technology; and a false identification from a witness viewing a photo lineup that was constructed around the face recognition lead and five filler photos. Each of the witnesses chose the face recognition-derived false match, instead of deciding that the suspect did not, in fact, appear in the lineup./p pA lawsuit filed earlier this year in Texas alleges that a similar series of failures led to the wrongful arrest of a href=https://www.theguardian.com/technology/2024/jan/22/sunglass-hut-facial-recognition-wrongful-arrest-lawsuit?ref=upstract.comHarvey Eugene Murphy Jr./a by Houston police. And in New Jersey, police wrongfully arrested a href=https://www.nytimes.com/2020/12/29/technology/facial-recognition-misidentify-jail.htmlNijeer Parks/a in 2019 after face recognition technology incorrectly flagged him as a likely match to a shoplifting suspect. An officer who had seen the suspect (before he fled) viewed the face recognition result, and said he thought it matched his memory of the suspect’s face./p pAfter the Detroit Police Department’s third wrongful arrest from face recognition technology became public last year, Detroit’s chief of police a href=https://www.facebook.com/CityofDetroit/videos/287218473992047acknowledged/a the problem of erroneous face recognition results tainting subsequent witness identifications. He explained that by moving straight from face recognition result to lineup, “it is possible to taint the photo lineup by presenting a person who looks most like the suspect” but is not in fact the suspect. The Department’s policy, merely telling police that they should conduct “further investigation,” had not stopped police from engaging in this bad practice./p pBecause police have repeatedly proved unable or unwilling to follow face recognition searches with adequate independent investigation, police access to the technology must be strictly curtailed — and the best way to do this is through strong a href=https://www.aclu.org/sites/default/files/field_document/02.16.2021_coalition_letter_requesting_federal_moratorium_on_facial_recognition.pdfbans/a. More than 20 jurisdictions across the country, from Boston, to Pittsburgh, to San Francisco, have done just that, barring police from using this dangerous technology./p pBoilerplate warnings have proven ineffective. Whether these warnings fail because of human a href=https://www.nytimes.com/2020/06/09/technology/facial-recognition-software.htmlcognitive bias/a toward trusting computer outputs, poor police training, incentives to quickly close cases, implicit racism, lack of consequences, the fallibility of witness identifications, or other factors, we don’t know. But if the experience of known wrongful arrests teaches us anything, it is that such warnings are woefully inadequate to protect against abuse./p

Communities Should Reject Surveillance Products Whose Makers Won't Allow Them to be Independently Evaluated

6. Březen 2024 v 16:05
pAmerican communities are being confronted by a lot of new police technology these days, a lot of which involves surveillance or otherwise raises the question: “Are we as a community comfortable with our police deploying this new technology?” A critical question when addressing such concerns is: “Does it even work, and if so, how well?” It’s hard for communities, their political leaders, and their police departments to know what to buy if they don’t know what works and to what degree./p pOne thing I’ve learned from following new law enforcement technology for over 20 years is that there is an awful lot of snake oil out there. When a new capability arrives on the scene—whether it’s a href=https://www.aclu.org/wp-content/uploads/publications/drawing_blank.pdfface recognition/a, a href=https://www.aclu.org/blog/privacy-technology/surveillance-technologies/experts-say-emotion-recognition-lacks-scientific/emotion recognition/a, a href=https://www.aclu.org/wp-content/uploads/publications/061819-robot_surveillance.pdfvideo analytics/a, or “a href=https://www.aclu.org/news/privacy-technology/chicago-police-heat-list-renews-old-fears-aboutbig data/a” pattern analysis—some companies will rush to promote the technology long before it is good enough for deployment, which sometimes a href=https://www.aclu.org/blog/privacy-technology/surveillance-technologies/experts-say-emotion-recognition-lacks-scientific/never happens/a. That may be even more true today in the age of artificial intelligence. “AI” is a term that often amounts to no more than trendy marketing jargon./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 img width=1200 height=628 src=https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7.jpg 1200w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-768x402.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-400x209.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-600x314.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-800x419.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/a573aa109804db74bfef11f8a6f475e7-1000x523.jpg 1000w sizes=(max-width: 1200px) 100vw, 1200px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank Six Questions to Ask Before Accepting a Surveillance Technology /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tabletCommunity members, policymakers, and political leaders can make better decisions about new technology by asking these questions./p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technology target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pGiven all this, communities and city councils should not adopt new technology that has not been subject to testing and evaluation by an independent, disinterested party. That’s true for all types of technology, but doubly so for technologies that have the potential to change the balance of power between the government and the governed, like surveillance equipment. After all, there’s no reason to get a href=https://www.aclu.org/news/privacy-technology/six-questions-to-ask-before-accepting-a-surveillance-technologywrapped up in big debates/a about privacy, security, and government power if the tech doesn’t even work./p pOne example of a company refusing to allow independent review of its product is the license plate recognition company Flock, which is pushing those surveillance devices into many American communities and tying them into a centralized national network. (We wrote more about this company in a 2022 a href=https://www.aclu.org/publications/fast-growing-company-flock-building-new-ai-driven-mass-surveillance-systemwhite paper/a.) Flock has steadfastly refused to allow the a href=https://www.aclu.org/news/privacy-technology/are-gun-detectors-the-answer-to-mass-shootingsindependent/a security technology reporting and testing outlet a href=https://ipvm.com/IPVM/a to obtain one of its license plate readers for testing, though IPVM has tested all of Flock’s major competitors. That doesn’t stop Flock from a href=https://ipvm.com/reports/flock-lpr-city-sued?code=lfgsdfasd543453boasting/a that “Flock Safety technology is best-in-class, consistently performing above other vendors.” Claims like these are puzzling and laughable when the company doesn’t appear to have enough confidence in its product to let IPVM test it./p div class=mp-md wp-link div class=wp-link__img-wrapper a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 img width=1160 height=768 src=https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e.jpg class=attachment-original size-original alt= decoding=async loading=lazy srcset=https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e.jpg 1160w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-768x508.jpg 768w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-400x265.jpg 400w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-600x397.jpg 600w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-800x530.jpg 800w, https://www.aclu.org/wp-content/uploads/2024/03/f0cab632e1da8a25e9a54ba8019ef74e-1000x662.jpg 1000w sizes=(max-width: 1160px) 100vw, 1160px / /a /div div class=wp-link__title a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank Experts Say 'Emotion Recognition' Lacks Scientific Foundation /a /div div class=wp-link__description a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 p class=is-size-7-mobile is-size-6-tablet/p /a /div div class=wp-link__source p-4 px-6-tablet a href=https://www.aclu.org/news/privacy-technology/experts-say-emotion-recognition-lacks-scientific target=_blank tabindex=-1 p class=is-size-7Source: American Civil Liberties Union/p /a /div /div pCommunities considering installing Flock cameras should take note. That is especially the case when errors by Flock and other companies’ license plate readers can lead to innocent drivers finding themselves with their a href=https://ipvm.com/reports/flock-lpr-city-sued?code=lfgsdfasd543453hands behind their heads/a, facing jittery police pointing guns at them. Such errors can also expose police departments and cities to lawsuits./p pEven worse is when a company pretends that its product has been subject to independent review when it hasn’t. The metal detector company Evolv, which sells — wait for it — emAI/em metal detectors, submitted its technology to testing by a supposedly independent lab operated by the University of Southern Mississippi, and publicly touted the results of the tests. But a href=https://ipvm.com/reports/bbc-evolvIPVM/a and the a href=https://www.bbc.com/news/technology-63476769BBC/a reported that the lab, the National Center for Spectator Sports Safety and Security (a href=https://ncs4.usm.edu/NCS4/a), had colluded with Evolv to manipulate the report and hide negative findings about the effectiveness of the company’s product. Like Flock, Evolv refuses to allow IPVM to obtain one of its units for testing. (We wrote about Evolv and its product a href=https://www.aclu.org/news/privacy-technology/are-gun-detectors-the-answer-to-mass-shootingshere/a.)/p pOne of the reasons these companies can prevent a tough, independent reviewer such as IPVM from obtaining their equipment is their subscription and/or cloud-based architecture. “Most companies in the industry still operate on the more traditional model of having open systems,” IPVM Government Research Director Conor Healy told me. “But there’s a rise in demand for cloud-based surveillance, where people can store things in cloud, access them on their phone, see the cameras. Cloud-based surveillance by definition involves central control by the company that’s providing the cloud services.” Cloud-based architectures can a href=https://www.aclu.org/news/civil-liberties/major-hack-of-camera-company-offers-four-key-lessons-on-surveillanceworsen the privacy risks/a created by a surveillance system. Another consequence of their centralized control is increasing the ability of a company to control who can carry out an independent review./p pWe’re living in an era where a lot of new technology is emerging, with many companies trying to be the first to put them on the market. As Healy told me, “We see a lot of claims of AI, all the time. At this point, almost every product I see out there that gets launched has some component of AI.” But like other technologies before them, these products often come in highly immature, premature, inaccurate, or outright deceptive forms, relying little more than on the use of “AI” as a buzzword./p pIt’s vital for independent reviewers to contribute to our ongoing local and national conversations about new surveillance and other police technologies. It’s unclear why a company that has faith in its product would attempt to block independent review, which is all the more reason why buyers should know this about those companies./p div class=rss-ctadiv class=rss-cta__subtitleWhat you can do:/divdiv class=rss-cta__titleStop mass warrantless surveillance: End Section 702/diva href=https://action.aclu.org/send-message/stop-mass-warrantless-surveillance-end-section-702 class=rss-cta__buttonSend your message/a/div
❌
❌