FreshRSS

Normální zobrazení

Jsou dostupné nové články, klikněte pro obnovení stránky.
PředevčíremHlavní kanál
  • ✇Techdirt
  • Lawmakers Ask DHS Oversight To Look Into Agency Spending On Questionable Shot-Spotting TechTim Cushing
    More bad news for ShotSpotter, which recently re-branded to “SoundThinking” to distance itself from exactly this sort of negative press. Four legislators (three senators, one congressperson) are asking the DHS Inspector General to take a closer look at the tech the DHS is funding via one of its grant programs. The problem with Spotshotter is it seems unlikely to put a dent in the public’s arsenal. Multiple cities have chosen to dump the tech rather than continue to pay for false positives, alter
     

Lawmakers Ask DHS Oversight To Look Into Agency Spending On Questionable Shot-Spotting Tech

31. Květen 2024 v 00:31

More bad news for ShotSpotter, which recently re-branded to “SoundThinking” to distance itself from exactly this sort of negative press. Four legislators (three senators, one congressperson) are asking the DHS Inspector General to take a closer look at the tech the DHS is funding via one of its grant programs.

The problem with Spotshotter is it seems unlikely to put a dent in the public’s arsenal. Multiple cities have chosen to dump the tech rather than continue to pay for false positives, altered shot reports, and nonexistent public safety increases.

The problem with the DHS is that it has already started spending money on a portable “Gunshot Detection System.” It’s capitalized for a reason. It’s a bespoke version of a product already offered by a company called [re-reads DHS press release] Shooter Detection Systems — a redesign of its [deep breath] Guardian Indoor Active Shooter Detection System.

According to the DHS’s PR team, the “enhanced” version of this off-the-shelf shot spotter will detect both sounds and light flashes, apparently aiming to reduce the number of false positives generated by acoustic-only detection systems… like the one offered by [coughs at first half of rebrand] SoundThinking, formerly ShotSpotter.

Whether adding “eyes” to “ears” to spot shots has accomplished a reduction in false positives is still an open question. Whether or not the DHS should continue to pay for shot spotting tech — namely the one offered by the former ShotSpotter — is exactly the question these four lawmakers would like the DHS Inspector General to answer.

The question — as posed in this letter [PDF] from Sen. Ed Markey, Sen. Ron Wyden, Sen. Elizabeth Warren, and Congressperson Ayanna Pressley — is a bit leading perhaps. But the question is valid and the lawmakers’ letter contains plenty of evidence that lends validity to the question: should the DHS really be spending federal dollars on grants to local law enforcement agencies seeking to acquire ShotSpotter tech?

Several recent reports have cast substantial doubt on the accuracy and effectiveness of the “ShotSpotter” gunshot detection system and have raised serious questions about its contribution to unjustified surveillance and over-policing of Black, Brown, and Latino communities. Through the Urban Area Security Initiative (UASI) grant program, the Department of Homeland Security (DHS) provides funding to localities to deploy the ShotSpotter system. We request that the DHS Office of Inspector General (OIG) investigate DHS’s spending of taxpayer dollars on ShotSpotter, including potential violations of Title VI of the Civil Rights Act of 1964, which prohibits recipients of federal financial assistance from discriminating based on race, color, and national origin.

And that’s only part of the problem. It’s not even necessarily a ShotSpotter problem per se, but a long-standing problem with law enforcement agencies, who almost always deploy new surveillance “solutions” in low-income neighborhoods, especially those heavily populated by minorities.

The other problem is more technical: the tech just doesn’t work as advertised. Multiple investigations have shown the tech is either (1) unable to reliably detect gunshots, or (2) doesn’t lead to better enforcement of gun-related crime. The cities now dumping the tech say it’s both unreliable and useless. Of course, SoundThinking/ShotSpotter insists otherwise in responses to the latest negative reporting and in its marketing materials, which are still somehow capable of convincing government entities to buy its tech.

That’s where the DHS comes in. It offers grant money to law enforcement agencies — funding that can be used to purchase acoustic gunshot detection tech. The biggest brand in the business is SoundThinking, so naturally that’s where most of this funding goes.

In Massachusetts alone, “UASI [Urban Area Security Initiative] has funded almost a decade of contracts for gunshot detection technology with ShotSpotter in Cambridge, Chelsea, Somerville, and Boston.” Since 2012, according to city records, Boston has spent more than $4 million on ShotSpotter. Elsewhere, municipalities across the country have used UASI funds for the ShotSpotter system. One study found that “[t]hrough an analysis of UASI funding in Los Angeles, Boston, New York City, and Chicago . . . cities spend millions of UASI dollars on contracts with surveillance corporations” such as ShotSpotter.

And what are we getting in return for this combination of federal and local spending? Not much.

The ShotSpotter system’s ineffectiveness has consequences for law enforcement, community response, and the prevention of gun violence. A 2021 study from the Journal of Public Health found “that implementing ShotSpotter technology has no significant impact on firearm-related homicides or arrest outcomes” and that “[p]olicy solutions may represent a more cost-effective measure to reduce urban firearm violence.” Another study from the MacArthur Justice Center at Northwestern University concluded “that more than 90% of ShotSpotter alerts lead police to find no evidence to corroborate gunfire when police arrive at the location ShotSpotter sent them: no shooting, no shell casings, no victims, no witnesses, no guns recovered.

Not great. More bad stuff from studies and reports: 70% of people in neighborhoods with ShotSpotter systems are either black or Latino. 75% of those neighborhoods had annual incomes well below the national median.

As I stated above, this is a cop problem: the long-held biases that subject the same people to any new surveillance option. The rest of it is a ShotSpotter problem: it doesn’t spot shots and it doesn’t stop crime. And yet, millions are being spent on it every year, with some of the funding flowing directly from the DHS.

The main point of this letter, however, is to nudge DHS oversight to take a close look at the end result of this funding in terms of purchasing ShotSpotter tech. The federal government is forbidden from spending money on anything that violates federal laws. And this funding might be doing that. Title VI of the Civil Rights Act of 1964 forbids recipients of federal funding from discriminating on the basis of race, color, or national origin. Does planting most of your shot-spotting mics in predominately non-white neighborhoods violate the Civil Rights Act?

Well, that’s what these lawmakers hope to find out.

For all the preceding reasons, we respectfully request that you open an investigation in DHS’s funding of the ShotSpotter system to determine whether it is an appropriate use of taxpayer dollars, including the critical question of whether such funding may lead to Title VI violations.

It will likely be awhile before we hear back on this. But given what’s already been discovered via studies, public records requests, and investigative journalism, it certainly looks as though cops with this tech are violating the law. And one would expect another investigation into ShotSpotter use is going to turn up more of the same biased policing. If that’s the case, it won’t stop cops from being racist. But it will mean they’ll have to spend local funds to keep minorities under their tech-enhanced thumbs.

  • ✇Techdirt
  • Wyden Presses FTC To Crack Down On Rampant Auto Industry Privacy AbusesKarl Bode
    Last year Mozilla released a report showcasing how the auto industry has some of the worst privacy practices of any tech industry in America (no small feat). Massive amounts of driver behavior is collected by your car, and even more is hoovered up from your smartphone every time you connect. This data isn’t secured, often isn’t encrypted, and is sold to a long list of dodgy, unregulated middlemen. Last March the New York Times revealed that automakers like GM routinely sell access to driver beha
     

Wyden Presses FTC To Crack Down On Rampant Auto Industry Privacy Abuses

Od: Karl Bode
2. Květen 2024 v 22:33

Last year Mozilla released a report showcasing how the auto industry has some of the worst privacy practices of any tech industry in America (no small feat). Massive amounts of driver behavior is collected by your car, and even more is hoovered up from your smartphone every time you connect. This data isn’t secured, often isn’t encrypted, and is sold to a long list of dodgy, unregulated middlemen.

Last March the New York Times revealed that automakers like GM routinely sell access to driver behavior data to insurance companies, which then use that data to justify jacking up your rates. The practice isn’t clearly disclosed to consumers, and has resulted in 11 federal lawsuits in less than a month.

Now Ron Wyden’s office is back with the results of their preliminary investigation into the auto industry, finding that it routinely provides customer data to law enforcement without a warrant without informing consumers. The auto industry, unsurprisingly, couldn’t even be bothered to adhere to a performative, voluntary pledge the whole sector made in 2014 to not do precisely this sort of thing:

“Automakers have not only kept consumers in the dark regarding their actual practices, but multiple companies misled consumers for over a decade by failing to honor the industry’s own voluntary privacy principles. To that end, we urge the FTC to investigate these auto manufacturers’ deceptive claims as well as their harmful data retention practices.”

The auto industry can get away with this because the U.S. remains too corrupt to pass even a baseline privacy law for the internet era. The FTC, which has been left under-staffed, under-funded, and boxed in by decades of relentless lobbying and mindless deregulation, lacks the resources to pursue these kinds of violations at any consistent scale; precisely as corporations like it.

Maybe the FTC will act, maybe it won’t. If it does, it will take two years to get the case together, the financial penalties will be a tiny pittance in relation to the total amount of revenues gleaned from privacy abuses, and the final ruling will be bogged down in another five years of legal wrangling.

This wholesale violation of user privacy has dire, real-world consequences. Wyden’s office has also been taking aim at data brokers who sell abortion clinic visitor location data to right wing activists, who then have turned around to target vulnerable women with health care disinformation. Wireless carrier location data has also been abused by everyone from stalkers to people pretending to be law enforcement.

The cavalier treatment of your auto data poses those same risks, Wyden’s office notes:

“Vehicle location data can reveal intimate details of a person’s life, including for those who seek care across state lines, attend protests, visit mental or behavioral health professionals or seek treatment for substance use disorder.”

Keep in mind this is the same auto industry currently trying to scuttle right to repair reforms under the pretense that they’re just trying to protect consumer privacy (spoiler: they aren’t).

This same story is playing out across a litany of industries. Again, it’s just a matter of time until there’s a privacy scandal so massive and ugly that even our corrupt Congress is shaken from its corrupt apathy, though you’d hate to think what it will have to look like.

  • ✇Techdirt
  • Senate Approves Section 702 Reauthorization, Keeps Only The Bad StuffTim Cushing
    The government had a few years to sort this out, but as usual, the final call came down to the last minute. Shortly after Section 702 expired at midnight, April 19, the Senate pushed through a two-year reauthorization — one pretty much free of any reforms. This happened despite there being a large and vocal portion of the Republican party seeking to curb the FBI’s access to these collections because some of their own had been subjected to the sort of abuse that has become synonymous with the FB
     

Senate Approves Section 702 Reauthorization, Keeps Only The Bad Stuff

22. Duben 2024 v 18:29

The government had a few years to sort this out, but as usual, the final call came down to the last minute. Shortly after Section 702 expired at midnight, April 19, the Senate pushed through a two-year reauthorization — one pretty much free of any reforms.

This happened despite there being a large and vocal portion of the Republican party seeking to curb the FBI’s access to these collections because some of their own had been subjected to the sort of abuse that has become synonymous with the FBI’s interaction with this particular surveillance program.

The reauthorization passed to the Senate from the House had been stripped of a proposed warrant requirement and saddled with an especially expansive definition of the term “electronic communication service provider.” Here’s how Senator Ron Wyden explained it while speaking out against the amendment:

Now, if you have access to any communications, the government can force you to help it spy. That means anyone with access to a server, a wire, a cable box, a wifi router, a phone, or a computer. Think about the millions of Americans who work in buildings and offices in which communications are stored or pass through.

After all, every office building in America has data cables running through it. These people are not just the engineers who install, maintain and repair our communications infrastructure; there are countless others who could be forced to help the government spy, including those who clean offices and guard buildings. If this provision is enacted, the government could deputize any one of these people against their will, and force them to become an agent for Big Brother.

For example, by forcing an employee to insert a USB thumb drive into a server at an office they clean or guard at night.

This could all happen without any oversight. The FISA Court won’t know about it. Congress won’t know about it. The Americans who are handed these directives will be forbidden from talking about it. And unless they can afford high priced lawyers with security clearances who know their way around the FISA Court, they will have no recourse at all.

So, instead of reform, we’re getting an even worse version of what’s already been problematic, especially when the FBI’s involved. As the clock ticked down on this vote (but not really: the FISA court had already granted the Biden administration’s request to keep the program operable as-is until 2025), attempts were made to strip the bill of this dangerous addition and add back in the warrant requirement amendment that had failed in the House.

None of this worked, as Gaby Del Valle reports for The Verge:

Sens. Ron Wyden (D-OR) and Josh Hawley (R-MO) introduced an amendment that would have struck language in the House bill that expanded the definition of “electronic communications service provider.” Under the House’s new provision, anyone “who has access to equipment that is being or may be used to transmit or store wire or electronic communications.” The expansion, Wyden has claimed, would force “ordinary Americans and small businesses to conduct secret, warrantless spying.” The Wyden-Hawley amendment failed 34-58, meaning that the next iteration of the FISA surveillance program will be more expansive than before.

Both Sens. Paul and Dick Durbin (D-IL) introduced separate amendments imposing warrant requirements on surveilling Americans. A similar amendment failed in the House on a 212-212 vote. Durbin’s narrower warrant requirement wouldn’t require intelligence agencies to obtain a warrant to query for those communications, though it requires one to access them.

The version headed to the president’s desk is the worst version. The rush to push this version of the bill through possibly gained a little urgency when two unnamed service providers informed the government they would stop complying with FISA orders pretty much immediately if the Senate didn’t renew the program.

One communications provider informed the National Security Agency that it would stop complying on Monday with orders under Section 702 of the Foreign Intelligence Surveillance Act, which enables U.S. intelligence agencies to gather without a warrant the digital communications of foreigners overseas — including when they text or email people inside the United States.

Another provider suggested that it would cease complying at midnight Friday unless the law is reauthorized, according to the people familiar with the matter, who spoke on the condition of anonymity to discuss sensitive negotiations.

We’ll never know how empty these threats might have been or if the Intelligence Community would have even noticed the brief interruption in the flow of communications. Section 702 has been given a two-year extension in the form approved by the Senate, superseding the FISA Court’s blessing of one more year of uninterrupted spying if discussions over renewal blew past the April 19, 2024 deadline.

If you’re a fan of bipartisan efforts — no matter the outcome — well… enjoy your victory, I guess. But there’s nothing about this renewal debacle that can actually be called a win. Unless you’re the FBI, of course. Then it’s all gravy.

  • ✇Techdirt
  • Senate Must Follow House’s Lead In Passing Fourth Amendment Is Not For Sale ActMike Masnick
    The Fourth Amendment exists for a reason. It’s supposed to protect our private possessions and data from government snooping, unless they have a warrant. It doesn’t entirely prevent the government from getting access to data, they just need to show probable cause of a crime. But, of course, the government doesn’t like to make the effort. And these days, many government agencies (especially law enforcement) have decided to take the shortcut that money can buy: they’re just buying private data on
     

Senate Must Follow House’s Lead In Passing Fourth Amendment Is Not For Sale Act

20. Duben 2024 v 04:39

The Fourth Amendment exists for a reason. It’s supposed to protect our private possessions and data from government snooping, unless they have a warrant. It doesn’t entirely prevent the government from getting access to data, they just need to show probable cause of a crime.

But, of course, the government doesn’t like to make the effort.

And these days, many government agencies (especially law enforcement) have decided to take the shortcut that money can buy: they’re just buying private data on the open market from data brokers and avoiding the whole issue of a warrant altogether.

This could be solved with a serious, thoughtful, comprehensive privacy bill. I’m hoping to have a post soon on the big APRA data privacy bill that’s getting attention lately (it’s a big bill, and I just haven’t had the time to go through the entire bill yet). In the meantime, though, there was some good news, with the House passing the “Fourth Amendment is Not For Sale Act,” which was originally introduced in the Senate by Ron Wyden and appears to have broad bipartisan support.

We wrote about it when it was first introduced, and again when the House voted it out of committee last year. The bill is not a comprehensive privacy bill, but it would close the loophole discussed above.

The Wyden bill just says that if a government agency wants to buy such data, if it would have otherwise needed a warrant to get that data in the first place, it should need to get a warrant to buy it in the market as well.

Anyway, the bill passed 219 to 199 in the House, and it was (thankfully) not a partisan vote at all.

Image

It is a bit disappointing that the vote was so close and that so many Representatives want to allow government agencies, including law enforcement, to be able to purchase private data to get around having to get a warrant. But, at least the majority voted in favor of the bill.

And now, it’s up to the Senate. Senator Wyden posted on Bluesky about how important this bill is, and hopefully the leadership of the Senate understand that as well.

Can confirm. This is a huge and necessary win for Americans' privacy, particularly after the Supreme Court gutted privacy protections under Roe. Now it's time for the Senate to do its job and follow suit.

[image or embed]

— Senator Ron Wyden (@wyden.senate.gov) Apr 17, 2024 at 3:30 PM

  • ✇Techdirt
  • Senate Must Follow House’s Lead In Passing Fourth Amendment Is Not For Sale ActMike Masnick
    The Fourth Amendment exists for a reason. It’s supposed to protect our private possessions and data from government snooping, unless they have a warrant. It doesn’t entirely prevent the government from getting access to data, they just need to show probable cause of a crime. But, of course, the government doesn’t like to make the effort. And these days, many government agencies (especially law enforcement) have decided to take the shortcut that money can buy: they’re just buying private data on
     

Senate Must Follow House’s Lead In Passing Fourth Amendment Is Not For Sale Act

20. Duben 2024 v 04:39

The Fourth Amendment exists for a reason. It’s supposed to protect our private possessions and data from government snooping, unless they have a warrant. It doesn’t entirely prevent the government from getting access to data, they just need to show probable cause of a crime.

But, of course, the government doesn’t like to make the effort.

And these days, many government agencies (especially law enforcement) have decided to take the shortcut that money can buy: they’re just buying private data on the open market from data brokers and avoiding the whole issue of a warrant altogether.

This could be solved with a serious, thoughtful, comprehensive privacy bill. I’m hoping to have a post soon on the big APRA data privacy bill that’s getting attention lately (it’s a big bill, and I just haven’t had the time to go through the entire bill yet). In the meantime, though, there was some good news, with the House passing the “Fourth Amendment is Not For Sale Act,” which was originally introduced in the Senate by Ron Wyden and appears to have broad bipartisan support.

We wrote about it when it was first introduced, and again when the House voted it out of committee last year. The bill is not a comprehensive privacy bill, but it would close the loophole discussed above.

The Wyden bill just says that if a government agency wants to buy such data, if it would have otherwise needed a warrant to get that data in the first place, it should need to get a warrant to buy it in the market as well.

Anyway, the bill passed 219 to 199 in the House, and it was (thankfully) not a partisan vote at all.

Image

It is a bit disappointing that the vote was so close and that so many Representatives want to allow government agencies, including law enforcement, to be able to purchase private data to get around having to get a warrant. But, at least the majority voted in favor of the bill.

And now, it’s up to the Senate. Senator Wyden posted on Bluesky about how important this bill is, and hopefully the leadership of the Senate understand that as well.

Can confirm. This is a huge and necessary win for Americans' privacy, particularly after the Supreme Court gutted privacy protections under Roe. Now it's time for the Senate to do its job and follow suit.

[image or embed]

— Senator Ron Wyden (@wyden.senate.gov) Apr 17, 2024 at 3:30 PM

  • ✇Techdirt
  • Once Again, Ron Wyden Had To Stop Bad “Protect The Children” Internet Bills From Moving ForwardMike Masnick
    Senator Ron Wyden is a one-man defense for preventing horrible bills from moving forward in the Senate. Last month, he stopped Josh Hawley from moving a very problematic STOP CSAM bill from moving forward, and now he’s had to do it again. A (bipartisan) group of senators traipsed to the Senate floor Wednesday evening. They tried to skip the line and quickly move some bad bills forward by asking for unanimous consent. Unless someone’s there to object, it effectively moves the bill forward, ending
     

Once Again, Ron Wyden Had To Stop Bad “Protect The Children” Internet Bills From Moving Forward

7. Březen 2024 v 22:36

Senator Ron Wyden is a one-man defense for preventing horrible bills from moving forward in the Senate. Last month, he stopped Josh Hawley from moving a very problematic STOP CSAM bill from moving forward, and now he’s had to do it again.

A (bipartisan) group of senators traipsed to the Senate floor Wednesday evening. They tried to skip the line and quickly move some bad bills forward by asking for unanimous consent. Unless someone’s there to object, it effectively moves the bill forward, ending committee debate about it. Traditionally, this process is used for moving non-controversial bills, but lately it’s been used to grandstand about stupid bills.

Senator Lindsey Graham announced his intention to pull this kind of stunt on bills that he pretends are about “protecting the children” but which do no such thing in reality. Instead of it being just him, he rounded up a bunch of senators and they all pulled out the usual moral panic lines about two terrible bills: EARN IT and STOP CSAM. Both bills are designed to make it sound like good ideas and about protecting children, but the devil is very much in the detail, as both bills undermine end-to-end encryption while assuming that if you just put liability on websites, they’ll magically make child predators disappear.

And while both bills pretend not to attack encryption — and include some language about how they’re not intended to do so — both of them leave open the possibility that the use of end-to-end encryption will be used as evidence against websites for bad things done on those websites.

But, of course, as is the standard for the group of grandstanding senators, they present these bills as (1) perfect and (2) necessary to “protect the children.” The problem is that the bills are actually (1) ridiculously problematic and (2) will actually help bad people online in making end-to-end encryption a liability.

The bit of political theater kicked off with Graham having Senators Grassley, Cornyn, Durbin, Klobuchar, and Hawley talk on and on about the poor kids online. Notably, none of them really talked about how their bills worked (because that would reveal how the bills don’t really do what they pretend they do). Durbin whined about Section 230, misleadingly and mistakenly blaming it for the fact that bad people exist. Hawley did the thing that he loves doing, in which he does his mock “I’m a big bad Senator taking on those evil tech companies” schtick, while flat out lying about reality.

But Graham closed it out with the most misleading bit of all:

In 2024, here’s the state of play: the largest companies in America — social media outlets that make hundreds of billions of dollars a year — you can’t sue if they do damage to your family by using their product because of Section 230

This is a lie. It’s a flat out lie and Senator Graham and his staffers know this. All Section 230 says is that if there is content on these sites that violate the law, the liability goes after whoever created the content. If the features of the site itself “do damage,” then you can absolutely sue the company. But no one is actually complaining about the features. They’re complaining about content. And the liability on the content has to go to who created it.

The problem here is that Graham and all the other senators want to hold companies liable for the speech of users. And that is a very, very bad idea.

Now these platforms enrich our lives, but they destroy our lives.

These platforms are being used to bully children to death.

They’re being used to take sexual images and voluntarily and voluntarily obtain and sending them to the entire world. And there’s not a damn thing you can do about it. We had a lady come before the committee, a mother saying that her daughter was on a social media site that had an anti-bullying provisions. They complained three times about what was happening to her daughter. She killed herself. They went to court. They got kicked out by section 230.

I don’t know the details of this particular case, but first off, the platforms didn’t bully anyone. Other people did. Put the blame on the people actually causing the harm. Separately, and importantly, you can’t blame someone’s suicide on someone else when no one knows the real reasons. Otherwise, you actually encourage increased suicides, as it gives people an ultimate way to “get back” at someone.

Senator Wyden got up and, as he did last month, made it quite clear that we need to stop child sexual abuse and predators. He talked about his bill, which would actually help on these issues by giving law enforcement the resources it needs to go after the criminals, rather than the idea of the bills being pushed that simply blame social media companies for not magically making bad people disappear.

We’re talking about criminal issues, and Senator Wyden is looking to handle it by empowering law enforcement to deal with the criminals. Senators Graham, Durbin, Grassley, Cornyn, Klobuchar, and Hawley are looking to sue tech companies for not magically stopping criminals. One of those approaches makes sense for dealing with criminal activity. And yet it’s the other one that a bunch of senators have lined up behind.

And, of course, beyond the dangerous approach of EARN IT, it inherently undermines encryption, which makes kids (and everyone) less safe, as Wyden also pointed out.

Now, the specific reason I oppose EARN It is it will weaken the single strongest technology that protects children and families online. Something known as strong encryption.

It’s going to make it easier to punish sites that use encryption to secure private conversations and personal devices. This bill is designed to pressure communications and technology companies to scan users messages.

I, for one, don’t find that a particularly comforting idea.

Now, the sponsors of the bill have argued — and Senator Graham’s right, we’ve been talking about this a while — that their bills don’t harm encryption. And yet the bills allow courts to punish companies that offer strong encryption.

In fact, while it includes some they language about protecting encryption, it explicitly allows encryption to be used as evidence for various forms of liability. Prosecutors are going to be quick to argue that deploying encryption was evidence of a company’s negligence preventing the distribution of CSAM, for example.

The bill is also designed to encourage scanning of content on users phones or computers before information is sent over the Internet which has the same consequences as breaking encryption. That’s why a hundred civil society groups including the American Library Association — people then I think all of us have worked for — Human Rights Campaign, the list goes… Restore the Fourth. All of them oppose this bill because of its impact on essential security.

Weakening encryption is the single biggest gift you can give to these predators and these god-awful people who want to stalk and spy on kids. Sexual predators are gonna have a far easier time stealing photographs of kids, tracking their phones, and spying on their private messages once encryption is breached. It is very ironic that a bill that’s supposed to make kids safer would have the effect of threatening the privacy and security of all law-abiding Americans.

My alternative — and I want to be clear about this because I think Senator Graham has been sincere about saying that this is a horrible problem involving kids. We have a disagreement on the remedy. That’s what is at issue.

And what I want us to do is to focus our energy on giving law enforcement officials the tools they need to find and prosecute these monstrous criminals responsible for exploiting kids and spreading vile abuse materials online.

That can help prevent kids from becoming victims in the first place. So I have introduced to do this: the Invest in Child Safety Act to direct five billion dollars to do three specific things to deal with this very urgent problem.

Graham then gets up to respond and lies through his teeth:

There’s nothing in this bill about encryption. We say that this is not an encryption bill. The bill as written explicitly prohibits courts from treating encryption as an independent basis for liability.

We’re agnostic about that.

That’s not true. As Wyden said, the bill has some hand-wavey language about not treating encryption as an independent basis for liability, but it does explicitly allow for encryption to be one of the factors that can be used to show negligence by a platform, as long as you combine it with other factors.

Section (7)(A) is the hand-wavey bit saying you can’t use encryption as “an independent basis” to determine liability, but (7)(B) effectively wipes that out by saying nothing in that section about encryption “shall be construed to prohibit a court from considering evidence of actions or circumstances described in that subparagraph.” In other words, you just have to add a bit more, and then can say “and also, look, they use encryption!”

And another author of the bill, Senator Blumenthal, has flat out said that EARN IT is deliberately written to target encryption. He falsely claims that companies would “use encryption… as a ‘get out of jail free’ card.” So, Graham is lying when he says encryption isn’t a target of the bill. One of his co-authors on the bill admits otherwise.

Graham went on:

What we’re trying to do is hold these companies accountable by making sure they engage in best business practices. The EARN IT acts simply says for you to have liability protections, you have to prove that you’ve tried to protect children. You have to earn it. You’re just not given to you. You have to have the best business practices in place that voluntary commissions that lay out what would be the best way to harden these sites against sexually exploitation. If you do those things you get liability, it’s just not given to you forever. So this is not about encryption.

As to your idea. I’d love to talk to you about it. Let’s vote on both, but the bottom line here is there’s always a reason not to do anything that holds these people liable. That’s the bottom line. They’ll never agree to any bill that allows you to get them in court ever. If you’re waiting on these companies to give this body permission for the average person to sue you. It ain’t never going to happen.

So… all of that is wrong. First of all, the very original version of the EARN IT Act did have provisions to make company’s “earn” 230 protections by following best practices, but that’s been out of the bill for ages. The current version has no such thing.

The bill does set up a commission to create best practices, but (unlike the earlier versions of the bill) those best practice recommendations have no legal force or requirements. And there’s nothing in the bill that says if you follow them you get 230 protections, and if you don’t, you don’t.

Does Senator Graham even know which version of the bill he’s talking about?

Instead, the bill outright modifies Section 230 (before the Commission even researches best practices) and says that people can sue tech companies for the distribution of CSAM. This includes using the offering of encryption as evidence to support the claims that CSAM distribution was done because of “reckless” behavior by a platform.

Either Senator Graham doesn’t know what bill he’s talking about (even though it’s his own bill) or he doesn’t remember that he changed the bill to do something different than it used to try to do.

It’s ridiculous that Senator Wyden remains the only senator who sees this issue clearly and is willing to stand up and say so. He’s the only one who seems willing to block the bad bills while at the same time offering a bill that actually targets the criminals.

  • ✇Latest
  • Lawmakers Want Pause on Federal Funds for Predictive PolicingJ.D. Tuccille
    Should data scientists be in the business of fingering Americans for crimes they could commit, someday? Last month, a group of federal lawmakers asked the Department of Justice to stop funding such programs—at least until safeguards can be built in. It's just the latest battle over a controversial field of law enforcement that seeks to peer into the future to fight crime. "We write to urge you to halt all Department of Justice (DOJ) grants for pr
     

Lawmakers Want Pause on Federal Funds for Predictive Policing

21. Únor 2024 v 13:00
Sen. Ron Wyden (D–Ore.) speaks in the U.S. Capitol | Annabelle Gordon - CNP/CNP / Polaris/Newscom

Should data scientists be in the business of fingering Americans for crimes they could commit, someday? Last month, a group of federal lawmakers asked the Department of Justice to stop funding such programs—at least until safeguards can be built in. It's just the latest battle over a controversial field of law enforcement that seeks to peer into the future to fight crime.

"We write to urge you to halt all Department of Justice (DOJ) grants for predictive policing systems until the DOJ can ensure that grant recipients will not use such systems in ways that have a discriminatory impact," reads a January letter to Attorney General Merrick Garland from U.S. Sen. Ron Wyden (D–Ore.) and Rep. Yvette Clarke (D–N.Y.), joined by Senators Jeff Merkley (D–Ore.), Alex Padilla, (D–Calif.), Peter Welch (D–Vt.), John Fetterman, (D–Penn.), and Ed Markey (D–Mass.). "Mounting evidence indicates that predictive policing technologies do not reduce crime. Instead, they worsen the unequal treatment of Americans of color by law enforcement."

The letter emphasizes worries about racial discrimination, but it also raises concerns about accuracy and civil liberties that, since day one, have dogged schemes to address crimes that haven't yet occurred.

Fingering Criminals-To-Be

Criminal justice theorists have long dreamed of stopping crimes before they happen. Crimes prevented mean no victims, costs, or perpetrators to punish. That's led to proposals for welfare and education programs intended to deter kids from becoming predators. It's also inspired "predictive policing" efforts that assume crunching numbers can tell you who is prone to prey on others. It's an intriguing idea, if you ignore the dangers of targeting people for what they might do in the future.

"For years, businesses have used data analysis to anticipate market conditions or industry trends and drive sales strategies," Beth Pearsall wrote in the Department of Justice's NIJ Journal in 2010. "Police can use a similar data analysis to help make their work more efficient. The idea is being called 'predictive policing,' and some in the field believe it has the potential to transform law enforcement by enabling police to anticipate and prevent crime instead of simply responding to it."

Interesting. But marketers targeting neighborhoods for home warranty pitches only annoy people when they're wrong; policing efforts have much higher stakes when they're flawed or malicious.

"The accuracy of predictive policing programs depends on the accuracy of the information they are fed," Reason's Ronald Bailey noted in 2012. "We should always keep in mind that any new technology that helps the police to better protect citizens can also be used to better oppress them."

Predictive Policing in (Bad) Action

People worried about the dangers of predictive policing often reference the 2002 movie Minority Report, in which a science-fiction take on the practice is abused to implicate innocent people. Recent years, though, have delivered real-life cautionary tales about misusing data science to torment people for crimes they haven't committed.

"First the Sheriff's Office generates lists of people it considers likely to break the law, based on arrest histories, unspecified intelligence and arbitrary decisions by police analysts," the Tampa Bay Times reported in 2020 of Pasco County, Florida's predictive policing program. "Then it sends deputies to find and interrogate anyone whose name appears, often without probable cause, a search warrant or evidence of a specific crime."

In practice, as a former deputy described the program's treatment of those it targeted: "Make their lives miserable until they move or sue."

Sue they did, with many plaintiffs represented by the Institute for Justice. Last year, with legal costs mounting, the sheriff's office claimed in court documents that it discontinued predictive policing efforts.

Garbage In, Garbage Out

A big problem with predictive policing is that it relies heavily on honesty and dispassion in people who create algorithms and enter data. As recent arguments over biases in internet search results and artificial intelligence reveal, the results that come out of a data-driven system are only as good as what goes in.

"One foundational problem with data-driven policing is that it treats information as neutral, ignoring how it can reflect over-policing and historical redlining," the Brennan Center for Justice's Ángel Díaz wrote in 2021. He added that tech vendors dealing with the NYPD's predictive policing program "proposed relying on data such as educational attainment, the availability of public transportation, and the number of health facilities and liquor licenses in a given neighborhood to predict areas of the city where crime was likely to occur."

Are those real predictors of criminal activity? Maybe. Or maybe they're excuses for making people's lives miserable until they move or sue, as happened in Pasco County.

Forecasts Fueled by the Feds

As with so many big ideas with scary potential, impetus for development and implementation comes from government funding and encouragement.

"The National Institute of Justice, the DOJ's research, development and evaluation arm, regularly provides seed money for grants and pilot projects to test out ideas like predictive policing," American University law professor Andrew Guthrie Ferguson commented earlier this month. "It was a National Institute of Justice grant that funded the first predictive policing conference in 2009 that launched the idea that past crime data could be run through an algorithm to predict future criminal risk."

Of course, it's not bad to seek innovation and to look for new tools that could make the public safer. But hopefully, those funding such research want it to make the world a better place, not worse. And when lawmakers asked the Justice Department in 2022 for some documentation on predictive policing, officials admitted they didn't really know how money was being spent, let alone its impact.

"It remains an unanswered [question], for example, to what degree such tools are, or ever have ever been, assessed for compliance with civil rights law," Gizmodo's Dell Cameron wrote at the time.

Hence the letter from Wyden and company. After years of haphazard funding and development, warnings from civil libertarians, and abuses by police, some lawmakers want the federal government to stop funding predictive policing efforts until due diligence is done and safeguards are in place.

You have to wonder if predictive policing programs predicted the field's own current troubles.

The post Lawmakers Want Pause on Federal Funds for Predictive Policing appeared first on Reason.com.

❌
❌