FreshRSS

Zobrazení pro čtení

Jsou dostupné nové články, klikněte pro obnovení stránky.

96% Of Hospitals Share Sensitive Visitor Data With Meta, Google, and Data Brokers

Od: Karl Bode

I’ve mentioned more than a few times how the singular hyperventilation about TikTok is kind of silly distraction from the fact that the United States is too corrupt to pass a modern privacy law, resulting in no limit of dodgy behavior, abuse, and scandal. We have no real standards thanks to corruption, and most people have no real idea of the scale of the dysfunction.

Case in point: a new study out of the University of Pennsylvania (hat tip to The Register) analyzed a nationally representative sample of 100 U.S. hospitals, and found that 96 percent of them were doling out sensitive user visitor data to Google, Meta, and a vast coalition of dodgy data brokers.

Hospitals, it should be clear, aren’t legally required to publish website privacy policies that clearly detail how and with whom they share visitor data. Again, because we’re too corrupt as a country to require and enforce such requirements. The FTC does have some jurisdiction, but it’s too short staffed and under-funded (quite intentionally) to tackle the real scope of U.S. online privacy violations.

So the study found that a chunk of these hospital websites didn’t even have a privacy policy. And of the ones that did, about half the time the over-verbose pile of ambiguous and intentionally confusing legalese didn’t really inform visitors that their data was being transferred to a long list of third parties. Or, for that matter, who those third-parties even are:

“…we found that although 96.0% of hospital websites exposed users to third-party tracking, only 71.0% of websites had an available website privacy policy…Only 56.3% of policies (and only 40 hospitals overall) identified specific third-party recipients.”

Data in this instance can involve everything including email and IP addresses, to what you clicked on, what you researched, demographic info, and location. This was all a slight improvement from a study they did a year earlier showing that 98 percent of hospital websites shared sensitive data with third parties. The professors clearly knew what to expect, but were still disgusted in comments to The Register:

“It’s shocking, and really kind of incomprehensible,” said Dr Ari Friedman, an assistant professor of emergency medicine at the University of Pennsylvania. “People have cared about health privacy for a really, really, really long time.” It’s very fundamental to human nature. Even if it’s information that you would have shared with people, there’s still a loss, just an intrinsic loss, when you don’t even have control over who you share that information with.”

If this data is getting into the hands of dodgy international and unregulated data brokers, there’s no limit of places it can end up. Brokers collect a huge array of demographic, behavioral, and location data, use it to create detailed profiles of individuals, then sell access in a million different ways to a long line of additional third parties, including the U.S. government and foreign intelligence agencies.

There should be hard requirements about transparent, clear, and concise notifications of exactly what data is being collected and sold and to whom. There should be hard requirements that users have the ability to opt out (or, preferably in the cases of sensitive info, opt in). There should be hard punishment for companies and executives that play fast and loose with consumer data.

And we have none of that because our lawmakers decided, repeatedly, that making money was more important than market health, consumer welfare, and public safety. The result has been a parade of scandals that skirt ever closer to people being killed, at scale.

So again, the kind of people that whine about the singular privacy threat that is TikTok (like say FCC Commissioner Brendan Carr, or Senator Marsha Blackburn) — but have nothing to say about the much broader dysfunction created by rampant corruption — are advertising they either don’t know what they’re talking about, or aren’t addressing the full scope of the problem in good faith.

❌