04-03-2024, 02:26 AM
This article is not for those who shy away from paranoia...
I've heard many people crowing about how they don't care if mega businesses have collected vast amounts of data about them. Usually, their dismissal relates to their inability to see how that data might possibly have any meaningful value or represent any realistic danger at all.
I don't bother trying to justify my concerns, and usually just allow them the luxury of considering me paranoid. Life's easier that way.
From TechCrunch: ‘Reverse’ searches: The sneaky ways that police tap tech companies for your private data
U.S. police departments are increasingly relying on a controversial surveillance practice to demand large amounts of users’ data from tech companies, with the aim of identifying criminal suspects.
So-called “reverse” searches allow law enforcement and federal agencies to force big tech companies, like Google, to turn over information from their vast stores of user data. These orders are not unique to Google — any company with access to user data can be compelled to turn it over — but the search giant has become one of the biggest recipients of police demands for access to its databases of users’ information.
For example, authorities can demand that a tech company turn over information about every person who was in a particular place at a certain time based on their phone’s location, or who searched for a specific keyword or query. Thanks to a recently disclosed court order, authorities have shown they are able to scoop up identifiable information on everyone who watched certain YouTube videos.
Our collection of government agencies has a long history of 'fishing' for crime (perhaps they can't find any crime otherwise) by scouring huge chunks of data, collected 'incidentally' by data mega companies like Google, Yahoo, Microsoft, etc. So rather than spend time, money, and resources actually investigating ... they opt to 'demand' the data from service providers... who generally don't say "no" to their requests.
A recently unsealed search application filed in a Kentucky federal court last year revealed that prosecutors wanted Google to “provide records and information associated with Google accounts or IP addresses accessing YouTube videos for a one week period, between January 1, 2023, and January 8, 2023.”
The search application said that as part of an undercover transaction, the suspected money launderer shared a YouTube link with investigators, and investigators sent back two more YouTube links. The three videos — which TechCrunch has seen and have nothing to do with money laundering — collectively racked up about 27,000 views at the time of the search application. Still, prosecutors sought an order compelling Google to share information about every person who watched those three YouTube videos during that week, likely in a bid to narrow down the list of individuals to their top suspect, who prosecutors presumed had visited some or all of the three videos.
This particular court order was easier for law enforcement to obtain than a traditional search warrant because it sought access to connection logs about who accessed the videos, rather than the higher-standard search warrant that courts can use to demand that tech companies turn over the contents of someone’s private messages.
The Kentucky federal court approved the search order under seal, blocking its public release for a year. Google was barred from disclosing the demand until last month when the court’s order expired...
“The government is essentially dragooning YouTube into serving as a honeypot for the feds to ensnare a criminal suspect by triangulating on who’d viewed the videos in question during a specific time period,” said Pfefferkorn, speaking about the recent order targeting YouTube users. “But by asking for information on everyone who’d viewed any of the three videos, the investigation also sweeps in potentially dozens or hundreds of other people who are under no suspicion of wrongdoing, just like with reverse search warrants for geolocation.”
Law enforcement might defend the surveillance-gathering technique for its uncanny ability to catch even the most elusive suspected criminals. But plenty of innocent people have been caught up in these investigative dragnets by mistake — in some cases as criminal suspects — simply by having phone data that appears to place them near the scene of an alleged crime.
The article makes one statement that I must protest...
Some companies choose not to store user data and others scramble the data so it can’t be accessed by anyone other than the user. That prevents companies from turning over access to data that they don’t have or cannot access — especially when laws change from one day to the next, such as when the U.S. Supreme Court overturned the constitutional right to access abortion.
The characterization of abortion access as a "constitutional right" does not stand scrutiny. Not that I want to get into that can of worms... but casually inserting it into this article shows the true effect of "activist journalism" engendering tropes that are not correct (as well as detracting from the point of the article.)
I've heard many people crowing about how they don't care if mega businesses have collected vast amounts of data about them. Usually, their dismissal relates to their inability to see how that data might possibly have any meaningful value or represent any realistic danger at all.
I don't bother trying to justify my concerns, and usually just allow them the luxury of considering me paranoid. Life's easier that way.
From TechCrunch: ‘Reverse’ searches: The sneaky ways that police tap tech companies for your private data
U.S. police departments are increasingly relying on a controversial surveillance practice to demand large amounts of users’ data from tech companies, with the aim of identifying criminal suspects.
So-called “reverse” searches allow law enforcement and federal agencies to force big tech companies, like Google, to turn over information from their vast stores of user data. These orders are not unique to Google — any company with access to user data can be compelled to turn it over — but the search giant has become one of the biggest recipients of police demands for access to its databases of users’ information.
For example, authorities can demand that a tech company turn over information about every person who was in a particular place at a certain time based on their phone’s location, or who searched for a specific keyword or query. Thanks to a recently disclosed court order, authorities have shown they are able to scoop up identifiable information on everyone who watched certain YouTube videos.
Our collection of government agencies has a long history of 'fishing' for crime (perhaps they can't find any crime otherwise) by scouring huge chunks of data, collected 'incidentally' by data mega companies like Google, Yahoo, Microsoft, etc. So rather than spend time, money, and resources actually investigating ... they opt to 'demand' the data from service providers... who generally don't say "no" to their requests.
A recently unsealed search application filed in a Kentucky federal court last year revealed that prosecutors wanted Google to “provide records and information associated with Google accounts or IP addresses accessing YouTube videos for a one week period, between January 1, 2023, and January 8, 2023.”
The search application said that as part of an undercover transaction, the suspected money launderer shared a YouTube link with investigators, and investigators sent back two more YouTube links. The three videos — which TechCrunch has seen and have nothing to do with money laundering — collectively racked up about 27,000 views at the time of the search application. Still, prosecutors sought an order compelling Google to share information about every person who watched those three YouTube videos during that week, likely in a bid to narrow down the list of individuals to their top suspect, who prosecutors presumed had visited some or all of the three videos.
This particular court order was easier for law enforcement to obtain than a traditional search warrant because it sought access to connection logs about who accessed the videos, rather than the higher-standard search warrant that courts can use to demand that tech companies turn over the contents of someone’s private messages.
The Kentucky federal court approved the search order under seal, blocking its public release for a year. Google was barred from disclosing the demand until last month when the court’s order expired...
“The government is essentially dragooning YouTube into serving as a honeypot for the feds to ensnare a criminal suspect by triangulating on who’d viewed the videos in question during a specific time period,” said Pfefferkorn, speaking about the recent order targeting YouTube users. “But by asking for information on everyone who’d viewed any of the three videos, the investigation also sweeps in potentially dozens or hundreds of other people who are under no suspicion of wrongdoing, just like with reverse search warrants for geolocation.”
Law enforcement might defend the surveillance-gathering technique for its uncanny ability to catch even the most elusive suspected criminals. But plenty of innocent people have been caught up in these investigative dragnets by mistake — in some cases as criminal suspects — simply by having phone data that appears to place them near the scene of an alleged crime.
The article makes one statement that I must protest...
Some companies choose not to store user data and others scramble the data so it can’t be accessed by anyone other than the user. That prevents companies from turning over access to data that they don’t have or cannot access — especially when laws change from one day to the next, such as when the U.S. Supreme Court overturned the constitutional right to access abortion.
The characterization of abortion access as a "constitutional right" does not stand scrutiny. Not that I want to get into that can of worms... but casually inserting it into this article shows the true effect of "activist journalism" engendering tropes that are not correct (as well as detracting from the point of the article.)