Exactly. This is why this is scaring me. The police are vacuuming up data on everyone and who knows who else they’ll go after, especially if the wrong person gets into power. Even on the state level. I sure hope DeSantis’ Florida doesn’t have this ability.
I’m less concerned about that if its purely public data. If a police officer sat in a helicopter looking for drivers driving erratically, then notified a trooper on the ground to check on the car, and perform a field sobriety test if there is cause to do so I think that would fall within the confines of the law, even though thousands of cars could have been in their field of view and considered for potential DUI.
I am of the opinion that if the data is not either directly in public view, or the user can opt out of persisting it and it is available to the general public, even if for a fee, then its fine to use the data. I think any kind of AI algorithm’s suggestions on its own should not be considered probable cause, you can use it to narrow down suspects, but you need actual evidence for a warrant or arrest.
I think the issue I have with this situation is collecting and storing such a vast amount of travel data on individuals without their consent. If leaked, that data could be used to track down victims of stalking and abuse, or political dissidents.
And for that, they also processed the data on thousands of innocent people, too. Without any legal basis or permission, probably.
Exactly. This is why this is scaring me. The police are vacuuming up data on everyone and who knows who else they’ll go after, especially if the wrong person gets into power. Even on the state level. I sure hope DeSantis’ Florida doesn’t have this ability.
ctOS, it is happening
I’m less concerned about that if its purely public data. If a police officer sat in a helicopter looking for drivers driving erratically, then notified a trooper on the ground to check on the car, and perform a field sobriety test if there is cause to do so I think that would fall within the confines of the law, even though thousands of cars could have been in their field of view and considered for potential DUI.
I am of the opinion that if the data is not either directly in public view, or the user can opt out of persisting it and it is available to the general public, even if for a fee, then its fine to use the data. I think any kind of AI algorithm’s suggestions on its own should not be considered probable cause, you can use it to narrow down suspects, but you need actual evidence for a warrant or arrest.
I think the issue I have with this situation is collecting and storing such a vast amount of travel data on individuals without their consent. If leaked, that data could be used to track down victims of stalking and abuse, or political dissidents.