Seven Ways Technology Can Get You Arrested for Something You Didn’t Do

Millions of people are arrested every year—about 7.36 million in 2022 alone . This seems pretty abstract to most of us because most of us are not criminals and we assume that if we haven’t done anything wrong we have nothing to worry about.

But there are at least two errors in this assumption. First, almost everyone does something illegal in their life that they consider a victimless crime or simply doesn’t realize is illegal , so your chances of ever being arrested may be higher than you think. And second, law enforcement agencies and corporations are increasingly excluding people and relying on technologies like automation, facial recognition, and artificial intelligence (often in combination), and these technologies are flawed—or rather, truly flawed. And interacting with them can lead to false accusations and even arrests, even if you have done absolutely nothing wrong. Here are seven ways you can get arrested today without even thinking about the crime.

Self payment

Self-service kiosks are controversial and many retailers are rethinking them , but they are still quite common. And if they make a mistake, you could be in big trouble. Take the example of Olympic athlete Meagan Pettypiece, who purchased $176 worth of groceries at the self-checkout counter at Walmart. The stall she was using was missing two items: ham and asparagus. Pettipiece scanned the items, but the kiosk was unable to register them – she had done nothing wrong. But police were called and when they searched her bag, they found marijuana and prescription drugs, and she was arrested and charged with theft and possession of a controlled substance.

The charges were later dropped, but not before Pettipiece’s life was ruined: she quit her coaching job and suffered damage to her reputation that would haunt her forever. So the next time you buy groceries at the self-checkout, make sure every item is scanned.

Face recognition

Facial recognition technology is flawed and unreliable (often in very racist ways ), but that hasn’t stopped corporations and law enforcement from using it with unsurprisingly disastrous results . For example, Harvey Murphy Jr. was arrested when facial recognition software used by Houston retailers Macy’s and Sunglass Hut identified him as the perpetrator of an armed robbery. Murphy was in prison for two weeks, during which time he was allegedly attacked by other inmates several times. But Murphy wasn’t just innocent: he wasn’t even in Texas when the robbery happened. In fact, this happens often , and it can happen to you if the facial recognition tool fails and suggests your name for no reason.

License plate cameras

Automatic license plate readers are used by police departments to identify vehicles involved in crimes. If a car is, say, subject to a robbery or shooting, readers can find out the license plate number and alert the police, who can then alert officers to the car’s make, model and license number.

You can guess where this leads: license plate readers make mistakes. In North Carolina, for example, Jacqueline McNeil was arrested for her alleged involvement in a shooting . The arrest was based on an automated license plate reading that incorrectly identified her car as being involved. She was held for several hours, interrogated, and then released. She ultimately settled the lawsuit against the city for $60,000.

Incorrect databases

If you’ve ever had a run-in with the law that was resolved (dismissed case, settled lawsuit), you may think your nightmare is over and you can get back to your life. But these days, case management is becoming increasingly automated, and the software that handles it is as buggy and unreliable as all software. A few years ago in California, a new case management system suddenly began counting old arrest warrants as active, and there was a wave of false arrests because police were given incorrect information. In other words, if one piece of data in a complex database changes from 1 to 0, something you worked with years ago could result in a new arrest.

Incorrect photo analysis

There’s really no such thing as online privacy: your files, photos, voicemails and messages are stored somewhere and someone has access to them, even if they’re supposedly protected. Companies like Google, which see huge amounts of media flowing through their servers, often use automated scanning to identify and flag material that may be illegal, but when they get it wrong, it leaves a ruined life or two in its wake.

For example, in 2021, a father took photographs of his baby and sent them to his doctor for analysis. Google’s review algorithm flagged the photos and referred the man to law enforcement on suspicion of trafficking in child sexual abuse images. Police quickly cleared the man of all charges, but Google refused to restore his accounts . The lesson here is to remember that anything you post, store, email or create using any internet-connected platform is not confidential and can easily be misinterpreted by a soulless algorithm, leading to your arrest – and perhaps for the worse.

Field drug test

Police often use field drug tests when they suspect someone is under the influence of a controlled substance: About 773,000 of the 1.5 million drug arrests in this country are based on evidence collected through field tests. But these tests are considered “speculative” because the technology is not very reliable. The tests are cheap (they cost about $2 each) and disposable, and they are so ridiculously terrible at their performance that everything from cotton candy to vitamins are often mistaken for drugs.

Clarice Doku was arrested in 2018 after a field drug test revealed that the folic acid she was taking in hopes of becoming pregnant was ecstasy. She and her husband spent two weeks in jail, she lost her job, he missed his citizenship ceremony, and the charges were eventually dropped.

Technology makes our lives easier. Until it makes them much, much harder, especially when it results in a false arrest based only on some phantom data that you have no control over.

More…

Leave a Reply