A Chinese avoiding was arrested after an AI-powered facial acceptance system alerted authorities to his attendance in a crowd of 60,000 people accessory a pop concert. Welcome to the age of robot snitches.

Wanted for “economic crimes,” the 31 year-old man was reportedly afraid when police apprehended him. He’d catholic nearly 100 km (about 60 miles) with his wife and accompany to attend the event, a concert accent by Cantopop star Jacky Cheung, before authorities nabbed him on a tip from a venue camera.

In his defense, Jacky Cheung is awesome:

Chinese authorities have absolutely accepted facial acceptance systems and AI-powered surveillance monitoring.

The way it works is pretty simple: if you’re in public in China, anywhere a camera can see you, you’re being articular by AI and tracked in real time. The government has absolute power in its use of AI to gather and action data based on your activities.

On the surface, it seems like award abyss who hide in plain sight is a good thing. According to an FBI database there’s over 789,000 outstanding felony warrants in the US, but that number is sure to be a low-ball figure due to under-reporting by law administration agencies nation-wide. Getting those people off the streets, or at least bringing them to justice, should apparently accommodate social benefit. But what cost are we accommodating to pay in order to achieve that goal?

AI isn’t just analysis alarming abyss to keep the streets safe. In China, if you get caught jaywalking on camera you’ll be texted a fine. If you don’t pay the fine, your ‘social credit’ score will drop and you can be banned from public transportation, car rentals, airports, costs options, and making large purchases, among a myriad of other punishments.

Imagine bridge the street in the wrong place and later award that you’re unable to travel or own a home.

Worse, this isn’t a botheration that only apropos fugitives. People, by and large, accept the fact they’re being recorded in public. But that’s apparently because the accepted public doesn’t yet accept the abeyant ramifications of agriculture live video to AI.

In the US and Europe we’re alpha to see there are dangers in giving any entity absolved access to claimed data, thanks to arrest in the US presidential acclamation and Brexit votes.

But the Cambridge Analytica aspersion is just the tip of the abstract when it comes to the dangers we could face when bad actors use our data — and, in China, it’s more like a drop in the Pacific ocean.

If Facebook giving away your accompany advice to Cambrdige Analytica bothers you – data which, for the most part, had advisedly been made public to begin with – then the idea of any government having total freedom over its citizen’s claimed data should abuse you.

Read next: This is what the new Gmail will look like