A accessibility store in Tacoma has installed a facial acceptance aegis system to deny barter entry unless they’re accustomed by an AI. This news has likely been well-received by the city’s bigotry attorneys.

State-of-the-art facial acceptance sucks. AI simply isn’t good at acquainted faces unless they’re white. This simple fact has been accepted by academics, experts, and the better technology companies on the planet. Here at we’ve committed cogent advantage to the danger facial acceptance technology poses to bodies of color, as have many of our peers.

But, for whatever reason, Blue Line Technologies — the aggregation amenable for the accessibility store system —  thinks it’s got it ample out. According to the Seattle Times, a agent for the Missouri-based technology aggregation said it software “has never misidentified anyone.”

This makes it seem like either the aggregation is leaps and bounds ahead of Google and Amazon in the area of facial recognition, or their software hasn’t articular very many people.

We’re still gluttonous capacity on absolutely how Blue Line’s AI works, but the gist when it comes to facial acceptance technology is that it compares the pixels in images of one face adjoin a database of others to see if any match. As mentioned, cutting-edge AI struggles to tell the aberration amid non-white faces, making its use ethically ambiguous in any bearings where bigotry is a concern. 

The store in question, Jackson’s Food Store in Tacoma, appears to be aware of the aloofness apropos surrounding the use of such products. It issued a account acceptable the association it won’t sell or share the data, but didn’t abode the technology’s problems acquainted non-white faces.

When spoke with Brian Brackeen, the CEO of facial acceptance aggregation Kairos, he told us after ambiguity that he believed the technology wasn’t ready for public-facing use cases:

Imperfect algorithms, non-diverse training data, and poorly advised implementations badly access the chance for ambiguous outcomes. Surveillance use cases, such as face acceptance enabled body-cams, ask too much of today’s algorithms. They cannot accommodate even able answers to the challenges presented of applying it in the real world. And that’s before we even get into the ethical side of the argument.

Customers at Jackson’s, it appears, will have to get used to a archetype where the color of their face could play a role in whether they’ll be accustomed inside the store or not. 

We accomplished out to Jackson’s Food Stores and Blue Line Technologies but didn’t accept an actual response. We’ll update this commodity if we do.

Read next: Google wants to make the 25-year-old robots.txt agreement an internet accepted