Surveillance AI with a thermal heat twist - another look at Athena Security, with COVID-19 in mind

Neil Raden Profile picture for user Neil Raden May 8, 2020
Summary:
The ethical questions raised by AI-powered surveillance are numerous. Athena Security has some thoughtful answers - but what happens when we extend those capabilities into thermal heat detection?

Woman with technology quotient hair © Kiselev Andrey Valerevich - shutterstock

In the very recent past, surveillance cameras merely recorded video information. Some applications took the video feed and analyzed it, but not in real-time, and certainly not with spooky algorithms using deep learning techniques. That is just recent.

But using A.I. techniques to evaluate video as it occurs is, in my thinking, creepy, intrusive, and one more assault on our privacy. But a few months ago, I had the occasion to speak with Lisa Falzone, co-founder and CEO of Athena Security, and wrote about in Facial recognition revisited - can it save lives and actually protect privacy?

Lisa explained to me that right after one of the mass school shootings, she came up with the idea to use A.I. to save people. The concept of A.I. saving lives isn't original, but in this case, it's actually in use. They develop a surveillance system that can identify a threat, especially one involving a gun,  in three seconds or less with 99% accuracy. As I wrote:

Surveillance cameras are the first step. By employing dozens of professional actors to act out the numerous gestures of pulling a firearm, brandishing a weapon, and even, currently in beta, a knife, Athena Security was able to train their algorithms to pick up these threats in near-real-time, the system can also alert you to falls, accidents, and unwelcome visitors.

Despite the apparent benefit, it still struck me that any kind of facial recognition was fraught with ethical concern:

The ethics of this is that facial recognition is not involved, only the gestures. And if there is a face, it is blurred. Athena advises strictly on-premise computing to clients to avoid the cloud and big brother's grasp.

Lisa reassured me that they did not do facial recognition. Their system was based on gesture analysis. I was a little leery at first because the idea of any surveillance system that uses A.I. to evaluate the streaming images makes me a little dyspeptic. Then I was fascinated by how they did it, with hired actors to enact a zillion scenarios with guns and used the recordings to train the system. So far, so good.

Besides, they DO NOT monetize the data. They flush everything within 48 hours (inference stuff, not the video itself, that stays with the client). If there are faces, it grays them out. They advise their clients to "air gap" the system; in other words, no internet connection. I was satisfied.

Recently I was contacted by them again. This time, Athena Security developed a thermal camera system that can detect people's temperature. My mind started to race about all of the (bad) possibilities. Here is what they said:

 Athena Security can accurately detect fevers with its thermal A.I. security cameras:

  • accurate within 1/2 a degree
  • detects 12 different points on the body
  • perfect for airports, grocery stores, hospitals
  • ethical - no facial recognition, no personal tracking

My first question was:

 "When Athena detects someone about to pull a gun, it's reasonable that you would take some action against that person. But if you are randomly scanning people, what is the ethical action to take?"

And the response:

Athena doesn't counsel customers on what to do in either event - guns or fevers - that's up to the hospital, airport, grocery market to act in the best interest of public health and safety.  As best practices are discovered and local authorities begin to explain the best steps for someone with a fever - we'll pass that along - but the general consensus if one has a fever is to call your doctor, seek medical help and get tested in a conscious way as not to come in contact with others.

Athena simply acts as an overlay to the existing system to promote more knowledge, funneling folks that show signs of fever to move swiftly to medical care and testing.

Let's deconstruct that answer. If the system is deployed in a conspicuous place and people are knowingly posing for the camera, that isn't troublesome. But what happens to the person in an airport, in line to vote, in a grocery store - I have a hard time believing they would be given a gentle suggestion as Athena proposed. I can't imagine a security guard saying:

"Excuse me, ma'am, you're running a 103 fever, and we'd rather you took these precautions for yourself and for the many people that are in this hospital or grocery store or wherever this, can help us not have 2.2 million global deaths due to COVID-19."

But being scanned anonymously, without consent, even if it has public health benefits, is an ethical question that is not being addressed. In fact, I learned later that:

All tech can be redirected to perform evil or misleading and dastardly acts - just not the case with Athena and your factual reporting on their A.I. object detection tech and ethics details their pattern of thoughtful innovation - this is no different and yet another reason the co-founders came out of retirement to use their tech and scaling expertise to save lives - this time with a pandemic.

Now if the government purchased the tech and white-labeled it and there's no oversight - then sure - "who knows" what they could be up to.

My take

I'm not comfortable with this explanation, but I'm willing to give Athena Security a little of the benefit of the doubt. In general, public security and protection are needed, but there are questions about the ethics of digital surveillance.  Governments point to the fight against terrorism, not being intrusive and worth it for the benefit. My feeling is surveillance technology has yet to be proven in avoiding disasters, fighting crime, or advancing public health. What surveillance does do is cause injustice, eats up resources, while enhancing the potential for social and personal division and invasion of privacy.

Loading
A grey colored placeholder image