It was a shock for tech community when the news broke in January about an independent developer’s app containing preloaded personal data and matching photos of people to their online images: an invention that might eradicate anonymity. Causing legitimate fears for reality, the news brought into question the whole future of personal data.

Clearview AI, a tiny software developer team, designed a tool that uses an uploaded photo of a person to search the web – from news websites to social networks – for matching images. Scrapping data from Facebook, YouTube, Venmo, and dozens of other websites resulted in a personal data bulk the developer distributes as a part of the service.

The story affected me. First off, using photographs in such manner breaks a good deal of laws and End-user license agreements of social media. What happened was that the line was crossed – the one ethically respected even by large companies so far. Developing solutions based on facial recognition technology and artificial intelligence (AI) myself, I am well aware how sensitive this topic is and that the public considers some aspects of AI to be another Pandora’s Box you’d rather not even touch. We all know the reason: If such a tool falls into the wrong hands, the concequences can be frightening. 

So let’s talk on how to handle the “Pandora’s box” part of AI.

Pandora’s Box and large companies

Let’s take Google’s reverse image search algorithm, for example, and play cops. The first thing we need is a photo. Let’s take a LinkedIn profile photo, the one where you shine your brightest smile and dressed for success. Then we click the Pictures tab to upload our photo and use Search by Images feature. Now, all we have to do is wait for the algorithms to reveal their secrets, expecting them to dig into profiles of ten years ago and find this stupid picture (probably, in some old friend’s album) of you sneezing and looking anything, but smart… But no. Instead of your last weekend photo, Google returns visually similar images – in our case, plenty of business faces looking forward to the future with confidence, but the faces will never be the same. Why? The company doesn’t provide AI-powered solutions jeopardizing their users’ personal data. There are certain taboos and red lines not to be crossed, and this practice should not be an exception and should be governed by law.

Pandora’s Box and users

The next point is users – any device owner, basically, so this paragraph is about all of us. There are fitness trackers, smartphone apps, cookie, and even antiviruses collecting your personal data. And nothing will stop six billion people from sharing, especially if it’s a new AI-based app that is so good at editing photos of your friends and colleagues involved in another Instagram activity. It’s all fun, but let’s get to personal responsibility. I believe it’s time for users to take it seriously. Are we ready to contribute our photos to further AI growth? Checking permission requests carefully before installation also makes sense.

Here comes the most interesting part:

Pandora’s Box and government

In my opinion, law enforcement agencies have every right to use AI-powered solutions for the sake of citizens’ safety. Investigative work engages a variety of legal methods, so why not make use of AI, taking into account its high performance. 

First, the technology itself is not evil. Facial recognition and artificial intelligence are powerful tools helping to catch offenders. However, you can’t just distribute software packed with a preloaded bulk of personal data; otherwise, your product will tend to enter and remain in the black market. NNTC stays strong and true to the light side. This is why we offer iFalcon Face Control with a blank database and never store or share any personal data. It is our customers that create and update their own databases. Retailers, for instance, accumulate data on shoplifters to watch them while they are shopping. Security companies integrate our solution with their database of violators. As for offices, a database of personal data processing agreement signees is relevant here. The above given examples show that iFalcon Face Control is a multipurpose tool easily adaptable to user needs and that performing the prescribed function doesn’t make of it a nuclear time bomb ticking away. 

Second, it’s a good idea to keep some solutions with powerful engines away from public domain limiting their use to public service organizations. Compare it with access to weapons, and you’ll get what I mean.

Pandora’s Box and independent developers

Alas, a frantic race for new solutions and income sources whips independent developers up to release a program of this kind as if it were nothing serious. They hardly resemble Prometheus giving fire to humanity, but play with fire and Pandora’s Box. There are two possible endings for such apps: the black market or lawsuits. Do today’s developers feel personally responsible for their actions? And what can be the outcome of such mindless behavior? 

NNTC does hold responsibility for the developed solutions, and, trust me, we are doing our best to make the world a better place. There are countries successfully regulating new technologies. Certainly, the day will come when more countries join them in relevant law-making initiatives.

by Dmitry Doshaniy, NNTC General Manager