AI illiteracy

FaceApp has shown just how easy it is for us to drop our guard

Ian Hawkins

Portrait of old lady

Have you updated your social media profile with a FaceApp picture?

This is the software that digitally manipulates your selfie to give you an idea of what you'll look like in later life, if looking at your own parents is not high-tech enough for you. 

If you want to keep up with the crowd (and what a crowd it is: with over 100 million downloads from Google Play, FaceApp is the top-ranked app on the iOS App Store) you'd better get a move on; the bad news is already out about the terms and conditions.

Those of us in the business of creating content are sensitive about having our intellectual property taken from us, but when that 'intellectual property' is your own face, you might want to look more closely at the terms and conditions that most people barely bother to scroll through before clicking the 'next' button: 

You grant FaceApp a perpetual, irrevocable, nonexclusive, royalty-free, worldwide, fully-paid, transferable sub-licensable license to use, reproduce, modify, adapt, publish, translate, create derivative works from, distribute, publicly perform and display your User Content and any name, username or likeness provided in connection with your User Content in all media formats and channels now known or later developed, without compensation to you. When you post or otherwise share User Content on or through our Services, you understand that your User Content and any associated information (such as your [username], location or profile photo) will be visible to the public. 

(The highlights are mine, as is the font size.)

In other words, if they want to post your photo on a billboard or in a TV commercial, with your name - or something unflattering - underneath it, they can. They can sell it to whoever they like. Or create a deep fake, indistinguishable from the real you. 

Ask a stranger for their bank details, or if you can see their passport. Whether they went through last year's GDPR roll-out or not, they're likely to tell you where to go (and you won't need a passport of your own). Even while we're unlocking our mobile phones with our faces, biometric data doesn't feel like, well, data. 

FaceApp's users may well be engaged in nothing more sinister than making funny photos, but those terms of use are pretty comprehensive. Your friends' data points may well be inside a computer right now, teaching a computer facial recognition, or resurface as a deep fake. Where does PEX Network fit into this? 

The disquieting part of all this is that so many people have given up so much of their data so willingly. Biometric data was, it was hoped, impossible to copy, but it's also nearly impossible to hide. The trick is not to get a picture of your face; the trick is to get you to waive your rights to it. 

The Pick of PEX

Need to know more about a particular subject? The PEX Network delivers free content to members about all aspects of Process Excellence - from evergreen management wisdon to the latest in AI and robot workers. 

Whitepapers, podcasts, video and articles - click here for the Pick of PEX.  

While we argue that AI and automation are job creators and have the potential to affect problems from politics to the polar ice caps, public perception of the technology's potential, dangers, and on the strength of this, what it actually looks like, shows how vulnerable we are to misuse of data that doesn't look like data. 

There's no shame in admitting to have been fooled by deep fakes (guilty); we should all be concerned that AI is running away from our understanding of what it is. 

The consequences of inaction are that we will live in a world without truth. Some may say this has already happened. And how would we know, anyway?

What do you think? Comment on LinkedIn.