Face ID: What can it be, what not

Face ID is, as is well known, next to the Notch – the most outstanding feature of the upcoming iPhone X. And as always, when a new method of authentication comes onto the market, it is critically eyed. Some criticisms are bend, in relative ignorance of the process behind Face ID, even into very gloomy scenarios of a new threat and historical warnings of a flare-up of new human-scornful paths of physiognomy. There are details like those, which Craig Federighi published in an interview with Techcrunch about the function of Face ID, particularly important to bring the discussion down to a somewhat cooler level.

What do we learn? On the one hand, Apple has trained the AI ​​behind Face ID with a very large pool of as differentiated data (faces), which hopefully exclude any classic faux pass, such as especially dark faces. Most importantly, the fact is that no data generated in Face ID will ever leave the device, migrate to the cloud, or be distributed directly to other apps. This is also – as already with Touch ID, secured that Apple can not give this data to authorities, because they simply have no access to it. One way to secure his iPhone X with Face ID and additional passcode is, unfortunately, not, but at least this discussion has been led by Apple and one thinks about it.

The possibility to tap five times on the on-switch to deactivate the face ID, there is probably no more, but you can now hold the on-switch and the loud-noise switch a bit longer, then you land automatically the same place. Other reasons why Face ID is disabled and the passcode is used is also available. Reboot, no use of Face ID in 48 hours, 5 wrong attempts to open the iPhone X with an inappropriate face (as on the keynote). Something spongy it is here, because apparently also after 4 hours after a passcode is asked, if in this time Face ID has not used to the Unlock.

It is soon to be released a document from Apple, that the cooperation of the different sensors and cameras in “extreme” details explained, but also a few praxistipps are already to be found in the interview. Sunglasses can limit the functionality of Face ID, but they do so only when they filter out IR light strongly. Since you need eyes, nose and mouth for Face ID, Niqabs are of course problematic, but helmets and scarves should not be a problem – as long as they show these parts of the face. Masks (as with doctors or as respiratory protection) naturally also prevent Face ID recognition.

For people who can not really look directly at their phone, there is probably still another way to use Face ID, because you can also disable the “attention” feature. However, this also means reduced security. Face ID should work from most angles where a Selfie would be possible, but the detection at particularly oblique angles may be somewhat problematic.

If you ever trust that the Secure Enclave in which these Face ID data are processed is really safe (from which one should never go out, but Apple in this regard also rather a relatively good reputation), then the main problem with the data which are produced here are actually the area, which is made accessible for various software.

There should be a “depth map”, a depth map of faces, which is passed from Face ID to ARKit and thus other applications. And not only from the self-camera but also the back camera. This is by no means as detailed as the data has the face ID, but, as the introduction of new app rules to restrict this data has already shown, it is also no paranoia, if one fears that facial data could be collected, whose Use something could be eerie.

So far, however, it has not yet been clear how detailed these depth maps are. However, the fact that Apple is already worried before the introduction of ARKit and Face ID, that this data of apps could be used for the authentication (which they officially can no longer do) should be a little worried.

In view of the fact that biometric facial data can always be detected when you show your face in public or, for example, in a hackable passport with you, the real concern may be less with systems like Face ID, which at least do a lot to keep this technology safe, but rather in everyday use of face detection, which takes place in camera apps of all kinds. Each Snapchat mask, makeup app, AR access to faces or facial change app is based on more or less good biometric data, which is given voluntarily at every fun moment without knowing what is really done.

The shift of the “best” of such data into locked security areas on the smartphone with regulated access to only certain data (“depth mask”) is probably an additional security factor in this context. However, we are also curious about what hackers will also find out about the sensors and cameras used as well as the functionality of Face ID in the coming months. Disasters are never excluded.

Dreamtale

Dreamtale

This person is lazy, nothing left!

Leave a Response