In a matter of seconds, it is possible to hold your phone up to your face and see what you will look like in 40 years. Or you could fuse the image of your face with that of a celebrity. You could even record a birthday song for a friend in the voice of their favourite artist. With deepfake technology, it is simple and possible to edit a person’s facial and vocal likeness with alarming accuracy. For the most part, this can be seen as harmless entertainment. But what if your likeness was used to drain your savings or commit fraud? As the technology to create deepfakes becomes easier and cheaper, the need to guard against these cybercrimes have come to the forefront.
A deepfake is a video, visual, or audio recording that has been distorted, manipulated, or synthetically created using deep learning techniques to present an individual, or a hybrid of several people, saying or doing something that they did not say or do. These deepfakes are often used in digital injection attacks which are sophisticated, highly scalable, and replicable cyberattacks that bypass the camera on a device or are injected into a data stream.
Murray Collyer, Chief Operating Officer of iiDENTIFii, says, “Digital injection attacks present the highest threat to financial services, as the AI technology behind it is affordable, and the attacks are rapidly scalable. In fact, a recent digital security report by our technology partner, iProov, illustrates how, in an indiscriminate attempt to bypass an organisation’s security systems, some 200-300 attacks were launched globally from the same location within a 24-hour period. As more and more South Africans embrace digital banking, deepfake technology is a serious threat.”
Recent research by Discovery Bank and Boston Consulting Group (BCG) into the future of retail banking in South Africa found that most (86%) South Africans are ready to do all their banking digitally, particularly via an app. Much of this trend stems from previously unbanked people. The Covid-19 pandemic naturally accelerated the trend.
As more South Africans set up digital accounts and do their banking online, financial crime and cybercrime have become more inextricably linked than ever before. Interpol states that financial and cybercrimes are the world’s leading crime threats and are projected to increase the most.
Collyer adds, “Deepfake technology is one of the most rapidly growing threats within financial services, yet not all verification technologies are resilient to it. Password-based systems, for example, are highly susceptible to fraud. South Africa needs to strengthen their technology to outwit cyber criminals.”
While deepfakes are a severe threat, the technology and processes exist to safeguard financial services companies against this method of fraud.
A growing percentage of face biometric technology incorporates some form of liveness checks – such as wink and blink – to verify and authenticate customers. Liveness detection uses biometric technology to determine whether the individual presenting is a real human being, not a presented artefact. Therefore, this technology can detect a deepfake if it were to be played on a device and presented to the camera.
While many liveness detection technologies can determine if someone is conducting fraud by holding up a physical image (for example, a printed picture or mask of the person transacting) to the screen, many solutions cannot detect digital injection attacks.
Collyer says, “Specialised technology is required to combat deepfakes. Within iiDENTIFii, we have seen success with the use of sophisticated yet accessible 4D liveness technology, which includes a timestamp and is further verified through a three-step process where the user’s selfie and ID document data are checked with relevant government databases. This enables us to accurately authenticate someone’s identity.”
“With the right technology, it is not only possible to protect consumers and businesses against deepfake financial crimes but also create a user experience that is simple, accessible and safe for all,” Collyer concludes.