What’s Real, What’s Not?

By | January 25, 2021

There are old and new forms of deception that trick targets into believing that an interaction is something that it is not. Today, these interactions are increasingly in cyberspace as technology takes social engineering and mimicking techniques to new levels of sophistication, according to cyber analytics specialist CyberCube.

Cyber criminals are creating fake images of real business leaders more often, and the use of deep fake video and audio technologies could become a major cyber threat within the next two years, the firm says.

In its new report, Social Engineering: Blurring reality and fake, CyberCube says the ability to create realistic audio and video fakes using artificial intelligence and machine learning has grown. Recent technological advances and the increased dependence of businesses on video-based communication have accelerated these developments. Because of the increasing number of video and audio samples of business people now accessible online – in part due to the pandemic – cyber criminals have a large supply of data from which to build photo-realistic simulations of individuals, which can then be used to influence and manipulate people.

Also a technology known as “mouth mapping” that was created by the University of Washington can be used to mimic the movement of the human mouth during speech with extreme accuracy. This complements existing deep fake video and audio technologies.

The report’s author, Darren Thomson, is California-based CyberCube’s head of cyber security strategy. He says that as the availability of personal information increases online, criminals are investing in technology to exploit this trend.

“New and emerging social engineering techniques like deep fake video and audio will fundamentally change the cyber threat landscape and are becoming both technically feasible and economically viable for criminal organizations of all sizes,” he warns.

Thomson offered a few hypotheticals. “Imagine a scenario in which a video of Elon Musk giving insider trading tips goes viral – only it’s not the real Elon Musk. Or a politician announces a new policy in a video clip, but once again, it’s not real. We’ve already seen these deep fake videos used in political campaigns; it’s only a matter of time before criminals apply the same technique to businesses and wealthy private individuals.”

The CyberCube report also examined the growing use of traditional social engineering techniques – exploiting human vulnerabilities to gain access to personal information and protection systems. One facet of this is social profiling, which involved assembling the information necessary to create a fake identity based on information available online or from physical sources such as refuse or stolen medical records. According to the report, the blurring of domestic and business information technology systems created by the pandemic combined with the growing use of online platforms are making social engineering easier for criminals. The report warns insurers that there is little they can do to combat the development of deep fake technologies but stresses that risk selection will become increasingly important for cyber underwriters.

Was this article valuable?

Here are more articles you may enjoy.

From This Issue

Insurance Journal Magazine January 25, 2021
January 25, 2021
Insurance Journal Magazine

Excess, Surplus & Specialty Markets Directory, Volume I; The Diversity Issue