Anonymous hacker in front of his computer. | iStock photo
Anonymous hacker in front of his computer. | iStock photo

Fraud detection specialists in short supply as industry plans hiring push

While some fintechs are shedding jobs, one area where they’re short is in fraud detection specialists, results of a new survey from OCR Labs Global and FINTRAIL finds.

A copy of Fighting Identity Fraud in an Economic Downturn is available here. More than 50 global members of FINTRAIL’s Fintech FinCrime Exchange responded. The worldwide network collaborates on best practices in financial crime risk management.

Overall, 81% expect to see all fraud types increase in 2023. Close to 60% saw a fraud spike in 2022. A similar percentage said the amount of money stolen and the volume increased.

Third-party fraud is on the rise. That is troubling because identity theft, account takeovers, and social engineering scams are the most difficult to identify or prevent. Sixty percent of respondents said identity theft, account takeover, and social engineering scams are the most common tactics they see.

Origins of the fraud specialist shortage

Roughly 90% of fintechs plan to recruit more fraud prevention experts as they expect scams to soar in 2023. Close to 70% plan to spend more on fraud prevention management, including protection against deep fakes. Between 38-42% use fraud prevention software, AI and Machine Learning solutions, and digital identity verification (IDV) software.

While companies may have over-hired and the markets are now correcting, they are consistently hiring for their safety teams, OCR Labs Global CMO Loc Nguyen said. Consider the layoffs in areas that attract new business and the hiring for tasks that protect existing business.

He added that academia had not provided enough graduates in critical areas for the past decade-plus. Ten or 15 years ago, data science was an emerging and understaffed field. Today most understand the job, but there’s still a shortage. Nguyen sees the same trend repeating itself with fraud.

How machine vision and other technologies fight fraud

The nature of the work is changing thanks to advances in technology. Algorithms now address much of the heavy lifting previously done by humans. In some areas, like machine vision and chat, machines do it better than humans.

And machine vision still has many places to go, Nguyen said. Consider SightGPT the next advancement.

“That’s been the problem; we haven’t been able to do it fast enough,” Nguyen said. “ChatGPT existed (before), but it couldn’t do it fast enough. If we had to wait 15 minutes for a response, no one would use that. 

“What we’re doing with vision is being able to process in the seconds range. That’s been the huge breakthrough (because) humans can’t process that fast.”

Also read:

Combatting bias in fraud technology

As digital identity systems proliferate, flaws in some designs are emerging. Poorly-designed algorithms can simply replicate human flaws at a digital pace because they reflect the biases of their designers.

Those biases include favoring whites and males. Studies show technologies can struggle with females, people of color, and non-Western cultures and races.

“Whether it’s in the workforce or our business, diversity matters because the systems that recognize diverse people are trained by datasets,” Nguyen said. “If you only show them specific colors, ages, genders, and races, the machine only learns them. 

“The idea in ChatGPT, that large language model where they expose them to lots and lots of different languages… with visuals, it’s hard to get that right because how do you get large data sets of people’s faces?”

The industry combats these biases by investing in technologies related to liveness and identity, Nguyen said. They know machine-based biases built into systems hurt the user experience. Businesses are demanding better solutions because they believe technology does not consistently deliver an unbiased experience.

Industry collaboration is a powerful fraud-fighting tool

Loc Nguyen headshot
New technologies allow businesses to collaborate in the fight against financial crime, Loc Nguyen said.

Businesses collaborate more today, Nguyen noted. Historically, criminals have worked in groups while business operates alone for fear of revealing secrets to competitors.

This shift is driven by technology in two ways. Some technologies (like deep fakes) have evolved to the stage where they are readily accessible and easily used by more people. Those with them can access open libraries, apps, and platforms.

Many technologies are available that allow companies to collaborate safely. They need no longer fear spilling any secrets.

Nguyen explained that some of those solutions involve using machine vision to spot deep fakes. They can detect light reflections inconsistent with how live images respond. Technologies can detect micromovements under the skin by slowing down frame rates.

The technologies supporting deep fakes can also be used for good, Nguyen said. Synthetic media is deployed to train AI systems to combat biases and remove them from systems.

Staying ahead of fraud rings

Identity theft, account takeovers, and social engineering scams are easy entry points. Scammers impersonate people, so businesses need verification.

But over the last five years, fraudsters developed the ability to fool those systems. Some techniques include intercepting the camera signal and substituting an image.

“We don’t use the camera off the laptop,” Nguyen said. “We only use the mobile phone’s camera to secure that communication. It’s the same reason Apple lets you unlock your face with your phone but doesn’t let you do that with their Mac.”

Scams are increasingly sophisticated, Nguyen added. Whereas in the past, a criminal would immediately try and exploit a piece of stolen data, they now aggregate it ad can sit on it for five years.

Building shared intelligence capacity helps. Machines remember images that popped up five years ago related to fraudulent activity. Nguyen likens it to police forces worldwide sharing their knowledge through Interpol.

This impacts lending and credit because scammers try to exploit real-time decisioning systems. The business is incentivized to catch them before they repeat the action at multiple brands.

“Allowing machines to let companies who normally compete collaborate without exposing each other’s private information… we have machine learning and infrastructure to do that,” Nguyen said. “That’s how we stay ahead.”

  • Tony Zerucha

    Tony is a long-time contributor in the fintech and alt-fi spaces. A two-time LendIt Journalist of the Year nominee and winner in 2018, Tony has written more than 2,000 original articles on the blockchain, peer-to-peer lending, crowdfunding, and emerging technologies over the past seven years. He has hosted panels at LendIt, the CfPA Summit, and DECENT's Unchained, a blockchain exposition in Hong Kong. Email Tony here.