With a new capacity of artificial intelligence after another entry into the main current, it is tempting to give each one the same institution. But some entitlement is more than others.
Consider Deepfakes. Ajlones can now use Generalaual-E tools to create sounds, or even live video, which look or look like specific people-and ask for money transfers. As such, there is a “great” risk of such capabilities “breaking the confidence and identity systems on which our entire economy depends on,” said Emily Cheo, CEO of Novo in Fintech, based in Miami, said. luckThe Summit of Women is Stronger in Riyadh, Saudi Arabia, last week.
Fraud that defends artificial intelligence
She cited a case in Hong Kong last year in which he was a financial employee Deception to transfer more than $ 25 million to fraudsters. The employee, although he is skeptical after receiving an e -mail request for money, was seduced in Enlargement An invitation was not anyone else-although it looked and looks like the financial manager of the UK and the other CEO.
Police official investigates the case I told local media Although previous frauds included individual video calls, “this time, at a multi -person video conference, it turned out that everyone you see is fake.”
However, it is relatively easy to access and use, and as much as artificial intelligence technique develops this fraud, it is relatively easy to access and use.
“The possibility of accessing these services has reduced the entry barrier to the Internet criminals – they no longer need special technological skills groups,” David Ferman, chief security official at the Netskope security company, CNBC said.
ARUP, an engineering company in the United Kingdom, later confirmed that she was the victim in the attack.
“Like many other companies around the world, our operations are subject to regular attacks, including bill fraud, hunting fraud, WhatsApp Voice, and Deepfakes,” He said ARUP CIO Rob Greig in a statement. “This is an industry, commercial and social issue, and I hope that our experience will help increase awareness of the growing development and development techniques of bad actors.”
A continuous threat
Deloitte Financial Services Center was recently weighed on the case, male“It is expected that the intrusive intelligence will greatly raise the threat of fraud, which may cost banks and their customers up to $ 40 billion by 2027.”
The Hong Kong accident shows that “we will face a world where our ability to trust and verify what is real – the confidence system on which trade depends, which Fintech depends on – will be a real challenge.”
Of course, this offers opportunities for companies that can reach effective solutions to this problem, “but it is not a situation that has been solved yet.” “So, it’s something I will be observing … even if you are out of Fintech.”
This story was originally shown on Fortune.com
https://fortune.com/img-assets/wp-content/uploads/2025/05/54533838837_52914157a3_o-e1748560311489.jpg?resize=1200,600
Source link