Artificial intelligence is increasingly used by fraudsters who post manipulated ads with the participation of famous people on social media. The victims are usually those who invest in cryptocurrencies among other ventures. According to experts, AI has the potential to both aid criminals and hinder their activities. For instance, artificial intelligence allows for identity verification. This is particularly important now, as we are on the cusp of the so-called ‘grandchild method’, where soon a grandmother will see the actual face of her grandchild speaking with his voice in a messenger app thanks to AR overlays. This rapidly developing technology will soon be widely used by criminals. That’s why experts insist that serious emphasis should be placed on education, not just financial.
Traps for investors
Fraudsters are increasingly using artificial intelligence in their activities. This is evidenced by examples from social media where investment proposals, especially in the cryptocurrency market, appear. As Aleksander Łapiński from the Association of Security Experts of the Republic of Poland explains, the criminals’ scheme of operation is very similar, regardless of the platform. They use AI-based tools without the consent of celebrities to manipulate their image or voice. These technologies enable face swapping, voice cloning, and other manipulations at relatively low costs. From the manipulated material, they create advertising campaigns that direct users to professionally-looking websites that encourage investing via a special application or a feedback form.
– Social media, offering broad possibilities of reaching potential victims, de facto facilitate such activities. For an inexperienced investor, the greatest danger is the possibility of encountering highly manipulated content, which is difficult to distinguish from real offers. Moreover, augmented reality overlays, a rapidly developing technology, are just around the corner as part of the so-called ‘grandson method’, where a grandmother will see the actual face of her grandchild, speaking in his own voice, in a messaging app – comments prof. Dariusz Jemielniak, head of the Department of Management in a Networked Society at the Leon Koźmiński Academy.
In addition, prof. Krzysztof Piech from Lazarski University argues that easy access to AI is necessary. But every new, rapidly developing innovation attracts criminals, as was the case with cryptocurrencies and earlier – the Internet. Legal and law enforcement agencies usually lag behind initially, but eventually, they catch up. The European Union is trying to regulate the AI field. Piech emphasizes that the problem with advertising concerns not only Poles but also other nationalities. Generally, the less educated a society, the easier it is for fraudsters to operate. In terms of financial education level, Poland has been one of the last places in Europe for many years.
– We’ve helped people recover their money who fell victim to such investment scams. They admitted they were surprised at the high level of professionalism by the fraudsters, both in communication, language, and understanding of financial mechanisms – adds Aleksander Łapiński.
Double-edged sword
According to experts, AI has the potential to both aid fraudsters and combat them. Aleksander Łapiński points out that artificial intelligence mechanisms can facilitate the creation of more credible investment scams. For instance, by automating the process of generating fake ads, websites, or emails that look very convincing. Moreover, advanced deepfakes with images of famous people are created. He also points out the precision in personalizing attacks. AI can analyze personal data (for example after their leakage) to create more personalized phishing campaigns. In addition, artificial intelligence can create chatbots that will convincingly give false investment advice in real time.
– The development of AI-based tools capable of analyzing and detecting potential fraud is crucial for the future security of cryptocurrency investments. It’s important to simultaneously develop protection and education systems that will keep pace with technology evolution. Unfortunately, social media have zero interest in developing such tools. Moreover, they even block access to their content to scientists who could analyze the phenomenon and help counteract it – emphasizes prof. Dariusz Jemielniak.
As Jakub Martenka from Ari10 argues, artificial intelligence is already being used in work related to identity verification. There are many programs that can assess whether a recording has been manipulated or whether a conversation is taking place with a real person. These solutions are very effective, so this will certainly be a developing trend.
– Usually, the justice system is unable to keep up with a given market. For example, look at tracking cryptocurrency addresses. Does the Polish police have any tools to trace “dirty cryptocurrencies”? No, although it could have such tools. Instead, it uses American solutions that are not particularly adapted to our needs. And the same thing will happen with AI – points out prof. Krzysztof Piech.
In addition, Jakub Martenka notes, the number of so-called deepfakes is steadily increasing. Initially, these tools were fairly primitive, and their results differed significantly from the original recordings. But now, they pose a serious threat to the security of public figures due to the increasing quality of the generated material. More perfect models make distinguishing truth from falsity really difficult. Even though we see a person and hear their voice, we often deal with unauthorized use of the image and manipulation. This means that we are reaching a moment when most of the society after watching the manipulated material will not be able to determine that it is fake.
Education and caution
Furthermore, Jakub Martenka emphasizes that we often make spontaneous purchasing decisions. Those concerning investments should always be more responsible and well thought out. Fraudsters can tempt with various types of assets, as they use not only cryptocurrencies. In the expert’s opinion from Ari10, today there is a problem with how cryptocurrencies are treated. Due to many scams of this type, there were plenty of content suggesting losses. This indicates potential danger, but there is little educational content. This is also because, proverbially, such content is not ‘clicked’.
– The problem is the lack of basic knowledge and financial hygiene of Poles. These are issues that need to be taught not only to children in schools. It is also necessary to conduct extensive information campaigns addressed to various social groups, using access channels specific to them. For example, educational content can be woven into soap operas that seniors willingly watch. The idea was tested several years ago in the United Kingdom – analyzes prof. Krzysztof Piech.
On the other hand, Aleksander Łapiński emphasizes that the scale of the problem with investment scams is huge, both in Poland and worldwide. Criminals exploit the naivety or downright blindness of victims, promising quick and easy profit without risk. The expert adds that in our country CERT Poland and UMNF have been conducting campaigns for a long time aimed at educating users in this area. According to CERT Orange, scams related to fake investments are one of the most common types of websites that users are directed to as a result of such campaigns.
– Before making a decision to send or spend money, you should think carefully and consult with others. For example, there is a Facebook group where almost every day there are inquiries to check whether a given entity or project is a scam. Unfortunately, in more than 90% of cases it’s already too late to help such people. They usually report harm, i.e. after transferring money to scammers. If you don’t know what cryptocurrencies are and how they work, and you’ve only heard about the possibilities of profit, better refrain. It’s too early for you – summarizes prof. Krzysztof Piech.
Source: https://managerplus.pl/za-chwile-zrobi-sie-glosno-o-nowej-metodzie-na-wnuczka-wszystko-dzieki-ai-i-social-mediom-30949