Rick Holland, VP Strategy, Digital Shadows discusses how research indicates that criminals are increasingly exploiting the growth of ‘fake news’ for commercial gain.
The growth of so-called fake news has traditionally been associated with the political sphere and executed for ideological purposes. However, research conducted by analysts at Digital Shadows reveals the growth of toolkits and services – available for just $7 – that are aimed specifically at causing financial and reputational damage for companies.
In particular, Digital Shadows has noticed a recent trend towards the growth on the dark web of so-called ‘Pump and Dump’ services. These are designed to work by gradually purchasing major shares in altcoin (cryptocurrencies other than Bitcoin) and drumming up interest in the coin through posts on social media. The tool then trades these coins between its multiple accounts, driving the price up, before selling these to unsuspecting traders on currency exchanges looking to buy while share prices are still rising. An analysis of the bitcoin wallet of one such popular service noted that it received the equivalent of $326,000 from wanabee criminals in less than two months.
Similarly, Digital Shadows identified over ten services that allow user to download software, which controls the activities of social media bots. One offers users downloadable software for a trial period of just $7. Others tools claim to promote content across over hundreds of thousands of platforms, including forums, blogs and bulletin boards. These work by controlling large numbers of bots; armies of computers that the individuals control and can configure the bots to post on specific types of forums on different topics. Overall, mentions of these sites across criminal forums can give an indication of their popularity – these have increased 300% in just two years from 418 in 2015 to 1381 in 2017 so far.
The battle against fake news could be getting even more difficult with advertisements for toolkits increasingly claiming to include built in features that bypass captcha methods, which were initially brought in to prevent bots and automated scripts from posting advertisements indiscriminately across these platforms.
Unsurprisingly, media organisations are a particular target of purveyors of fake news. Digital Shadows analysed the top 40 global news websites and checked over 85,000 possible variations on their domain. In doing so, it discovered some 2,858 live spoof domains. Simply by altering characters on a domain (e.g. a “m” may have changed to an “rn”) and by using cloning services it is possible to create a fake website of a legitimate news organisation that looks realistic.
Retailers too are a target. One managed service offers ‘Amazon ranking, reviews, votes, listing optimization and selling promotions’ with pricing ranging from $5 for an unverified review, $10 for a verified review, to a $500 monthly retainer.
“The sheer availability of tools means that barriers to entry are lower than ever. It means this now extends beyond geopolitical to financial interests that affect businesses and consumers”, said Rick Holland, VP Strategy, Digital Shadows. “Of course, rumours, misinformation and fake news have always been part of human society. But what has changed in the digital world is the speed such techniques spread around the world, and the fact tools are freely available on the dark and surface web to enable those wanting to carry out these sorts of campaigns to do so with easy and by locating and using the services and tools they need online.”
Digital Shadows has issued the following advice for firms looking to combat disinformation:
- Combat domain spoofing – organizations should proactively monitor for the registration of malicious domains and have a defined process of dealing with infringements when they occur. An agile and scalable takedown capability is critical for combating domain spoofing
- Combat the ‘bots’ – monitor social media for brand mentions and seek to detect the ‘bots’ though it’s not always immediately obvious, there are often clues such as looking at the age of the account, the content being posted, and the number of friends and followers
- Monitor forums for information that could manipulate the share price – organizations should search for mentions of their brand or staff across forums, which could be instances of malicious actors spreading disinformation
- Keep an eye on trending activity – monitor trending activity as it relates to an organization’s digital footprint and potentially identify disinformation activity