This could be the beginning of a larger push at Facebook, and one which would be beneficial on various fronts. This week, the UK Competition and Markets Authority (CMA) has called upon The Social Network to do more to combat the sale of fake reviews on its platform, which could prompt expanded action.
The sale of false reviews is illegal under UK consumer law – as reported by TechCrunch:
“The CMA said that it has found “troubling evidence” of a “thriving marketplace for fake and misleading online reviews.” Though it also writes that it does not believe the platforms themselves are intentionally allowing such content to appear on their sites.”
Indeed there is a thriving marketplace for fake reviews, and it’s not difficult to find. A simple search of ‘Buy fake reviews’ in the Facebook search box returns with the following:
I mean, they’re not even veiled – and as you can see from the search query, it’s a pretty blunt search. I have no doubt I could find many more if I changed up the search terms and considered how they might name themselves to avoid detection.
And that’s long been a large part of the frustration on this element – if I can find fake review sites that easily, why can’t Facebook remove them? The platform has far more advanced algorithms and systems in place, you would assume that Facebook would have the capacity to detect these sellers, and could then weed them out. So why hasn’t it?
The assumption could be that Facebook is happy to host as many businesses as possible, even those of questionable legitimacy, because each, potentially, increases the company’s bottom line by purchasing ads.
Facebook has long sought to take a “hands off” role in the content on its platform, leaning on the stance that it’s merely the host (you may recall its repeated “we’re not a media company” line in regards to the spread of misinformation via Facebook Pages). In more recent times, Facebook has been forced to accept some level of responsibility for such, but it would prefer to maintain a level of separation in order to avoid getting into the area of censorship – which is particularly true in the case of businesses and ads, given their subsequent revenue potential.
But this case highlights a new area of concern – with regulatory bodies paying more attention to illegal, and potentially illegal, practices like this, and with the influence of online platforms growing and impacting more consumer decisions, Facebook may be forced to take more action against fakes. Which can only be a good thing for the broader social eco-system.
In this case, the justification is fairly clear-cut – Facebook needs to take action because, as noted, it’s against the law in the UK to sell fake reviews. But another area that Facebook has been slowly ramping up its action has been the sale of fake engagement and followers on social media, which has become a much bigger element of concern, given the rise of influencer marketing.
In recent months, Facebook has taken legal action against several companies which had been found to be promoting the sale of fake accounts, likes and followers. This comes after New York’s Attorney General ruled in February that selling fake social media followers and likes is essentially illegal, in relation to the specific actions of a company called Devumi, which has since gone out of operation.
Given that Facebook now has a legal precedent for such, it can, and should, be taking action against these fakes, and removing their capacity to peddle false engagement, which can mislead and misinform consumers across the growing eCommerce sphere.
As noted, Facebook is already, slowly looking to ramp up its actions on this front, but cases like this in the UK may prompt even more action, and see a wider net cast over the sale of fake activity online.
As such, this is a great push – Facebook hasn’t announced specific action in response the CMA’s call as yet. But hopefully, the shifting legal sands will empower Facebook to rid its network of all fakes, and lessen their impact in digital marketing.
This latest Social Media News has been posted from here: Source Link