Facebook has promised to suspend or ban users who repeatedly set up groups and profiles that promote, encourage and facilitate bogus or misleading reviews across Facebook and Instagram, both of which it owns.
The move comes after the Competition and Markets Authority (CMA) found more evidence of misleading content, and reiterated a request to Facebook to help prevent its promotion.
The business regulating body has stepped in to impel the social media platform to take swifter and more robust action to tackle the issue.
The move follows a series of investigations by Which? over the past two years to unveil the source and scale of fake reviews, and the difficulties they cause for online shoppers.
It also comes as the government launched a dedicated watchdog to bring in and enforce a new code for governing the behaviour of digital platforms, including Facebook. The Digital Markets Unit (DMU) will be set up within the CMA from April 2021 once the necessary legislation is in place.
Find out how to spot a fake review and stay one step ahead.
What has Facebook promised to do?
Facebook has pledged before to better identify and investigate groups and other pages promoting misleading reviews, and is now stepping up its efforts.
It has already removed 16,000 groups on its site that solicited the buying and selling of bogus reviews and suspended or banned those who created these groups. It has also made changes to its systems to ensure it is able to fulfill its previous commitments, including:
- Suspending or banning users who are repeatedly creating Facebook groups and Instagram profiles that promote, encourage or facilitate fake and misleading reviews
- Introducing new automated processes that will improve the detection and removal of this content
- Making it harder for people to use Facebook’s search tools to find fake and misleading review groups and profiles on Facebook or Instagram
- Putting in place dedicated processes to make sure that these changes continue to work effectively and stop the problems from reappearing,
Andrea Coscelli, chief executive of the CMA, said: ‘If people lose trust in online reviews, they’re less able to shop around with confidence and will miss out on the best deals. It also means that businesses playing by the rules miss out.
‘Facebook has a duty to do all it can to stop the trading of such content on its platforms. After we intervened again, the company made significant changes – but it’s disappointing it has taken them more than a year to fix these issues.’
How Which? is combating fake reviews
Which? has been shining a light on the issues caused by fake reviews for years and first uncovered Facebook groups promoting fake or incentivised reviews in 2018. An undercover investigation revealed a range of groups willing to reimburse shoppers for Amazon purchases in return for positive reviews, while other sellers encouraged genuine customers to increase their ratings or refused the agreed refund.
The CMA has urged Facebook and eBay to combat the sale and promotion of fake reviews – which on Facebook at least, have persisted. Subsequent Which? investigations found that Facebook was failing to stop the trading of fake reviews – groups with hundreds of thousands of members were still being found, generating tens of thousands of posts a month.
Rocio Concha, director of policy and advocacy at Which?, said: ‘We’ve previously raised the alarm about fake review factories continuing to operate at scale on Facebook, leaving online shoppers at huge risk of being misled. The tech giant failed to meet its earlier commitment to the CMA, so it’s positive that the regulator has stepped in and demanded more robust action.
‘Facebook must deliver this time round – it has shown it has the sophisticated technology to eradicate these misleading review groups, and needs to do so much more swiftly and effectively.
‘The CMA and Facebook now need to monitor the situation, and if the problems persist the regulator must take stronger measures to ensure that trust in online reviews doesn’t continue to be undermined.
‘Online platforms should also have greater legal responsibility for tackling fake and fraudulent content, and activity on their sites.’