We use cookies to allow us and selected partners to improve your experience and our advertising. By continuing to browse you consent to our use of cookies. You can understand more and change your cookies preferences here.

Coronavirus Read our latest advice

Facebook failing to stop the trading of fake reviews

Amazon review groups continue to thrive on the social media platform, with tens of thousands of products advertised every month

Facebook failing to stop the trading of fake reviews

Nearly two years after we uncovered a thriving marketplace in the trading of fake Amazon reviews on Facebook, where sellers offered free products in exchange for five-star reviews, our latest investigation reveals that Facebook still has a persistent problem on its platform.

Following action from the Competition and Markets Authority (CMA) in January this year, Facebook, along with eBay, committed to combating the trade of fake and misleading reviews on their sites. Both signed up to agreements to better identify, investigate and respond to this activity.

But in our latest investigation, we easily found evidence of dozens of Facebook groups continuing to trade in reviews. We analysed activity in 16 of these and found:

  • High membership, with 200,000 members across the groups
  • Active timelines – there were more than 82,000 posts between the groups in just one month
  • New tactics that sellers are using to avoid detection
  • Evidence of a direct impact on Amazon listings

eBay appears to have cracked down on listings selling five-star reviews – we found no evidence of sellers trading in reviews on the platform.

Buy Smart newsletter sign up box

A hive of activity

It remained disconcertingly easy to find dozens of suspicious-looking groups in minutes on Facebook. We searched for ‘Amazon review’ and then analysed the top 25 results.

Despite Facebook’s agreement to take action on review trading on the platform, 16 of the top groups were clearly trading in incentivised reviews. In many, sellers were openly asking for reviews in exchange for refunds or commission.

These groups remain a hive of activity. One had more than 36,000 members, while several others boasted more than 20,000. Overall, there were more than 200,000 members across the 16 groups.

In the most active group, which has 27,000 members, there had been an incredible 6,406 posts in one day. Combined with the other groups, there were nearly 16,000 posts that day. Over a month, the number of posts was at least 82,000.

New tactics to avoid detection

In some of the groups, we saw admins telling members not to use prohibited phrases such as refund, review, or 4 or 5 stars. Sellers used phrases such as revie-ws, r*fund, or p*y-pal.

Facebook committed to an agreement with the CMA to ‘introduce more robust systems to detect and remove such content’, and the appearance of ‘prohibited words’ in groups suggested that sellers are adopting new tactics to try to avoid the systems the social media platform has implemented.

During the course of our research, five of the groups were shut down by Facebook. When we reported our findings, the remaining groups were closed. A spokesperson told us: ‘Fraudulent activity is not allowed on our platforms, including offering or trading fake reviews. We will continue to invest in technology and our safety and security teams to proactively prevent this kind of activity.’

But we believe the action taken by Facebook so far doesn’t go far enough, and it is failing to stem the tide of reviews trading flooding its platform. It took one Which? researcher just minutes and a simple search for ‘Amazon review’ to uncover these groups – an action that could easily have been taken by Facebook.

The impact of review trading on Amazon listings

The difference incentivisation can make to the visibility of a product listing on Amazon is clear. Since we began investigating review trading on Facebook, we’ve uncovered 36 groups with total membership topping half a million people. This means potentially hundreds of thousands of incentivised reviews, designed to mislead shoppers, may have made their way onto Amazon.

It may be simple to spot the more obviously suspicious reviews, such as those that are overly effusive, but ratings and reviews also feed into well-trusted endorsements – potentially helping sellers to rise up the rankings.

In one Facebook group we found an Enacfire Bluetooth speaker listed with the promise of a refund after review. On Amazon the same product, which wasn’t being sold by Enacfire directly, had a massive 2,558 ratings and a customer score of 4.9 out of 5. It was also appearing as the top result on a search for ‘Bluetooth speakers’, which had returned more than 30,000 listings, and the blue version of the speaker appeared as Amazon’s Choice endorsed.

It wasn’t the only product we found that had Amazon’s coveted Choice endorsement attached. Two pieces of Backture camping equipment posted by an agent in a Facebook group – an inflatable camping mat and inflatable lounger – were also labelled as Amazon’s Choice.

When we checked the products at the time of writing, the Amazon’s Choice logo had been removed on the Backture Inflatable Camping Mat and on the ‘Blue & Black’ version of the Backture Inflatable Lounger.

But it remained in place on the ‘Blue’ version of the Backture Inflatable Lounger, and on a variation of the Enacfire speaker.
This isn’t the first time we’ve uncovered issues with the Amazon’s Choice endorsement, with a previous investigation showing that potentially poor products had their reviews boosted through incentivisation.

In our investigation into Amazon’s Choice in February, we found an Amazon’s Choice fitness tracker with hundreds of reviews and a 4.2 customer rating – but one reviewer posted a photo of a card that had arrived in the box, offering a £15 incentive for a five-star review.
Amazon told us: ‘We don’t tolerate Amazon policy violations, such as review abuse, incentivized reviews, counterfeits or unsafe products.

‘When deciding to badge a product as Amazon’s Choice, we proactively incorporate a number of factors that are designed to protect customers from those policy violations. When we identify a product that may not meet our high bar for products we highlight for customers, we remove the badge.’

Which? calls for greater commitment to stamp out fake reviews

Which? acknowledges that Facebook may face some technical challenges in dealing with this issue, but we expected the online platform to have made far more progress six months on from its pledge to the CMA.

Facebook must now show it is taking the problem seriously and demonstrate what more it will do to meet its commitment to the regulator.

The CMA must also ensure it is closely monitoring Facebook, setting out its next steps if the site fails to crack down on the problem on its platform.

We reported our findings to the regulator, and a CMA spokesperson said: ‘As a result of our action, Facebook committed to improving the way it identifies, investigates and responds to fake and misleading reviews, including introducing more robust systems and taking regular action to identify and remove the trading of these kinds of reviews on their platform. It also committed to regularly reviewing the effectiveness of these systems and actions.

‘We’ll look carefully at these worrying findings and raise them with Facebook as a matter of urgency.’

Which? also believes that the continued use of Facebook to successfully trade fake and misleading reviews unhindered is further evidence that online platforms must be made more responsible for the content and activity on their sites.

Natalie Hitchins, Head of Home Products and Services at Which?, said: ‘Our research shows that review trading groups continue to thrive on Facebook, leaving online shoppers at huge risk of being duped into buying products on Amazon that have been boosted by fake reviews.

‘It’s clear that Facebook has not lived up to its commitment to the CMA and must urgently address the spate of fake review groups on its site, or the regulator must intervene again.

‘The failure of sites like Facebook to crack down on bad practice underlines the need for online platforms to have more responsibility for content and activity on their sites.’

Back to top
Back to top