'Goodbye Meta AI' viral post - what is it and why you shouldn't share it

A viral post dubbed ‘Goodbye Meta AI’, falsely claims that users of platforms such as Facebook, Instagram and WhatApp can opt out of having their posts used to train the company's AI models. But what’s the point of the post?
Hundreds of thousands of users of the three platforms, owned by Meta, have re-posted a statement which says that an 'attorney' has advised users to share a post to refuse permission for Meta to use their data to train its AI.
This comes following the news that Meta plans to use data from user's posts on its sites to train its AI models.
But in reality, copying and pasting this message won't do anything to prevent your posts from being used by Meta. Instead, you can use a form to opt out of this here.
Viral hoax posts like this, and the ones below, may seem pointless, but there can be a more sinister purpose behind them.
Sign up for scam alerts
Our emails will alert you to scams doing the rounds, and provide practical advice to keep you one step ahead of fraudsters.
Sign up for scam alerts
Missing boy in Hartlepool
A Facebook post shared in a group about buying and selling motorbikes and scooters in the local area spoke of a missing boy in Hartlepool.
The post used an image of a bruised child (which we’ve blurred out) to shock users and urge them to engage with the post.
It says that a missing boy was found by the police and taken to the police station but no one knows how he got there or who he is. It ends with: ‘Let's flood our feeds so that this post may reach his family’ and has been shared 54 times so far.
The profile behind the scam also posted a fake missing dog in the Hartlepool-based group. We found the same image posted on social media three other times saying that the dog was found in three separate areas.
We also found the image of the 'missing boy in Hartlepool' posted in US-based groups where the boy was said to be found in the local areas of twelve other places.
Missing boy in Kings Lynn
Another post also of a bruised missing child, but this time shared in a local community group called King’s Lynn, was shared 86 times and said:
‘This little boy, approximately 2 years old, was found last night walking behind a home here in Kings Lynn. Deputy Tyler Cooper saved him and took him to the Police Station, but no one has an idea where he lives.’
We also found this post shared in a Facebook group dedicated to users in Louisville in America as well as on Instagram, but this time, the boy was said to be found in Riverside. This post had 22,373 likes.
When we searched both of these images using a reverse image search tool, we only found identical scam posts, indicating that the images are most likely of non-existent people created using artificial intelligence (AI).
Why do scammers create viral posts?
Scammers create posts that will attract wide interest and attention in the hopes that they’ll be shared numerous times.
Once the post has been shared enough times they sometimes transform it into a scam post to promote their dodgy schemes. Previously, we’ve reported on how missing person posts have turned into investment scams.
Other times, scammers use these posts to bring attention to scam profiles through likes and friend requests so that these profiles can eventually contact you and try to lure you into their scams.
If you come across a post like this, particularly one that’s been shared in a local community group and has nothing to do with the usual posts in that group, think twice before engaging with it.
Check the profile that’s posting the post. If it’s newly created or from a profile without any other content, be suspicious.
You can also use a reverse image search tool like Google (by selecting the image icon on the right of the search bar) or TinEye, to see if images used in the post are used elsewhere on the internet.
If you lose any money to a scam, call your bank immediately using the number on the back of your bank card and report it to Action Fraud or call the police on 101 if you’re in Scotland.