Meta has recently sparked global outrage among parents and children’s rights activists after being accused of using images of young girls in advertisements targeted towards grown men.
The company that owns Threads, Instagram and Facebook has been found using images of young girls, some as young as 13, dressed in schoolgirl outfits as part of a campaign to attract middle-aged men to its social media platforms.
One user, a 37-year-old man (who wished to remain anonymous), reported receiving adverts displaying images of young schoolgirls dressed in what he described as “short skirts” on his Instagram feed. He claimed that the advertisements only portrayed female schoolchildren, and described the images as “deliberately provocative” and “exploitative”.
The images were used in adverts for the Meta app Threads and raised safety concerns as many images included both the names and faces of those pictured.
Images were taken from parents’ back-to-school Instagram posts, and without the knowledge or consent of either the parents or the girls themselves. Some were sourced from private accounts.
The father of a 13-year-old girl who featured in one of Meta’s recent adverts stated that he was “disgusted” by the “sexualised way” the company had used an image of his young daughter.
The incident demonstrates a lack of informed consent in Meta’s adverts. The mother of a 15-year-old whose image was used said that the advertisements are “absolutely disgusting”.
She said: “Not for any money in the world would [she] let them use a girl dressed in a school uniform to get people on [a social media platform].”
The post of her child amassed almost 7,000 views, 90% of which came from not following her. Half of the post’s viewers were aged over 44, and 90% of them were men.
This suggests that the adverts may have been deliberately and algorithmically recommended to older male Instagram users.
The anonymous 37-year-old man reported that he had never posted or liked any similar content, again suggesting that the Instagram algorithm is recommending images of young girls to older men, despite no user data to justify that decision.
Meta denies any foul play in its choice of advert images, stating that there is no issue with using publicly posted images for promotional purposes. A company spokesperson said: “The images shared do not violate our policies and are back-to-school photos posted publicly by parents. We have systems in place to help ensure we don’t recommend Threads shared by teens, or that go against our recommendation guidelines, and users can control whether Meta suggests their public posts on Instagram.”
Although Meta claims that its use of photos of children in adverts is appropriate, children’s rights activists disagree.
Beeban Kidron, a campaigner for children’s safety online, stated: “Offering up school-age girls as bait to advertise a commercial service is a new low, even for Meta. At every opportunity, Meta privileges profit over safety, and company growth over children’s right to privacy. It is the only reason they could think it appropriate to send pictures of schoolgirls to a 37-year-old man – as bait. Meta is a wilfully careless company.”
Activists, including Kidron, have appealed to the regulator Ofcom to ban companies from using potentially sexualised images of children for advertising purposes. Ofcom already has ‘illegal harm’ codes in place, aimed at tackling online grooming. These codes require that “children’s profiles and locations – as well as friends and connections – should not be visible to other users”.
Campaigners and activists are concerned that sharing the images, along with names, school logos, and locations, could amount to Meta violating these ‘illegal harm’ codes. Those opposing the advertising campaign are calling the promotion an exploitative exchange of children’s privacy for profit, a further decline in the ethics of big media platforms.

