Facebook is still targeting adverts to children based on personal data

When the UK’s Age-Appropriate Design code came into force this September it was quietly preceded by announcements from big tech companies highlighting the changes they were making. Examples of these changes include TikTok giving children enhanced protection by default[1], Google making safe search a default for under 18 and the right to delete problematic images, and Facebook/Instagram stopping advertisers from targeting specific children based on their profiles on their platform, beyond age, location and gender. Facebook have since gone even further by further restricting the ability of advertisers to target people based on their health, race and ethnicity, political affiliation, religion and sexual orientation.

For many it looked like the proverbial leopard had changed its spots. Facebook, a company widely associated with massive profiling of their users, that generates over 98.5% of its revenue from advertising, was no longer going to put their own interests, or those of their advertisers, above those of its younger users by deciding to turn off their powerful targeting engine when advertisers were trying to reach children.

In their first public statement they claimed that, after consulting with youth advocates, they were going to take ‘a more precautionary approach in how advertisers can reach young people’. This meant Facebook would ‘only allow advertisers to target ads to people under 18 based on their age, gender and location.’ This point was confirmed in Facebook’s appearance before the US Senate in September.

However, as the research published today by Fairplay, Reset Australia and Global Action Plan shows, these statements may have been highly misleading about what changes were actually being implemented with regard to the targeting of children on Facebook.

The reality that the report reveals is that whereas they have indeed removed the ability of advertisers to select attributes beyond age, gender and location for children, this does not mean that adverts will be displayed to children randomly who meet those limited selected criteria. It seems that Facebook will be taking over the role of targeting children on behalf of advertising, all powered by their own AI-generated algorithm.

The changes that Facebook is making may, in fact, be making the targeting more problematic for children than before. As Facebook themselves note when talking about the ad-targeting algorithm, ‘each time an ad is shown, the delivery system’s predictions of relevance become more accurate. As a result, the more an ad is shown, the better the delivery system becomes.’

This means that the algorithm is learning to optimise the delivery of adverts to ensure that they are as ‘relevant’ to the targeted user as possible, often with little or no human guidance. As the algorithm tries to optimise for engagement this could lead it to target adverts to children that are inherently problematic. Because, as Frances Haugen noted in her recent testimony, ‘algorithms learn correlations’ – this could mean that it learns to identify when a teen’s mood suggests that they are particularly vulnerable to advertising in general, and possibly particularly vulnerable to specific types of adverts or certain products.

This is all particularly concerning, not just because we can see big tech trying to bamboozle us, but because children find it much harder than adults to identify adverts over organic content and are especially vulnerable to the pressures of marketing. Children also generally have a low level of understanding about how online surveillance-based advertising works. As the letter released today highlights, ‘ever more personalised, ever more optimised advertising to children has the capacity to amplify these harms.’[2]

Transferring the targeting power from the advertiser to a Facebook algorithm not only does nothing to mitigate the harms of surveillance-based advertising but in all likelihood exacerbates it. The only way forward for Facebook is to implement the ban based on any personal data for all children across all their platforms – including by their own algorithm.

 

[1] https://newsroom.tiktok.com/en-us/furthering-our-safety-and-privacy-commitments-for-teens-on-tiktok-us

[2] Letter to Facebook