This article contains graphically sexual language.
Google’s Keywords Planner, which helps advertisers choose which search terms to associate with their ads, offered hundreds of keyword suggestions related to “Black girls,” “Latina girls,” and “Asian Girls” — the majority of them pornographic, The Markup found in its research.
Searches in the keyword planner for “boys” of those same ethnicities also primarily returned suggestions related to pornography.
Searches for “White girls” and “White boys,” however, returned no suggested terms at all.
Google appears to have blocked results from terms combining a race or ethnicity and either “boys” or “girls” from being returned by the Keyword Planner shortly after The Markup reached out to the company for comment about the issue.
These findings indicate that, until The Markup brought it to the company’s attention, Google’s systems contained a racial bias that equated people of color with objectified sexualization while exempting White