YouTube is finalizing a plan under which it would no longer serve targeted ads in videos that are popular with kids, Bloomberg reported, citing anonymous sources. The move is intended to appease U.S. regulators that have been investigating whether YouTube violated privacy laws that prohibit online services from collecting data on children, according to the report.

Google declined to comment on the report. It’s unclear how YouTube would determine which videos would no longer be eligible to serve targeted ads.

Last month, the Federal Trade Commission reached a settlement with Google over allegations that YouTube violated the Children’s Online Privacy Protection Act, which bans companies from collecting personally identifiable information from children who are under 13, according to the Washington Post. The FTC would impose a multimillion-dollar fine under the agreement, which is subject to Justice Department approval, but other details of the settlement aren’t known, the Post reported.

In July 2018, a coalition of 23 advocacy groups alleged in a complaint with the FTC that YouTube has — for years — illegally collected data on children’s video viewing.

Kids’ content on YouTube, regardless of intended audience, is hugely popular. A study by the Pew Research Center released last month, analyzing English-language YouTube videos from the platform’s largest channels during the first week of January 2019, found that videos featuring a children who appeared to be under the age of 13 received nearly three times as many views on average as other types of videos.

Google’s policies prohibit children under 13 from signing up for a YouTube account, but critics say kids are easily able to watch videos on the site.

YouTube offers YouTube Kids, a separate platform with enhanced parental controls and curated video playlists. According to Google, interest-based ads are already prohibited in the YouTube Kids app, as are paid ads with remarketing or other tracking pixels. All ads in YouTube Kids must be pre-approved by YouTube’s policy team, the company says.

Separately, this past February, the FTC levied a $5.7 million fine against TikTok, the short-form video app owned by China’s Bytedance, over allegations that predecessor app Musical.ly violated COPPA by failing to get parental consent for users who were under 13.