In response to the United Kingdom’s new children and teens’ privacy law, popular apps and websites have recently announced significant changes to their official policies for young users. 
Washington (October 8, 2021) – As major tech companies have announced policy changes intended to protect young users online in response to a new United Kingdom children’s privacy law, Senator Edward J. Markey (D-Mass.) and Representatives Kathy Castor (FL-14) and Lori Trahan (MA-03) wrote to the Federal Trade Commission today urging the agency to use its full authority—including its authority under Section 5 of the FTC Act—to ensure these companies comply with their new policies. The Age Appropriate Design Code (AADC) took effect in the U.K. this September and requires online services available to children and teens to meet 15 key children’s privacy standards, many of which are similar to legislative proposals to update Senator Markey’s 1998 law, the Children’s Online Privacy and Protection Act (COPPA), in the United States.
“The need to protect young people from privacy threats online is more urgent than ever. Since 2015, American children have spent almost five hours each day watching their screens, and children’s and teens’ daily screen time has increased by 50 percent or more during the coronavirus pandemic,” the lawmakers wrote in their letter. “We therefore encourage you to use every tool at your disposal to vigilantly scrutinize companies’ data practices and ensure that they abide by their public commitments.”
A copy of the letter can be found HERE.
In response to the AADC, Instagram publicly announced it is “defaulting young people into private accounts, making it harder for potentially suspicious accounts to find young people, [and] limiting the options advertisers have to reach young people with ads.” Google and its subsidiary YouTube announced they will be “tailoring product experiences for kids and teens” by changing to “private” the default video upload setting for teens between the ages of 13 and 17; turning off location history (without the option of turning it back on) for users under 18; and “block[ing] ad targeting based on the age, gender, or interests of people under 18,” among other changes. Last year — in a similar vein prior to the enactment of the AADC — TikTok stated that it had disabled messaging for the accounts of those under the age of 16 and increased parental controls.
The lawmakers note, “These policy changes are no substitute for congressional action on children’s privacy, but they are important steps towards making the internet safer for young users.”