New York, United States
Lawmakers on Tuesday grilled executives from YouTube, Snap and TikTok about mounting concerns that their platforms can harm children and teenagers.
A bipartisan group of senators expressed concerns that the companiesâ software steered young people toward inappropriate posts, mishandled consumer data and did not do enough to spot dangerous content on their platforms. Lawmakers repeatedly said their staff had been able to find harmful content â including posts related to self-harm and pornography â inside the companiesâ products, sometimes while logged in as a teenager.
Sen. Richard Blumenthal, D-Conn., opened the hearing by accusing the companies of drawing young people further and further into their products.
Also read | Hate easiest way to grow on Facebook: Whistleblower Frances Haugen testifies before UK parliament
âEverything that you do is to add users, especially kids, and keep them on your apps for longer,â said Blumenthal, who leads the subcommittee of the Senate Commerce Committee holding the hearing.
The companies sent executives with political experience to answer the questions. TikTok was represented by Michael Beckerman, its head of public policy for the Americas who used to lead a top lobbying group for internet companies. Leslie Miller, YouTubeâs vice president for government affairs and public policy and a former Democratic political aide, appeared on behalf of the streaming site. Snap, the parent company of Snapchat, sent Jennifer Stout, its vice president for global public policy and John Kerryâs former deputy chief of staff.
Two weeks ago, Frances Haugen, a former Facebook product manager who leaked thousands of pages of internal documents, told the committee how the company knew that its products made teenagers feel worse about themselves. The decision to invite executives from other companies reflects how the lawmakersâ concerns go beyond Facebook and its photo app, Instagram, to include other major platforms across the web.
The companies quickly tried to distance themselves from each other, while arguing they were already taking significant steps to protect child users.
Also read | Facebook whistleblower brings campaign to Europe after disclosures
Stout said that Snapchat was an âantidote to social mediaâ and stressed the differences between Snapchat and Instagram. She said that her companyâs app focused on connecting people who already knew each other in real life, rather than feeding them a constant stream of content from strangers. And she said it focused on privacy, making images and messages delete by default.
She also stressed that Snapchat moderates the public content it promotes more heavily than other social media companies. Human moderators review content from publishers before promoting it in Discover, the public section of Snapchat that contains news and entertainment, Stout said. Content on Spotlight, Snapâs creator program that promotes videos from its users, is reviewed by artificial intelligence before being distributed, and reviewed by human moderators before it can be watched by more than 25 users, Stout added.
Beckerman said that TikTok was different from other platforms that focus more on direct communication between users. âItâs about uplifting, entertaining content,â he said. âPeople love it.â
He said that policymakers should look at the systems that verify whether users were old enough to use a product, suggesting that legislation should include language on age verification âacross apps.â
Also read | Inside the big Facebook leak
Lawmakers also hammered Beckerman about whether TikTokâs Chinese ownership could expose consumer data to Beijing. Critics have long argued that the company would be obligated to turn Americansâ data over to the Chinese government if asked.
âAccess controls for our data is done by our US teams,â said Beckerman. âAnd as independent researchers, independent experts have pointed out, the data that TikTok has on the app is not of a national security importance and is of low sensitivity.â
Senators repeatedly tried to prod the companies to commit to more transparency for researchers to investigate the health and safety of their platforms as well as support for elements of potential privacy legislation.
Miller of YouTube refused to be pinned down in a series of exchanges with senators. When Blumenthal asked whether the companies would allow independent researchers access to algorithms, data sets and data privacy practices, Miller responded, âIt would depend on the details, but weâre always looking to partner with experts in these important fields.â
Watch | Gravitas: How Facebook profits from hate
Blumenthal shot back that YouTubeâs answer âindicates certainly a strong hesitancy if not resistance to providing access.â
Similarly, Miller seemed reluctant to commit to aspects of potential privacy legislation such as a proposed update to the Childrenâs Online Privacy Protection Act. Specifically, she waffled on whether YouTube would support a ban on targeted advertising for children or curbs on adding âlikesâ or comments on videos â even as Miller said that the company already did not permit such features on childrenâs content.
The companies frequently argued that they were already taking the kind of steps that could be required by laws in the future.
âWe believe that regulation is necessary but given the speed at which technology develops and the rate at which regulation can be implemented, regulation alone canât get the job done,â Stout said.
Lawmakers resisted efforts by the executives to paint their employers as the exception to concerns about children's safety online.
âI understand from your testimony that your defense is: Weâre not Facebook,â Blumenthal said. âBeing different from Facebook is not a defense. That bar is in the gutter.â