Zuckerberg – who runs Instagram and Facebook – turned to parents of children in the audience and said “no one should have to go through” what they went through.
He and the directors of TikTok, Snap, X and Discord were questioned for almost four hours by senators from both parties.
Parliamentarians want to know what companies are doing to protect children on their internet platforms.
Legislation is pending in Congress that seeks to hold social media companies accountable for material posted on their platforms.
Wednesday’s hearing (31/1) was a rare opportunity for US senators to question the heads of technology companies.
Zuckerberg and TikTok CEO Shou Zi Chew voluntarily agreed to give their testimony – but the directors of Snap, X (formerly Twitter) and messaging platform Discord initially refused and were subpoenaed.
Sitting behind the five directors of the tech giants were families who said their children had been hurt or killed themselves as a result of content on social media.
Parents spoke out throughout the hearing — booing when CEOs entered and applauding when lawmakers asked tough questions.
While the hearing focused primarily on protecting children from online sexual exploitation, the issues varied widely, with senators seizing the opportunity to have five powerful executives under oath.
Shou Zi Chew, director of TikTok (which is owned by Chinese company ByteDance), was asked whether his company shared US user data with the Chinese government, which he denied.
Senator Tom Cotton asked Chew, who is from Singapore, if he had ever belonged to the Chinese Communist Party.
“Senator, I’m from Siganpura. No,” Chew replied.
Cotton then asked, “Have you ever been associated or affiliated with the Chinese Communist Party?”
Chew replied, “No, Senator. Again, I’m from Singapore.”
He added that as a father of three young children, he knew the issues being discussed were “horrible and every parent’s nightmare.”
He said his own children did not use TikTok because of rules in Singapore that prohibit children under 13 from creating accounts.
The focus of the hearing was the attitude of companies towards online security legislation currently being considered in Congress.
This was summed up in a tense debate between Discord’s Jason Citron and Republican lawmaker Lindsey Graham.
Graham listed a number of bills pending in Congress related to online security, asking whether Citron supported them or not.
Although Graham gave Citron little opportunity to respond, the Discord boss seemed to have reservations about most of them.
Graham concluded: “So here it is – if you expect these guys to solve the problem, we’re going to die waiting.”
Before the hearing, Meta announced new security measures, including that minors will not be able to receive messages from strangers on Instagram and Messenger.
Social media industry analyst Matt Navarra told the BBC he thinks the hearing resembles many other showdowns — with a lot of arrogance on the part of politicians and Zuckerberg seizing a perfect opportunity to appear in news photos, with his request of apology, according to him.
The expert says that, although senators agree on the need for bipartisan legislation to regulate platforms, it is not known what will be decided in the future.
“We’ve seen these hearings over and over again, so far, they haven’t generated any significant or substantial rulemaking,” he said.
“It’s 2024 and the US has virtually no regulation, as was pointed out in the hearings, when it comes to social media companies.”
At the hearing, the directors revealed how many people they hired to moderate content on their platforms.
Meta and TikTok, with the largest number of users on the platforms represented, said they had 40,000 moderators each; Snap said it had 2,300, X said it had 2,000, and Discord – which claimed to be a smaller platform – said it had “hundreds” of moderators.
Discord is a messaging platform and has been questioned in the past about how it detects and prevents child abuse on its platform.
After the hearing, some of the parents in the room organized a rally outside, with several calling on lawmakers to urgently pass legislation to hold companies accountable.
“Just as I did before, many parents continue to think that this damage we are talking about today will not affect their families,” said Joann Bogard, whose son Mason died in May 2019. She said Mason participated in a TikTok chain about choking .
“These harms are happening overnight to ordinary children,” she said. “We have the testimonies. Now it’s time for our lawmakers to pass the Kids’ Online Safety Act.”
Arturo Béjar, a former high-ranking Meta official who testified before Congress in November 2023, was also at Wednesday’s hearing and told the BBC: “Meta is trying to shift its responsibility to provide a safe environment for teenagers, but does not add a button where a teenager can say they have been unwantedly approached.”
“How can they make this a safe environment for teens without that?”
During this week’s hearing, Meta said it created “more than 30 tools” to create a safe environment for teenagers on the internet.
Mark Zuckerberg apologizes for damage caused to his social networks