In an hours-long hearing, Congress questioned Facebook CEO Mark Zuckerberg, Google CEO Sundar Pichai, and Twitter CEO Jack Dorsey over a wide range of concerns, including COVID-19 misinformation, racial bias, targeted advertising, child exploitation, and algorithmic amplification of disinformation.
The hearing was held remotely, with Zuckerberg and Pichai appearing from office spaces and Dorasay testifying from a kitchen.
Despite being asked what disinformation experts categorize informed and critical questions, the three CEOs adeptly avoided giving meaningful answers.
The hearing comes as Congress is considering what measure to take against tech companies in light of the January 6 Capital riot and COVID-19 misinformation online.
Reportedly, several policies are being considered for reforming, including Section 230 of the Communications Decency Act of 1996, which shields tech firms from liability for content posted on their platforms.
Yes or No Questions
In the middle of the appearance, Dorsey seemed to take a break from the hearing as he tweeted a “yes or no” poll with a single question mark left as a text.
Dorsay’s tweet is said to summarize both the essence of the joint hearing before two House Energy and Commerce subcommittees, which tried to press the tech giants CEOs about their roles in promoting disinformation and extremism online.
Rep. Billy Long required the tech leaders to answer whether they knew the difference between the word “yes” and “no.” As all CEOs answered in the affirmative, Long commented that he had won a steak dinner from a colleague for managing to receive a straight answer.
However, the CEOs seem to have evaded more questions by offering clear indications of their frustration with some of the proceedings.
Concerns about misinformation spreading
While the tech CEOs have faced similar hearings on Capitol Hill, as this interview was the first interrogation since the January 6 riot, several lawmakers focused on social media’s responsibility in creating conditions that encourage violence.
“Unfortunately, this disinformation and extremism doesn’t just stay online, it has real-world, dangerous and even violent consequences and the time has come to hold online platforms accountable for their part,” said Rep. Frank Pallone.
When pressed on whether the Capitol rioters had organized on Facebook, Zuckerberg attempted to put the blame on the rioters and other platforms.
Neither Zuckerberg nor Pichai provided Doyle with a clear answer to a question regarding their role in the attack, but Dorsey obliged.
“Yes,” said Dorsey, adding that “you also have to take into consideration the broader ecosystem.”
In the run-up to Thursday’s hearings, critics turned their attention to social media, with advocacy groups publishing reports on Facebook’s role in the Capitol attack and the spread of COVID-19 misinformation.
Twelve state attorneys general also signed a letter urging Facebook and Twitter to remove anti-vaccination misinformation from their platforms.
Algorithm influence on social media users
Congress members also focused their attention on platforms’ algorithms and their influence on users’ behavior. Lawmakers pointed out that as these algorithms, most of which are developed to get users to spend more time on the platforms, control what users see in their feed, they could easily push people towards more extreme and dangerous groups.
Faced with a question regarding the influence of FaceBook’s algorithm on the platform’s users, Zuckerberg mentioned the Section 230 reform – which will likely affect the company negatively – and suggested that platforms should have systems in place to address “unlawful content,” but should not be held liable if those systems fail.
Pichai also commented on the Section 230 reform, warning that it would have “unintended consequences” for free expression.
Claire Wardle, a co-founder of First Draft, a nonprofit organization that provides research and training on misinformation for journalists, commented that while CEOs are trained to deflect questions, lawmakers have become more aware of the complexity of these issues. Wardle described the asked questions as “specific and well-researched” and expressed hope that “we will start to see political action around harmful content online.”