WASHINGTON • Top executives with major tech companies faced questions Wednesday from a U.S. Senate committee chaired by Mississippi’s Roger Wicker as political leaders wrestle with whether any sort of legislation designed to curb mass gun violence will receive any kind of vote.
Representatives with Facebook, Google and Twitter offered statements and took questions from the U.S. Senate Commerce Committee. A representative of the Anti-Defamation League was also present as a witness.
In response to queries by Wicker and other Commerce Committee members, those tech company leaders described an array of efforts to identify content that portrays or promotes violence, terrorism and the like and to remove it.
In his opening remarks to the hearing, Wicker – a Republican from Tupelo – invoked recent spates of gun violence in El Paso, Texas and Dayton, Ohio.
“Following the shooting, President Trump called on social media companies to work in partnership with local, state, and federal agencies to develop tools that can detect mass shooters before they strike – I certainly hope we talk about that challenge today,” Wicker said.
Following calls by the president for some kind of response, Senate Majority Leader Mitch McConnell, R-Kentucky, called on Wicker, Sen. Lindsay Graham, R-South Carolina, and Sen. Lamar Alexander, R-Tennessee to hold hearings related to the issue of gun violence.
Wednesday’s hearing was light on any legislative response. Rather, questioning variously centered on how tech and social media companies can cooperate with law enforcement, act more proactively and identify potentially violent individuals.
“I sincerely hope we can engage in a collaborative discussion about what more can be done within the jurisdiction of this committee to keep our communities safe from those wishing to do us harm.”
Technology companies have come under particular scrutiny as mass shooters have frequently begun to post manifestos online or to even stream their violent acts over the internet, something Wicker highlighted in his remarks.
Witnesses before the committee Wednesday touted their efforts to create systems – including artificial intelligence – that identify and remove objectionable content.
Witnesses appearing before the company from tech companies were Monika Bickert, head of global policy management for Facebook; Nick Pickles, public policy director for Twitter; and Derek Slater, global director of information policy for Google.
Citing recent statistics, Bickert said that “more than 99 percent” of content removed for connections with terrorism or violence was identified by software programs or artificial intelligence.
Sen. Richard Blumenthal, D-Connecticut, did use the hearings to promote “red flag” legislation backed by himself and Sen. Lindsay Graham, R-South Carolina.
Red flag laws typically allow family members or law enforcement to seek a court order temporarily taking guns from individuals thought to be a risk to themselves or others.
The bill supported by Blumenthal and Graham would offer federal grants to assist and encourage states to enact red flag laws.
In her time for questions, Sen. Tammy Duckworth, D-Illinois, derided Wednesday’s hearing as an inadequate response to the scourge of mass shootings in the United States.
“Nothing highlights the absurdity of Congress’s inability to solve the gun-violence crisis than seeing 318 mass shootings in 260 days, and then holding a hearing on extremism in social media,” Duckworth said.