YouTube suspended President Donald Trump from uploading new videos to his official account for at least a week, making the decision days after fellow social media giants Twitter and Facebook shut the president out of his accounts because of concerns his posts will incite violence.
The Google-owned video site was the last of the major social media networks to suspend Trump after the attack on the U.S. Capitol. It said it removed a video uploaded Tuesday for violating its policies and "in light of concerns about the ongoing potential for violence."
YouTube wouldn't confirm which video broke its rules, but a review of archived versions of its site suggests it was a clip from a news conference Trump gave to reporters where he said his comments to supporters before the Capitol attack were "totally appropriate."
In the same clip, which is available on C-SPAN, Trump said social media companies were making a "catastrophic mistake" and doing a "horrible thing for our country" by penalizing him.
The White House did not respond to a request for comment. On Wednesday, the White House Twitter account shared a video statement from Trump. He didn't directly address tech companies, but he hinted at the issue, noting the "unprecedented assault on free speech we have seen in recent days."
"These are tense and difficult times," he said. "The efforts to censor, cancel and blacklist our fellow citizens are wrong and they are dangerous. What is needed now is for us to listen to one another, not to silence one another."
Last week, Facebook said it would cut the president off indefinitely, "for at least the next two weeks." Facebook chief operating officer Sheryl Sandberg later told Reuters that the company had no plans to reinstate the president's account. YouTube took down one video from the president's account. A day later, Twitter banned him.
Texas Attorney General Ken Paxton on Wednesday issued formal legal demands to Amazon, Apple, Facebook, Google and Twitter, aiming to investigate their' recent moves to ban Trump and shut down the alternative social-network Parler.
YouTube's decision came after a weekend of criticism that the company hadn't acted strongly enough against the president. The newly formed Alphabet Workers Union, a collection of Google employees and contractors, put out a statement saying YouTube's actions in taking down just one video were "lackluster, demonstrating a continued policy of selective and insufficient enforcement of its guidelines."
YouTube has a three-strike process when deciding which channels to take down, which directly affects the speed at which it moves. Facebook also has a strike system, but big, complex decisions often roll up directly to Sandberg and CEO Mark Zuckerberg. At Twitter, decisions are made by the company's policy team and signed off on by CEO Jack Dorsey.
Dorsey and Zuckerberg have become familiar faces on Capitol Hill, where they were called to testify in front of Congress about tech's power and role in misinformation last year. Google CEO Sundar Pichai has testified, as well, but YouTube CEO Susan Wojcicki has escaped questioning.
On Wednesday, Google also said it wouldn't allow political ads until at least Jan. 21, the day after the inauguration of President-elect Joe Biden. The company paused political ads in the week after the November presidential election, as well, after a policy Facebook had laid out earlier.
The strike against Trump's account means he can't add new videos for a minimum of seven days, YouTube said in a Twitter post late Tuesday. The company will also disable comments on his channel indefinitely. A second strike within the next three months would net Trump a two-week suspension, and a third would result in a ban, according to YouTube's policies.
Trump's YouTube account is still visible, and past videos can still be viewed.
Information for this article was contributed by Elizabeth Dwoskin of The Washington Post.