Facebook, Twitter and YouTube plan to take a series of steps on Election Day to prevent the spread of misinformation, particularly around the results and the integrity of voting.
At Facebook, an operations center staffed by dozens of employees — what the company calls a war room — will work Tuesday to identify efforts to destabilize the election. The team, which will work virtually because of the coronavirus pandemic, has already been in action, Facebook said.
In a Twitter post on Monday night, Facebook said that the team was also tracking "potentially dangerous activity we saw with the swarming of Biden campaign buses this weekend," a reference to the caravan of Trump supporters that surrounded a Biden campaign bus in Texas. "We are monitoring closely and will remove content calling for coordinated harm or interference with anyone's ability to vote," it added.
Facebook's app also will look different on Tuesday. To prevent candidates from prematurely and inaccurately declaring victory, the company plans to add a notification at the top of News Feeds letting people know that no winner has been chosen until election results are verified by news outlets like Reuters and The Associated Press.
Twitter's strategy is twofold. One group of employees will work to root out false claims and networks of bots that spread such information by using both algorithms and human analysts, while another team will highlight reliable information in the Explore and Trends sections of its service.
Twitter plans to add labels to tweets from candidates who claim victory before the election is called by authoritative sources. At least two news outlets will need to independently project the results before a candidate can use Twitter to celebrate his or her win, the company said.
On Monday, Twitter labeled a post by Richard Grenell, President Trump's former acting director of national intelligence, showing Joseph R. Biden Jr. on a campaign plane without a mask as manipulated media. The photo used by Mr. Grenell, a former U.S. ambassador to Germany, was from 2019 before the pandemic.
But social media companies had a haphazard response on Monday to a video of Mr. Biden that was deceptively edited to make it appear as though he were admitting to voter fraud, labeling some versions of the video and not others. The video was viewed more than 17 million times on social media platforms, according to Avaaz, a progressive human-rights nonprofit that studied it.
The video was an edited clip from an Oct. 24 appearance by Mr. Biden, on the podcast "Pod Save America," in which he discussed the Obama administration's efforts to combat voter fraud and said that he had put together "the most extensive and inclusive voter fraud organization in the history of American politics."
YouTube said it would be especially sensitive about videos that attempt to challenge the election's integrity. YouTube already does not allow videos that mislead voters about how to vote or the eligibility of a candidate, or that incite people to interfere with the voting process. The company said it would take down such videos quickly, even if one of the speakers was a presidential candidate.
As the polls close, YouTube will feature a playlist of live election results coverage from what it deems authoritative news sources.
New York Times, November 3, 2020
November 3, 2020 #ElectionDay2020 #GetOutTheVote
Voices4America Post Script. This is what YouTube, Facebook and Twitter are doing today to protect the integrity of our Elections. Do your part too. #ShareThis #ElectionDay2020 #Vote #GetOutTheVote
This happened too.
Twitter hides Trump mail voting tweet ahead of polling day - BBC News