Kristine Hostetter was a beloved fourth-grade teacher. Then came the pandemic, the election and the Jan. 6 riot in Washington.
After losing an ugly congressional race last year, Denver Riggleman is leading a charge against the conspiracy-mongering coursing through his party. He doesn’t have many allies.
It failed the test for conspiracy connoisseurs, and the public.
Allowing the U.S. government to be held hostage by political extremists is unacceptable.
The agency, responding to what the force called “a possible plot to breach the Capitol,” again sounded the alarm that pro-Trump conspirators may be planning an attack.
The news is awash with stories of platforms clamping down on misinformation and the angst involved in banning prominent members. But these are Band-Aids over a deeper issue — namely, that the problem of misinformation is one of our own design. Some of the core elements of how we’ve built social media platforms may inadvertently increase polarization and spread misinformation.
If we could teleport back in time to relaunch social media platforms like Facebook, Twitter and TikTok with the goal of minimizing the spread of misinformation and conspiracy theories from the outset … what would they look like?
This is not an academic exercise. Understanding these root causes can help us develop better prevention measures for current and future platforms.
Some of the core elements of how we’ve built social media platforms may inadvertently increase polarization and spread misinformation.
As one of the Valley’s leading behavioral science firms, we’ve helped brands like Google, Lyft and others understand human decision-making as it relates to product design. We recently collaborated with TikTok to design a new series of prompts (launched this week) to help stop the spread of potential misinformation on its platform.
The intervention successfully reduces shares of flagged content by 24%. While TikTok is unique amongst platforms, the lessons we learned there have helped shape ideas on what a social media redux could look like.
We can take much bigger swings at reducing the views of unsubstantiated content than labels or prompts.
In the experiment we launched together with TikTok, people saw an average of 1.5 flagged videos over a two-week period. Yet in our qualitative research, many users said they were on TikTok for fun; they didn’t want to see any flagged videos whatsoever. In a recent earnings call, Mark Zuckerberg also spoke of Facebook users’ tiring of hyperpartisan content.
We suggest giving people an “opt-out of flagged content” option — remove this content from their feeds entirely. To make this a true choice, this opt-out needs to be prominent, not buried somewhere users must seek it out. We suggest putting it directly in the sign-up flow for new users and adding an in-app prompt for existing users.
Shift the business model
There’s a reason false news spreads six times faster on social media than real news: Information that’s controversial, dramatic or polarizing is far more likely to grab our attention. And when algorithms are designed to maximize engagement and time spent on an app, this kind of content is heavily favored over more thoughtful, deliberative content.
The ad-based business model is at the core the problem; it’s why making progress on misinformation and polarization is so hard. One internal Facebook team tasked with looking into the issue found that, “our algorithms exploit the human brain’s attraction to divisiveness.” But the project and proposed work to address the issues was nixed by senior executives.
Essentially, this is a classic incentives problem. If business metrics that define “success” are no longer dependent on maximizing engagement/time on site, everything will change. Polarizing content will no longer need to be favored and more thoughtful discourse will be able to rise to the surface.
Design for connection
A primary part of the spread of misinformation is feeling marginalized and alone. Humans are fundamentally social creatures who look to be part of an in-group, and partisan groups frequently provide that sense of acceptance and validation.
We must therefore make it easier for people to find their authentic tribes and communities in other ways (versus those that bond over conspiracy theories).
Mark Zuckerberg says his ultimate goal with Facebook was to connect people. To be fair, in many ways Facebook has done that, at least on a surface level. But we should go deeper. Here are some ways:
We can design for more active one-on-one communication, which has been shown to increase well-being. We can also nudge offline connection. Imagine two friends are chatting on Facebook messenger or via comments on a post. How about a prompt to meet in person, when they live in the same city (post-COVID, of course)? Or if they’re not in the same city, a nudge to hop on a call or video.
In the scenario where they’re not friends and the interaction is more contentious, platforms can play a role in highlighting not only the humanity of the other person, but things one shares in common with the other. Imagine a prompt that showed, as you’re “shouting” online with someone, everything you have in common with that person.
Platforms should also disallow anonymous accounts, or at minimum encourage the use of real names. Clubhouse has good norm-setting on this: In the onboarding flow they say, “We use real names here.” Connection is based on the idea that we’re interacting with a real human. Anonymity obfuscates that.
Finally, help people reset
We should make it easy for people to get out of an algorithmic rabbit hole. YouTube has been under fire for its rabbit holes, but all social media platforms have this challenge. Once you click a video, you’re shown videos like it. This may help sometimes (getting to that perfect “how to” video sometimes requires a search), but for misinformation, this is a death march. One video on flat earth leads to another, as well as other conspiracy theories. We need to help people eject from their algorithmic destiny.
With great power comes great responsibility
More and more people now get their news from social media, and those who do are less likely to be correctly informed about important issues. It’s likely that this trend of relying on social media as an information source will continue.
Social media companies are thus in a unique position of power and have a responsibility to think deeply about the role they play in reducing the spread of misinformation. They should absolutely continue to experiment and run tests with research-informed solutions, as we did together with the TikTok team.
This work isn’t easy. We knew that going in, but we have an even deeper appreciation for this fact after working with the TikTok team. There are many smart, well-intentioned people who want to solve for the greater good. We’re deeply hopeful about our collective opportunity here to think bigger and more creatively about how to reduce misinformation, inspire connection and strengthen our collective humanity all at the same time.
Facebook has gone out of its way to help law enforcement officials identify those who participated in the January 6 riot at the US Capitol, the company said in a Thursday conference call with reporters.
“We were appalled by the violence,” said Monika Bickert, Facebook’s vice president of content policy. “We were monitoring the assault in real time and made appropriate referrals to law enforcement to assist their efforts to bring those responsible to account.”
She added that this “includes helping them identify people who posted photos of themselves from the scene, even after the attack was over” and that Facebook is “continuing to share more information with law enforcement in response to valid legal requests.”
Politicians and political scientists wonder if there are electoral reforms that might blunt the lunacy.
Ms. Cheney, having fended off a challenge to her House leadership role, was defiant in defending her impeachment vote and called for Republicans to be “the party of truth.”
Ms. Cheney, having fended off a challenge to her House leadership role, was defiant in defending her impeachment vote and called for Republicans to be “the party of truth.”
Recast by President Trump’s most ardent supporters as a MAGA martyr, Michael T. Flynn has embraced his role as the man who spent four years unjustly ensnared in the Russian investigation.
The mass execution cult has roots in three decades of demonization.
Senator Mark Warner, the committee’s new chairman, said he hoped to lead a bipartisan investigation of the groups, their overseas ties and amplification of their message by foreign powers.
Democrats in the House voted to strip freshman Georgia Representative Marjorie Taylor Greene of some of her responsibilities Thursday, citing her penchant for violent, anti-democratic and at times anti-Semitic conspiracy theories.
Greene has expressed support for a range of alarming conspiracies, including the belief that the 2018 Parkland school shooting that killed 17 people was a “false flag.” That belief prompted two teachers unions to call for her removal from the House Education Committee — one of her new committee assignments.
The vote on a resolution to remove Greene from her committee assignments broke along party lines, with nearly all Republicans opposing the resolution. Some of her colleagues even voted in Greene’s defense in spite of condemning her behavior in the past.
As the House moved to vote on the highly unusual resolution, the new Georgia lawmaker claimed that her embrace of QAnon was in the past.
“I never once said during my entire campaign “QAnon,’” Greene said Thursday. “I never once said any of the things that I am being accused of today during my campaign. I never said any of these things since I have been elected for Congress. These were words of the past.”
But as the Daily Beast’s Will Sommer reported, a deleted tweet from December shows Greene explicitly defending QAnon and directing blame toward the media and “big tech.”
In another recently-uncovered post from January 2019, Greene showed support for online comments calling for “a bullet to the head” for House Speaker Nancy Pelosi and executing FBI agents.
Greene has also shared openly racist, Islamophobic and anti-Semitic views in Facebook videos, a track record that prompted Republican House Minority Leader Kevin McCarthy to condemn her statements as “appalling” last June. More recently, McCarthy defended Greene against efforts to remove her from committees.
Greene was elected in November to represent a conservative district in northwest Georgia after her opponent Kevin Van Ausdal dropped out, citing personal reasons. Greene beat her opponent in the Republican primary in August, winning 57% of the vote.
QAnon, a dangerous once-fringe collection of conspiracy theories, was well-represented in January’s deadly Capitol riot and many photos from the day show the prevalence of QAnon symbols and sayings. In 2019, an FBI bulletin warned of QAnon’s connection to “conspiracy theory-driven domestic extremists.” A year later, at least one person who had espoused the same views would win a seat in Congress.
The overlap between Greene’s beliefs and those of the violent pro-Trump mob at the Capitol escalated tensions among lawmakers, many of whom feared for their lives as the assault unfolded.
A freshman representative with little apparent appetite for policy or coalition-building, Greene wasn’t likely to wield much legislative power in the House. But as QAnon and adjacent conspiracies move from the fringe to the mainstream and possibly back again — a trajectory largely dictated by the at times arbitrary decisions of social media companies — Greene’s treatment in Congress may signal what’s to come for a dangerous online movement that’s more than demonstrated its ability to spill over into real-world violence.
Democrats pressed past Republicans’ objections to remove the Georgia freshman from her two committee posts in a vote without precedent in the modern Congress.
Does the G.O.P. have the spine to rein in Marjorie Taylor Greene?
In a statement, Representative Kevin McCarthy of California said he “unequivocally” condemned the Georgia freshman’s violent and conspiracy-laden remarks, but declined to revoke her committee posts.
Millions of Americans continue to actively participate in multiple conspiracy theories. Why?
The Democratic Congressional Campaign Committee has started a $500,000 effort on television and online portraying House Republicans as aligned with Marjorie Taylor Greene and QAnon conspiracy theories.
These steps, experts say, could prod more people to abandon the scourge of hoaxes and lies.
She embarrasses some Republicans, but she’s no outlier.
There’s still time for Republican leaders to reject Q.
When normal life recedes, ideology fills the vacuum.
In a newly unearthed video from 2018, Representative Marjorie Taylor Greene suggested that 9/11 was a hoax, President Barack Obama was a Muslim and the Clintons were guilty of murder.
The inauguration of Joe Biden shattered the fundamentals of the baseless QAnon theory. What happens to its followers now?
During the political fallout after four years of Donald J. Trump, one question is what will happen with the followers of conspiracy theories that bend Americans’ perceptions of reality.
A number of members of Congress have links to organizations and movements that played a role in the Jan. 6 assault on the Capitol.
They also discuss “flouncing” and whether a Substack newsletter would be too much work for Donald Trump.
Leaderless but united by racist ideology that has been supercharged by social media, extremists have built a web of real and online connections that worry officials.
QAnon adherents called it “the storm.” At midday on Wednesday, there were supposed to be blackouts across the US, military tribunals led by Donald Trump and the mass execution of Democrats in the streets.
It did not happen. Instead, Joe Biden was inaugurated as the 46th US president and the day of reckoning anticipated by the pro-Trump conspiracy cult failed to materialize, dismaying the faithful.
“QAnon believers invested all their remaining hopes in false beliefs that Trump would take action validating their theories before or during inauguration,” said Jared Holt, a research fellow focused on extremism at the Atlantic Council’s Digital Forensic Research Lab. “For some followers, watching Biden and (Vice-president Kamala) Harris sworn into office was a breaking point in their beliefs.”
As President Biden took office, some QAnon believers tried to rejigger their theories to accommodate a transfer of power.
Avril Haines, who has been nominated to be director of national intelligence, told senators that she would assist with a public written assessment of the threat from QAnon.
Merchandise with phrases like “Battle for Capitol Hill Veteran” could still be purchased on major e-commerce sites, a sign of how the platforms have struggled to remove the goods.
Valerie Gilbert posts dozens of times a day in support of an unhinged conspiracy theory. The story of this “meme queen” hints at how hard it will be to bring people like her back to reality.
A network of far-right agitators across the country spent weeks organizing and raising money for a mass action to overturn President Trump’s election loss.
It was a showdown between reality and dark digital fantasy. Fantasy didn’t lose.
After the Capitol Hill riot, the divide between reality and fantasy may become too wide to bridge.
Amazon has begun the process of removing QAnon-related products from its platform.
A spokesperson for the company said that the process may take a few days. Any sellers that attempt to evade the company’s systems and list products will be subject to action, including a blanket selling ban across Amazon stores.
News of the ban was first reported by The New York Times.
The company is shutting down the nation’s newest favorite conspiracy theory by removing products sold by QAnon adherents from its platform after supporters were prominently on display at the riot in the nation’s Capitol last week.
Amazon’s ban of Q-related products follows the company’s decision to remove Parler from its web servers and cloud services platform.
The ban applies to any self-published books that promote QAnon or any clothing, posters, stickers, or other merchandise related to the Q conspiracy theory.
Amazon has policies that prohibit products that “promote, incite, or glorify hate or violence toward any person or group,” the company said.
A cursory search of the company’s platform on Monday revealed that the ban isn’t being applied to all of the Q-related products for sale.
Seven pages of Q-related products were surfaced under the search for “WWG1WGA” an acronym for the Q-related phrase, “Where we go one, we go all.”
The widely discredited Q conspiracy theory was born from a stew of different conspiracy theories that emerged from the 4chan message boards back in 2017.
Since its emergence, the conspiracy theory has grabbed the attention of conservative activists, and its supporters were highly visible among the group of rioters that stormed the Capitol building last week — even as at least one Q-believer joined Congress the same week.
Amazon’s decision to ban the sale of Q-related goods comes many, many, many years after the movement was first linked to violence, as TechCrunch previously reported.
The conspiracy’s followers have also interfered with legitimate child safety efforts by hijacking the hashtag #savethechildren, and exporting their extreme ideas into mainstream conversation under the guise of helping children. Facebook, which previously banned QAnon, limited the hashtag’s reach in late 2020 because of the interference.
The actions followed the barring of President Trump from the service last week, as Twitter has moved to distance itself from violent content.
Twitter on Friday removed the accounts of several high-profile supporters of President Trump and the QAnon conspiracy theory. Targets included former Trump advisor Michael Flynn and former Trump lawyer Sidney Powell.
The rioters who broke into the Capitol on Wednesday were of course supporters of Donald Trump. Many were inspired by QAnon as well. The QAnon conspiracy theory centers around a supposed government insider who goes by “Q.” Members of this community have baselessly accused liberal leaders like Hillary Clinton, Barack Obama, and George Soros of running a sex trafficking ring and attempting to organize a coup against President Trump.
These paranoid beliefs evidently inspired some QAnon enthusiasts to stage an attempted coup of their own on Wednesday, as supporters broke into the Capitol to attempt to prevent Congress from certifying Joe Biden’s election. Their actions led to the deaths of five people, including US Capitol Police Officer Brian Sicknick.
Twitter took action against a pair of President Trump’s close associates, banning them from the platform Friday.
Trump ally Michael Flynn and former Trump campaign lawyer Sidney Powell were both suspended under Twitter’s “Coordinated Harmful Activity” policy. Ron Watkins, who previously ran 8kun (formerly 8chan) also saw his account removed.
“We’ve been clear that we will take strong enforcement action on behavior that has the potential to lead to offline harm, and given the renewed potential for violence surrounding this type of behavior in the coming days, we will permanently suspend accounts that are solely dedicated to sharing QAnon content,” a Twitter spokesperson told TechCrunch.
Each figure has promoted the QAnon conspiracy in recent months. As part of Trump’s post-election legal team, Powell became a heroic figure to the QAnon crowd, which believes that a master plan being orchestrated behind the scenes will give the president a second term.
Michael Flynn, Trump’s first national security adviser, embraced the QAnon movement last year, reciting an oath and saying the popular QAnon motto “where we go one, we go all!” Flynn has also been actively involved in Trump’s quest to overturn the results of the November election.
Of the three, Watkins is the furthest from Trump and the closest to the heart of QAnon. As the administrator of QAnon’s central online hub, Watkins played a key role QAnon’s explosion into the mainstream over the last few years. Beyond the ranks of believers, some QAnon observers believe that Ron Watkins or his father Jim Watkins are the mysterious “Q” figure, perpetuating the elaborate scheme by doling out cryptic bread crumbs for QAnon adherents.
It might be time to crack down, rather than reach out.
Electoral violence is in our DNA.
After 14 years in the military, Ashli Babbitt bought a pool supply company and delved into far-right politics.
For obvious reasons, Trump doesn’t have a TikTok account. But the President’s speeches that helped incite the mob who yesterday stormed the U.S. Capitol will have no home on TikTok’s platform. The company confirmed to TechCrunch its content policy around the Capitol riots will see it removing videos of Trump’s speeches to supporters. It will also redirect specific hashtags used by rioters, like #stormthecapitol and #patriotparty, to reduce their content’s visibility in the app.
TikTok says that Trump’s speeches, where the President again reiterated claims of a fraudulent election, are being removed on the grounds that they violate the company’s misinformation policy. That policy defines misinformation as content that is inaccurate or false. And it explains that while TikTok encourages people to have respectful conversations on subjects that matter to them, it doesn’t permit misinformation that can cause harm to individuals, their community or the larger public.
A rioting mob intent on stopping democratic processes in the United States seems to fit squarely under that policy.
However, TikTok says it will allow what it calls “counter speech” against the Trump videos. This is a form of speech that’s often used to fight misinformation, where the creator presents the factual information or disputes the claims being made in another video. TikTok in November had allowed counter speech in response to claims from Trump supporters that the election was “rigged,” even while it blocked top hashtags that were used to promote these ideas.
In the case of Trump’s speeches, TikTok will allow a user to, for example, use the green screen effect to comment on the speech — unless those comments support the riots.
In addition, TikTok is allowing some videos of the violence that took place at the Capitol to remain. For example, if the video condemns the violence or originates from a news organization, it may be allowed. TikTok is also applying its recently launched opt-in viewing screens on “newsworthy” content that may depict graphic violence.
These screens, announced in December, appear on top of videos some viewers may find graphic or distressing. Videos with the screens applied are already eligible for TikTok’s main “For You” feed, but may not be prohibited. When viewer encounters a screen, they can just tap a button to skip the video or they can choose to “watch anyway.” (It could not provide any example of the screens in use, however.)
Anecdotally, we saw videos that showed the woman who was shot and killed yesterday appear on TikTok and then quickly disappear. But those we came across were from individual users, not news organizations. They were also not really condemning the riot — they were just direct video footage. It’s unclear if the specific videos we saw were those that TikTok itself censored or if the user chose to remove them instead.
Separately from graphic content, TikTok says it will remove videos that seek to incite, glorify, or promote violence, as those also violate its Community Guidelines. In these cases, the videos will be removed as TikTok identifies them — either via automation or user reporting.
And, as it did in November, TikTok is proactively blocking hashtags to reduce content’s visibility. It’s now blocking tags like #stormthecapitol and #patriotparty among others, and redirects those queries to its Community Guidelines. There are currently redirections across dozens of variations of those hashtags and others. The company doesn’t share its full list in order to protect its safeguards, it says.
We should point out that for all Twitter’s posturing about safety and moderation, it allowed Trump to return to its app, after a few key tweets were deleted. And it has yet to block hashtags associated with false claims, like #stopthesteal, which continues to work today. Facebook, on the other hand, banned Trump from Facebook and Instagram for at least two weeks. Like TikTok, it had previously blocked the #stopthesteal and #sharpiegate hashtags with a messages about its Community Standards. (Today those searches are erroring out with messages that say “This Page Isn’t Available Right Now,” we noticed.)
TikTok’s content moderation efforts have been fairly stringent in comparison with other social networks, as it regularly hides, downranks, and removes users’ posts. But it’s also been accused of engaging in “censorship” by those who believe it’s being too aggressive about newsworthy content.
That’s led to users finding more creative ways to keep their videos from being banned — like using misspellings, coded language or clever editing to route around TikTok policies. Other times, creators will simply give up and direct viewers to their Instagram where their content is backed up and less policed.
“Hateful behavior and violence have no place on TikTok,” a TikTok spokesperson told TechCrunch, when we asked for a statement on the Capitol events. “Content or accounts that seek to incite, glorify, or promote violence violate our Community Guidelines and will be removed,” they added.
In a new statement issued by former First Lady Michelle Obama, she calls on Silicon Valley specifically to address its role in the violent insurrection attempt by pro-Trump rioters at the U.S. Capitol building on Wednesday. Obama’s statement also calls out the obviously biased treatment that the primarily white pro-Trump fanatics faced by law enforcement relative to that received by mostly peaceful BLM supporters during their lawful demonstrations (as opposed to Wednesday’s criminal activity), but it includes a specific redress for the tech industry’s leaders and platform operators.
“Now is the time for companies to stop enabling this monstrous behavior – and go even further than they have already by permanently banning this man from their platforms and putting in place policies to prevent their technology from being used by the nation’s leaders to fuel insurrection,” Obama wrote in her statement, which she shared on Twitter and on Facebook.
The call for action goes beyond what most social platforms have done already: Facebook has banned Trump, but though it describes the term of the suspension as “indefinite,” it left open the possibility for a restoration of his accounts in as little as two weeks’ time once Joe Biden has officially assumed the presidency. Twitter, meanwhile, initially removed three tweets it found offended its rules by inciting violence, and then locked Trump’s account pending his deletion of the same. Earlier on Thursday, Twitter confirmed that Trump had removed these, and that his account would subsequently be restored twelve hours after their deletion. Twitch has also disabled Trump’s channel at least until the end of his term, while Shopify has removed Trump’s official merchandise stores from its platform.
No social platform thus far has permanently banned Trump, so far as TechCrunch is aware, which is what Obama is calling for in her statement. And while both Twitter and Facebook have discussed how Trump’s recent behavior have violated their policies regarding use of their platform, neither have yet provided any detailed information regarding how they’ll address any potential similar behavior from other world leaders going forward. In other words, we don’t yet know what would be different (if anything) should another Trump-styled megalomaniac take office and use available social channels in a similar manner.
Obama is hardly the only political figure to call for action from social media platforms around “sustained misuse of their platforms to sow discord and violence,” as Senator Mark Warner put it in a statement on Wednesday. Likely once the dust clears from this week’s events, Facebook, Twitter, YouTube, et al. will face renewed scrutiny from lawmakers and public interest groups around any corrective action they’re taking.
A chaotic scene unfolded in Washington D.C. on Wednesday as a large crowd of pro-Trump protesters stormed the U.S. Capitol Building.
The Trump supporters flooded into the nation’s capital to attend a rally held earlier by President Trump outside the White House. The rally was timed to protest lawmakers gathering Wednesday to certify President-elect Joe Biden’s electoral win.
At his own event, Trump encouraged his supporters to continue demonstrating against Congress, claiming incorrectly that Vice President Mike Pence holds the power to overturn the election results. While the situation is still unfolding, protesters penetrated the Capitol building and injuries have been confirmed, including at least one gunshot victim.
As Trump supporters flooded up the Capitol steps with “Make America Great Again” hats and “Stop the Steal” banners, the president did little to quell the violence. “Mike Pence didn’t have the courage to do what should have been done to protect our Country and our Constitution, giving States a chance to certify a corrected set of facts, not the fraudulent or inaccurate ones which they were asked to previously certify,” Trump wrote in a tweet. “USA demands the truth!”
Twitter appended a warning label calling Trump’s election fraud claims “disputed” to the tweet. After his supporters already made their way into the Capitol building, the president seemed to walk back his calls to action, calling for supporters to remain peaceful.
The Stop the Steal movement grew out of online conspiracies boosting Trump’s unfounded claims that Democrats had in some way rigged the presidential election. In reality, U.S. electoral results were decisively in favor of Biden, though votes trickled in over an extended period of time, as expected, due to a massive expansion of pandemic-related mail-in voting.
Facebook made efforts to rein in Stop the Spread groups soon after the election, blocking the hashtag for violating its rules around election misinformation. “The group was organized around the delegitimization of the election process, and we saw worrying calls for violence from some members of the group,” Facebook spokesperson Andy Stone said at the time.
Stop the Steal supporters also found a foothold on many other platforms, including Reddit, Twitter and alternative social networks like Gab and Parler, which have attracted far-right users with policies much friendlier to extremist content. The crowd at the capitol also shares considerable overlap with QAnon, a constellation of conspiracy theories that exploded on Facebook, YouTube and other online platforms over the last few years.
This story is developing.
The sprawling online conspiracy network is at the center of Trump’s attempt to overturn the election.