Taking too long? Close loading screen.
Connect with us

World

Facebook and Twitter said they would crack down on QAnon, but the delusion seems unstoppable

Published

on

James Wolfe, 45, no longer believes in the conspiracy theory QAnon. For a year, though, it dominated his life.

“The thing with QAnon is that Q is dropping these little breadcrumbs every day or every couple days,” said Wolfe. “It’s so easy to feel you’re special or in on something.”

QAnon, which has been called a collective delusion and a cult, is a baseless conspiracy theory about a secret cabal of liberal, Satan-worshipping elites who are running a child sex trafficking ring that President Trump will soon uncover. QAnon believers follow the anonymous online postings of “Q,” who claims to be a Trump insider with knowledge of the cabal and the president’s plans. And at QAnon’s core are some deeply anti-Semitic tropes, like the centuries-old idea of blood libel and the Protocols of the Elders of Zion.

After being introduced by a friend to the idea in late 2017, Wolfe — who was recently unemployed and recovering from a serious physical injury — quickly started spending as much as eight hours a day consuming an endless stream of content from people interpreting Q’s cryptic prophecies on YouTube, Facebook, and Twitter. Wolfe pulled himself out of QAnon after seeking counseling for his depression, which he thinks was at the root of his obsession with the theory. And he quit using any social media at all.

“It was like, ‘government is going to take your guns’ or ‘government is doing some sort of operation in Texas and shutting down things’ or ‘earthquakes are happening because the cabal is happening,’” Wolfe told Recode. “I look back at it and say, ‘How stupid was I?’ But at the time it seemed really real, and it’s not.”

QAnon has millions of devoted followers who, like Wolfe, have become hooked on the theory through social media. But unlike Wolfe, many of them may have no way out. For three years, platforms like Facebook, Twitter, and YouTube left QAnon largely unchecked. Now, these platforms have identified QAnon as a serious source of harmful misinformation linked to real-world violence, and in the past few months, have taken down thousands of QAnon accounts and tried to minimize the group’s reach. But social media giants also appear unwilling or unable to take some of the easiest steps to combat dangerous QAnon-related content — like removing all groups that mention the conspiracy theory in their name — in fear of overly restricting that content that’s not explicitly violent but is overtly delusional and radicalizing.

In many ways, any effort to stop QAnon’s rising influence is too late. The theory continues to grow online, both in the number of followers and the strength of its political influence in the Republican party. The growing political clout of the movement is especially worrisome for misinformation researchers who say QAnon is potentially becoming one of the largest networks of extremism in the United States. QAnon is gaining broad appeal not just with the extremely online, male-dominated, 4chan message board crowd, where QAnon was first born; it’s also increasingly popular with suburban moms and yoga-loving wellness gurus on Instagram and Twitter.

In August, a Guardian report found that about 170 QAnon Facebook groups had some 4.5 million aggregate followers. And several Pew studies suggest that the percentage of Americans polled who know about QAnon is growing. The percentage of Americans who say they have heard “a lot” or “a little” about QAnon roughly doubled from 23 percent in March to 47 percent in September, according to a series of Pew studies (although the Pew studies polled the same group of people, so they may have heard about QAnon through the poll). Not everyone who’s heard of QAnon supports it; in fact, some studies show that most probably don’t. A recent poll by research firm Civiqs found that out of nearly 4,000 people polled, only 7 percent supported QAnon.

Many civil rights groups are calling for Facebook, Twitter, YouTube, and other platforms to crack down more strictly on these theories as the companies weigh concerns that restricting QAnon too much could compromise their commitment to free speech. Facebook CEO Mark Zuckerberg has long argued that platforms have an obligation to protect users’ free speech, including fringe beliefs, in all but the most extreme cases. Complicating the matter further, a growing number of established Republican politicians promoting QAnon makes it harder for social media companies to take content down without being accused of political suppression.

“There’s a lot of general framing in QAnon of ‘here’s what they don’t want you to know’ — and in that, people circulate a lot of content that’s meant to outrage and mobilize people toward specific ends,” said Joan Donovan, an online misinformation researcher at Harvard University.

The FBI has identified QAnon as a domestic terrorism threat. About a dozen people inspired by QAnon have been charged with committing or attempting to commit violent crimes, including two murders, an armed standoff on the Hoover Dam, and an attempted child kidnapping. There was also an incident in which a woman linked to QAnon conspiracy videos was arrested after driving a car from Illinois to New York, armed with a dozen knives, shortly after she posted messages online threatening to “take out” Joe Biden. Meanwhile, dozens of people have posted personal stories to online support groups about feeling they’ve lost their loved ones to QAnon’s radicalizing, all-encompassing influence.

In order to better understand how QAnon spreads online, Recode investigated how new followers find their way to content about the conspiracy theory. This led to a paradoxical conclusion: For myriad reasons, the QAnon movement is difficult to contain on social media, but companies like Facebook and Twitter seem to be missing opportunities to do even the easy stuff.

Falling down the Q rabbit hole

To find out how easy it is to get drawn into the abyss of QAnon on Facebook and Twitter, Recode ran a quick experiment in early September and repeated it in October.

In early September, we created brand new accounts on both Facebook and Twitter to see how aggressively each platform promoted QAnon. In both cases, all it took was one quick search to start falling into the rabbit hole.

On Facebook, we started by searching for “QAnon.” The first three results were reputable news articles about the conspiracy theory from CNN, USAToday, and TheHill. After that, however, Facebook’s search results pointed users toward QAnon propaganda. One of the first accounts Facebook returned in its search was a profile called “QAnon Angel,” which had 677 followers and posts containing links to conspiracies about adrenochrome, a chemical compound that QAnon followers falsely believe people like Hillary Clinton are extracting from young children and ingesting into their own bodies.

Next, the Facebook search for QAnon returned a series of posts from a group called “QAnon Truth Movement,” which had a much larger base of more than 6,000 followers. The group was full of posts about Trump’s plans to uncover the purported secret cabal and promoting baseless conspiracy theories like the false claim that Michelle Obama is a man. The group also contained bigoted memes such as an anti-Semitic cartoon depicting a man with an exaggeratedly large nose sitting at a dining table ordering a white man to give his food away to a Black man. Shortly after Recode notified Facebook about the results of the experiment, the QAnon Truth Movement page was taken down.

When Recode repeated this experiment in October by creating another new account and searching “QAnon” in the search bar, we found results similar to those from a month prior.

The first results were a series of reputable news articles. When we searched for People, the first result was still QAnon Angel, the same account we first came across a month ago in our search. And the “Pages” and “Groups” sections also returned obviously pro-QAnon accounts, like a group called “Q TRUMP NEWS #WWG1WGA #TRUSTTHEPLAN #Q+ #QANON #GODWINS #144K #TRUMP2020” with 861 members, and recent posts asserting that Trump doesn’t actually have Covid-19 and that this is all part of a plan to uncover the alleged cabal.

A post on a QAnon Facebook page that was one of the top search results in the Groups section of Facebook in Recode’s experiment.

It’s equally easy to get sucked into a QAnon filter bubble on Twitter, according to our experiment. After starting a new Twitter account, we searched for accounts named “QAnon”; the first account that came up under the “People” tab was an account promoting the QAnon conspiracy theory with more than 500,000 followers. As soon as we visited that account, Twitter recommended three other QAnon influencer accounts in the “You Might Like” section. And after following a few right-wing (but not QAnon) accounts, like Fox News, Ben Shapiro, and Breitbart News, Twitter’s algorithm appeared to suggest QAnon keywords. When Recode typed in the letters “WW” in the search bar, for example, Twitter served up a recommendation for the popular QAnon code “WWG1WGA,” which stands for “Where We Go One, We Go All.”

When we repeated the experiment by creating another new Twitter account in October, we found that searching for QAnon brought up similar results. The first result under “People” was the same account with more than half a million followers. And after viewing that account, Twitter’s algorithm then suggested three similar accounts to follow on the right-hand sidebar, two of which were openly promoting QAnon.

Though these sorts of recommendations might seem mundane, research shows that these signals can lead to a snowball effect and help radicalize users. For example, someone might first “like” a Facebook post that’s skeptical of vaccines, or purporting that 5G internet networks are harmful, according to Aoife Gallagher, a researcher for the extremism think tank the Institute for Strategic Dialogue, which recently published a report on the spread of QAnon. From there, social media networks can continue to suggest more and more convoluted ideas, including QAnon.

“If someone starts to kind of get into a conspiracy, they’ll just continue to be recommended this kind of content,” said Gallagher. “It really creates a rabbit hole of tunnel vision. And it just means that people are surrounded with conspiracies every time they go online.”

The problem with banning a sprawling conspiracy theory

When Recode discussed the results of its experiment with Facebook and Twitter, spokespersons for both companies acknowledged that it shouldn’t be so easy for new users to come across QAnon. But they also said it’s only been a few months since new policies limiting the conspiracy theory’s reach were rolled out and that they’re working to improve this in the future.

A few weeks ago, for example, Facebook said it started demoting content from QAnon groups and pages in people’s News Feeds.

“In August, we expanded our policy to address QAnon, but this is a challenge that requires ongoing vigilance,” a spokesperson for Facebook said in a statement to Recode. “Since people will always create new Groups, content, and hashtags to try and evade our enforcement, our internal specialists work with external experts to keep our finger on the pulse of this activity so we can enforce our policies.”

A spokesperson for Twitter issued a similar statement.

“We’ve been clear that we will take strong enforcement action on behavior that has the potential to lead to offline harm,” said the Twitter spokesperson. “We aim to be iterative and transparent in our approach, and we recognize we cannot review every Tweet containing a falsehood. That’s why, under this expansive framework, we noted that we will prioritize the removal of content that has the greatest potential for harm.”

It may have been easier for Facebook and Twitter to rein in QAnon if the companies had taken the movement more seriously sooner, as Reddit did when it made a sweeping ban on discussions promoting the theory back in 2018 — two years ahead of the larger social media networks.

“The tech companies have not been very quick to respond to QAnon and have not responded very aggressively,” said Michael Jensen, a senior researcher at the University of Maryland studying domestic radicalization. “When QAnon first pops up in 2017, the tech companies are still really reeling from ISIS spreading on their platform. Then white supremacists pop up on their platform, and there’s congressional demands to take care of those problems. QAnon is kind of an afterthought.”

And even now that Twitter and Facebook are finally taking QAnon more seriously, they haven’t fully banned it. Instead, both companies are only taking down accounts from QAnon followers or groups that have directly encouraged violence or engaged in coordinated “harmful activity.” But what meets the bar for harmful speech — or just delusional hypothesizing — can be a tricky needle to thread.

If Facebook and Twitter take down too little QAnon content, they run the risk of letting extremist communities that form around it thrive, which has proven to lead to real-life violence. But if they take down too much, they could cause a backlash among users — and politicians — who think the company is going too far in restricting their right to discuss their personal beliefs, regardless of how potentially dangerous they may be. This is especially important now, when Republican leaders like Trump have criticized tech companies for alleged (and unproven) allegations of anti-conservative bias. Twitter has said that, overall, since it introduced its new policies, impressions on accounts promoting QAnon have dropped by 50 percent.

But despite these recent attempts by Facebook and Twitter to clamp down on the movement in a nuanced way, it appears QAnon continues to thrive and find a growing audience on social media platforms. In early September, some of the 17 most popular QAnon Twitter accounts had, by Recode’s count, a combined 2.4 million followers. By early October, all but two of those 17 accounts were still active, with even more followers than the month prior.

This is consistent with outside reporting. The Washington Post recently reported that around 93,000 QAnon accounts still exist on Twitter since it announced its new policies, citing nonpartisan research firm Advance Democracy. And the New York Times reported in mid-September that about 100 QAnon groups on Facebook added 13,600 new followers a week from mid-August, when Facebook implemented its QAnon rules.

“Facebook and Twitter have shut down some QAnon groups, but it’s not nearly enough,” said Sarah Hightower, an independent QAnon researcher. “They could try to fully solve the problem by giving people offramps — so when a user is looking into this type of content, and you know that it’s dangerous, you could nudge them in another direction.”

A spokesperson for Facebook told Recode in August that another part of the reason it’s hard for the company to crack down on QAnon is because its followers evade restricted codewords, symbols, and calls to arms. For example, many groups changed the spelling of “Q” to “Cue.” And major influencer accounts who get banned simply start new ones, and then ask other Q influencers who haven’t been purged to promote their accounts, effectively evading the bans as other conspiracy groups have done on Facebook.

But that’s only part of the problem. A larger issue is that the movement is so sprawling that it’s hard to crack down on QAnon’s full scope. QAnon seems to absorb and interact with a long litany of other conspiracy theories in what Vice’s Anna Merlan has called “the conspiracy singularity.”

“QAnon has become a shorthand for a much broader coalition, especially since the pandemic,” explained Donovan, the Harvard researcher. That coalition includes anti-vaxxers, anti-Covid-19 lockdown protesters, 9/11 truthers, white supremacist groups, armed militias, and “some of the MAGA crowd,” she said.

Several misinformation researchers Recode spoke with said that if companies like Facebook or Twitter wanted to, they could easily go after QAnon more harshly — as they have in the past with groups like Al-Qaeda or ISIS. But one of the main holdups for these platforms is that a majority of individual QAnon supporters aren’t openly, explicitly calling for violence.

Some conspiracy theory experts, such as Joe Uscinski, a professor of political science at the University of Miami, have defended social media companies’ reluctance to completely ban QAnon discussion altogether.

“We have to understand that if we’re going to be puritanical about only allowing true stuff on social media, a lot of things are going to come down,” said Uscinski, who pointed out that if these companies started taking down QAnon content simply because it’s not demonstrably true, the same reasoning could be applied in arguing that platforms take down posts about God, for example.

“Everyone is happy to ban the other guy’s conspiracy theory, but they don’t realize that their own beliefs often qualify as conspiracy theories,” said Uscinski.

Still, many people who are worried about QAnon’s pernicious influence, including some former QAnon believers, say that social media companies need to do more to stop QAnon’s reach.

It’s not just social media that is to blame for QAnon’s rapid growth in popularity. Mainstream media plays a role, too. Many local news outlets covered “#SaveTheChildren rallies,” for example, without realizing that QAnon organizers were behind them. And there are many outlets, like NBC News and the Guardian, that have been doing excellent reporting dispelling QAnon theories and exposing its dangerous consequences — but these important discussions about the movement make it important to distinguish meaningful coverage of QAnon versus dangerous QAnon content.

Political influence and the upcoming elections

QAnon has proven itself to be a dangerous movement, but that hasn’t stopped some from seizing the political opportunity in appealing to its built-in devoted fanbase.

“QAnon is a bunch of people who are highly networked online and have spiderwebbed out into many, many, other spaces,” said Donovan. “This is potentially reaching millions of people. If you can tap into that and have those groups support you … they become groups that are able to distribute your media and your content for free.”

President Trump has refused to denounce QAnon and has instead tacitly endorsed it and welcomed its followers by retweeting QAnon-promoting accounts hundreds of times, as Media Matters’ Alex Kaplan has diligently tracked. Meanwhile, dozens of Republican congressional candidates have supported QAnon, and one of them, Marjorie Taylor Greene— a former staunch QAnon believer who has only recently distanced herself from the theory — will likely be elected to Congress in November. Trump has welcomed Greene as a rising Republican star, inviting her as a guest to the Republican National Convention.

“Republicans are letting QAnon into their ranks — they publicly keep dangerous conspiracy theories at arm’s length but are secretly flirting with them, continuing to keep them in their caucus,” said Darwin Pham, the Deputy National Press Secretary for the Democratic Congressional Campaign Committee (DCCC).

Facing weeks of pressure and calls from Democrats to dismiss QAnon, some Republican Party leaders have started to publicly denounce QAnon while still supporting its proponents. Vice President Pence told CBS that he didn’t know much about QAnon but dismissed it “out of hand” in late August. Around the same time, Republican House minority leader Kevin McCarthy put out a statement condemning QAnon as a conspiracy theory. But McCarthy has also embraced Greene’s likely arrival in Congress and said he looks forward to her win.

“This is a vile and hateful movement, it is promoting one of the oldest lies in the world under a new guise,” said Rep. Tom Malinowski, a New Jersey Democrat who authored a bill condemning QAnon, which recently passed the House 378-18. The bill is co-authored by Rep. Denver Riggleman, one of the few Republican members of Congress who swiftly condemned the movement. In total, 17 Republicans, one independent, and no Democrats voted against the QAnon bill.

“You cannot have the fringe dictating public policy or political discourse, that is going to destroy the constitutional basis of this country,” said Riggleman, who lost a recent election in his district to a more conservative Republican contender. Riggleman added that he’s used to standing by his principles, even when that upsets some members of his party, and that if conservatives continue to embrace QAnon, “That’s the end of the Republican Party.”

The fate of QAnon as a political force could rest on how its supporters handle the outcome of this year’s election. Some worry that QAnon followers, who have proven effective at mobilizing an active base, might take matters into their own hands if Trump loses. Already, several major QAnon groups Recode reviewed shared memes accusing the elections of being “rigged” by corrupt elites, a conspiracy theory that Trump himself has perpetuated. Many QAnon supporters overlap with extremist groups like the Proud Boys and the boogaloo movement, who openly support armed militia violence and inciting a civil war.

“The danger isn’t whether QAnon followers are going to move votes or not, but whether they will accept reality,” said Michael Hayden, an investigative researcher on extremism for the Southern Poverty Law Center.

QAnon has earned its designation as a domestic terrorism threat for a reason. If QAnon followers start spreading chaos on social media on Election Day, there is concern that the group’s followers could incite violence, even if only a small percentage of followers take action. So while Facebook and Twitter say they’re doing everything they can to prevent the theory from spreading, the record shows that the threat of violence from QAnon is more imminent than ever, which means these platforms have an imperative to do better.

“I believe in free speech,” said Wolfe. “But I think there’s a line there when you’re pushing someone to the point where they’re driving across the country to kill politicians.”


Help keep Vox free for all

Millions turn to Vox each month to understand what’s happening in the news, from the coronavirus crisis to a racial reckoning to what is, quite possibly, the most consequential presidential election of our lifetimes. Our mission has never been more vital than it is in this moment: to empower you through understanding. But our distinctive brand of explanatory journalism takes resources. Even when the economy and the news advertising market recovers, your support will be a critical part of sustaining our resource-intensive work. If you have already contributed, thank you. If you haven’t, please consider helping everyone make sense of an increasingly chaotic world: Contribute today from as little as $3.

Source

Continue Reading
Advertisement
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

World

In defense of Quibi

Published

on

Quibi was a bad idea, poorly executed. Now it’s dead, just six months after it debuted.

Here’s a quick timeline of its short life:

It was easy to be skeptical about Quibi before launch because … see above. The real surprise is that it failed so quickly. And even that surprise is a little bit couched. Once news got out that Katzenberg was trying to sell, the only question was whether he’d find a buyer or have to shutter. As I wrote last month, you don’t try to sell your startup five months after launch if things aren’t going terribly, even though Katzenberg insisted otherwise in sales pitches.

But, that said: I would like to see more Quibis in the future.

Not the concept or the execution (again, see above) but the model: Running a media business the old-fashioned way, where you ask people to make something, pay them for it, and then try to re-sell that work to someone else. Because there’s another version of running a media business — what YouTube, Twitter, and Facebook do — and I don’t feel great about that one in 2020.

To recap: Katzenberg and Meg Whitman, the CEO he hired away from Hewlett Packard, paid Hollywood studios, TV networks, and digital shops like Vox Media (which owns this site) to make short videos. Then they tried selling subscriptions to those videos to you.

That’s one way — the old way — to run a media business.

There are lots of variants, and you can debate the right way to scale those companies and how much money you need to make them work, etc. The model includes everything from your local newspaper (if it still exists) to TV networks to Spotify to Netflix. But they’re all using the same basic playbook.

There is also the new — and often much more successful way — to run a media business: Get people to give you stuff for free, get people to consume that stuff for free, and sell their attention to advertisers. You may not want to call yourself a media business — for strategic, valuation, or legal reasons — but you are most definitely in the media business. This has worked really, really well for YouTube, Twitter, and Facebook.

But as we spend a lot of time discussing these days, it’s not clear that the model that YouTube, Twitter, and Facebook use — which is dependent on ingesting as much free content as possible, and distributing as widely and quickly as possible, with as little input from the people who run those businesses as possible — is good for the rest of us.

And at the core of all the proposals to fix those businesses is the idea that they should act a lot more like … traditional media businesses. These proposals call for the people who run these platforms to pay attention to what they distribute, and even make judgment calls about whether that stuff should be distributed. And, yes: It also involves paying people who make some of the stuff they distribute.

I don’t want to belabor this thought, and I don’t want to oversell it. Quibi would have likely struggled using any model because it didn’t have stuff people wanted to see, and it didn’t have the distribution it needed to get it in front of them, anyway.

And while the Facebooks of the world run on free content, they certainly have to spend money on lots of other stuff. TikTok, for example, spent $1 billion on marketing in a single year in order to get its free videos, uploaded for free by its users, in front of people around the world.

But if you’re going to dunk on Quibi for failing so big, so fast, at least give them this: They failed the old-fashioned way. Which still has an upside.


Help keep Vox free for all

Millions turn to Vox each month to understand what’s happening in the news, from the coronavirus crisis to a racial reckoning to what is, quite possibly, the most consequential presidential election of our lifetimes. Our mission has never been more vital than it is in this moment: to empower you through understanding. But our distinctive brand of explanatory journalism takes resources. Even when the economy and the news advertising market recovers, your support will be a critical part of sustaining our resource-intensive work. If you have already contributed, thank you. If you haven’t, please consider helping everyone make sense of an increasingly chaotic world: Contribute today from as little as $3.

Source

Continue Reading

World

FDA says there is no timeline for a Covid-19 vaccine

Published

on

A health worker works in a lab during clinical trials for a Covid-19 vaccine at Research Centers of America in Hollywood, Florida, on September 9.
A health worker works in a lab during clinical trials for a Covid-19 vaccine at Research Centers of America in Hollywood, Florida, on September 9. Eva Marie Uzcategui/Bloomberg/Getty Images

US Food and Drug Administration Commissioner Dr. Stephen Hahn said Wednesday that the agency does not have a set timeline to review a Covid-19 vaccine.

The goal, he said, is that everyone could get a vaccine by spring. But it “really depends on a number of factors.”

“We want to expedite it,” Hahn said at a conference sponsored by the Milken Institute, a nonpartisan think tank founded by ex-banker Michael Milken.

“We’ve said that we will schedule a vaccine advisory committee to review those data. We have committed for every application to have a vaccine advisory committee,” Hahn said.

“We will make that public, as I mentioned. Our scientists will make an initial determination, will ask specific questions about the product from the vaccine advisory committee. And then we will incorporate that in our decision making,” Hahn said.

“At the end of the day, only our career scientists in the Center for Biologics Evaluation and Research will be making this decision, and they will be making it solely upon the science and data that come from the clinical trials.”

To speed up the process, Hahn said the FDA has been working with manufacturers from day one and have stayed in touch throughout the manufacturing process, rather than reviewing everything at the end of the process. 

“We need to make sure that there’s quality and consistency and that every lot has the same ability to provide protection to all of Americans,” Hahn said. “We have a lot of confidence in the manufacturing of these developers, and we will be doing our part with respect to working with them to make sure that manufacturing can be ramped up as quickly as possible.”

Source

Continue Reading

World

Trump slams US stimulus deal on Twitter as talks continue

Published

on

Trump’s evening tweets came hours after all three major stock indexes fell over the ongoing stimulus deadlock.

United States President Donald Trump expressed scepticism that an agreement could be reached with Democratic leaders on a new round of coronavirus aid relief, seemingly torpedoing hopes for a stimulus plan even as talks continue between Democratic House Speaker Nancy Pelosi and Secretary Treasury Steve Mnuchin.

“Just don’t see any way Nancy Pelosi and Cryin’ Chuck Schumer will be willing to do what is right for our great American workers, or our wonderful USA itself, on Stimulus,” Trump wrote on Twitter Wednesday evening. “Their primary focus is BAILING OUT poorly run (and high crime) Democrat cities and states….Should take care of our people. It wasn’t their fault that the Plague came in from China!”

The tweets came after all three financial indexes fell on Wednesday amid dwindling hopes of a stimulus plan before Americans head to the polls on Election Day November 3.

Pelosi has proposed $2.2 trillion to help struggling businesses and families, while the White House rolled out a $1.8 trillion proposal, which Trump has since said he would be willing to go beyond. But Pelosi and Mnuchin are reportedly getting closer to a deal, with the pair due to speak again on Thursday, Pelosi’s spokesman said in a tweet.

Experts have warned the US needs another round of financial relief for struggling businesses and families in order to recover from the pandemic’s economic downturn. The last round of aid expired at the end of July, including an additional $600 per week in federal unemployment benefits meant to shore up workers in addition to state aid.

Earlier this month, Federal Reserve Chairman Jerome Powell warned that the country’s entire economic recovery is in danger of derailing if the government does not step up to the plate.

“Too little support would lead to a weak recovery, creating unnecessary hardship for households and businesses,” Powell said during an October 7 event with economists and strategists.

Trump has frequently used Twitter to weigh in on the continuing stimulus talks. On the same day Powell spoke, Trump tweeted that he had told his representatives to halt stimulus negotiations with Democrats until after the election – before making an about-face hours later and urging Democratic legislators to cast targeted financial lines to businesses and households.

Last week, he tweeted: “STIMULUS! “Go big or go home!!!” even as Senate Republicans expressed support for a pared-down aid package.

On Wednesday, Trump’s chief of staff, Mark Meadows, told Fox Business “the president’s willing to lean into this” with Republican senators if a deal is reached.

Talks are continuing as the timeframe for a pre-Election Day vote narrows, with investors around the world closely watching what comes out of Pelosi and Mnuchin’s discussions Thursday.

Source

Continue Reading

Trending