BudmanTX
Well-Known Member
and from the looks of Facebook being down........i wouldn't be surprised....
off they're twitter feed....
off they're twitter feed....
After four years of almost continuous scandal, Facebook is approaching its latest controversy over political polarization and the toxic effects of social media in a more aggressive and defiant way than it has previously, say current and former employees, including executives who helped shape the company’s earlier responses.
Gone is the familiar script in which chief executive Mark Zuckerberg issues a formal apology — sometimes in long blogs on his personal Facebook page or over live-streamed video for a Congressional hearing — then takes responsibility and promises change.
In its place, the company has deployed a slate of executives to mount a public defense while quibbling with the details of allegations from Frances Haugen, the former project manager who left Facebook with tens of thousands of documents. Those documents detail the company’s research into how it spreads hate, incites violence, and, through its Instagram subsidiary, contributes to teenage girls’ negative body images and suicidal thoughts.
Haugen, who revealed her identity in a nationally televised interview Sunday night, has brought a formal whistleblower complaint against the company with the Securities and Exchange Commission and will testify Tuesday on Capitol Hill.
“They’ve moved away from talking about responsibility and an apologetic stance to one that is much more aggressive, defensive and dismissive of the whistleblower’s claims,” said Brian Boland, a former vice president who resigned last year over concerns that the company’s products were accelerating polarization.
“When our work is being mischaracterized, we’re not going to apologize; we’re going to defend our record,” said Facebook spokesman Tucker Bounds.
The unapologetic defenses are part of a broader strategy to fight Facebook’s mounting woes, including potential antitrust legislation in the United States, a sprawling antitrust case brought by the U.S. Federal Trade Commission, and impending laws in Europe that could significantly hurt the company’s business. On Monday, the company asked a federal judge to dismiss the FTC case while it contended with another crisis during a prolonged and still unexplained outage across all its apps, which include WhatsApp and Instagram.
Fueling the approach, say the former employees — who spoke on the condition of anonymity to describe sensitive matters — is a desire by Zuckerberg to distance himself from the social network’s problems as he focuses on virtual reality and his ambitions to make Facebook a major player in hardware production.
The former company employees say senior executives also have grown weary of their own repeated public apologies and have come to believe the company needs to claim more credit for what it has done, much as a political campaign would. Others said that the company was only responding in kind to stepped-up attacks from all sides, including leakers making incomplete assertions.
Taking the lead has been Nick Clegg, a former British deputy prime minister who is Facebook’s vice president for communications and policy. In a memo to employees Friday, he suggested that there was something about the United States in particular that causes polarization, as opposed to Facebook use in general.
“Mature democracies in which social media use is widespread hold elections all the time — for instance Germany’s election last week — without the disfiguring presence of violence,” Clegg wrote in the memo, which was first reported by the New York Times.
The defiant tone of the memo — as well as its aggressive timing, coming hours ahead Haguen’s appearance on CBS “60 Minutes” — caught the attention of former employees and also some longtime critics.
“They’ve signaled that they’re going to apologize less,” said Rashad Robinson, president of civil rights group Color of Change, which in the past has organized advertiser boycotts against the company. “They know they aren’t accountable, and now they aren’t going to be apologizing, either.”
The claims from Haugen echo ones leveled against the company for years but are unusual for the thousands of pages of documentary evidence she downloaded from the company’s own computer systems and shared with lawmakers and news organizations, including the Wall Street Journal, which first reported on her allegations.
Although the company’s stock price was down nearly 5 percent after the unusual outage on Monday, the company’s previous controversies have not seriously undermined its business. Facebook now boasts 3.5 billion users — more than half of the world’s Internet users — across its subsidiaries, including Instagram, Messenger and WhatsApp, and its reach is so great that the outage was treated as a major news story across the world.
Starting with revelations in 2017 that Russian operatives had used Facebook to interfere in the U.S. presidential election a year earlier, the company has been subject to a degree of public scrutiny other social media companies largely have avoided. It has been subjected to advertiser boycotts, viral calls for users to quit the service, civil rights allegations, a record penalty from the FTC for privacy failures and harsh criticism by leaders of both major political parties.
There have also been numerous whistleblowers, including some former executives who have spoken out against the company. One of them, Sophie Zhang, also helped bring forward a trove of internal documents that alleged that the company had turned a blind eye to foreign governments that spread disinformation.
Yet each time, the company has weathered the attacks, its stock price maintaining, with brief declines, steady growth that has made Facebook one of the 21st century’s leading corporate success stories. Even with Monday’s decline, Facebook’s market capitalization of nearly $920 billion makes it among the world’s most valuable companies.
Investors who resisted the urge to sell on July 26, 2018, for instance — following a $100 billion decline that was the largest in Wall Street history and widely blamed on Facebook’s privacy missteps — saw their holdings roughly double in value before the most recent scandal-related downturn.
“There’s a ton of horrible things on Facebook, and there’s lot of valuable stuff on Facebook,” said Claire Wardle, U.S. director of First Draft, an organization dedicated to fighting misinformation that has a partnership with the company but also has criticized its failings. “The stock prices keeps going up,” she said. “From a regulatory standpoint, I don’t see anything coming any time soon.”
Boland said top corporate leaders are blinded to the platform’s problems because they still believe that the company does far more good than harm. That means efforts to understand and mitigate the problems aren’t taken seriously enough or are ended prematurely, he said.
That matches Haugen’s allegation that numerous steps the company took to curb misinformation and the spread of extremist content ahead of the 2020 presidential election were withdrawn too soon when ongoing implementation might have calmed the vote’s turbulent aftermath, which culminated with the Jan. 6 attack on the U.S. Capitol.
The “research points” uncovered by the whistleblower “are not definitive, but they are red flags,” Boland said. “If you were really concerned about the impact of your product, you would want to chase those red flags and see if they were true or not. You would not be satisfied if your car ran well only 80 percent of the time.”
Haugen revealed evidence that a change in the formula Facebook uses to direct content to a user’s news feed resulted in the belief among European political parties that they had to adopt more extreme stances to reach audiences on Facebook.
She has also argued that the company’s algorithms have the effect of surfacing extreme content that entices users to click more — and generate more profit for the company — despite evidence that such tactics intensify partisan feelings and distort democratic political debate.
Haguen’s complaint to the SEC that the company’s private research contradicted its public representations to investors was embraced by the company’s critics in Congress.
“Facebook certainly misled and deceived the public, and so their investors may well have been deceived as well,” Sen. Richard Blumenthal (D-Conn.) told The Washington Post on Sunday. He called on the SEC to take her allegations “very seriously.”
Both Democrats and Republicans have lambasted Facebook for years, amid polls showing the company is deeply unpopular with much of the public. But despite that, little has been done to bring the company to heel.
Former White House adviser Steve Bannon suggested on Tuesday that Facebook whistleblower Frances Haugen is part of a "psyops" campaign to destroy former President Donald Trump's MAGA movement.
Bannon spoke out on his daily broadcast as Haugen was testifying before a Senate hearing on Facebook's history of putting profit over its users' health.
During an interview with GETTR CEO Jason Miller, Bannon alleged that Haugen's testimony was a psychological operation to "shut down the [former] president's movement."
But Miller disagreed.
"People are going to look back at what's happened the past two weeks with the Facebook whistleblower, with what's happening with the Wall Street Journal exposé," Miller said. "The hearing that Marsha Blackburn is leading in the Senate right now, this is the Goths sacking Rome."
Bannon pushed back by pointing to Haugen's 60 Minutes interview as evidence of "psyops."
"You do agree that this is psyops?" Bannon asked. "This whistleblower is not really -- she's dumping MAGA information. Correct? I agree the Goths are sacking Rome. But the oligarchs and the mandarins in the mainstream media are trying to protect this by taking down the Trump movement."
"Two things can be true at the same time," Miller agreed.
The GETTR CEO went on to suggest that a 5-hour Facebook worldwide outage on Monday had been a conspiracy orchestrated by Facebook.
"The fact that a company that big, that powerful would do some multi-platform upgrade all at the same time for Facebook, Instagram, WhatsApp -- it's absurd. It defies logic, Steve. I'm not buying it. I'm not the hardcore Area 51 truther but I'm saying people aren't talking about targeting pre-teen girls anymore."
Testifying before a Senate committee looking into allegations that Facebook has been putting profits before the safety of Americans, whistleblower Frances Haugen noted an abrupt change in policy immediately following the Jan 6th Capitol riot that led U.S. lawmakers to flee for their lives.
Speaking with Sen. Amy Klobuchar (D-MN), Haugen -- whose explosive allegations have rocked the foundations of the social media behemoth -- was asked about the political implications of what Facebook was promoting prior to the November presidential election that saw Donald Trump fail to be re-elected.
After explaining how Facebook staffers tweak the algorithms to promote continued engagement, she explained changes that were made prior to the election that were suddenly abandoned after Jan 6th.
"You said Facebook implemented safeguards to reduce misinformation ahead of the 2020 election, but turned off those safeguards right after the election," the Democratic senator prompted. "And you know that the insurrection occurred January 6th. Do you think they turned it off because it was reducing profits?"
"Facebook has been emphasizing a false choice," Haugen replied. "They said the safeguards in place before the election implicated free speech. The choices happening on the flat platform were about how reactive and how twitchy was the platform -- how viral was that platform. Facebook changed those safety defaults in the run-up to the election because they knew they were dangerous and because they wanted that growth back."
"They wanted the acceleration of the platform back after the election, they returned to their original defaults," she continued. "And the fact that they had to 'break the glass' on January 6th and turn them back on, I think that is deeply problematic."
Something is wrong, he's only wearing a single button down shirt.It's pretty telling that the right wing insurrection propagandists are so worked up about someone exposing hate speech and other manipulative social media methods that they think are out to step on their con.
https://www.rawstory.com/steve-bannon-facebook- 9news.compsyops/View attachment 5003076
Fake News!!! It looks like it is blue under black shirt. lol but you made me look.Something is wrong, he's only wearing a single button down shirt.
WASHINGTON (AP) — Nearly all Americans agree that the rampant spread of misinformation is a problem.
Most also think social media companies, and the people that use them, bear a good deal of blame for the situation. But few are very concerned that they themselves might be responsible, according to a new poll from The Pearson Institute and the Associated Press-NORC Center for Public Affairs Research.
Ninety-five percent of Americans identified misinformation as a problem when they’re trying to access important information. About half put a great deal of blame on the U.S. government, and about three-quarters point to social media users and tech companies. Yet only 2 in 10 Americans say they’re very concerned that they have personally spread misinformation.
More — about 6 in 10 — are at least somewhat concerned that their friends or family members have been part of the problem.
For Carmen Speller, a 33-year-old graduate student in Lexington, Kentucky, the divisions are evident when she’s discussing the coronavirus pandemic with close family members. Speller trusts COVID-19 vaccines; her family does not. She believes the misinformation her family has seen on TV or read on questionable news sites has swayed them in their decision to stay unvaccinated against COVID-19.
In fact, some of her family members think she’s crazy for trusting the government for information about COVID-19.
“I do feel like they believe I’m misinformed. I’m the one that’s blindly following what the government is saying, that’s something I hear a lot,” Speller said. “It’s come to the point where it does create a lot of tension with my family and some of my friends as well.”
Speller isn’t the only one who may be having those disagreements with her family.
The survey found that 61% of Republicans say the U.S. government has a lot of responsibility for spreading misinformation, compared to just 38% of Democrats.
There’s more bipartisan agreement, however, about the role that social media companies, including Facebook, Twitter and YouTube, play in the spread of misinformation.
According to the poll, 79% of Republicans and 73% of Democrats said social media companies have a great deal or quite a bit of responsibility for misinformation.
And that type of rare partisan agreement among Americans could spell trouble for tech giants like Facebook, the largest and most profitable of the social media platforms, which is under fire from Republican and Democrat lawmakers alike.
“The AP-NORC poll is bad news for Facebook,” said Konstantin Sonin, a professor of public policy at the University of Chicago who is affiliated with the Pearson Institute. “It makes clear that assaulting Facebook is popular by a large margin — even when Congress is split 50-50, and each side has its own reasons.”
During a congressional hearing Tuesday, senators vowed to hit Facebook with new regulations after a whistleblower testified that the company’s own research shows its algorithms amplify misinformation and content that harms children.
“It has profited off spreading misinformation and disinformation and sowing hate,” Sen. Richard Blumenthal, D-Conn., said during a meeting of the Senate Commerce Subcommittee on Consumer Protection. Democrats and Republicans ended the hearing with acknowledgement that regulations must be introduced to change the way Facebook amplifies its content and targets users.
The poll also revealed that Americans are willing to blame just about everybody but themselves for spreading misinformation, with 53% of them saying they’re not concerned that they’ve spread misinformation.
“We see this a lot of times where people are very worried about misinformation but they think it’s something that happens to other people — other people get fooled by it, other people spread it,” said Lisa Fazio, a Vanderbilt University psychology professor who studies how false claims spread. “Most people don’t recognize their own role in it.”
Younger adults tend to be more concerned that they’ve shared falsehoods, with 25% of those ages 18 to 29 very or extremely worried that they have spread misinformation, compared to just 14% of adults ages 60 and older. Sixty-three percent of older adults are not concerned, compared with roughly half of other Americans.
Yet it’s older adults who should be more worried about spreading misinformation, given that research shows they’re more likely to share an article from a false news website, Fazio said.
Before she shares things with family or her friends on Facebook, Speller tries her best to make sure the information she’s passing on about important topics like COVID-19 has been peer-reviewed or comes from a credible medical institution. Still, Speller acknowledges there has to have been a time or two that she “liked” or hit “share” on a post that didn’t get all the facts quite right.
“I’m sure it has happened,” Speller said. “I tend to not share things on social media that I didn’t find on verified sites. I’m open to that if someone were to point out, ‘Hey this isn’t right,’ I would think, OK, let me check this.”
___
The AP-NORC poll of 1,071 adults was conducted Sept. 9-13 using a sample drawn from NORC’s probability-based AmeriSpeak Panel, which is designed to be representative of the U.S. population. The margin of sampling error for all respondents is plus or minus 3.9 percentage points.
A man who helped design and run a pro-Trump fake news empire during the 2016 presidential campaign now regrets his actions -- and he tells Ars Technica that he wants to make amends.
The hacker in question is named Robert Willis and he tells Ars Technica that he was hired in 2015 by a company called Koala Media whose stated goal was denying Hillary Clinton the presidency.
While working at the company he created a network of fake news websites that all syndicated one another's content and then spammed stories out to Facebook.
They quickly found that Trump supporters on Facebook were highly engaged and would share any pro-Trump, anti-Clinton articles they wrote, no matter how outlandish.
"Pieces that ran... claimed, among other things, that Clinton had plans to 'criminalize' gun owners, to kill the free press, to forcefully 'drug' conservatives, to vaccinate people against their wills, to euthanize some adults, and to ban the US flag," notes Ars Technica.
Willis quit the company in 2017, and Koala Media is now a shell of its former self after having been thoroughly banned from Facebook.
That said, Willis says he sees the same tactics that worked to elect Trump now being used to sow doubt about the novel coronavirus vaccines.
"COVID has shown me the deadly side of fake news and anti-vaccination people," Willis explains. "After multiple conversations with my father, who refuses to wear a mask or get vaccinated, I was getting very concerned. I asked him what sites he would read the conspiracy-based things on, and he mentioned the website that ran the network I had built the machine on."
Read the whole interview with Willis here.