Facebook quietly tests fake news flagging

Indicators of Facebook’s much-anticipated war on fake news are bubbling up into the newsfeeds of some customers. The fake news flag tests seem to have been about for over a week, though seem to be vanishing as quickly as they appear. That behavior is in line with the way Facebook frequently tests new solution attributes via limited, reside rollouts to small swaths of its huge user base.

screen-shot-2016-12-01-at-3-55-28-pm

The warning message, highlighted in red, states “This web site is not a trustworthy news supply.” The instance above, which appeared live right now, flagged a story from occupydemocrats.com, a site included on a widely circulated list of fake news sources. The headline reads “2,000 Veterans Just Arrived At Standing Rock To Form ‘Human Shield’ About Protesters.”

Notably, this particular news story checks out and was also picked up by The New York Occasions and Reuters, amongst other reputable fact-checking outlets. That indicates that Facebook might be flagging complete internet sites rather than individual stories in its work to stem the flow of fake news on its platform.

screen-shot-2016-12-01-at-4-08-11-pm

Hacker news user ideonexus noted the exact same flag feature nine days ago:

“Coincidentally, I was on FB today and began seeing a red flag on two friends’ posts reading, “This internet site is not a reliable news supply. Explanation: [Explanation]” with factors like state-sponsored news and unclassified. The flag appeared on an write-up about Pokemon Go that was clearly sensationalist speculation and a conspiracy web site about George Soros. The div tag in the post had a class “bsAlert” and, my preferred component, a poop icon beside it. None of my other friends are seeing the flag, so I suspect I’m a single of the fortunate A/B testers.”

We reached out to Facebook about the test feature and will update when we get comment.

Featured Image: Facebook
TechCrunch

Facebook Alone Didn’t Create Trump—The Click Economy Did

In mid-October I wandered into a Trump field workplace in Youngstown, Ohio and met Coni Kessler, a kind 75-year-old Youngstown native with penciled-on eyebrows and a Women for Trump button on her Trump 2016 t-shirt. She sat me down in a chair just beside her, and for more than an hour, explained why she detested Hillary Clinton and was ecstatic to vote for Trump this year.

Clinton, she told me, is an atheist who wears an earpiece in the course of debates so billionaire George Soros can feed her talking points. The day Clinton collapsed into the back of her van when she was sick with pneumonia? According to Kessler, the Clintons hired a young actress to run up and give Clinton a hug for a staged photo following the collapse. Kessler also stated she’d observed videos of Bill Clinton raping an underage girl but that the video had mysteriously disappeared. She wondered why no 1 was talking about Bill Clinton’s illegitimate, half-black son. And she stated that anytime she talks negatively about Clinton on the internet, “they”—presumably the technology overlords—shut her phone down.

At some point, I stopped Kessler to ask her where she’d gotten all these stories, stories I knew had been false Clinton conspiracy theories. Her answer: “It was on my Facebook web page.”

Kessler’s stories had been extreme, yes, but I need to have known then that Trump enjoyed the help of sufficient people like Kessler, becoming fed an entirely various narrative about the two candidates, to assist carry this election. But I missed it completely. Simply because of her conviction that these stories were accurate, I regarded Kessler element of a fringe that could not possibly gain the backing of sufficient voters to win. I missed this since, effectively, I was living in my own bubble, too.

I’ll stop right here to say that, no, this is not one more story about how Facebook gave us President-elect Trump. By now, there are plenty of solid stories like that, and they’re not incorrect. To be sure, Facebook played an instrumental role in this election by allowing the type of fake news stories Kessler described to me to proliferate and by providing all of us the alternative of only seeing the content material we “like.” But it would be wrong to lay the election outcomes totally at Facebook’s feet. Trump won in big element since he mastered this particular moment in social media’s evolution in a way that no presidential candidate—not even Barack Obama, whose embrace of social media was historic in its own way—has ever carried out ahead of.

We could not know exactly how wealthy our quickly-to-be president really is. But in an economy where clicks are currency, Trump is King Midas.

The Popularity Vote

Nicco Mele was as clueless as anybody that Trump was going to win the election. This is strange, given that Mele in fact predicted just that in his 2012 book, The End of Huge. Mele describes how the world wide web enabled insurgent political candidates like Barack Obama in 2008 to succeed by corroding conventional best-down political energy structures. That corrosion is in several ways a very good factor, he wrote, because the political class is corrupt and divisive. “Yet in hastening the demise of parties and empowering upstarts, radical connectivity also paves the way for a dangerous populism to take hold of our political method,” he says. “We get thrilling candidates like Barack Obama who can shake up the program, but also extremist or fringe candidates who if elected could bring the complete property down.”

“That’s specifically what’s happened,” Mele says these days.

Like Barack Obama, Trump used social media as a direct line among himself and the American folks. But unlike Obama in ‘08, Trump had spent years cultivating this neighborhood with his tirades about almost everything from President Obama’s birthplace to Robert Pattinson and Kristen Stewart’s relationship. By the time he ran for president, Trump’s on-line following was currently robust.

That mastery proved powerful. From last summer season onward, Trump regularly dominated the conversation on both Facebook and Twitter. But his frequent on-line outbursts didn’t just captivate voters. They captivated the media. “He had an capability to communicate in an unfiltered style like no candidate just before,” says Patrick Ruffini, a leader of the #NeverTrump movement and co-founder of the polling group Echelon Insights. “She was background noise in a way.”

Clinton herself, on the other hand, usually seemed to be a step removed from her on the internet persona. On social media, her large digital group crafted messages that referred to Clinton in the third person. These expertly manicured status updates included tastefully developed graphics. But they at times didn’t appear to incorporate the candidate herself, at least not in a way that felt like you were in direct contact with a real individual. As we wrote during the primaries, the tone of Clinton’s emails, which she didn’t intend to be viewed by the public, gave a greater sense of who she is as a person than her social media posts ever did. They felt like the authentic Clinton.

For better or worse, Trump’s online persona has usually been raw. “He seemed like a guy who was really upset at 3 in the morning,” Cenk Uygur, host of the left-leaning web series The Young Turks, says of Trump’s infamous “sex tape” tweet. “We had been all on the edge of our seats for the subsequent Trump tweet. The American men and women like to be entertained. Yes, it is a show. Yes, it is a reputation contest. Actually.”

The mistake much of the media made, Uygur says, was believing that Trump was undoing his candidacy with each and every 140-character missive. They created the same mistake in believing that Senator Bernie Sanders was as well unruffled and unpolished to ever turn into president. “That was not the bug. That was the feature,” he says. “It’s all about authenticity.”

High on Outrage, Low on Substance

It’s not just that Trump seemed authentic on the web. It is also that he seemed angry. While Clinton preached optimism, Trump preached despair. According to Jonah Berger, author of Contagious: Why Things Catch On, anger, anxiety, and other so-referred to as “high arousal emotions” are among the most critical ingredients to virality online.

In a 2011 paper, Berger tracked how men and women shared each New York Times report over a three-month period. He located that feelings like awe, anger, and anxiousness went viral far far more readily than sadness and contentment.

“Whether this was his method or they lucked into it, I don’t know,” Berger says. “But they’ve been playing on these emotions.”

We also know that men and women usually share viral content material without having even reading what they’re sharing. In 2011, web traffic outfit Chartbeat published a study that showed a huge percentage of folks who clicked on an report on Slate in no way even scrolled down the web page. A vast majority only ever created it halfway.

“They’re reading the headlines, acquiring an emotional reaction, and they’re passing them along,” Berger says.

It’s a trend that feels tailor-made for a candidate like Trump, whose campaign was high on emotion but low on detail. Clinton’s campaign, meanwhile, utilised fact-checking as its first line of defense. It seemed logical at the time. On the web, most information are findable, so why not use them? But it could be that all that perform only served to heap much more detail on an electorate that was scarcely creating it via the headlines. That’s not a behavior restricted to Trump supporters. That is everyone.

The Facebook Effect

As I mentioned before, Trump’s rise was not all Facebook’s carrying out. But the social networking giant has had a huge influence. In between March 23, 2015, when Ted Cruz became the first presidential main candidate, and November 1, 2016, 128 million folks in the US generated eight.eight billion Facebook likes, posts, comments and shares connected to the election. According to Pew Investigation, 44 % of Americans get their news from the site.

That reality has not gone unnoticed by spurious purveyors of so-called news, who now use Facebook’s huge scale to build an audience on the backs of stories that don’t have to be factual, just sharable. In a study of partisan news sites, Buzzfeed identified that certain right-wing Facebook pages published false or misleading details 38 percent of the time, even though left-wing pages did so nearly 20 percent of the time.

“Whatever is accurate is not necessarily viral and what ever is viral is not necessarily accurate,” says Vincent Hendricks, director of the Center for Data and Bubble Research at the University of Copenhagen. But Facebook’s algorithms prize virality. They’re educated to show folks the kind of news they’re going to like, allowing so much of this fake news to go unchallenged.

Facebook is conscious of its fake news dilemma and is trying to resolve it with technology. “In News Feed we use different signals based on community feedback to determine which posts are most likely to contain inaccurate data, and reduce their distribution,” Adam Mosseri, vice president of Facebook stated in a statement. “In Trending we look at a selection of signals to support make certain the topics becoming shown are reflective of true-planet events, and take further measures to avoid false or misleading content from appearing.”

But the echo chamber effect is tougher to overcome. Human beings have constantly sorted themselves into like-minded groups, and social media allows that sorting to take place with the speed of a couple of clicks. As writer Bill Bishop describes in his book The Massive Sort, Americans are increasingly picking to reside in places where most people feel, believe, and vote as they do. Which shouldn’t be so shocking, contemplating most of us choose pals simply because of what we have in common. In fact, some evidence even suggests that Facebook may possibly expose us to far more opposing viewpoints than we may possibly otherwise see in the actual planet.

Still, it does not take considerably digging by way of the election exit polls to see the nation is already deeply divided in between red and blue states, cities and rural places, ages and races. Naturally that division manifests itself on the internet, too. If Trump’s stunning upset teaches us something, it is that we have been blind to what the other side was saying. The great news? For far better or worse, the other side is only ever a click away. Donald Trump figured out early on how to take advantage of that digital closeness. To succeed, his opponents will have to figure out how to do the same.

Go Back to Leading. Skip To: Begin of Write-up.

WIRED

Facebook Reside will host a collaborative escape area game these days

The hour-extended interactive murder mystery focuses on a safety guard at an exclusive art gallery who finds himself trapped in a room with a dead body. The guard (who will be wearing a head-mounted camera for the duration of the stream) has 30 minutes to piece together what occurred ahead of he is framed for the crime himself.

Sounds like any other Escape Room puzzle, right? Nicely, simply because Framed is being hosted on Facebook Live, thousands (if not millions) of customers can potentially get involved. If you do determine to tune in, you will be encouraged to debate clues in the comments and asked to vote making use of Facebook’s Reactions emoticons. The Guardian reports that moderators will also be on hand to wade through the comments and choose out the answers that will assist the protagonist progress by means of the story.

With a potentially massive audience operating with each other, the story’s creators feel it is very achievable that the crime will be solved sooner than anticipated: “In a way that will just prove our premise, that the audience are a bunch of brilliant amateur detectives,” UKTV’s Sam Pearson told The Guardian.

Framed will launch on October 13th at 3pm ET/8pm BST on Alibi’s Facebook channel.

Engadget RSS Feed

Facebook has been exaggerating video views for two years

In a post on its marketing assist center, a Facebook employee announced the discrepancy and explained the distinction between how it defined the statistic, and how it was actually measured.

We had previously *defined* the Average Duration of Video Viewed as “total time spent watching a video divided by the total number of people who have played the video.” But we erroneously had *calculated* the Average Duration of Video Viewed as “the total time spent watching a video divided by *only* the number of folks who have viewed a video for 3 or more seconds.”

In response, Facebook says it really is introducing two new metrics:

Video Average Watch Time: the total watch time for your video, divided by the total quantity of video plays. This involves plays that commence automatically and on click. This will replace the Average Duration of Video Viewed metric.

Video Percentage Watched: reflects the percentage of your video somebody watches per session, averaged across all sessions of your video exactly where the video auto-played or was clicked to play. This will replace the Typical % Video Viewed metric.

As a user, this probably does not have an effect on you significantly. But even even though Facebook says the discrepancy did not affect billing, advertisers who relied on the numbers and outlets (like Engadget) who posted video to the platform might have far more questions. Bloomberg points out that Facebook is set to meet best advertisers next week during the Advertiser Week conference — we possibly haven’t heard the final of this.
Engadget RSS Feed

Privacy groups call foul on WhatsApp sharing information with Facebook

Particularly, the privacy group says it’s organizing to file a complaint against the businesses for violating statues of the Federal Trade Commission act that warns against “unfair or deceptive acts or practices.” Right here, EPIC is accusing WhatsApp of lying to users when it promised its 2014 sale to Facebook wouldn’t effect it really is privacy policy — which pledged never to share or sell “personally identifiable information” like the telephone quantity, name and profile information shared under the new policy.

WhatsApp says it wants to share limited information with Facebook to test out new features developed to support users “communicate with business,” such as receiving fraud notifications from a bank or flight delays from airline organizations. WhatsApp also maintains that all messages will nonetheless be fully encrypted, and unreadable by each Facebook and WhatsApp employees.

HILVERSUM, NETHERLANDS - FEBRUARY 2014, 2014: WhatsApp Messenger is a proprietary, cross-platform instant messaging subscription service for smartphones with Internet access founded in 2009. Shutterstock ID 177177047 PO: aol Job: production Client: drone

Customers also have up to 30 days to opt-out of the sharing portion of the new terms-of-service, but according to EPIC, that doesn’t shield the organizations from the FTC’s consent order. The order apparently needs the business to get an opt-in consent ahead of asking them to agree to the new terms. WhatsApp does technically provide an opt-in selection, but it really is not clear how to access it: a single must click “study” to view the terms-of-service agreement just before the opt-in checkbox seems.

It might sound like privacy groups are splitting hairs, but how user information is handled can have unforeseen legal consequences. It’s not just unique interest groups who are concerned — The United Kingdom’s Info Commissioner is also investigating the WhatsApp policy alter to make certain it complies with the Information Protection Act. It is a complicated small mess, but Facebook, at least, is confident it is on the right side of the law. “WhatsApp complies with applicable laws,” a spokesperson mentioned in a Motherboard interview. “As usually, we think about our obligations when designing updates like this.”

Engadget RSS Feed