Sci-Tech

Facebook is torn between the rise of Infox and its business goals

Given the soaring infox a few days after the November 3rd US election, Facebook decided to temporarily revise its algorithm, giving priority to traditional media like CNN, the New York Times, which score well on the “quality of.” Information”. This temporary change to the news feed will reduce the flow of so-called partisan sites like Breitbart, Occupy Democrats.

“Many variables play a role in every decision we make, but all of them aim to create the best possible experience for people,” said Facebook spokesman Joe Osborne in an interview with The Verge. .

In the same category

Snapchat launches Spotlight to compete with TikTok

Promote a healthier and less divisive platform

Facebook employees have asked company executives to keep the algorithm change long-term in order to benefit from a healthier, less divisive information platform. However, they are torn between fighting disinformation and the platform’s economic growth. Indeed, they want to limit polarization on the social network while enjoying good growth, as it did this year with a 22% increase in sales over last year.

Employees are less proud of Facebook
While Facebook employees’ salaries are some of the highest in technology, they’re less proud of the company than they were in previous years. According to a poll approved by the New York Times, only half of them believe Facebook is having a positive impact on the world, up from about three-quarters at the beginning of the year.

In May last year, amid the exploding Black Lives Matter movement, staff did not hesitate to voice their disapproval after Mark Zuckerberg controversially decided not to remove Donald Trump’s violent publication: “When the looting begins who have favourited The Shots “. According to the CEO of Facebook, the post did not violate the rules of the social network, while the provisions are as follows: prohibition of hateful statements, direct attack on an individual or group.

Fight Infox upstream

After the fake news awareness campaign this summer and the implementation of a retroactive Infox alert system, Facebook decided to limit the active distribution of content that could cause a wave of hate speech. . According to Guy Rosen, group vice president responsible for the integrity of Facebook, the software used to detect hateful content on the social network could “unfairly punish publishers for hateful comments from users or allow bad people to reduce the reach of a page by using them bomb toxic comments. “

After tumultuous elections riddled with misleading and inaccurate statements, Facebook employees hope to continue the experience of changing the algorithm to maximize quiet usage on the platform used by 2.74 billion users. The future will confirm whether this change in algorithms will help contain the scourge of fake news that threatens democracy, but also its impact on other media.

Report Rating
Close