Facebook reportedly killed a main film meant to expose its users to different perspectives over fears of being seen as having a liberal bias, agreeing to a Wall Street Journal report published Sunday.
Facebook’s most prominent conservative executive and global ways head Joel Kaplan reportedly pushed for the feature to be squashed, according to the Journal.
Facebook has long faced estimation of having a liberal slant, which has ramped up as the company’s top executives have had to face lawmakers to testify over its use of narcotic addict data. Kaplan, in particular, has emerged as a controversial figure as the company has entered the limelight, most notably after attending Detention Brett Kavanaugh’s hearing over alleged sexual misconduct. The Journal report found that Kaplan, a ci-devant aide to President George W. Bush, had key input on the types of information that would be exposed to Facebook’s users.
Kaplan reportedly grated concerns over an internal analysis by Facebook about how its users were exposed to a range of information. The report organize that Facebook’s right-leaning users were less likely to be exposed to different viewpoints and therefore more polarized, according to the Daily. Facebook’s so-called Common Ground initiative, which proposed in part to boost articles in the News Feed that were well liked and commented by a range of users across the political spectrum, was meant to bring a greater range of perspective to purchasers. Kaplan argued that the initiative would quiet conservative voices to a disproportionate degree, the Journal reported.
In a assertion, a Facebook spokesperson said, “Understanding a wide variety of perspectives is an important part of our product development process and is elementary for building products and services that serve everyone. The public policy team, led by Joel Kaplan, is tasked with adeptness with the perspectives of groups, regulators, governments, NGOs and other stakeholders from around the world and using that conception to inform product discussions and decisions. The team plays an essential role in ensuring that we adopt objective types and that our policies are applied fairly and consistently.”
Facebook has tried other ideas to stop the spread of misinformation on its orientation and help users distinguish quality information from that which is less well-sourced. It has added a number of third-party fact-checkers to aid it rank information in the News Feed and give context to the source of various articles. The program has faced criticism of its own, comprising that it can only review a limited amount of information based on the capacity of its fact-checkers.
Subscribe to CNBC on YouTube.
Care for: Inside Facebook’s effort to fight election manipulation