If anti-trust regulators were in doubt of how complicit social media giant, Facebook is with regard to hate speech, misinformation, and violence, a former staff may have just tipped them extensively in the right direction – Frances Haugen.
In a one-hour-long interview with 60 Minutes, Frances Haugen claims the social media conglomerate (Facebook’s suite of Apps) whenever faced with choosing between profit and ethics, chose profit every single time.
The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.
Frances Haugen
Frances initially filed the complaints with the federal law enforcement anonymously last month keeping the company guessing as to who the informant was. She was implicitly referred to as The Facebook Whistleblower.
Frances made sure to take with her incriminating evidence before quitting the company – thousands of pages of Internal Research that show that the company lied about making significant progress against hate violence and misinformation.
Frances who had joined the company in a bid to fight against misinformation after losing a friend to online conspiracy theories, she claims, was disappointed when the company took a U-turn after the elections in the US scaled through without any major violence resulting in her distrust for the company’s real motives.
According to the now-famous whistleblower, the crux of the challenge with Facebook lies in the change to its algorithms in 2018 which now recommend similar content consumed by users consistently in its feed even though there are millions of other trajectories a user’s browsing activity could follow.
The impact of this is that it sort of amplifies what the user already has an interest in and drowns out any contrary content. Doing this repeatedly gives an average user the false impression that whatever information they keep seeing is almost about the only information out there and thus creating polarized audiences. Facebook essentially takes away the freedom to choose from the user in exchange for keeping the user engaged for longer on the platform.
FACEBOOK AI FACING BACKLASH FOR LABELING BLACK MEN AS ‘PRIMATES’
While this may be somewhat less harmful if you are watching comedy videos, it takes on a whole new level of importance when the information you are consuming promotes any sort of violence or discrimination and even emboldens such category of people to carry on that violence. Everything you choose to engage with on the platform becomes amplified.
Frances’ tip-off opens a can of worms for regulators to clean up and the possible ills that could be covered up in the technology industry. It also brings a lot of questions to the forefront.
TECH WARS: APPLE CEO, TIM COOK KNOCKS FACEBOOK’S ETHICS OVER CONSUMER DATA
Facebook’s leader, Mark Zuckerberg had on multiple and previous occasions testified that the company was engaging in best practices, could this new testimony make him culpable in some way? Could regulators shut down Facebook due to continued and prolonged violations?
Can Facebook ever restore public trust in its offerings?