Facebook is to allow users to turn off the algorithm powering its News Feed and control the order items appear amid growing calls for algorithmic transparency.
The company will also provide more context on suggested content by expanding the “Why am I seeing this?” tab, as well as letting users control who can comment on their posts
It comes ahead of the first draft of the UK’s new Online Safety Bill, which is likely to include powers for Ofcom to audit how social media companies’ algorithms work.
Algorithmic transparency has become a focus for regulators, especially when companies are seen to be promoting harmful content or creating enclaves for people who believe conspiracy theories and only encounter people who share those beliefs.
Although he didn’t name the company specifically, Facebook was recently criticised by Apple chief executive Tim Cook in the wake of the mob attack at the Capitol building.
Mr Cook said: “We can no longer turn a blind eye to a theory of technology that says all engagement is good engagement.
“At a moment of rampant disinformation and conspiracy theories juiced by algorithms… it’s long past time to stop pretending that this approach doesn’t come with a cost – of polarisation, of lost trust and, yes, of violence.”
There are a range of approaches to tackling these so-called “echo chambers” or “filter bubbles”, although the global movement to address the problem has yet to agree on best practice.
Some suggest increased privacy controls to reduce the ability for platforms to profile users, while others suggest increased transparency in how this profiling takes place.
The move towards transparency comes as Sir Nick Clegg, the UK’s former deputy prime minister and now Facebook’s vice president of global affairs, published an article addressing “common misconceptions about the algorithms’ impact on users and society”.
Sir Nick writes: “Even if you agree that Facebook’s incentives do not support the deliberate promotion of extreme content, there is nonetheless a widespread perception that political and social polarisation, especially in the United States, has grown because of the influence of social media.
“This has been the subject of swathes of serious academic research in recent years – the results of which are in truth mixed, with many studies suggesting that social media is not the primary driver of polarisation after all, and that evidence of the filter bubble effect is thin at best,” he argues.
When Twitter introduced suggested content on users’ home pages it did so alongside an option for them to use the traditional chronological feed.
TikTok, in contrast, has often been cited for the way it suggests content without offering the option to view material in a chronological fashion.
Facebook has also confirmed it will be adopting another Twitter feature by enabling users to control who can comment on their News Feed posts, alongside an explanation of the factors that influence suggested posts in its News Feed feature, including:
• Related engagement: A post may be suggested for you if other people who interacted with the post also previously interacted with the same group, page, or post as you.
• Related topics: If you’ve recently engaged with a certain topic on Facebook, we may suggest other posts that are related to that topic. For example, if you recently liked or commented on a post from a basketball page, we could suggest other posts about basketball.
• Location: You may see a suggested post based on where you are and what people near you are interacting with on Facebook.
“In the long run, people are only going to feel comfortable with these algorithmic systems if they have more visibility into how they work and then have the ability to exercise more informed control over them,” added Sir Nick.
“Companies like Facebook need to be frank about how the relationship between you and their major algorithms really works. And we need to give you more control over how, and even whether, they work for you,” he added.