Instagram CEO explains how photo sharing platform decides what users see
In a bid to address misconceptions that Instagram intentionally hides certain posts, its CEO Adam Mosseri has explained in detail how the Facebook-owned photo sharing platform decides what the users see.
Sharing the first in a series of posts on Tuesday that will shed more light on how it impacts the experiences that people have across the app, Mosseri said that Instagram doesn’t have one algorithm that oversees what people do and don’t see on the app.
“We use a variety of algorithms, classifiers, and processes, each with its own purpose. We want to make the most of your time, and we believe that using technology to personalise your experience is the best way to do that,” he informed.
When first launched in 2010, Instagram was a single stream of photos in chronological order.
Today, each part of the app – Feed, Explore, Reels — uses its own algorithm tailored to how people use it.
“People tend to look for their closest friends in Stories, but they want to discover something entirely new in Explore. We rank things differently in different parts of the app, based on how people use them,” the company’s CEO said.
Mosseri then went on to explaining how over the years the company learnt that Feed and Stories are places where people want to see content from their friends, family, and those they are closest to.
“We start by defining the set of things we plan to rank in the first place. With Feed and with Stories this is relatively simple; it’s all the recent posts shared by the people you follow. There are a few exceptions, like ads, but the vast majority of what you see is shared by those you follow,” he explained.
“Next we take all the information we have about what was posted, the people who made those posts, and your preferences. We call these ‘signals’, and there are thousands of them”.
The ‘signals’ include everything from what time a post was shared to whether you’re using a phone or the web to how often you like videos.
The most important ‘signals’ across Feed and Stories, roughly in order of importance, are information about the post, information about the person who posted, your activity and your history of interacting with someone.
“From there we make a set of predictions. These are educated guesses at how likely you are to interact with a post in different ways. There are roughly a dozen of these,” said Mosseri.
Another important case to call out is misinformation.
“If you post something that third-party fact checkers label as misinformation, we don’t take it down, but we do apply a label and show the post lower in Feed and Stories. If you’ve posted misinformation multiple times, we may make all of your content harder to find,” the Instagram CEO noted.
People often accuse Instagram of “shadowbanning” or silencing them.
“We recognise that we haven’t always done enough to explain why we take down content when we do, what is recommendable and what isn’t, and how Instagram works more broadly,” said Mosseri.
“We’re working hard on improvements here. We also manage millions of reports a day, which means making a mistake on even a small percentage of those reports affects thousands of people,” he added.