Introducing a new “Why am I seeing this post?” feature, Facebook is inviting users to take a look at the hows and whys behind what posts are popping up on their News Feed.

It’s an expansion of the “Why am I seeing this ad?” feature, rolled out back in 2014, allowing users to see what information and data about them is being shared with advertisers.

To help users control their feed

The News Feed is meant to be relevant, despite there being more of a focus on reducing screentime in recent months when it comes down to it the aim of the game is still to get you to scroll for as long as possible.

How does Facebook achieve that? By ensuring that seen posts on the News Feed are relevant to users. By allowing users to see why they’re being shown posts, including which previous interactions on the platform have lead to the post in question being shown, and then giving the option to opt out if they feel a post is irrelevant will only help Facebook serve up more tempting content as time goes on.

The interactions taken into account can include:

  • How often a user interacts with posts from people, Pages, or Groups
  • Which post types users most commonly interact with
  • The popularity of posts users have been engaging with

To increase awareness of ads and why a user has been targeted

Following the Cambridge Analytica scandal where the data of 87 million Facebook users were handed over to the political consultancy, users have been more aware concerned about what data Facebook has on them.

To help combat this, Facebook is also improving their pre-existing “Why am I seeing this ad?” tool.

The new features will give users a better understanding of which factors including demographics, interests, and websites visited, play a part in the ads that appear in the News Feed.

Users will now be able to see when advertisers uploaded data to the platform and if they were working with any marketing partners to run the ads.

Reputation management

After facing recent outcry in regards to data and safety on the platform, especially following the New Zealand attacks live streamed to Facebook, these updates are a form of reputation management.

By being more transparent about how decisions are being made in regards to data and safety on the platform will keep users more aware and informed. It should, in theory, be easier to maintain a high level of trust with Facebook users this way.

However, has the trust already been broken? Does it matter how many new insights are given?

Let me know what you think on twitter @beccasocial.