简体中文
繁體中文
English
Pусский
日本語
ภาษาไทย
Tiếng Việt
Bahasa Indonesia
Español
हिन्दी
Filippiiniläinen
Français
Deutsch
Português
Türkçe
한국어
العربية
Abstract:Image copyrightReuters Facebook is launching a new feature that explains how its algorithms decide w
Image copyrightReuters
Facebook is launching a new feature that explains how its algorithms decide what to display in your News Feed.
A new “Why am I seeing this post?” button will indicate what activity influenced Facebook's algorithms.
It is the first time the company has given people access to this insight directly in its app and on the website.
Facebook, Twitter, YouTube and others have been criticised for using algorithms to recommend content without explaining to users how they work.
Facebook told the BBC the new feature was available for some users in the UK today. It will roll out fully by 2 May.
Image copyrightFacebook
The “Why am I seeing this post?” button will be found in the drop-down menu that appears at the top right of every post in the News Feed.
The tool will offer insights such as: “You've commented on posts with photos more than other media types.”
Facebook to consider live video restrictions after NZ attacks
Facebook begins new EU political ads rules
Facebook said it was also adding more information to the “Why am I seeing this ad?” button that has appeared on advertisements since 2014.
It will now let people know if details on their Facebook profile matched those on an advertiser's database.
It already revealed whether some of your online activity, such as the location where you connected to the internet, was being used to target ads at you.
Image copyrightFacebook
“Both of these updates are part of our ongoing investment in giving people more context and control across Facebook,” the company said in a blog.
Facebook has faced intense scrutiny after a series of data breaches, privacy scandals and allegations that the platform was used to interfere in elections.
Last week, chief executive Mark Zuckerberg called for government regulation, saying the responsibility for monitoring harmful content was too great for companies to tackle alone.
Disclaimer:
The views in this article only represent the author's personal views, and do not constitute investment advice on this platform. This platform does not guarantee the accuracy, completeness and timeliness of the information in the article, and will not be liable for any loss caused by the use of or reliance on the information in the article.