Why are you seeing that news story? What exactly is an article of higher news or journalistic “quality”? How should news ranking and recommendation systems work? To facilitate reflection about these questions, our team regularly curates links about journalism and news recommendation systems in the NewsQ Reading Library.
To kick off September, here is a sample of just some of the articles, blog posts, and stories the NewsQ team has been reading over the summer.

How Do Platforms Rank News Online?
Earlier this summer, the NewsQ team took a look at how platforms rank news online. First, we reviewed Google, where news content shows up in several places, including Google News, Google Search, Google Assistant, YouTube, and Discover. This Google Developers blog post is a great place to start learning about Google and news content.
According to the Google Developers blog post, news publishers are automatically considered for Google News if they produce “relevant content” that demonstrates “high levels of expertise, authority, & trustworthiness; a consistent history of producing original news-related content.”
However, while publishers could once apply to have their news content included in Google News, with the launch of Google Publisher Center in 2019, the process has been automated. Writing for Search Engine Land, Barry Schwartz provides more details.
While the NewsQ team did a deep dive on Twitter on Facebook’s News Feed ranking algorithm, the place to start is a comprehensive blog post by the Facebook engineering team, How Does News Feed Predict What You Want to See?
After Facebook and Google, we took a look at “Microsoft News Dataset (MIND),” a large-scale dataset for news recommendation research. After releasing MIND, Microsoft then launched the Microsoft News Recommendation Competition in 2020. This blog post provides a step-by-step walk-through for developing an algorithm for the news recommendation problem in the now-finished competition.
Considering the Transparency of Online News Recommendation Systems
Next, we considered how these news recommendation systems actually work in practice for publishers, and for audiences. So, here are a few links from the NewsQ library that share how those “outside the black box” consider the transparency of platforms and their algorithms.
First, let’s take a look at a 2018 paper by Jaron Harambam, Natali Helberger and Joris van Hoboken for the Royal Society. In their paper, the three researchers pose the question:
The three researchers suggest that while personalization of news enables media organizations to be more receptive to their audiences, “it can be questioned whether current deployments of algorithmic news recommenders […] live up to their emancipatory promise.”
Furthermore, the trio says, audiences “have little knowledge of what personal data is used and how such algorithmic curation comes about, let alone that they have any concrete ways to influence these data-driven processes.”
Next, Nick Diakopoulos and Daniel Trielli ask “How can journalists critique algorithms as government and big tech algorithms themselves become more pervasive and powerful?” In their 2020 paper, Diakopoulos and Trielli offer a systematic methodology for critiquing news recommendation algorithms that journalists and other non-specialists can use.
Bonus link: Auditing News Curation Systems: A Case Study Examining Algorithmic and Editorial Logic in Apple News, by Nick Diakopoulos and Jack Bandy.
How News Gets Processed Online
Over the summer, the NewsQ team also discussed how news gets processed online. We looked at how platforms such as Facebook and Google have a “news labeling problem,” as Emily Bell and Sara Sheridan described in their article for CJR in 2020.
In their CJR article, Bell and Sheridan note Tow Center research ahead of the 2020 elections revealed “how partisan online news networks operated a thousand politically backed sites cropping up across the US producing largely automatically generated stories.”
Some websites were “serving lobbying or political interests by producing what appears to be local news content… Lack of transparency around funding sources is designed to deceive readers by making it difficult to detect political or commercial motives.”
Bell and Sheridan noted that, “For third party platforms that aggregate news content like Google and Facebook, there is an increasing need to flag instances where news production becomes lobbying or advertising.”
The problem about labeling (or a lack of it) had already been identified in 2017, as demonstrated in an article by Bill Adair and Rebecca Iannucci, both members of Duke Reporters’ Lab. Their article summarizes a survey into labeling conducted in 2017. Their general conclusions? “Most (news) sites don’t include labels showing article type,” according to a survey they conducted of news audiences:
Online journalism provides readers with access to thousands of news sources, but readers may not understand the type of article they’re reading. Our hypothesis was that many news organizations do not label article types to indicate whether they are news, analysis, opinion or a review.
Bonus link: An Introduction to Schemas for Journalists, produced in partnership by the Credibility Coalition and the Nieman Foundation for Journalism at Harvard and written by Samantha Sunne.
NewsQ Guide to Platform News Products
In order to provide participants of our News Ranking Review Panels with information on each platform news product they will consider as panelists, NewsQ has created a regularly updated Guide to Platform News Products. The guide provides overviews for each news product, mostly from the perspective of the goals expressed by platform owners themselves. The NewsQ platform guide includes overviews of Google News, Apple News, Facebook News (News Tab), and Microsoft News.
Browse Our Zotero-Powered NewsQ Library!
It’s a work in progress, but the NewsQ team is busy curating a Zotero-powered library of reading materials about news ranking and recommendation systems. We also welcome your suggestions about additions to this library… Just contact us on Twitter!