What aspects of diversity are most important to consider when it comes to ranking and recommending the news?
As more readers turn to platforms such as Twitter, Google, and Apple News as their main source of news, it’s more important than ever to discuss how to realistically incorporate journalistic principles into the algorithms that automatically surface and recommend news.
In this blog post, let’s consider some aspects of the journalistic principle of diversity — from hiring diverse staff to writing for diverse audiences — that appear in the policies and editorial guidelines of both traditional newsrooms and platform news products.
What do such guidelines tell us about the aspects of diversity being considered in editorial processes? And what are the challenges of translating a journalistic principle into rules that govern news algorithms?
Examining diversity in the Newsroom
Discussions around diversity and journalism often center around staff, coverage, and audience. These all likely overlap, meaning more diversity in one area can lead to greater diversity in another. For example, the Columbia Journalism Review suggests that increasing diversity among newsroom staff leads to more diverse coverage which, in turn, means being able to reach new, diverse audiences. Newsrooms are also proactively connecting with journalism students from diverse backgrounds by offering fellowships as well as pushing for diverse sourcing in editorial policies, among other methods.
How Do News Aggregator Platforms Approach Diversity?
While newsrooms rely on traditional human resources practices to incorporate and increase diversity into their news coverage, how are news aggregator platforms — which rely in part or completely on algorithms — incorporating diversity into their news products?
Like newsrooms, technology platforms also emphasize diversity as an important journalistic principle that shapes their editorial decisions. Platforms often frame their language around diversity in a way that matches discussions happening in the journalism community.
While none of the platforms explicitly explain how diversity is incorporated into algorithms, their public guidelines indicate that diversity as a journalistic principle is indeed considered when news articles are surfaced, ranked and promoted in their news feeds. Three examples include diverse perspectives, audience diversity, and diversity of topics.
Google News, which relies primarily on algorithms rather than human curation to rank and recommend articles, states that it has taken the editorial approach of highlighting what Google calls “unpersonalized news” — algorithmically selected stories that are presented to all users — to “provide access to context and diverse perspectives” (emphasis added).
- American Indian and Alaska Native
- Native Hawaiian and Other Pacific Islander
Editorial guidelines for Facebook’s human curation team also include a commitment to diverse voices and a diverse range of topics.
What Might Signals of Diversity Look Like?
At NewsQ, part of our work involves considering signals that can help algorithms identify when an article or publisher meets certain standards. These signals can help translate journalistic principles into algorithms and they can range from data that flags editorial staff diversity to coverage variety and beyond. Here are two examples of signals for diversity that we as a team have explored:
Media Ownership. According to Reporters without Borders initiative Media Ownership Monitor, there is a correlation between pluralism of media and diverse coverage. Media Ownership Monitor maps the media landscape in countries all over the world and scores them based on indicators of risks to media pluralism.
Audience and Language. Including diverse perspectives involves news organizations that cater to diverse audiences and language groups as signals of diversity when surfacing and recommending news. For example, the Latino News Media Map is an interactive directory of news outlets serving Latino communities throughout the US.
Incorporating Diversity into Ranking Algorithms Could Result in Tradeoffs
Incorporating diversity into news algorithms is a challenge that requires thinking about the many ways that diversity can be represented in the news. Elevating or surfacing diversity may also require discussions around what aspects of diversity should be prioritized at the possible detriment of others.
An example of this challenge can be seen in a research audit of Apple News and another separate study on Google News. Researchers at Northwestern University found that by recommending Top Stories and Trending Stories without personalization or localization, Apple minimized the filter bubbles that recommend articles supporting pre-existing beliefs. However, this came at the expense of source or publisher diversity, namely: advertising opportunities for a wider range of news publishers excluded from the top ranked articles in the news feed.
In a different audit of Google News, researchers from Columbia University and the University of Oregon noted that minimal personalization meant readers were recommended news from outside of their stated personal partisan preference. However, the lack of personalization meant that news coverage overall was relatively similar — and less diverse — for every user in the study group.
If we agree diversity is important in our news feeds, then what tradeoffs are we willing to make? What aspect of diversity should take precedence? Diverse audiences? Staff? Sources?
Is there a way to elevate signals of diversity without negatively affecting other equally important signals?
Through our work, NewsQ will continue to discuss these and other challenges of bridging journalism principles with automated systems.