What should quality journalism online — through the distribution mechanisms of algorithms — look like? Over the next months, the NewsQ initiative looks forward to creating resources and fora in which to support constructive engagement on this issue.
After all, we continue to be uneasy about the news. We know this. Through a number of survey reports, we can find that a majority of us around the world are concerned about our ability to discern false information or its ability to be weaponized. It is hard to say whether we are overall reading news more (Edelman 2019) or trying to avoid it altogether (Reuters Institute 2019), but it seems fair to say there’s a certain amount of fatigue. And while we consider news in search results or social media to be problematic, we are increasingly getting some of our information this way even if more traditional or ‘reputable’ news sources have also recently regained some ground.
What would quality journalism look like?
How should things be different? A number of us think that news should be more accurate and less biased, especially as it is served to us via search results or social media recommendation systems. Other examples: we want our discussions around news to be more civil and to not be one-sided; we also worry about the harassment of journalists and potential censorship as well. (Pew Research, June 2019 and October 2019).
These responses make sense, but when it comes to processing news for millions in seconds, they are not strong enough. We need much more specific visions of what news at scale is supposed to look like: not just how it should look like in different languages and countries, but also against different topics and also against an understanding of the borders between misinformation and the freedom of expression and opinion.
An algorithm, broadly speaking, is a “step-by-step procedure for solving a problem or accomplishing some end” (Merriam-Webster 2019). Search and discovery on the internet today is made possible by these algorithms; those of us creating our first HTML pages in the early 1990s know this all too well. So, for example, how should an algorithm navigate the following characteristics to serve up meaningful news for a given reader:
- Presence of a false claim
- Presence of several fact-checked related claims
- Political slant
- Local versus national relevance
- Original content
- Recommended by my friends
- Positive tone
- Journalist author from a diverse background
- Engaged in accountability journalism
What would be the trade-offs in emphasizing one versus another? Should the recommended navigation differ if the topic is something like vaccinations, or immigration, or entertainment? Is the goal to get one to be able to see a different point of view (Ash et al., 2019) or to allow one control over the information one sees, even if it happens to be less accurate but more opinionated?
These are hard questions to answer, but we think that a concrete attempt to answer them to be worthwhile. With that in mind, NewsQ will be pursuing this in three ways:
Follow Along as We Make Progress
The NewsQ team look forward to keeping you updated on our progress and opportunities to connect. Let us know if you have questions.