Credibility collapse is driving a trust collapse that will drive a democracy collapse
Can a new credibility system help rebuild trust online (and off)?
This article is part of a series of thinking examining modern media and our civic life including my new book For ALL the People coming Feb 23, 2021.
Democracy is a system of faith: our institutions and process work when we imbue them with belief and trust. Misinformation (unintentionally false content) and disinformation (intentionally, nefariously false content) have always been part of our daily information consumption, but the information landscape we live in moves faster and at a volume well-beyond what humans can cognitively handle effectively. And in 2020, according to NewsGuard, unreliable sources nearly doubled their share of information from 2019. Discerning true from false, healthy from unhealthy, productive from nefarious is fundamental to our ability to understand the world and for democracy to function. But in our current media systems, both mis and disinformation are thriving and occupy a larger percentage of the available information than ever.
We used to rely on gatekeepers and institutions to dictate credibility, but they have lost power and control in a new landscape where we are all creators and distributors of content. The systems of distribution and discovery we rely on for storytelling, for entertainment, for news, for information are fundamental to culture and to democracy but have no ready replacement for the gatekeeping. And the tools we currently rely on are not optimized for our collective social wellbeing or civic health and desperately need direction and help distinguishing and labeling the information they present in order to give people a chance to discern effectively what they consume.
NewsGuard is a new technology company building a new human-curated system for rating and surfacing the sources of content to attempt to make it even possible to discern credible from the garbage or the outright malevolent.
In their analysis of 2020 engagement (defined as likes, shares, or comments on Facebook and Twitter ) with news content online in partnership with NewsWhip (a media analytics company), they see an explosion of news content about both the election and the pandemic — an increase consistent with other content research in a year when people have been craving good, public information about public crises. But what NewsGuard also discovered is that a large portion of that increase comes from generally unreliable sources who doubled their share of engagement over 2019.
At a moment when we most need reliable, healthy media and information systems, we’re getting dangerous and misleading content in greater percentages than ever via systems that have never made it harder to be good information consumers. It’s true both Facebook and Twitter have taken some steps in recent weeks to help remove and identify false and misleading content about COVID-19 and vaccination — thank god. But this is the barest beginning and does not address the ethics of their fundamental design principles. Ultimately, when look back studies are done of this pandemic, we will see a meaningful separation of death rates depending on different community’s information consumption patterns.
I’m hopeful of this new beginning, but time will tell whether NewsGuard’s solution can become a new standard. Unquestionably, this kind of innovation is necessary to a future where media and information systems actually serve people and not just profits.
Welcome to 7 Bridges — a conversation about the future of humanity and democracy in America. If you’re joining us for the first time, hello! Subscribe via the button below to get this in your inbox for free.
Please consider becoming a paid subscriber to support this work, too. Subscribing to 7 Bridges is the best way to keep it free and open to all — and to support new voices and independent media.