Weekend Edition: A familiar breathlessness
What the last decade wrestling with the potential and effects of social media might teach us about how we greet ChatGPT.
We’ve spent the last several years unpacking the unintended consequences of a decade of ad-optimized and profit-seeking social media driven transformation of our information systems and public sphere. The unguided, unhinged, breathless chase to profit off these innovations has wreaked havoc on society and civic life in ways we are still suffering under and struggling to redesign. (I may have written a whole book on this and others have been talking about it even longer.) In the last two months, an eerily familiar breathless conversation and chase seems to have begun about ChatGPT.
OpenAI's large language model powered chat bot has captured public attention and sent the big tech companies who’ve been chasing various kinds of generative and general AI systems for years scrambling. What we have seen is the glittering and intriguing possibility of a new tool capable of rapid, human-like text capable of dramatically speeding up how people find information via conversation —introducing a new (or reintroducing an old) feeling into the experience of discovery and learning. We have also seen it regularly display a dangerously confident wrongness that has been openly mocked broadly but that we have no real defense against or guardrails to identify. OpenAI — the company originally founded as a 501c3, now a capped for-profit company that developed ChatGPT — has a brilliant team of ethicists and a mission “to ensure that artificial general intelligence benefits all of humanity.” But Microsoft — now OpenAI’s largest investor — doesn’t have that mission, has wildly broader reach, has first rights to commercialize the technologies, and can leave all of the ethical responsibility at the feet of someone else while blindly chasing Google’s search market dominance. We have repeatedly (and recently) seen new technologies impact society broadly and more quickly than at any point in the evolution of human society without clear ethics or boundaries or purposes while chasing the profits of the last era of venture-backed wealth creation. Microsoft isn’t even trying to be ambitious about a new business model — just using something with clear challenges, known profound consequences, lots of potential for massive unintended consequences and commonizing those costs in spectacular fashion just to eat into Google’s ad revenue built on the last generation of transformative technology.
In our haste, we are releasing a tool with weak safe-guards on a society reeling from its decreased capacity for sense-making and discernment in order to provide even worse guidance on knowledge creation by a tool known to provide confident wrong answers regularly to people hungry for shortcuts after a decade of algorithmic sorting and overwhelming information overload. We've been trained to be trolls (often even by mistake), to expect the worst from the unfamiliar, to engage with each other as caricature and are unleashing those habits into conversation with a new tool that could help us relearn new ones but hasn't been ask to. Of course, this is going to make our capacity for productive discourse harder. Of course, this is going to lead to strange, unexpected conflicts, and since it’s reliant on the last decade of open web text, of course, it will reinforce unhelpful conflicts we are still trying to find bridges out of. Of course, the hilarity of its errors in experiments by safe members of safe communities will distract the public from how dangerous it will be for unsafe members of unsafe communities.
Our impatience, made worse and more corrosive over the last decade, is biting us again. Our craving to be first, to profit, to capture is getting in the way of the possibility of real transformation. Learning from conversation again but at scale has the potential for not just expansion of access to information but rebuilding of knowledge, of reconnecting to long-forgotten traditions and wisdom, of interconnecting conversations that have been long segregated, of integrating culture, of making simpler the ability to learn in subtle, nuanced, balanced ways that encourage strong agreements and take time. Right in front of us is the possibility to augment one of our most definitively human abilities with the capacity to make sense of and leverage exponential information systems that human neurology will never catch up to on its own. And these new tools have the capacity to automatically and gently provide the bridging conversations we so desperately need after a decade of intense bonding and sorting that has increasingly dis-integrated culture and reintroduce the unfamiliar as curious not threat. They could be part of the answer to the bad designs of the last generation if we demand that they tackle and remake those experiences as fundamental outcomes to their use, part of the healing we so desperately need. But instead we are focused on leveraging it for the benefit taking ad market share from another company who is in turn hoping to use the same technology, not to expand humanity, but to protect the wealth and position it’s already built.
There is a public conversation possible about the public goods these advances could enable — and a version of that is happening, but far too quietly. We have to have it now, bravely and broadly — to reclaim our agency in the face of these innovations and transformations. The profit will still be there if we wait. No one at Microsoft or Google will go hungry if forced into a longer conversation about what the public goods must be that come from the deployment of these technologies, of the expectations we demand of systems meant to serve us. But reinforcing bad habits and doubling down on the bad designs of the last decade with even more powerful technology will force us down paths that will be harder to course correct from later on — just like the civic dis-ease we are facing now after a decade of damaged discourses. Heading down a steeper slope is the wrong choice — especially when these tools could be part of the key to solving so much of our civic dysfunction.
Please consider becoming a paid subscriber to support this work. Subscribing to 7 Bridges is the best way to keep it free and open to all — and to support new voices and independent media.