A year and a half ago Netflix released The Social Dilemma, a docu-drama that dug into the harmful consequences of social media. Think political polarization, the spread of misinformation, and upticks in anxiety and depression across multiple demographics. Tristan Harris, a former Google design ethicist and co-founder of the Center for Humane Technology, is a central figure in the film. In a session at South By Southwest this week, Harris spoke about the steps we should be taking to get this technology and our relationship with it to a healthy place, or as he put it, the wisdom we need to steer technology and our future.
Harris opened with a quote from biologist Edward O. Wilson, who said, “The real problem of humanity is the following: We have Paleolithic emotions, medieval institutions, and godlike technology. And it is terrifically dangerous, and it is now approaching a point of crisis overall.”
In other words, technology is advancing far too fast for our brains to keep up and know how to healthily interact with it, or for our institutions to understand it and wisely regulate it.
Wilson spoke these words at a debate at the Harvard Museum of Natural History in 2009; that is, before the widespread adoption of platforms like Instagram and Tiktok, or of tech like deepfakes, text generators, CRISPR, and other innovations that have the potential to transform humanity (for better or for worse).
We now have algorithms that can generate realistic images based on text, of anything from mountain sunsets to bombed-out buildings in Ukraine. We have GPT-3, which could write a convincing paper arguing mRNA vaccines aren’t safe, citing real facts that are simply presented out of context. “This is like a neutron bomb for trust on the internet,” Harris said. “And the complexity of the world is increasing every day.” Our ability to respond, however, isn’t matching up.
Issues that would have been considered separate from one another in the past (or that didn’t exist in the past) are now closely linked; consider the impact that misinformation and synthetic media could have on nuclear escalation (and the impact they’ve already had on elections and democracy), or the connection between artificial intelligence and global financial risk.
Our previous thinking around how to manage technology isn’t good enough in the face of this new complexity; how do we handle issues like privacy or freedom of speech when multiple actors are involved, there’s low accountability, and everyone’s definition of what’s “right” is different? “Technology has been undermining humanity’s capacity for wisdom,” Harris said. “Not just individually, but our collective ability to operate with the wisdom that we need.”
Wisdom, he said, means knowing the limits of how we actually work, having the self awareness and humility to be inquiring, and being able to think in terms of systems and root causes. Harris referenced the book Thinking in Systems by environmental scientist Donella Meadows, in which she details 12 leverage points for intervening in a system—that is, changing the way a system works from its current state to something else. In Harris’s opinion, the most relevant of Meadows’ points to the tech conversation is the power to transcend paradigms.
Each of the paradigms of thinking in the tech industry that got us where we are should be overhauled by a human-centered focus. Rather than shrugging off the harms of technology by asserting that there are always costs and benefits, we should focus on minimizing harmful externalities. Rather than giving users what they want, we must respect human weaknesses and vulnerabilities (for example, the way social media platforms exploit the brain’s dopamine response). Rather than maximizing personalization to give users a satisfying experience (also known as creating our own unique little echo chambers), we should strive to create shared understanding.
The question is, how do we get more people to go from being typical users of social media and other tech to being what Harris calls humane technologists?
It starts with raising awareness and educating ourselves. Harris and his team at the Center for Humane Technology created an online course called Foundations of Humane Technology, which takes registrants through six values-centered tenets that, if we prioritize them when designing new tech (or changing the design of existing tech), can improve our experience both individually and as an interconnected community.
“We would like to have 100,000 humane technologists who are trained in this new paradigm,” Harris said. “It’s hard to think about these things when you feel like you’re the only one asking these questions.”
We’re at an inflection point where it’s crucial for those working on technology to help create shared understanding; the world isn’t about to get less complex or volatile. On the contrary, Harris predicts we’re heading into a period of increasing global catastrophes fueled by climate change, inequality, and unstable political regimes, among other factors.
It’s a lot to take on, even a lot to contemplate. But, Harris said, he has hope because he’s seen the system change much faster in the last few years than ever before. People from within the tech industry have spoken out about the risks and harms of the products they helped create, from former YouTube engineer Guillaume Chaslot to Facebook co-founder Chris Hughes to former Facebook data scientist Frances Haugen, and many more. “Technologists are actually waking up and saying, ‘I don’t want to participate in the toxic part of the industry, I want to help build a better part,’” Harris said.
Going back to Wilson’s quote, Harris proposed the following: we need to embrace our Paleolithic emotions, upgrade our medieval institutions, and have the wisdom to wield our God-like technology. We need to be able to make sense of the world and have people from different sides come together and agree on the actions we should take—then take them. There should be no place for business models that are dependent on dividing people. “We need everyone working on helping us close that gap,” Harris said.
Image Credit: Rodion Kutsaev on Unsplash
Looking for ways to stay ahead of the pace of change? Rethink what’s possible. Join a highly curated, exclusive cohort of 80 executives for Singularity’s flagship Executive Program (EP), a five-day, fully immersive leadership transformation program that disrupts existing ways of thinking. Discover a new mindset, toolset and network of fellow futurists committed to finding solutions to the fast pace of change in the world. Click here to learn more and apply today!