The Future Of Digital: How Can We Realign Technology With Humanity’s Best Interests?
Some of the most respected pioneers of the internet era are banding together to develop solutions for the dehumanizing tendencies of today’s digital platforms.
The tech industry has reached a tipping point. What started out as an optimistic and mostly unregulated dive into the digital world has opened an enormous can of ethical worms. Technologies that were originally designed to bring people closer together are arguably pulling us apart. For the first time in history, a handful of tech companies steer the thoughts and emotions of billions of people every day.
As society’s disenchantment with the dehumanizing aspects of digital grows, we need to accept that technology is not neutral and that the systems driving these forces have gone largely unchecked. So, what can and should be done about it?
Sara Holoubek, CEO and Founder of Luminary Labs, a consultancy that develops strategies and innovation systems for Fortune 500 & government leaders seeking to transform their organizations, recently sat down with Max Stossel, a former marketer and internet entrepreneur who now serves as the Head of Content and Storytelling for the recently launched Center for Humane Technology, for an illuminating discussion at SMWNYC regarding the future of tech ethics. This session focused on:
- How technology is hijacking our attention and how we can stop it.
- How the Center for Humane Technology is working to hold digital platforms accountable for their dehumanizing tendencies.
- Why optimizing for attention within social media platforms is not what’s the most productive for a healthy, strong democracy.
- How to change the way we assess the ethics of our content, in order to reshape digital channels and make them more conducive to building relationships versus creating isolation.
Here are a few of the highlights.
The rise of the “attention economy”
There’s a hidden goal driving all of our technology, and that goal is the race for our attention. Thousands of engineers are designing digital features and products for the sole purpose of capturing and directing our attention. Shaped by the demands of an advertising economy, the internet has become a medium based on the desirability of interruption.
The Center for Humane Technology asserts that what’s best for capturing our attention isn’t best for our well-being. Founded by a team of tech pioneers including former Google Design Ethicist Tristan Harris, the Center is a coalition that emphasizes the need for a realignment of technology and prioritizes personal well-being over maximizing attention. It is gathering other industry insiders who have “seen the inside of the beast” to try and tackle how the system is designed to exploit the human experience.
Why are those who designed and developed the systems being criticized now coming together to change them?
Stossel admits, “I think I’d be lying if I didn’t say that Trump has created some gasoline on this fire. This issue is not about the fact that Donald Trump was elected in any way, shape, or form. We have been shouting about these issues a long time before then. But I think seeing how clearly and deeply an election can be manipulated or influenced by this new way we are all receiving and consuming information, I think lit a fire under people.”
His personal wakeup call began much earlier though. During a previous role as a social media strategist, he discovered two things really helped to boost consumer engagement on Facebook – “showing audiences things they already believed in, and making extreme statements.” Over time, he noticed that news organizations were using the same tricks to engage their audiences. “They were exaggerating their headlines, they were preaching to their choirs, they were doing whatever it took to get those clicks and get that traffic”, said Stossel. “Once our news had been compromised by this system…we started to change the literal stories that we tell based on what works in these algorithms”, and he began to appreciate the impact this would have on billions of people. “We’ve really reached a point in time where the technology is so good at capturing and steering attention, that we really need to rethink if this is a way that we want to show up and live in the world.”
Addiction by design: Snapchat, Instagram, Facebook, and YouTube
Stossel discussed some of the techniques Snapchat, Instagram, Facebook, and YouTube are using to keep users engaged for longer, and the impact this can have on relationships, mental health, and society at large. For example:
- Snapchat turns conversations into streaks, redefining how children measure friendships.
- Instagram glorifies the picture-perfect life, eroding people’s feelings of self-worth.
- Facebook segregates us into echo chambers, fragmenting communities.
- YouTube auto-plays the next video within seconds, even if it eats into people’s sleep.
Holoubek also discussed how social media has been used to deliberately:
- Push lies to specific zip codes, races or religions.
- Find people who are already prone to conspiracies or racism and automatically reach similar users with lookalike targeting.
These techniques essentially employ basic marketing tactics, but Stossel feels it’s different because “most people didn’t have any idea how it worked” until now.
“I’m far less concerned with the literal advertising, I’m much more concerned with what that inherent newsfeed is choosing to show us. Ultimately we tend to react to however it is working with our own organic content, but so does everyone else”, said Stossel.
Based on Jean Twenge’s research, Stossel is concerned there has been a steep rise in feelings of isolation, suicide, and depression, particularly in young people which may be linked to the growth of smartphone and social media use.
“If this were happening in the automobile industry, it would be a product recall”, said Stossel.
How can the tech and advertising industries try to address the unintended consequences of technology?
When it comes to addressing the unintended consequences of technology, Stossel acknowledges that there is no quick fix. Mark Zuckerberg may not have foreseen the negative consequences Facebook might have when starting out, but we still have to take responsibility and deal with the fallout. “Solutions are a tricky beast”, said Stossel. Change needs to happen at multiple levels. For example, tackling issues like “fake news” will require more than hiring additional human editors (although Holoubek noted LinkedIn is doing this). On additional regulation, he said “I love the idea of regulation, but we need to be so careful and so smart about what that looks like. It would be great if people with more of an understanding of tech could be in office.”
Stossel believes it’s the intention of the system to extract as much attention as possible, but for this to be good for humanity, the advertising model will need to change. “If we start being willing to pay for things which bring us value, we’re competing for what brings value to people instead of attention. We need to shift the business model.” For example, this could mean developing a different kind of app store or marketplace that competed to bring value to consumers lives. “I think the fundamental piece that is not being considered is that most of the time, the most meaningful choice for a human being is not on screen. Until we start to look at that choice architecture differently, I don’t think the real value shift will come.” Stossel believes consumers also need to demand better technological solutions – “tech that cares about us and doesn’t use us and farm us for our attention.”
Given the scale and reach of today’s technology, Holoubek and Stossel acknowledged that the potential for unintended consequences is more dangerous. Stossel believes that more ethicists should be involved, especially in product teams. Citing a recent health tech example, Holoubek recommended that “every industry needs an ethicist in the room when decisions are made.” Former U.S. Chief Data Scientist DJ Patil is also trying to develop a code of ethics for the industry, to try and manage the ethical use of the mindboggling 2.5 quintillion bytes of data generated each day. Stossel predicted we might see some form of Hippocratic oath for designers developed one day too.
How can individuals try to address the unintended consequences of technology?
On an individual level, Stossel suggested people try to have healthier relationships with their devices. He recommended:
- Turning off all notifications that aren’t from a person who is trying to reach you.
- Changing the settings on your phone to greyscale.
- For those who are connected all day for work (e.g. social media strategists), sleeping with their phones on the other side of the room and using a physical alarm clock, so they don’t wake up and look straight at their phones.
- For those who are managing teams, being more conscious of the human costs of having to respond to social media all day, and giving people sufficient breaks/time off.
- Measuring success in different ways (e.g. in real life connections created instead of reach, likes, clicks, and shares).
Mark Zuckerberg’s recent testimony to lawmakers in the wake of the Cambridge Analytica scandal gave the world a wake-up call regarding how technology and social media systems are influencing real-world events. The way technology has been used to manipulate and steer human evolutionary instincts is a massive, global problem that currently affects over two billion people.
While we may be left with more questions than answers at this stage, it is reassuring to see global leaders in technology and business stepping forward to demand a new lens on which to assess the ethics of social media and technology. We need to keep the conversation going, and Social Media Week, Luminary Labs and the Center for Humane Technology are all endeavoring to navigate these murky ethical waters.
Learn the latest trends, insights and best practices from the brightest minds in media and technology. Sign up for SMW Insider to watch full-length sessions from official Social Media Week conferences live and on-demand.
Watch SMW Live
SMW Insider is a premium video platform that streams more than 180+ hours of talks, presentations, and interviews from leading industry experts.
Write for Us
Interested in sharing your ideas and insights with the world? Become a SMW News contributor and reach 300k readers each month.