“The greatest problem of humanity is that we have paleolithic emotions, medieval institutions, and god-like technology.” -E. O. Wilson
Earlier this year, I was asked to speak at a professional development event for my high school alma mater. I saw an opportunity. A portion of the talk was to be focused on current trends in mental health, and there are few trends more concerning to me than the relationship between people and interactive screen technology.
I have spent the past 10 years counseling families who were perplexed by their children’s behavior around screens, families whose children became physically violent, attempted to jump from moving vehicles, and threatened suicide when their devices were taken from them. I worked with multiple 18-year-old men who still rode in the back seat of the car as Mom or Dad drove them to my appointments, headphones on, eyes glued to their phones. I have worked with families through gaming addictions, school absences, and screen detoxification periods. One memorable young man described literal panic and shortness of breath whenever he became aware that the WiFi was turned off, a tactic his parents employed at night so that he would go to bed. He requested that they turn it back on once he was asleep because he felt safer knowing it was there, like a tether holding him to the spaceship.
If you haven’t experienced this in your household, you might think these cases are extreme or rare. They aren’t. Or you might think the parents were negligent, laissez-faire, or naive. They weren’t. Fear not, parents, this is not a point and shame piece. But I am getting ahead of myself.
Excited about my speaking invitation, I took to Google Scholar to pull some studies. In no time I found a match to my query, a 2022 meta-analysis of 33 studies published in the journal Professional Psychology: Research and Practice. This was going to be easy. There it was, plain as day: “Across studies, evidence suggests that screen media plays…(wait for it)…little role in mental health concerns.” I won’t quote what came out of my mouth next. This couldn’t be right.
Back to the search engine, my fishing expedition was underway. I quickly found another study, this one in the Journal of the American Medical Association. With a sample of 5,412 adolescents, the authors reported worse mental health and stress associated with higher screen use and better social support and coping with lower screen use. Now that’s what I’m talking about! But then I found more conflicting information. A systematic review of eight studies showed mixed results; smartphone use was associated with reduced executive functioning, but increased time with video games was associated with improvements in working memory. I wasn’t going to find a clear answer.
Anyone with a science background would say that I have gone about this all wrong, and they would be right; a proper review of the literature is more systematic than this. Nonetheless, my experience illustrates the vast spectrum of findings related to the issue, and it’s enough to cause a collective action paralysis. In whom do we trust? How do we make decisions about our children and their future when the scientific community hasn’t rendered a cohesive recommendation?
The paralysis has already set in. Our post facto regulatory system tends to see how innovations fare in the market without considering second- and third-order effects. Before you know it, what was once an innovation has become foundational in our everyday lives. In 1962, Rachel Carson warned us about the dangerous chemicals in circulation that were already causing serious environmental and public health concerns. Since then, we have introduced countless products and services without consideration for their externalities: BPA-laced food packaging, trans fats, solicitous gambling practices, synthetic CDOs, PFAS in our water, polarizing social media algorithms, and YouTube ads targeting five-year-olds. All of these have enjoyed the marketplace as their playground before being identified as problematic. All of them are still playing.
We are all well aware of the impact phones have on us. We can relate to the fatigue, depletion, mental fog, irritability, and regret that follows an hour of scrolling; we know about the social media traps, mindless phone games, email invasions, and internet rabbit holes that pull us away from the lives we want to live, the lives it seems everyone else is living on our Instagram feeds. Addictive phone habits are so ubiquitous that they’ve already been accepted as a given. It doesn’t take a degree in cognitive science to see what’s happening. The sad, familiar shrug of learned helplessness is creeping in, and we are struggling to extract ourselves from these toxins that have somehow gotten baked into our daily lives. This is what we know about ourselves already, individually and collectively, so what does that mean for our children?
There’s so much novelty in the world of a small child, and each discovery sets off a cascade of responses. Children are wired to respond quickly and sensitively to stimulation from their environment; it’s how they learn. Most parents can identify a kind of attention lock that occurs when a small child is in the presence of an active screen. The bedazzling, over-saturated colors, the exaggerated facial expressions, the rapid movements, and the alluring sounds outcompete almost any real-world environment. When I look at the programming for children today, what disturbs me most is the dialed up emotions on display, particularly the over-expression of cheerfulness that depicts an artificially sweet and sticky world. The reward cycle that precedes addiction is already in place as the child’s dopaminergic system is thrown into overdrive, headed for the inevitable cliff when the show is over. Parents are unfairly thrown under the bus as they become the messengers of bad news: that things don’t always work out the way they do in neatly packaged cartoon segments.
We are still lugging around the same brains that our ancestors developed over 100,000 years ago. Just as it was then, the primary conduit for learning and development is play. Games help us learn about social norms, solve problems, and improve coordination, among other things. The advent of computing technology has created new gaming ecosystems, and many of them are remarkably inclusive. Whether table-top, athletic, or digital, games create a play space where people of all ages can experiment with different sets of rules and social norms, where we can collaborate or deceive, take risks, solve problems, and push our physical and mental limits. Some game spaces encourage users to become involved in design; they can build levels, characters, and explore different rule sets. It’s not surprising that adolescents and adults in the autism community have found a home in gaming, where they can explore and connect with others while doing away with the punitive social norms of everyday life.
Games and other art forms should invoke a connection between creator and audience that helps you understand something about the world and yourself. A partnership develops in which the artist’s creation conjures up an idea that binds with your imagination to produce something unique. The artist can create art for you, but she cannot experience it for you.
Unfortunately, much of what constitutes video gaming today is not really play but gamified reward systems designed to maximize user engagement, create play-to-earn tokens, display ads, or sell digital merchandise. As Eric Zimmerman, game designer and author of The Rules We Break: Lessons in Play, Thinking, and Design, said in a recent interview, “Gamification strip mines the superficial aspects of games…In order to get people to use an airline, they will have a frequent flier program and use things like points and levels that we get from games…but in strip mining the superficial elements, they leave the soul of game behind.” What is true of frequent flier programs is now true of the games themselves. They have lost their souls, and users have become, well, users.
Many of today’s games have morphed into elaborate currency exchanges where it seems the sole purpose of playing is to develop enough digital assets to trade or sell them off on the black market. Popular sports games renew annually, updating little more than the player rosters while raking in billions. Other games have done away entirely with the element of fun and gone straight to addiction. These so-called “idle games'' task the user with monotonous chores leading to no outcome in particular. Cookie Clicker, for example, is a notoriously addictive game in which the sole aim is to click cookies, which opens up achievements and upgrades, which then improve the player’s ability to click more cookies, and that’s it; that’s the whole game. I suppose we can congratulate its designers - if you want to call them that - for proving that it’s possible to experience reward without actually having any fun.
Our research institutions may not be able to clearly identify the link between screen tech and mental health, but Facebook (sorry, “Meta”) did. In 2018 they discovered internally that Instagram, which is owned by Facebook, was actually encouraging eating disorder behavior in teenage girls, and they buried that information until it was exposed in the 2021 Wall Street Journal exposé, “The Facebook Files.” For many teenage girls, what started as a benign search for workouts led quickly to being pummeled by photos about how to lose weight, ideal body types, what they shouldn’t be eating, and, eventually, how to hide disordered eating behaviors. Authors of the Facebook Files revealed that the company’s leadership knew its algorithms were shuffling traffic inevitably toward extreme content and did little about it. Although most of us did not see this coming, it’s not hard to understand how it happened in hindsight. Facebook is monetized through ads, and so to make more money they need to increase user engagement. More attention for more ads for more clicks for more money. These are the functions of so-called dispassionate programming algorithms.
If you filled a bucket one quarter full of rocks and another quarter with sand, and then you shook the bucket for a minute or so, the sand would make its way to the bottom of the bucket. You didn’t tell the sand to do that; it’s just a mathematical function of mass and volume. If you filled a social media platform with content and then programmed it to increase engagement, the algorithm would naturally amplify extreme content because it gets more reactions (e.g., more likes, more comments, more re-shares). It’s not that social media companies set out to cause chaos (we hope), it’s just a mathematical interaction between the algorithm and the human psyche. Nonetheless, chaos is upon us, and little has been done to stop it.
In a scene from The Social Network, a 2010 biopic about Mark Zuckerberg, Sean Parker (Napster founder, played by Justin Timberlake) makes the comment “Facebook is cool. That’s what it’s got going for it. You don’t want to ruin it with ads because ads aren’t cool. It’s like you’re throwing the greatest party on campus and it’s got to be over by 11.” The scene is a fiction, but it portrays a collective memory of how we once thought of ads: annoying barnacles that attach themselves to free content floating through the media ocean.
Time has shattered this idea along with our innocence as we have come to learn what ads really are: perverse incentives for social media companies to prioritize engagement over the things that matter most, the quality of our relationships, our mental health, our free time, our attention spans, even our democracy. All of these have been eroded by - what Tristan Harris, cofounder of the Center for Humane Technology calls - the race to the bottom of the brainstem. “We are worth more when we are addicted, distracted, outraged, polarized, and misinformed,” he says. What started out as a virtual town square has metamorphosed into a well-tuned marketplace. Social media companies are the sellers, advertisers the buyers, and your attention is the product, harvested continuously, day in and day out, rain or shine, 365 days a year.
The tragedy in all of this is that social media holds so much potential for good. Facebook and Instagram succeeded in reconnecting old friends and family, giving power to the people, and helping small businesses reach their communities. But they got greedy. They wanted to have their cake and eat it too. They wanted credit for producing a free service, but then did something far worse than charging fees: they turned the users into the product. And they didn’t stop there. Once Facebook succeeded in securing their market share, they set out to colonize less affluent countries by distributing free cellular service exclusively for the use of their app, Free Basics. Sounds nice, but in actuality they are solving the problem of internet access by making their own platform, for all intents and purposes, the internet.
In 2021, the shiny new object called TikTok surpassed Google and Facebook as the most popular site on the internet, and, apparently, the most addictive. As the flagship app of Chinese company Bytedance, TikTok has developed enormous influence over the social and cultural pulse of young people around the world.
TikTok tracks the user’s keystrokes whenever a website is accessed through the app, allowing the company to amass a trove of personal data. Although not directly controlled by the Chinese government, TikTok is subject to government influence and intrusion, and it has already been shown to amplify pro-China sentiments in its algorithm to great effect. The Chinese government also recognizes the inherent mental health implications of TikTok, which is why a healthier version of the app exists for Chinese youth that limits user time to 40 minutes per day.
We have to start recognizing that the people leading these companies are bound by economic and political systems that don’t always consider the broader public good in their balance sheets. Free is not free when ads, political influence, or the expectation of a 10X return on investment are involved. We pay for these services with resources far more valuable than money: our time, our attention, and our opinions.
As screen technology becomes more immersive, our imaginations are becoming weaker and our tolerance for ambiguity fainter. Web3 techno-solutionists promise (once again) to save our humanity with new source code, billionaire self-styled “effective altruists'' (mostly a bunch of white guys) want to justify the exploitative origins of their obscene wealth by ushering us into a digital Age of Enlightenment, and Zuckerberg’s virtual reality metaverse is poised to rescue us from the drab confines of reality. Finally, you’ll get to wear your favorite selfie filter all day long! Attempts to disguise the digital takeover of our consciousness are now discarded as companies ramp up their production of VR headsets that render the user literally blind to reality, a fact that might seem comically on the nose if only it weren’t just so.
Some have praised virtual reality as an opportunity to connect more deeply or develop empathy by walking realistically in someone else’s shoes. The Anti-Racism Extended Reality Studio (UA-ARXRS), for example, is a project designed by the University of Arizona Center for Digital Humanities to give people first-person observations of common experiences of racism. By immersing their participants in the identity of someone else, they hope to cultivate a level of empathy that might not occur through intellectual discussion. Founders of Optima Classical Academy, the world’s first virtual reality school, want to create rich learning experiences for their students that take them far beyond the boundaries of a typical classroom where they might encounter unreachable places, cultures, galaxies, and epochs.
There’s incredible potential in these developments that cannot be overlooked, but of course, there’s little danger in that happening whenever business and venture capital are involved. What’s more concerning is that we adopt the next big thing into our daily lives, work, and learning environments without asking why and at what cost. Schools distributed iPads and personal computers to their students with remarkably little investigation of how they would support learning, and in doing so they have granted themselves the authority to decide when and how our children are introduced to screens. Today, much of what we call “individualized learning” takes place on these devices with gamified learning programs. It’s amazing that with all this dazzling technology, we still manage to bore students to tears.
I believe that if we had taken the time to think things through, we would still have adopted screen technology in our schools, but with one huge difference: we would have insisted that technology meet the needs of learning, not the other way around. That’s how Khan Academy came into existence. Sal Khan needed a way to tutor his cousin and used Yahoo! Doodle Images to illustrate concepts. When others began asking for help, he started uploading videos to YouTube, which eventually led to founding the nonprofit organization that is widely considered one of the most innovative institutions in modern education. Khan Academy’s technology has changed little in the 14 years since it was founded, but they have found myriad ways to use these tools to increase and deepen the connection between teachers and their students.
The interaction between screens, attention, and learning is a deeply personal subject for me. I was diagnosed with ADHD in grade school (it was called ADD back then), but since the diagnosis is so egregiously imprecise, it’s more accurate to say that my attention builds slowly and does not shift easily. As an adolescent I resented taking stimulants even though they were quite effective, and, to the eventual chagrin of my parents, I discontinued them in secret. It took most of my twenties to build systems to protect my concentration. It is a resource I guard fiercely, and yet, even after years of practice, I periodically slip into one screen sap or another: over-consumption of pointless news during election cycles, mindless phone games, crossword puzzles, a recurring obsession with online chess, and spiraling forays back into social media.
In November of 2022, when I decided to reboot the Kingsbury Newsletter and launch an independent series of essays, it was clear that I needed to do at least some promotion on social media. I stepped back into the fray, reclaimed my abandoned Twitter handle, and started checking LinkedIn again. I found little enjoyment in it. Despite the fact that LinkedIn feels obligatory and inane, and that Twitter is a bit, well, Musky, I couldn’t stop myself from falling into the attention vortex. Just like kids clicking cookies, I was hooked on something I didn’t even enjoy. Lucky for me I did not reactivate my Facebook account.
After only a couple weeks of carpal tunnel scrolling, I had stopped reading and writing entirely. I became irritable and depressed. I sat at my desk staring at a mile-long to-do list and couldn’t muster myself to push through it. Instead, I read about ways to grow a Twitter following from scratch (still 20 last I checked) and fell into despair at the thought of doing all the things they recommended in blogs, like following people just to get followed back or commenting on posts to draw attention. But most of all there were the rabbit holes - endlessly deep, mind-numbing, soul-crushing rabbit holes. Even though my phone use is less than the US average, its correlation with my mental health is profound and undeniable. Tracking stats and browsing social media diminish my desire to do the things I love: read books, exercise, spend time with my family, sketch out ideas, and write.
Look beyond the words on this page and you will find me, desperately clawing out of this malaise, trying to reclaim my sense of purpose and meaning. This article is my ticket out of the abyss. I have had the privilege of witnessing many adults and adolescents free themselves from addictive gaming and obsessive social media scrolling, and I remember their struggle as I also seek to find a balance for myself.
We can take back the power that was siphoned from us. After all, we did not cause this. No one asked to be surrounded by technology so immersive that it invades our thoughts. We did not ask for Silicon Valley to procure the attention of our children and sell it to the highest bidder. We were planted here, burdened with the ubiquity of screen technology just as family units were becoming more isolated from one another. It takes a village, they say, but we don’t have a village anymore.
We cannot afford to wait for the scientific community to produce cohesive guidelines, nor can we wait for government policy to catch up with exponential tech, nor can we expect a marketplace so addicted to the bounties of ad revenue to self-correct. The tide has pulled us too far from shore to be rescued, but it’s not as though we don’t have a vessel and some oars. We have to find each other, and then we have to find our way home. It may seem like everyone around you has given up on this, but they haven’t. Each person and each family is struggling to figure out how to live, as they say, in real life.
As writer David Foster Wallace put it, “the really important kind of freedom involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day. That is real freedom…The alternative is unconsciousness, the default setting, the rat race, the constant gnawing sense of having had, and lost, some infinite thing.”
If media companies are so desperate for our attention, then we should get to decide what our attention is really worth. Let’s make 2023 the year that we end the tech bro magic show. We can scrap our New Year’s resolutions and just opt out of the things that make us feel bad about our lives in the first place. We can turn off social media to love our bodies as they are, or set aside Slack chatter for a real life conversation with our kids. What is left when we opt out of the constant barrage of attention-hogging, dopamine-dispensing media? Well, at first, there’s a dullness, a blunted feeling of emptiness and wanting. But then, slowly but surely, there is fresh air, birds chirping, people laughing, wind blowing in your face; there is love and family and togetherness. There is, in short, consciousness, and the graphics are superb. Each of us must decide what screens are for, and more importantly, what they are not for. It’s time we exercise our power to choose.