It is a warm evening in April 2004. I am 23 years old. I am sitting in my underwear in my office. I’ve not showered for days. Detaching my sticky posterior from the surface of my office chair causes me to grimace. A blank screen stares me in the face, longing for something, anything to fill the empty pixel space. It’s time for me to write my final English 400 paper — the paper that I’ve (supposed to have) been working on for months.
It is 4:30am.
In retrospect it’s easy to understand how this happened. A PC game called Starcraft cast its shadow over my deadline. Wikipedia had drawn me into a destructive click-well where I glutted myself on the fictional biographies of Marvel superheroes. It was always about one last page of a comic book, one last link followed on Wikipedia, one last web search to make sure that rash I had found earlier that evening wouldn’t be the death of me. To any student who grew up in the internet age, a “digital native”, this must sound at least remotely familiar.
Non-plussed, I did what a determined weight loss dieter might do — I started to keep a log of my time spent staring at screens. After a week of diligently tracking myself, I was floored with the results: I was spending in excess of 9 hours a day online or watching television.
I grew up embracing technology and digital media without truly appreciating the cognitive and sociological effects it was having on my mind and inevitably my productivity.
Of course, I never ended up handing in that English 400 paper before the deadline. After all, the business of analyzing my shortcomings was far more consequential than the minutiae of graduating. It was not the last time that a deadline failed to rectify my lapses in self-control and focus — time and again I would repeat this sad story, of a weak-willed student battling with a computer and an internet that seeks only to feed me.
Like many in my generation, I grew up embracing technology and digital media without truly appreciating the cognitive and sociological effects it was having on my mind and inevitably my productivity. Yet here I am — married, a parent, employed and (mostly) productive when I need to be. That same student that pushed the limits of procrastination and sloth to such extremes is still with me. It’s not a stretch to say that he’ll never go away.
No Other Road
Not long after publishing this post, I’ll be deleting my accounts off of Facebook, Twitter and Instagram. I have failed time and again to cease their annexation of my time and focus — social media has left me with no recourse.
When I first considered leaving social media altogether, the first worry to enter my mind was how my friends would scorn me for seemingly blocking them out of my life. Then I thought better of them (and the world). The people that need to reach me will find me some way or another. The cognitive overhead of being distracted by my newsfeed just isn’t worth the cost to my professional and personal life.
But I knew I had to explain myself. Here are my reasons listed in a TL;DR fashion:
- I’m not the reader I used to be. I’ve become a chronic skimmer. The mere thought of reading anything that requires more than ten seconds of my time to digest and comprehend, turns my stomach ever so slightly. I long for a return to being the mindful reader of my youth.
- I want more time to do the work that really matters. Deep work — the kind that requires focus and determination. I should be spending my evenings honing my craft, not enviously reading about the amazing work my peers are producing.
- Here’s the most important reason of all: I’m convinced that I distract myself in order to avoid facing the problems in my life that most need solving. My penchant for avoiding ambitious projects and challenging opportunities lies at the creamy centre of my distraction problem — and quitting social media is just the tip of the iceberg.
What are we talking about here? It is clear that my problems go much deeper than simply a propensity towards checking my phone every few minutes for new Facebook posts or logging onto Slack at 1am. My problem is more than about social media. It’s about a pervasive consumption mindset — and for most human beings this is not unusual behaviour.
A Temple to Consumption
In North America, adults spend upwards of 11 hours a day consuming media accrued from interactions with social networks, television, and radio. One can imagine how much higher this number is for knowledge workers who spend most of their days in front of a display. In 2015, Americans consumed media at a rate of 74 gigabytes daily, totalling an estimated 8.75 zettabytes annually.
In North America, adults spend upwards of 11 hours a day consuming media
And the social media corpus is ever spinning these statistics to new pinnacles. Newsfeeds are designed to keep users engaged through a worrying synergy of user-to-user interactions bound with the limitless content generated by the inter-webs. Our society has built a grand temple to consumption, and lavish are the sacrifices upon its altars. Amongst these sacrifices — humanity’s ability to focus on weighty issues and our threshold for parsing quality information. We have laboured to erect a temple built atop tels made of click bait articles and misleading sensationalist titles.
A 2016 Pew Research Centre study shows that 67% of Americans are Facebook users, with 44% of those people reportedly reading news off their feeds. A majority of these social media news readers come across news organically (62–63% of Facebook and Twitter users get their news when coming onto the sites to do other things). This might allow us to infer several things. Firstly, that passive media consumption is emblematic of how a majority of us parse content on a daily basis — meaning that our engagement with content is likely distracted and cursory at best. Secondly, the content we see on our feeds are heavily weighted to increase the likelihood that we’ll engage with it.
The newsfeed is not just about seeing what your friends are doing — it is also a wall of self validation.
We know that content appears in our feeds based on dimensions of an article’s interaction with the social network (by virtue of likes, comments and shares) in addition to factoring what other content you interact with and the interests of your friends. This would mean, ostensibly, that the content we’re likely to see in our feed most often confirms and reinforces our personal beliefs and perspectives of the world. It feels good to read something that reaffirms your beliefs, good enough to do again, and again, and again. The newsfeed is not just about seeing what your friends are doing — it is also a wall of self validation.
For all the facts and research circulating around our world’s love affair with media consumption, how would one visualize the effects of this? It’s tempting to imagine a population of shut-ins: skin pallid and eyes bloodshot after binge watching House of Cards over 2 days and 4 hours (this is a real number). In my case, this vision isn’t too far off the mark.
The Click-Well of Despair
As a young designer in my first job out of school, I was tasked with designing a poster for a charity event. As I sat down to sketch the idea out, my mind wandered to the meaningless tedium of reading my stable of email inboxes, checking to see if a parcel I’d been expecting had reached my door, or seeing who had recently commented on my Facebook post about my trip to the latest and trendiest donuterie.
When the clock finally hit five that day and the office hum had died to the faint clickity clack of several co-workers growing vegetables on a Facebook farm, I finally got to work on that poster. I conceptualized, I sketched and finally it was time to execute — except, I wasn’t quite ready yet. The design required a special visual effect that I’d never worked with before. Perhaps I could write a script that could automate the effect, I thought, or find a tutorial that would teach me how to generate the effect.
Hours passed. Finally, at 9pm, when a tired old security guard came about to flush out the last dregs of the working class from the building, I awoke from my reverie, realizing I had been reading a post on Stack Overflow for hours. My Illustrator file remained empty, a solitary layer to its name. Dejected I went home, determined to work through the night to meet my 9am deadline the next morning.
I don’t think I need to describe the rest of this night to you. Some how, some way, the boundless distractions of the inter-webs, crept into my evening. The rock solid schedule I had pencilled in my notebook on the bus home had read:
11pm to 2am: work on poster.
It should have read:
10pm to midnight: watch old episodes of Seinfeld I had downloaded yesterday,
Midnight to 2am: Wikipedia gluttony,
2am to 7am: fall asleep at my keyboard, at the feet of an empty Adobe Illustrator file.
This might sound familiar to you. If it does not, I may strike you as a frivolous human being, incapable of honouring my commitments or recognizing the brevity of life. However, this exact scenario plays out in offices, schools and homes every day. It is beyond procrastination, laziness and far past the point of preoccupation. I want to use the word ‘addiction’ in describing this condition, and most would be apt to agree with this assessment.
An Interlude: Beyond Social Media
You’ll recall that my primary objective here was to lay out for you my journey away from social media. It’s important that I acknowledge here that my issues are far beyond the scope of social media’s wiles upon me. I’m often told that I confuse my need for so-called surveillance gratification (constant monitoring of user posted status updates and photographs) with an internet addiction. There are umpteenth avenues upon which I might come across time-wasting and dopamine inducing web experiences. RSS, Feedly, Reddit, Politico to name a few of my favourite sources of procrastination worthy content. My point is that social media spurs on this consumption mindset, a high pressure hose nozzle equipped atop an already ceaseless tumult. In other words, the force of social media, with its captivating layer of user-to-user interaction, leaves me powerless to stop myself.
I know that I’ve not always been this way: some time in the fuzzy past I was an elementary school student that was content reading Eyewitness Books, gleefully flipping from cover to cover, appreciating every glossy picture and reading every sidebar. To think that I once read Paradise Lost in a week’s time, head between the spine for hours a day, seems now an impossible endeavour. How did this happen to me?
Don’t Call it an Addiction
I’m often confronted with the surprising realization that I’ve opened my browser to Facebook, seemingly without thinking, like how smokers often raise a lit cigarette to their lips that they don’t remember lighting.
The very act of Googling for information and finding the correct result spurs the production of dopamine, a chemical connected with the brain’s reward mechanism. The pings of incoming emails, text messages and Facebook comments give our brains a hit of the neurotransmitter that keeps us wanting more. We might even exhibit an elevated heart rate and the slightest high, finally followed by what might be described as an emptiness. Repeat.
What I’ve been most interested in is the power of dopamine to affect our behaviour when the “rewards” come about in an unpredictable fashion. Much like how casinos reward slot machine junkies with the occasional payout, email notifications and news feed updates arrive sporadically, prompting us to refresh and check in repeatedly, reinforcing a habit of expecting a reward. This is also known as a variable interval reinforcement schedule, described by Tom Stafford, a lecturer in psychology and cognitive science at the University of Sheffield, as the best way to train animals — providing rewards in return for tasks performed at irregular intervals. Performing said task without necessarily receiving the reward becomes habitual, and without knowing with any certainty when the next reward will arrive, a seemingly self-destructive behaviour has been born. It’s easy to see how this plays out in browser refreshes for newsfeed updates and the constant, almost desperate, opening of an email app.
I’m often confronted with the surprising realization that I’ve opened my browser to Facebook, seemingly without thinking, like how smokers often raise a lit cigarette to their lips that they don’t remember lighting. My brain scans the page, looking for a red notification badge, ravenously consuming the top of the feed. Scroll, scroll, scroll, nothing new, close the tab. Just minutes later, this might all repeat itself. This past election didn’t help — during the campaign season I found myself repeatedly visiting sites to read about poll numbers, or anything refreshingly asinine one particular candidate spouted off.
Sociologist Sherry Turkle warns us in her book Alone Together, that using the term addiction in regards to defining the compulsive behaviour that surrounds heavy video game and internet users might not be the most apt metaphor for describing these experiences. She writes that
“talking about addiction subverts our best thinking because it suggests that if there are problems, there is only one solution. To combat addiction, you have to discard the addicting substance. But we are not going to ‘get rid’ of the Internet. We will not go ‘cold turkey’ or forbid cell phones to our children.”
Instead, Turkle suggests that we circumvent the hopelessness that an addiction offers, and look inwards in order to adapt to our circumstances and write a fresh narrative around how we can live concurrently with technology. And she’s right — we’ve crossed the Rubicon and leaving the web entirely is, for most of us, absolutely not an option. I’m as optimistic as Turkle is, believing that the age of the web is in its infancy and we have time still to right some wrongs with how we abuse our access to infinite knowledge.
Epochal evolutions in how humankind consumes media is often accompanied by a backlash of warnings surrounding the supposed deleterious effects of the new medium. The mass proliferation of video games in my generation was often touted by the media as being a pervasive force for brain degeneration. But this was, more often than not, seen as hyperbole.
In our pursuit of cat videos, articles about Donald Trump’s latest tirade and the Top 10 Productivity Hacks I’m Not Using, we’ve inexorably altered our physiology.
Direct physiological changes to the brain as a result of interactions with the internet is one such reason for my departure from social media, and an impetus behind my suspicion of mass internet usage.
In his seminal work, The Shallows, Nicholas Carr describes how comforting a delusion it is for us to assume that our interaction with the internet has no lasting physical effect on our brains as to believe otherwise would “call into question the integrity of the self”. Our primate brain carries with it the ability to form new neural pathways and connections as a result of our experiences. Carr references a study of monkeys that show growth to the motor areas of their brains after being taught to use tools (and even more surprisingly, the monkeys began to interpret the tools as extensions of their hands).
Changes to brain anatomy are not exclusively the result of physical actions — it’s been shown that mental tasks, our very thoughts, can have an affect on brain anatomy. These anatomical changes to our brain that occur as a result of our experiences is thanks to the plastic nature of our neurons, or as it’s known in the scientific literature, neuro-plasticity.
Carr cites a 2008 study that compared two groups of internet users. The first group consisted of experienced and frequent web search engine users (called the Net Savvy group); the second group (known as the Net Naïve group) had web usage frequency characteristics that ranged from once a month to once a week. The study involved both groups performing both a Google search task and a text reading task while receiving an MRI scan. The study found that the experienced web users tended to exhibit higher brain activity when performing both the web search task and the text reading task when compared to the Net Naïve group. However a follow-up test showed that after just five days of being online per day, the Net Naïve group was beginning to show signs of brain activity that brought them to par with the activity levels seen in the Net Savvy group. It’s now clear that with a trivial amount of time spent interacting with the web, our brains are reworking themselves at an anatomical level.
The levity of this situation is not lost upon me. In our pursuit of cat videos, articles about Donald Trump’s latest tirade and the Top 10 Productivity Hacks I’m Not Using, we’ve inexorably altered our physiology.
We’re All Shitty Readers Now
About a year ago, I realized that I was having trouble reading anything longer than 300 words at a time. Most notably, I realized that using the Pocket app on my mobile and my desktop computer was adding a layer of anxiety and urgency to my late night reading sessions. During the working day, I diligently file and tag articles for later reading in the Pocket app, with the intention of revisiting them after the business of the day gives way to the reassuring tick-tock of the clock near my reading chair.
The Pocket app allows users to save articles they find on the web for later perusal. It’s a fantastic app that I still use today. But I realized that, as my list of articles to be read expanded day by day, I’d never reach the end of this list. This list of articles stretched seemingly infinite screen lengths, requiring a Hermean speed of thumb swiping to reach the end of. It seemed a hopeless prospect to reach the bottom of this list of well over 800 articles of varying length.
The sheer quantity of material that lay before me, began to wear on me. I frequently lost focus mid paragraph, or found some other article more engaging (or shorter) and often stopped to check my email or scan my social media feeds. It felt as if I had lost my ability to read deeply, a feeling so disparaging that in my despair, I marked as read all the items in my Feedly and Pocket lists.
The next day I picked up a dead-tree book, left my Kindle at home, and tried to get past the first several chapters of War & Peace. The effort was a complete bust, but I had lasted a fair margin longer than if I had read it on my tablet or Kindle. The journey back to the life of a deep and thoughtful reader won’t be easy, but I’m encouraged by fond memories of being lost in a narrative of faraway and fantastical places.
Nicholas Carr writes in The Shallows, that reading on the web is an inherently distracted venture. In addition to notifications from the device, we’re accosted by hyperlinks that stop our eyes in their tracks as they make saccades over lines of typography. As our eyes stop to rest on hyperlinks that are normally styled differently on the web to contrast them from running text, our brains are forced to make micro-decisions as to whether or not to follow the link or ignore them in favour of continuing to read. It’s these deliberations that make reading in a browser decidedly more difficult. My paperback copy of Stephen King’s The Stand, has the advantage of doing only one thing — laying on its back showing me dried ink on paper. It is not a portal to humanity’s collective knowledge.
There are very good reasons why tweets and Facebook posts are so easily digestible. First, sensationalist titles and bespoke (user relevant) content increases session times and exposure to advertisement. Second, the content is often presented as being modest in length, as dictated by the interface. By nature, tweets are limited in their character count, and Facebook truncates text content, having users click on a ‘read more’ button in order to view the full content available. While there are practical design related reasons for doing this, it’s also clear that social media platforms understand the limited scope and magnitude of human focus. In writing about how people read on the web Jakob Nielson is famous for replying succinctly: “They don’t”.
Work Without Depth
The cognitive overhead of being drawn into social media for micro-moments of time, perhaps voluntarily or by way of push notifications, is tantamount to a hurricane running through our most valuable thoughts and creative processes. The distractions that we generally view as merely part and parcel of living in a connected world are destroying what obsessive and bookish nerds such as myself hold most dear — depth. I believe that we all wish to look deeply into our work and find value in what we do each day. But it’s not in the web’s best interests for us to spend hours away from its advertising and rapidly digested content.
Cal Newport, a computer science professor at Georgetown University and champion of depth, defines deep work thus:
“Professional activities performed in a state of distraction-free concentration that push your cognitive capabilities to their limit. These efforts create new value, improve your skill, and are hard to replicate.”
How we work today can be described as constant context switching, a costly and ultimately frivolous chain of actions that begins with allowing any possible derailing external factor to encroach on our most formidable working hours. Considering how the average Facebook user will check their profile fourteen times per day on average, and that multitasking in context to push notification and messages shows a drop in overall productivity, it is a marvel that employers don’t make efforts to communicate the dangers of this potentially costly black hole of efficiency.
Nothing can be more disparaging than coming home, dragging the lethargy of the day in tow, with nothing to show for it except answered emails, a few tweets, Facebook likes and a cacophony of Instagram hashtags that no one will ever use. One day when I look back on the body of work I’ve produced, the observation must be that I’ve tackled problems worth solving. If instead I am faced with the shallow veneer of a distracted mind, the regret that would surely follow would consume me utterly.
Letting go of social media is a big step, especially for a compulsive personality like mine. But as Cal Newport reminds us:
“Sometimes in order to go deep, you must first go big”.
Through a Glass Darkly
With more than a couple all-nighters like the one I described under my belt, I am admittedly becoming more pragmatic in my approach to work. I find that I can mitigate the inevitably of procrastinating on a task by clearly defining the goal of the project I’m working on. Breaking down the tasks needed to get to ‘finished’ into manageable chunks allows me to feel like I’m not always foregoing an admittedly pleasurable plunge into a wretched sub-reddit.
Tools like Freedom and Self-Control App stop me from accessing websites that might easily derail me. RescueTime gives me an overview of how I actually spend time on my devices. I turn off email, Slack and Twitter notifications in favour of phone calls. I prefer pen and paper when drafting the first iteration of a design. I’m a distraction addict on the road to recovery. (P.S. I have to admit at this point that whilst writing this segment of the post, I spellchecked nearly 20 times — worrying behaviour, but infinitely less distracting than checking my email)
I want to believe that these tools are enough — enough to keep me on track, enough to stop any possible relapses into nights of ceaseless web-based procrastination. I want to believe that all this will fix me — these meagre software packages and an underlying fear of being mired in the mediocrity of being a mindless media consumer. The act of creation, I feel now, is closer and less clouded by the allure of social media’s surveillance gratification intoxicants. Turkle wasn’t kidding when she wrote that
“technology is seductive when what it offers meets our human vulnerabilities. And as it turns out, we are very vulnerable indeed.”
At Our Core
The most compelling reason I can think of to leave social media is perhaps the most nebulous to communicate. I am certain that in the act of constantly absorbing media, I am attempting to evade some core discontent, a deep set inadequacy or pain that needs frivolous distraction to prevent it from rising to the surface of my consciousness.
“Even in our everyday work we slave more fiercely and busily than necessary in order to earn a living because it seems even more necessary to avoid reflection. The haste is universal because everyone is running from himself.”
Our contemporary lives are certainly devoid of self reflection. As a society that bills cell phones by the second and are rushed out of busy restaurants at lunch, we are told daily that idleness is an evil that should be marginalized and planned against. Nietzsche continues his commentary:
“The air around us is filled with spirits; every instant of life wants to tell us something, but we refuse to listen to this ghostly voice. When we are alone and quiet, we fear that something will be whispered into our ear, and for this reason we hate the quiet and drug ourselves with social life”.
I don’t know what lies in the heart of me, what is driving me to smother my feelings with a ceaseless stream of distractions. I want to know, need to know. And I’ll never know, until I seek out the “quietness” that Nietzsche describes.
At a commencement speech at the University of Hampton in 2010, US President Barack Obama described exactly the concerns I’ve voiced.
“You’re coming of age in a 24/7 media environment that bombards us with all kinds of content and exposes us to all kinds of arguments, some of which don’t always rank that high on the truth meter. And with iPods and iPads; and Xboxes and PlayStations — none of which I know how to work — information becomes a distraction, a diversion, a form of entertainment, rather than a tool of empowerment, rather than the means of emancipation. So all of this is not only putting pressure on you; it’s putting new pressure on our country and on our democracy.
Class of 2010, this is a period of breathtaking change, like few others in our history. We can’t stop these changes, but we can channel them, we can shape them, we can adapt to them. And education is what can allow us to do so. It can fortify you, as it did earlier generations, to meet the tests of your own time.”
Obama couldn’t be more insightful — like Turkle, he believes that education and understanding can help us navigate this new uncertainty. Without an understanding of how we interact with these new challenges, we are certain to fall victim to a charismatic over-brain spawned from our internet.
I’ve learned through observation that the concept of the narrative is important to how our human minds conceptualize the world around us. Neuroscientist Susan Greenfield describes a narrative as
“a sequence — a chain of cause and effect in a non-random, strictly ordered sequence. Any narrative will, in some way, echo a life story. Stories arrange events into a context, a conceptual framework, and this order creates meaning.”
The idea of viewing our reality as a collection of narratives allows us to make sense of uncertainty and the vast body of knowledge that our species has rested our fate upon. Somehow, the instant, spurious nature of the internet and social media has shattered this aspect of our perception. Information is broken up into shards of facts and we are prevented from forming a deep understanding of how those shards connect together to form the world we live in.
It is late on a Thursday night, November 2016. I am 35 years old.
I am at home, writing the end of this post. Considering the personal nature of what I’ve written, I hesitate as my mouse hovers over the publish button. My breath rises, a ghostly visage, as it escapes my mouth and swirls about as if making perceptible the erratic nature of my thoughts.
And hesitate I should — what I’ve shared with you is both personal and perhaps even shameful in some eyes. This tale of compulsive behaviour and TCP/IP incarceration will likely live on in perpetuity online exactly as I have written it. Nothing escapes the infinite memory of the internet.
Now consider the billions upon billions of internet users on this planet. Nearly 3.5 billion people on the planet accessed the internet in 2016. With low-cost internet enabled mobile devices gaining ground in developing countries, we’re on track to see over 4 billion people online by the year 2020. Imagine all the things about myself I’ve shared with you playing out exactly the same way on the other side of billions of internet enabled devices. When considering this staggering possibility, I believe that what I’ve shared is less a personal story than a reflection of the human condition, a timeless human propensity to seek out distraction.
I believe that what I’ve shared is less a personal story than a reflection of the human condition, a timeless human propensity to seek out distraction.
My fingers instinctively begin to type in the letters ‘f-a-c-’ in the browser URL field, yearning to log onto Facebook for one last look around. My iPhone’s home screen is a paragon of productivity, barren of Instagram’s gradient vortex. I look at the clock and then at the tome that sits atop the desk before me. It is a copy of Gibbon’s Decline and Fall of the Roman Empire, a treasured read that I’ve kept since my school years. I’ve since been too distracted to pick it up again. I flip the worn cover open and begin to read in earnest.
It is 4:30am
Thanks to my wife Christina and my brother Michael for believing in my message, and to my friend Talent Pun for applying his critical eye to my writing.