Whenever I speak at a conference, someone always asks me to weigh in on the controversy surrounding whether or not to label excessive cyber behaviors as addiction. I usually begin to tackle such questions by sharing conversations I have had with video game designers, marketing strategists, and industry executives. These people understand how the brain works, and they use this knowledge to design games and various applications in such a way that they easily embed themselves into the reward circuitry of the brain. This means that the dopamine circuits of the frontal brain, those most closely associated with reward and pleasure, become so highly active when playing a certain game or application, that these activities frequently become preferred. In an increasing number of cases, this preference is so strong that individuals begin to turn away from friends and family, career opportunities, and even activities that were once a source of enjoyment.
News outlets focus on sensational and outlandish stories, like the 16-hour-a-day World of Warcraft devotees, or death from deep vein thrombosis after a 40-hour gaming binge. The more troubling reality, however, is that the average American child spends close to five hours a day in front of a screen. Excessive screen time correlates with obesity, attention issues, sleep troubles, poor performance in school, and social issues. America’s board rooms are even starting to become aware that employees’ screen habits can hurt the bottom line. In a series of interviews with executives for a recent article in the New York Times, the consensus was “that the lure of constant stimulation — the pervasive demand of pings, rings and updates — is creating a profound physical craving that can hurt productivity and personal interactions.” Realizing the potential for harm, many of the giants of Silicon Valley have begun encouraging mindfulness classes, exercise programs, and simply providing ideas to balance and integrate screen time in healthier ways. It is of course a great irony that many of the developers of these alluring cyber amusements are now realizing the potential downside. The rest of society lags far behind, however.
While it is heartening that companies like Facebook, Zynga, Microsoft, and eBay have begun to address this problem within their own organizations, I wish they would take some responsibility in the wider world. The Times article also pointed out that many companies view their activities through the “Fast-food Paradigm,” that while they may provide cyber “junk food,” they are not responsible for the choices people make.
As I have in the past, I call on the companies who profit from technology to spend more resources in public service campaigns to alert citizens to the dangers of excessive use of technology. Parents need to understand age appropriate levels of screen time, and must be educated on how to properly guide their children, so that lives do not get swallowed up by the screen. Even mental health professionals lack basic information needed to recognize cyber-related problems. The solution lies not in eliminating these technologies, but rather in a drastic increase in awareness of how to use them responsibly.
Video games, smart phones, social networking, the Internet, and computers are powerful tools. People adept with these technologies can use them to advance and succeed. Controlling predator drones and monitoring battlefield activity are now achieved through video-game-like interfaces. Facebook and Twitter are essential to sales and marketing. Smart phones can radically increase worker productivity. People who play moderate amounts of video games increase visual-spatial acuity and hand-eye coordination. But we need much more consciousness about how to benefit from these technologies without diminishing our social skills. We need more research to understand how increasing “screen dependence” is rewiring our brains. We need to learn to use the offerings of the cyber world to increase our opportunities for fulfillment, not restrict them.