The Age of Screenagers

Bruce Wilson, PhD

“I think many people have this sense that something about modern society - the screens, the noise, the traffic, the constant busyness - has approached a point where living in the world feels somewhat unhealthy.”- Michael Finkel

 

Remember when tech companies used to advertise their products as ‘user friendly’.  What ever happened to ‘user friendly’?   It disappeared and the reasons are obvious. 

Screenagers Not Teenagers

Modern life starts off not with a rattle, or a stuffed animal, or a book but rather a digital device, usually what we want to refer to as a smartphone, which is questionable to begin with.  From an early age, sometimes still in a pram, modern day children are given a device to play with, which allows parents to be on their device, and serves as a very economical choice for baby-sitting. 

When we reinforce behaviour, it sticks.  Young people are being programmed to be more and more digital and this practice provides billions of dollars in corporate profits. 

The screenagers life span is somewhat limited however because the pace of change in technology is now exponential.  You may know technology better than someone older because you were born into what is current, and without any knowledge of what went before.  However, your expertise is guaranteed to waiver as change continues to accelerate and you are the one struggling to understand what the new technology is all about. 

This allows tech companies to no longer worry about ‘user friendly’ because there will be a new generation coming along that will only know what is currently up to date.  Of course, this means each generation will have to realize that they are falling behind in their knowledge of technology as time goes on. 

Each generation will then experience what baby-boomers complained about when they realized they were lost in the modern digital world.  This was mainly because baby-boomers were familiar with something else, like a map instead of a GPS, before all the changes in technology occurred.  Many of the members in this older age group have just opted out of technology because it was just too hard to adapt.  They have given up on technology because they could not adjust to all the stress from the changes to their life.

Adjustment Disorder

Interestingly, the term ‘adjustment disorder’ was added to the DSM 5 in May 2013.  Do you think the fast-paced growth of technology had anything to do with this happening?  “Adjustment disorder is defined as a maladaptive reaction to an identifiable stressor—such as divorce, job loss, or illness—resulting in emotional or behavioural symptoms that are out of proportion to the stressor and cause significant impairment.” (1)  

Do you think technology could be such an ‘identifiable stressor’?  It seems strange that divorce and job loss and illness have been around long before 2013 but now there is a need for this adjustment disorder diagnosis right about the time many people are not coping well to the exponential changes in technology.   When a portion of the population opts out of technology because they are not coping, could that be construed as an ‘identifiable stressor’?  Perhaps the older generation’s trust concertinaed with the accelerating pace of constant change.

Future Shock    

Over 50 years ago, Alvin Toffler disturbed and challenged the world with his classic work Future Shock. Toffler predicted that the biggest issue facing future generations would be our ability to adapt to the accelerating pace of change. Does the modern world embody many of Toffler’s ideas? Toffler predicted that environmental overstimulation would not only impact our physical and social worlds, but also our psyche.  

The accelerating pace of change is the key phrase here.  Technology not only initiates and propels this change, it embodies it.  When will the iPhone 97 be coming out?

Present Shock

In 2023, according to Google CEO Sundar Pichai, the greatest change to human existence, AI, is about to happen. Pichai admitted there is in the AI industry a “black box,” or an area we cannot predict or know about at the present time. Pichai also admits that we are going to need more than engineers to navigate the future of AI. He says we are going to need social scientists.

Interestingly, social scientists are believed to be necessary after the implementation of technology and not in its planning phases. This sounds very familiar. Isn’t this the same situation we find ourselves in with smartphones and other screen devices in the hands of two-year-olds now? And over time, what we have learned is that it's creating a population that is more distracted, compulsive, lonely, depressed, and anxious.

The time to incorporate knowledge about the social impacts of technology on society is in the pre-implementation phase, not post-implementation. The history of technology to date is all about: “Because we can, we should.” Social scientists will be expected to be part of the clean-up for the techno-bias that pushed society into this place too early.  Social scientists are now being drafted as firemen to put out the fire.

Technocrats need to be prepared to investigate the potential impact of their creations before thrusting them onto an unsuspecting and uninformed population. Saying we have a “black box” is an admission that we are not ready to thrust AI on the public.

We have already started on this journey in many areas but there needs to be more research before AI becomes not just an asset but also a liability. Studies indicate that in some current contexts, the downsides of AI systems disproportionately affect groups that are already disadvantaged by factors such as race, gender, and socio-economic background. (2)

Pichai admitted that fake-news, already a major problem, could become even more problematic with the onset of AI.  The term “black box” is interesting since that’s what we usually look for after an aviation accident. We need to look at AI pre-accident.  Maybe humans are not ready to become the detritus  remains of a failed human experiment.

Creating the flesh of technology, machines, is the easy part. Alongside such efforts, designers and researchers from a range of disciplines need to conduct what we call a social-systems analysis of AI. They need to assess the impact of technologies on their social, cultural, and political settings. (3)  We need to qualify and quantify the spirit of AI.

How will AI affect the spirit of being human in a world that relies on superhuman machines? Will the human psyche adapt, as the engineers suggest? Or is there some aspect of human adaptability the engineers have not considered? We need that information before, not after.

References

1-Diagnostic and Statistical Manual for psychologists and psychiatrists, (DSM-5, 2013).

2-Crawford, K. & Calo, R. (2016). There is a blind spot in AI research. Nature 538, 311-313, Oct. 13, 2016.

3-Barocas, S. & Selbst, A. D. Calif. Law Rev. 104, 671–732 (2016).