Opinion: Going Digital Native

The term 'digital native' is used to describe the conflict between a generation that has grown up with computers and an education system populated by people less familiar with technology. But who are these digital natives and where are they to be found?

Feature by Michael Shea | 22 Dec 2015

In 2001, during an IT class in secondary school, our tutor turned to us and gleefully declared that teaching us was pointless as there were children about to be born who would spend their entire lives immersed in a digital environment. This new generation would have an instinctive understanding of computers that would render anyone born before the year 2000 obsolete.

The same year the term “digital native” was coined by Marc Prensky, an education consultant, in an article where he described the conflict between a generation that has grown up with computers, mobile phones and video games and an education system populated by older people who are less familiar with these things.

For Prensky there is a sharp divide between “native speakers” of the new digital language and everyone else: “So what does that make the rest of us? Those of us who were not born into the digital world but have, at some point later in our lives, become fascinated by and adopted many or most aspects of the new technology are, and always will be compared to them, Digital Immigrants.”

The pejorative native-immigrant analogy aside, it is unclear where the dividing line between these two groups is to be found. As his article was published in 2001 and he was referring to current university students we can assume that Prensky’s digital natives were students born in the 1980s. However in media discourse the jargon used to refer to this generation - Millennials, Generation Y or Generation Z - can be variously defined as people born in the 1980s, 1990s, 2000s or later.

So what is a digital native? Is it someone who grew up with a Nintendo and a VCR or is it someone whose first toy was a smartphone with a cracked screen? For me the year naught will always be 1995 as that the year it seemed everyone I knew suddenly acquired a home computer and access to the internet.

This exposes another problem with the digital natives concept, as there are parts of the world where the majority of people have still never used a computer. Depending on where you were born you might also find the introduction of mobile phones more influential than the advent of the internet, as is the case in many developing countries.

There is little hard evidence for a sharp divide between current and previous generations in terms of their attitudes and behavior towards technology. One alternative to the native/immigrant dichotomy is that between digital “residents” and “visitors.” Rather than age instead this model focuses on the differences in behavior between occasional users of technology and heavy users or “residents,” those who live a significant portion of their life online.

But all of these definitions, aren’t they unfair to computer-literate people born before the 1980s, of which there are millions? Indeed most of the research that produced the technology we currently enjoy emerged in the mid-twentieth century. The internet, the graphical user interface and the computer mouse, to name but a few, were all created by people who are now of retirement age.

There are plenty of middle-aged and elderly computer scientists that have a profound understanding of the way various technological devices work. Having grown up with a wealth of access to such devices doesn’t necessarily equate to a better understanding of how they work.

Many young children today press their hands against TV screens, mistaking it for a tablet computer, failing to understand that not all screens are interactive. While this certainly demonstrates a kind of instinctive familiarity with computing technology, equally it demonstrates profound ignorance. Regardless of age computer science, like anything else, must be learnt to be understood.

High school teacher Brianna Crowley describes the importance of teaching students to use technology for educational purposes and warns of the “myth” of the digital native: “Many adults think that because children have been immersed in a technology since a young age, they are naturally “literate” or skilled in using technology […] research suggests this labelling is outright false - students are no more literate with devices than their so-called digital immigrant parents.” Instead, argues Crowley, educators should design methods to teach students to use search engines, social media and other digital tools effectively.

Prensky has since abandoned his “digital natives” concept in favor of a more nuanced view, however, the notion continues to appear in media discourse on technology in various guises. While it is certainly true that some permanent dramatic changes have occurred with our relationship to computing technology over the past few decades, the real divide is between those who are familiar with it, which in the UK is most people regardless of age, and those who really understand how it works, which continues to be a small minority.