The Myth of the Digital Native

There’s a term flying around that really gets my goat, to put it like a Nancy Drew character. “Digital native” purports to describe a young person who has grown up surrounded by digital technology. It is a dangerous, grossly misleading term that needs to be nuked from orbit if we ever hope to move forward into a healthy relationship with The Future. Here’s why.

There Is No Fork

i remember a quote making the rounds during a conference on kids and technology. i’m not sure if it was borrowed from somewhere, but the gist of it was this: we’re not excited about using forks, because we’ve grown up with forks all our lives. Kids today have the same relationship with the Internet.

It’s true: there now exists a generation of people who have never known a life without the Internet, smart phones, VOIP, video conferencing and game consoles. So it must follow, some people reason, that these new technologies are as commonplace to them as are eating utensils.

Ageism

To compare something as earth-shattering and civilization-changing as the Internet with something as mundane as a fork already betrays a lack of appreciation of the capability and complexity of the current Age … and i capitalize “Age” because i have no doubt that the networked computers have ushered in a capital-A Age of human technological development: as in Stone > Bronze > Iron > Internet. An astoundingly myopic focus sees only Pinterest and cat pictures; what’s happened in the past few decades is nothing short of epochal.

The Internet has been compared to the printing press, but that invention was not made available at a very low cost to millions of people enabling the unfettered transmission of type, sound, AND images – both moving and still – WITH automated language translation and free duplication and instant WORLDWIDE distribution. Take a much more macro view of human existence, and the printing press won’t even rank.

But more importantly, the term “digital native” subtly implies that because young people are surrounded by networked technology, they intuitively know how to use that technology. In fact, nothing could be further from the truth. It doesn’t matter what sort of technology you’re surrounded by: no one comes out of the womb knowing how to type a search engine query, pilot a spaceship, or even use a fork.

Priorities

The crucial difference, continuing with our fork/computer comparison, is that today’s parents know how to use a fork, they know the importance of using a fork, and they consequently teach their children how to use a fork. In contrast, today’s parents do not know how to use computers, they do not know the importance of using computers, and they therefore do not and cannot teach their children to use computers.

Father may know best, but he definitely doesn’t know how or why to defrag a hard drive.

Calling kids “digital natives” seems to leave technology education up to forces of nature, as if kids are somehow going to learn how to properly use a computer by osmosis – much like we’ve done with sex education, and look at how that’s turned out. i’ve seen the resulting ignorance that a tack like that produces; when i taught a group of first year college students a few years ago, i required them to zip their midterm test file and email it to me as an attachment. The class erupted with protests. They did not know how to zip computer files. They did not know how to attach files to emails. They did not know how to send emails. And in which program were they enrolled? Video game design.

So in this computer course, you want me to … USE … a computer?

But why should they know how to send emails? Email is a very recent advancement. It’s really only seen widespread use for the past fifteen years. i didn’t really begin to use email heavily until i was working full-time in an office setting. And how were these kids supposed to know how to archive a collection of files? It’s an easy thing to do, but you don’t know what you don’t know. Archiving has only been a recent addition to operating systems; prior to its inclusion in Windows XP (i believe?), you had to download a shareware program like WinZip or gZip or WinRar to archive files. It’s not really something you’d naturally know how to do until you’ve been required to do it.

Tying your shoes: not incredibly difficult, but definitely a learned skill.

Wickedpedia

i found that the students i’ve taught and the young graduates i’ve mentored – “digital natives”, all – have been completely hopeless at using search engines, a skill i call “Google-Fu”. They’ve been taught by their high school teachers never to use Wikipedia as a source because it’s “unreliable”, due to the fact that “anyone can edit it”. (Teachers, if you think that just anyone is on Wikipedia writing extensive entries on complex mathematical theorems, ancient Jewish mysticism, and common practices in the manufacture of thumbtacks, kindly retire. The Future will take it from here.)

Lately, this admonishment has softened to become “fine – use Wikipedia, but it can’t be your only source”, which is equally ridiculous, because many well-written Wikipedia articles are already cross-referenced to the nines with links to all of the material that would turn up through diligent independent research anyway [citation needed]. And often, articles that are further off the beaten path all have Talk pages which feature ongoing discussions on how those articles are being written and refined. Talk pages are excellent resources to help young researchers identify authorial biases and to develop media criticism skills.

And again, the fact that so many young people i meet have been told not to use Wikipedia as a source suggests an education system that, itself, does not understand the current Age and has been teaching neither adequately nor accurately.

If someone vandalizes a Wikipedia article to make Magellan a contemporary of Cap’n Crunch, and a student cites that passage verbatim, the problem is not Wikipedia.

Forgotten Knowledge from the Mists of Time

i attended college on the cusp of the changeover between a period in personal computing where it was a niche interest of hobbyists, and the explosion of networked machines into the lives of everyone on the planet. And being involved during the changing of the guard, i was very fortunate to attend a class at my school that unravelled some of the crucial mysteries of computing for me, and to this day, i am immensely thankful that i have this knowledge.

The course taught me what a disk is, and explained the actual physical process involved in storing data inside a computer. i learned what RAM was, what a ROM was, and why waving a magnet around near your computer was a bad idea. i came to understand how digital displays worked, and the difference between our increasingly old-fashioned cathode ray tube monitors, and these newfangled flat LCD monitors. i learned what a bus was, how a microprocessor worked, why we talked about “BOOTing” computers, and where the term “spam” came from. i learned how search engines indexed web pages on the Internet, and that knowledge alone has made me particularly adept at Google fu. i was taught about viruses, what they were and how to avoid them.

To this day, i understand how disk drives and CDs store digital information. This should be common knowledge.

All of this amazing and wonderful arcane knowledge is stuff that we no longer teach, because we have a generation populated by “digital natives”. Our kids know how to thumb around on tablet and smart phone devices that have one button. They can communicate with each other as long as it’s nothing too complicated, and as long as it all boils down to one gigantic shiny graphic element that says “SEND”. Some know it all boils down to 1′s and 0′s somewhere down the line, but they have no idea how or why, or why they should care. As long as it all just works, they’re fine. They can’t swap the battery out of their devices, but pretty soon they won’t need to: companies like Apple are leading the charge with perfectly impenetrable little boxes that we must return to them to service. The days of tinkering are disappearing. Our future – The Future – belongs to the companies who build the devices, who hold the keys, and who alone understand how things work.

Making Us Go

IANASTF (i am not a Star Trek fan), but one Trek episode introduces an alien race called the Pakleds:



The Pakleds appear to be very simple-minded, yet somehow they’re flying around in spaceships. That’s because they steal as much technology as they can get their hands on – “things to make us go“, without ever putting in the effort to develop their own technology, or to understand how their stolen technology works. They desire only the power that this technology brings, and they don’t care about the ramifications or consequences of using it.

The poisonous term “digital natives” excuses us from effectively teaching our children how to properly use, appreciate, and understand the incredible networked computer technology that now permeates our lives. We don’t want to learn how to program – we just want programs that work. We want things to make us go. We have become, and we are raising, a generation of Pakleds – a devolution of humankind which, instead of standing on the shoulders of giants, is dandruff on the shoulders of giants. To wit: we’re flaky. It’s time that we do away with the term “digital natives” altogether, accept our responsibility, and recognize the importance of teaching our young people how to effectively navigate and steer the incredible future they will soon inherit.

11 thoughts on “The Myth of the Digital Native

  1. David F

    Totally bang-on. I recall reading a study that confirms all this, though my own digital aptitude apparently doesn’t extend to the ability to retrieve webpages visited a couple of years ago.

    However, this isn’t a new thing, and we’re all Pakleds. The technologies of the industrial revolutions and the atomic age, with their radical transformations in food generation, power generation, health care, etc. have arguably more radical implications for the past and future of humanity… but how many of us know how those work? How many of us are sufficiently informed to have real discussions about nuclear power or modern agriculture, the outcome of which will directly determine the survival of humanity within the next century?

    Assumption of knowledge of any sort is a dangerous proposition. But let’s be real here: learning things is haaaaaaaard.

    Reply
    1. Ryan Henson Creighton Post author

      In the first episode of his awesome series Connections, James Burke asks “what if the lights go out?”, and then traces mankind’s progress backwards through a succession of developments until he winds up at the one invention crucial to human survival: the plough. He argues that as long as we know how to operate a plough, we’ll be able to feed ourselves and start over.

      i don’t know about you, but i do not know how to use a plough.

      Reply
  2. axcho

    I totally agree! I didn’t realize this until I started teaching game programming to middle school students. Who knew downloading and installing FlashDevelop could be such an ordeal?

    As far as ploughs go, they’ve got some downsides in the long run, in terms of soil erosion and such. I read it in a book called Dirt: The Erosion of Civilization.

    Reply
  3. Brian B

    Aah! I remember when my great grandfather would tell me stories about his first fork. It was the shiniest fork on the street, and he got a lot of ladies flashing that thing around! Ole Chromey Mc Pointy Tips, they just don’t mak’em like they used to.

    Reply
  4. David Golden

    I’m going to take a guess and say that you probably don’t know what the Otto cycle is. But I’ll bet you’ve relied on it hundreds (thousands) of times in your life.

    Anyone who decries that people don’t know how computers and Internet work is guilty of immense hypocrisy, due to the many things in modern life that we rely on for which we have no clue how they work. And due to increasingly micro-specialization of scientific and engineering study, even the experts probably know only their own specialties.

    Why is this new? (Or news?) We’ve clearly crossed some critical point where the sum of human technology is no longer something that a reasonably bright, capable person can hold in a single brain.

    To the extent that kids are “digital natives” (and perhaps I’ve missed some big influential article to which you are indirectly alluding), I think it’s that they will take for granted things that we never did and will have a different relationship with time, space and history as a result.

    My 4.5 year old understands that Skype lets us talk to people far away. That far away, it’s a different *time* (because it’s dark or light there when it’s different for us). He experiences time zones directly, not abstractly.

    He expects his parents and friends parents to be generally available for communication by text or email — where hours to get a response is a long delay.

    He regularly gets to review his entire life in photos because the cost of photography (even printing out pictures for his physical photo albums) is trivial compared to a few decades ago. I suspect his memory of his childhood will be vastly better than mine because he’ll be able to see all those pictures and videos and refresh his memory of things.

    This are not experience that are taught. They are a consequence of the ubiquitous technology that surrounds him.

    Yes, the underpinnings of technology *should* be taught. We still have to explain what Skype is or that texting is sending a message to someone else. (I’m an engineer by training. I get that.) But to focus on that — or on whether kids need be taught or left to figure things out on their own — is to miss the point.

    Ubiquitous technology will change how they think about and relate to the world. Whether they understand how to use it well or how it works — any more than people understand today’s or yesterday’s technology — is a side issue.

    Reply
    1. Ryan Henson Creighton Post author

      The term “digital native” is an irresponsible invention of author Marc Prensky: http://en.wikipedia.org/wiki/Digital_native

      Your car argument only takes us so far. People don’t know what the Otto cycle is (i had to Google it myself … but i WAS able to perform a web search for it), but they also don’t know the process involved in smelting the metals to make the engine components, nor do they understand how those metals were mined, nor can they speak confidently about the multi-million-year processes that formed those metals in the Earth.

      There’s a healthy place to inhabit on the knowledge spectrum between Omniscient Deity and Utter Moron. My only point is that the needle is edging too close to “moron”, even as the expectation (or mistaken viewpoint) brought about by terms like “digital native” puts young people closer to “knowledgeable”, when they simply aren’t. You don’t just absorb knowledge through your skin because you’re surrounded by technology; your son knows what he knows about teleconferencing and time zones because he’s been on teleconference calls, and has witnessed the time difference.

      i’m more concerned about aspects of technology that are important to know, but that are generally hidden from view. We can’t witness them. And more and more of these aspects (changing batteries, file management) are disappearing from view every day due to controlling corporations like Apple.

      Here’s an example: many of us know that computers can “talk” to each other. And we’ve used wifi, so we know that this can happen over through the air with no cable connection. Only some of us understand that when we type a URL into the browser, we’re requesting information from another computer. That’s where our understanding starts to fall apart, because the nuts and bolts of networking are, for the most part, hidden from view.

      So when a man calls us on the phone saying that he’s with Some Important Organization, and that our computer has shown up on his grid as having a virus, and he points us to a never-before-seen area on our computer that appears to list serious issues, and the man offers to fix those issues for a small fee if we just provide our credit card number, we fall for it. And we don’t fall for it because we’re stupid – we fall for it because we’re ignorant. That ignorance is a side effect of not knowing enough because too many important details are hidden from our view. We don’t have to swing all the way to Omniscient Deity on the knowledge spectrum, but that needle does have to move.

      Reply
      1. David Golden

        My car argument was that this is not terribly new. How many people have been (are still) swindled by malicious mechanics telling them that their car needs a new Fibulator because they are clueless about automotive technology. Or how many people were (are still) swindled by patent medicines or “natural remedies” because they have no clue about medicine? Ignorance has a long history of being taken advantage of.

        So I agree with you about that aspect of digital nativity. Or, at least, I chalk it up as another idiotic simplifying phrase like “information superhighway”.

        My rebuttal — to the extent it is one — is that I’m much more interested (concerned?) about the ways in which my kids will *think* differently in this new environment. When so much information is on demand, how will they learn patience for painstaking research? If Google doesn’t feed them an instant answer, what then? When high-fidelity communication is effectively free, will their social skills evolve differently than ours did? (Is introversion or extroversion more or less of a liability?) When every mistake they make is watched by Big Brother (or Big Mom&Dad) will they be more liberated? Or more paranoid?

        I think there *is* something different about the environment children today experience and both the term “digital native” and your rebuttal of it, seemed to mostly miss that bigger picture.

        Reply
  5. Sia

    I am one of those who grew up with the Internet. I learnt a bit about it boiling down to 0s and 1s but this was years after learning how to use a computer and Internet. Still, I never fell prey to the Nigerian prince scams. You know why? Because I knew it was a bad idea. I didn’t know WHY it was a bad idea but I did know not to fall for it nonetheless.

    Guess what though? Knowledge of the intricacies of human biology is not a prerequisite to knowing not to run with scissors. Nor do I need to know about how stuff is grounded to know that water and electricity don’t mix. Just as I didn’t need to be a nurse or an electrican to know those two basic safety rules, I needn’t have been a sociology professor to know that taking gifts from or giving gfts to strangers is a bad idea. That and knowing that everyone you only know online was a stranger is part of being smart online. I also watched my older brother and cousins closely.

    Reply
  6. Pingback: Digital Nativity? | A Blog on LIST

  7. Bill Quirk

    The worst thing about the term digital native is that it implies that young people somehow learn differently than people who didn’t grow up with iPads, smart phones and all the apps. It’s not about access to these technologies, the problem is with the implicit argument that the learning process is evolving along with the technology.
    The term is being used to promote much lower quality learning environments like online education. The very best online courses may be as good as an in person course (and these won’t save $), but most of them are seriously deficient when considering the factors that really facilitate learning…interaction with a good teacher! If the term digital native is loaded with assumptions, and loaded with expectations about the potential of the next generation to use technology to solve social problems, and if its being used to lead us into a subpar educational experience for the masses…I say we ditch it.

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>