[text of speech delivered April 13, 2013 at the Global Health and Humanitarian Summit 2013 at Emory University, Atlanta, Georgia]
The age of digital technology has radically changed the way we interact with information, but has not fundamentally changed the way we learn. Educational institutions are struggling to reconcile traditional instructional goals with new technology, when actually the goals themselves should be realigned with the needs of a modern technologically-centered society. Information is no longer precious, but the ability to make sense of it is more important than ever.
The most ancient form of education provides a key for learning how to distill knowledge from the ocean of data available in our hand-held devices.
Ten thousand years ago, a group of humans sat around a fire, telling stories. For this small band of hunter-gatherers, storytelling was an art form, a means of entertainment, and a social activity. The stories conveyed important information – cautionary tales to convey warnings and advice about the successful navigation of life’s dangers; heroic adventures to spark the imagination and stir aspirations; stories about interpersonal conflict to explore the nature of human emotions and to validate the social boundaries recognized by the group.
Stories were education; they promoted intellectual and emotional development as well as social cohesion. They allowed people to travel beyond their immediate surroundings to share in the experiences of others. Stories also gave people a conceptual framework for abstract ideas that may have been introduced by others, but were capable of being expanded and refined by anyone who chose to engage with them.
Stories were the original educational system, and their form is so familiar to us today that not only are they still an effective mode of communication, but they have the power to educate us while lulling us into the belief that we are simply being entertained. No matter what part of the world we are in, no matter what cultural standards apply, no matter the age of the people involved or formal education they may have had, stories are an intrinsic part of how we communicate. Stories are fundamental to how we learn.
Stories facilitate learning in nearly every field. In physics, the properties of matter and the laws of motion make sense to us in real-world examples of objects moving through space. In the biological sciences, the different characteristics of organisms make sense to us as we consider how these organisms behave over time. In mathematics, the wonderful power of precise numbers to describe phenomena in the real world can only be fully appreciated with an imprecise description rendered in everyday language.
In the humanities, the examples are even more obvious. The study of literature revolves around stories and explores their structure and effect upon the reader. Economics, which leans heavily upon numbers to prove its assertions, nevertheless requires narrative to explain its analysis of behavior. Psychology bases its theory and practices on client narratives.
In medicine, the ability to interpret patients’ stories is essential to effective treatment. Rarely are symptoms experienced as bulleted points on a list. They exist in the patient’s mind as a part of the pattern of their lives, an interruption in the status quo that makes sense only in the context of the story of life as it is normally lived. Diagnosis often requires the medical professional to wade into the patient’s narrative without preconceptions and without a checklist, to take the patient’s experiences at face value and apply medical expertise and insight to identifying the problem.
Stories are the most ancient formula for communicating important ideas, and yet understanding the power of stories and their capacity for education is more important than ever in this – the information age.
In an age when information is so easy to acquire through the internet, stories can provide a framework for constructing knowledge into a useful form.
As a history teacher, I have many times encountered people who told me, “I don’t like history,” or “history is so boring.” And yet I know that the non-fiction bestseller lists are made up of biographies and self-help books full of personal success stories. People love a good story, but almost all of us dislike the way history is taught in school – as discrete bits of information that we have little reason to care about. It is easy to miss the connection. The internet provides an opportunity – not just for students to learn the connection, but to create it – to construct knowledge from the raw material of information.
20 years ago, when I began my career in education, schools were just beginning to be outfitted with computer labs. 15 years ago, the high school where I taught prohibited students from having cell phones at school. For the last five years, smart phones with internet access have become increasingly common, to the point of being standard. A few weeks ago, I asked a class of 8th graders in a suburban Atlanta middle school how many of them had cell phones in their possession. Every student in the room raised a hand.
As technology has become more commonplace, access to the internet and to the ocean of information it contains has become almost universal. We used to teach students how to conduct research on the computer. Now they come to class more expert in surfing the web than almost any member of the faculty.
But the kind of information they are finding and the process they are using for discovering it is new, and this process arises from an entirely new paradigm for creating knowledge. It is time to take a hard look at what we think we know about knowledge.
It’s not like it was in the old days.
It once was that we could make sense of the world by attributing everything that was unexplainable to a divine plan. For the more agnostic or pagan among us, we could attribute it all to forces of nature. We might be able to glimpse the beauty of the design, the majesty of the forces at play, but we could not expect to fully understand the how and why of everything we experienced.
And yet there is a natural human desire to explain the things we encounter in life. So we built elaborate belief systems constructed from uncritical observations of the natural world, folklore and faith.
This began to change in the Western world with the scientific revolution. We began to observe the natural world with a critical eye, accumulate facts without immediately ascribing explanation, and to assemble what we learned into theories that were independent of belief systems. It was a new way to acquire knowledge – constructing it from empirical evidence.
Under this new system, independent scientists, university professors, and industry-sponsored researchers all worked to uncover factual evidence and develop theories. Academics, technicians, and practitioners in an infinite number of fields tested the application of the new knowledge. Books and journals reported the knowledge to the world where it was disseminated through libraries and textbooks.
The scientific method allowed for the expansion of the body of knowledge, but also acted as a filter to eliminate information that did not pass its rigorous tests. This was an essential function for any authority that governs information – and for a very practical reason.
For most of human history, there were natural limits to how much new information a society could handle. Whether information was based on faith and folklore or based on the scientific method, a selection process had to occur. There were only so many new books that could be published in any one year. There was only so much new material a library could absorb.
But it’s not like that any more. Information is everywhere. Thanks to the internet, there is no longer a practical limit on what can be published to the world. Peer-reviewed articles are available alongside user-edited sites such as Wikipedia. Crackpot opinions are just as easily accessible as rigorously tested knowledge.
By eliminating the technological and logistical constraints on the amount of new knowledge that can be produced, the established selection process has become irrelevant as a limitation on public access to information.
And not only is the information available vastly expanded, but also the process for discovering knowledge has changed. The fact that a single short article on the internet can contain dozens of links to other sources, and each of these sources can contain potentially hundreds more, means that the search for information has no natural boundaries.
As a result of the extraordinarily rapid change in the way we access information, the way we learn is also changing. This constitutes an alteration of behavior for those of us who grew up reading books and accepting the lines of demarcation between professionally edited works and off-the cuff essays, between authoritative sources and amateur self-appointed experts, even between the boundaries that separate different academic disciplines. But for the generation that has grown up with the internet as their main source of information, the emerging new paradigm is all they have known.
The fact is that we live in a period of transition. We continue to accept as authoritative the kind of expert opinion that is published in scholarly works, but we also acknowledge the validity of information produced in an entirely different way.
Universities and government agencies may assemble groups of experts to study specific issues and solve highly technical problems, but crowd-sourcing on the internet allows for a larger and more diverse group of participants to contribute, sometimes with ideas that never would have occurred to professionals committed to their own particular lines of inquiry.
Scholarly works are still published that distill mountains of research and data into thoroughly annotated and well-supported conclusions, but now it is possible on the internet to link this supporting evidence so that anyone who wishes to do so can read the data and come to their own conclusions.
New technology allows for a different, far broader kind of interactivity. This is changing the way we think about new information. Books promote deep thinking. The internet promotes free association.
Books – and for that matter, stories in traditional form – are sequential. They begin at a logical starting point and proceed, including the information that the author has decided will lead the reader most directly towards a logical conclusion. By necessity, because of limitations on space and conventions of form, books must exclude relevant information. In fact, information that does not propel us forward seems like an unwanted distraction.
By contrast, the internet allows for us to drop in on a body of information, follow a thread for as long as it holds our interest, then jump off at a point where a new thread appears to be more productive. The internet reader discovers his or her own conclusion – or simply surfs around, absorbing information methodically but reaching no articulable conclusion.
The new technology, and the new ways that we are interacting with information, are creating a challenge for the education establishment.
Traditionally, we have recognized a specific body of knowledge that one must master in order to be considered a well-educated person. This body of knowledge can be divided and parceled out to different courses in school where it can be taught and tested and checked off as students progress through the years towards a diploma.
All of the material to be taught and tested is published in bullet point form. For example:
Georgia Performance Standard SSUSH 8 a. Explain how slavery became a significant issue in American politics; include the slave rebellion of Nat Turner and the rise of abolitionism (William Lloyd Garrison, Frederick Douglas, and the Grimke sisters).
Here is a topic with expansive opportunities for research and rich potential for insights into the nature of American politics, the ethical questions of human bondage, the dynamics of social reform movements, the role of violence shaping public opinion, race relations, gender politics … the list could go on and on. But the students who take U.S. History in the state of Georgia had better be ready for that multiple choice question on the Grimke sisters or there will be trouble.
The problem should be obvious, and it’s not that it is necessarily a bad thing to know about all the items on that bulleted list. It is not just that preparing for multiple-choice tests is not a very practical a way to prepare for life after school. It is that the world of knowledge has so much more to offer, and the really valuable life skill is to know how to navigate through that world and derive from it useful conclusions.
In this age of information without boundaries, of data that is too voluminous to grasp, an educated person cannot possibly know all there is to know. At the same time, having an outside body determine what is important and what is not – for instance that one should know about the Grimke sisters, but need not know about Theodore Dwight Weld – places an artificial limitation on learning that is destructive to the cause of education, and wastes the opportunity that the internet offers. The truly valuable experience is learning how to discriminate – having access to more information than one can use and selecting what is most valuable to make sense of the rest.
Students must learn how to construct a model that is consistent with the information available. This model does not need to represent the only possible way to organize the facts, but it does need to be one that incorporates the facts into a cohesive account that can be communicated to others. In the field of history, this would be an account of past events based upon authoritative sources, and told in the form of – a story.
Stories have characters and setting. Stories have a plot, with exposition, rising action, a climax, falling action, and resolution. Students must make choices about how to select historical facts to fill out these familiar features.
Stories have a conflict. In real life, conflicts often have many sources and evolving points of origin. But history students must look at the available facts and decide where to begin their story, and how to resolve the conflict within the framework of the story – if indeed a resolution is possible.
Each of the choices the student makes in constructing the story represents an act of higher-order thinking. Every fact included in the story can be linked to one or more sources, demonstrating competence in research skills and understanding of how the network of information available online can be utilized to support a conclusion.
We have been teaching history students information in bullet point form. We need to teach them how to be experts in the use of information, and the question of whether a student is proficient in the use of information cannot be answered on a multiple-choice test.
If we are to have high-stakes testing, we must assess real-life skills that will be essential to success in this century. That means teaching and testing proficiency in using the internet to harvest useful information. And while information can be used in an infinite variety of ways, the time-tested, universally recognized narrative form provides a natural structure for students to use in demonstrating what they have learned. This is certainly true in the discipline of history. I believe the same principle applies in other fields of study as well.