Four essential skills

We need a fundamental reassessment of the goals of education in the 21st century.  Students deserve to be prepared for lives that are fulfilling and uniquely their own.  They need to learn how to become life-long learners.  Their creative potential must be unleashed, for the sake of each individual, and for the benefit of society.

Students need to practice and become ready to refine throughout their lives the following skills:

  • find relevant information
  • choose thoughtfully
  • create boldly and intelligently, and
  • communicate effectively

Essential skill: Find relevant information

Information is no longer a precious commodity, confined to books and libraries.  It is no longer accessible only to those with money or advanced degrees.  It is available to anyone – often in the palm of their hand – and in greater abundance than would have been thought possible even a generation ago.

Students need to have the technical and intellectual tools to select the information that will be the most useful to them.  They need to develop the ability to discriminate among the vast resources on the internet, what is reliable and what is not.  They need to be prospectors and detectives, and most of all critical thinkers.

Reflection: Finding information in the world where you live

My sons and I are from different worlds.  When I was young, I listened to radio and purchased recorded music on either 45 rpm records with one song on each side or on 12-inch “long-playing” record albums (LPs) that contained a number of songs.  It was years before I learned that the term “album” when applied to recorded music had originated at least a generation earlier.

You see, my father and I are from different worlds as well.  In his day, records played at 78 rpm, and even though the discs were larger than the 45s I had in my collection, they could only hold one song on each side.  So there were hardcover books with pages that were actually sleeves for the 78 rpm records.  The original format for “record albums” was this type of book that looked like a photo album – but was in fact a place for organizing and storing recorded music.  The term “album” was retained for the collections of songs on one long-playing disc, even though the hard-cover book was a thing of the past.

My father’s childhood records were replaced by the LPs of my youth, which were replaced by the CDs of my young adulthood.  My sons know about CDs, but it is entirely possible that they will never own one.  Their music comes from the internet. They never listen to the radio unless they are in the car with me.  The term “album” has no meaning for them in the context of music.  But they do know what a playlist is.

When I was young, I read books and watched television.  There were some books that everyone (or so I thought) read, and some “events” on television that everyone (I believed) experienced.  When I was very young, the film The Wizard of Oz was played on television every year.  I recall it being on the weekend after Thanksgiving, but I’m not sure about the accuracy of that memory.  I could look it up, I suppose, but I don’t feel any need to do so.  The memory serves me well as a place marker for a period of time when I felt that the country where I lived had a common culture, and a finite set of shared experiences that were essentially universal.  It wasn’t literally true, of course, but at one point in my life, everyone I knew could sing along to “If I Only Had A Brain.”

My sons don’t have any patience with the kind of hazy memory I hold related to the broadcasts of The Wizard of Oz.  Too many times for me to count, I have described a partial bit of knowledge and within a minute or two my 13-year-old has found the missing pieces in a quick online search.

My sons don’t have any real sense of the kind of shared experiences I had as a child.  Not long ago a new television series based on a comic book superhero premiered.  I asked my 11-year-old to watch it with me.  He said he would rather catch it later online.  And when he does, he may watch it over and over – something that is perfectly natural for him to do, but to me that seems an odd investment of time.  Not only does it diminish the special quality of the unique event, and it takes time away from the potential new experiences he is giving up.

The generational differences in the way we encounter information – and in this context I include experiences like listening to music and watching movies – are significant in framing the way we think about information.  In fact, the reality that we go to the same source for mindless entertainment that we access for research for a term paper also frames the way we think about information.

Experiencing information is less precious if it is an experience we can have at will, on our own schedule, without raising our eyes from our tiny screens.  When filling in the gaps is easy, there is less opportunity to reflect on why we need the information in the first place.  There is some virtue – it seems to me – in having to work for it, and sometimes having to wait for it.

But I approach this from my own cultural context.  Anyone born in this century, and probably also the last decade of the 20th century, is likely to be able to find information in great detail with casual and refined ease.  It is less clear that they will be able to use this information effectively.

Schools have a role in teaching about the nature of information, the ways to distinguish between information in terms of quality and reliability, and the ways to use information to reach original conclusions that have merit and significance.  Schools should be operating in the same world as their students.

Essential skill: Choose thoughtfully

Life in human society for countless centuries involved very little real choice.  Between economic necessity and social stratification, lives were pretty well laid out for most people.  But today in the United States we are faced with an often bewildering variety of choices in everything from career decisions to family planning to political issues to how we are going to spend our money to fill our leisure time.

Students need to be empowered to make good choices based on a genuine understanding of their individual needs and of the available options.  In terms of instructional practice this means that students should be shown how to direct their own learning, and given the opportunity to become experts in the field of their own choice.

Reflection: Choosing the perfect job

How many of us love our jobs so much that if we didn’t have bills to pay, we would still go to work for free?  How many of us if given the opportunity to look for such a job would know what to look for?  How many of us would even recognize such a job if we saw it?

In the final project of a high school economics class, I had the students select a career they would like to pursue.  There were various research tools they could use to discover the relevant numbers – compensation, rate of growth of the position, years of training required, expense of that education, etc.  Students were also required to identify where they would live and to research living expenses – cost of their residence, utility bills, transportation costs, cost of food, cost of clothing, cost of entertainment, and so forth.  The kids were given a detailed questionnaire that set out the categories of information they would need to find.  Their task was to find the information and make choices.  I was somewhat disappointed and quite concerned with the results.

A majority of the students chose their future career strictly on the basis of income potential.  This is not a supposition on my part – I took a poll and the students were very upfront about it.  Very few strongly considered their level of interest in the profession, fewer still considered their aptitude.  Students who had proved to me over the course of months in my class that they were not dedicated students opted for careers as doctors and lawyers.  Students who showed average athletic ability on the school’s playing field projected themselves having long careers in the NFL.  It is good to set one’s sights high, I suppose, if one is willing to be realistic about what it takes to attain them.

But a surprising number of these students also imagined themselves living in the same suburban community where they had grown up – even if that meant a horrible commute to the central business district and forsaking the choice of a nicer neighborhood closer to work that was more in line with their substantial new income.

It was as if they could not quite imagine actually attaining the career goals they had set, so they were free to shoot the moon.  But they also could not summon the courage to imagine moving away from the familiarity of home – even if staying in the community would mean a clear concession in their quality of life.

Most of my students were making choices on the basis of fantasies of wealth and sentimentality, not on the basis of aptitude and practicality.  I was left feeling frustrated and concerned that these high school seniors were about to go out on their own so ill-prepared to make important decisions, and so lacking in insight about their own true potential.

We all have talents and abilities that are uniquely ours, and which make us happy when we use them.  Unfortunately, most people have little awareness of their true gifts.  Most of us don’t know what will make us truly happy.  We feel lucky when we stumble across that rare and beautiful feeling that everything in life fits together perfectly.  And yet never before in history have there been so many choices for us to make, and so many opportunities for us to find that perfect fit.  We need to provide the next generation with better tools for building happy, productive lives.  We need for our schools to help students learn how to make better choices for themselves.

Essential skill: Create boldly and intelligently

Wealth is generated, and personal pride is grounded in personal accomplishment.  The human race is imbued with incredible talent, and historically, we have only allowed for relatively few bright lights to shine.  The economic growth of the 21st century will be spurred by creativity.

Students must be given the freedom to generate new ideas and create practical solutions to problems.  Their work product should be assessed in its totality, not according to answers selected on a standardized test.

Reflection: Creating something new

“Everything that can be invented has been invented.” 

This quotation, attributed to Charles Holland Duell, commissioner of the U.S. Patent Office from 1898 to 1901, has often been used to ridicule the notion that the product of human inventiveness was finite – and that we had reached its point of exhaustion.  A pretty short-sighted perspective from anyone living in the modern era, and particularly coming from the man responsible for the registry of new inventions.  But did Mr. Duell actually utter these words?

The quote was traced by researcher Samuel Sass to a book that had been published in 1981, long after Mr. Duell had died.  In 2011, law professor Dennis Crouch conducted a Google search for mention of the phrase “everything that can be invented” and essentially corroborated Sass’s finding.  Google found no mentions of that phrase from prior to 1981.  But Crouch did discover another possible origin – a 1899 edition of Punch Magazine.  Neither researcher found a contemporary attribution of the quote to Mr. Duell.[1]

Consider for a moment the idea that a person living at the dawn of the 20th century could think that technological advancement had run its course at the end of the 19th.  Now consider how much easier it was for Professor Crouch to do his research circa 2011, than it had been for Mr. Sass circa 1989, before the internet had come into widespread usage.  Even the process of debunking the erroneous attribution was affected by new inventions.

One invention opens the door for another invention.  Information technology facilitates the exchange of information, and leads to the generation of new ideas.  We keep trying to make our lives easier by turning what used to be laborious and time-consuming into a simple operation that might require just a few keystrokes.  Progress is measured by how much work we can turn over to the machines we have made.

We have disassembled the old order in which we used our natural skill at problem solving to devise better ways to do things by hand, and replaced it with one in which we interact with machines that do these things for us.  Freed from labor, we are now infinitely more free … but to do what, exactly?

In his book A Whole New Mind, Daniel Pink points out that manufacturing technology has eliminated much of the role of skilled craftsmen in our economy.  Furthermore, information technology has led to the creation of machines that can solve problems by evaluating far more information than a human mind could. So what is left after machines have taken over the jobs making things, and taken over the jobs figuring out how to do things?

According to Pink, it is creativity and empathy – the ability to generate new concepts and the ability to operate on the level of our shared humanity.  Medical schools are now teaching students how to understand patient histories through narratives rather than simply through questionnaires.  Corporate recruiters are now seeking applicants with Master of Fine Arts degrees, not just the usual crop of MBAs. [2]

Recognizing the importance of creativity and empathy requires a significant paradigm shift.  We have spent generations creating tools that can do what we can do, better, cheaper, and faster.  But by living among the machines that are supposed to free us from the rote and the mundane, we have to some extent made ourselves more machine-like and less free.

This is not some science fiction story in which the hapless humans become servants to their own machines.  But there is a reason stories like those have some resonance in our collective consciousness.  When our school systems invest enormous energy into preparing students for multiple-choice tests to generate data and measure progress, one must wonder – is this what we really need our schools to do – to train students to feed information to a computer?  Is it in anyone’s best interest that we reward only the kind of intellectual achievement that can be demonstrated with a No. 2 pencil on an answer sheet?

Fortunately, some decision-makers on the leading edge of economic growth have recognized the importance of creative thinking.  The medical schools and corporate leaders sited above understand the vast potential in individual human expression.  Even more encouraging are the growing possibilities for small start-up business using the internet for those who have imagination and faith in their creative powers.

But public schools are not contributing to this growth the way that they should.  We need to encourage creativity and empathy.  We must recognize the danger in standardized curriculums and standardized testing that are robbing us of our individuality and our ability to imagine the unimagined.

Essential skill: Communicate effectively

We live in an age of communication, and yet as our means of connecting with one another proliferate, our schools treat this new reality as an unpleasant distraction that must be stopped.

Students should be encouraged to use technology appropriately, and more importantly, to communicate with others in a productive manner.  This means according text messaging its realistic place in students’ lives, and it also means teaching spoken and written communication that will enable students to communicate effectively no matter the context or medium.  Communication in the 21st century involves both traditional modes with all their rules, and means of connecting that have yet to be invented.

Reflection: Communicating in many languages

“Go to, let us go down, and there confound their language, that they may not understand one another’s speech.”  (Genesis 11:7, KJV)

Genesis tells the story of the Tower of Babel, which ends with the Lord scattering the people across the world and giving them different languages.  Maybe that’s where it began, but the process of creating new languages is ongoing.

The lifeblood of human society is communication.  It is what holds us together.  It is what enables us to move forward together.  It gives us the power to name the things in the world in which we live, to define the roles we play in life, to share traditions as well as newly conceived ideas, to express devotion and fear, and attempt to capture the mystic.

Communication is not limited to speech, or writing.  People can communicate through non-verbal sounds, through gestures, through music, through dance.  It is said that a picture is worth a thousand words.  We have developed a universal language for saying “no,” by using a circle with a diagonal line through it.  We have devised a language that is capable of describing with precision quantities, volume, shapes, movement through space, and degrees of force – the language of mathematics.

The languages we use are in a constant state of change.  New words appear, old words change their meaning.  “Wherefore art thou, Romeo?” Juliet asked, as Romeo hid beneath her balcony.  Modern readers assume she was inquiring as to his location, but none of them actually use the word “wherefore” in their daily speech as people in Shakespeare’s time did.  It meant why, not where, and Juliet is asking why the boy she just developed a crush on had to turn out to be a member of a rival family.  Her next line is “Deny thy father and refuse thy name.  Or if thou wilt not … I’ll no longer be a Capulet.”  She is questioning fate, not her GPS.

Within each language, dialects appear with words that may or may not ever enter the standard version of the language. At any given time there might be a multitude of these variants from the standard, which may be perfectly suitable for communication for their users, but which could be confusing and seem “incorrect” to non-users.

Dialects can arise in ethnic communities and be nurtured in relatively closed environments.  Dialects can sweep across the land borne by mass media targeted at specific audiences.  Specialized versions of the language can develop along with technology – a whole vocabulary that is transparent to the tech-savvy, but opaque to most others.  And there can be a specialized language that develops for use in a particular medium, such as the abbreviations commonly used in texting.

Recently, a school administrator shared with me an insight that he seemed to find particularly encouraging.  He told me that kids are actually writing more today than they did 10 years ago – because of all the texting they are doing.  I had to fight the urge to LOL.  Could he really be suggesting equivalence between texting and, say, writing a persuasive essay?  Is the quantity of writing the relevant measure here?

The fact is that we all communicate, all the time. Many of us are fluent in more than one language.  Most of us are fluent in more than one dialect. Almost all of us, whether we are conscious of it or not, practice code-switching – alternating between two languages or dialects – on a regular basis.  The key is to choose the appropriate language for the setting in which it is used.

Schools should not deny the validity of dialects, especially those that are in vibrant use by students.  In fact, they should embrace and even celebrate the fact that students can communicate effectively in different modes.

But schools should not ignore their responsibility to teach effective means of communicating that the students may not be picking up on their own.  Students should leave high school able to use standard English in verbal discourse.  They should be comfortable with the conventions of public speaking.  They should be able to write a business letter, a persuasive essay, a research paper with proper citations.  Students should be cognizant of their own code-switching, understand its utility, and should not be trained to think that one form of communication is per se inferior to another.

The lifeblood of human society is communication.  We can’t allow ourselves to be confounded by the many forms it takes, or distracted by the mistaken idea that there is only one style of communication that is always appropriate and correct.  Form is dictated by content, by audience, and by purpose.  It is only incorrect when the intended message does not get through.  We need to be able to communicate effectively in all of the many languages that are spoken in the different areas of our lives.

Conclusion

These four skills are essential to modern life, but the standard practices in many schools work to discourage their development.  Providing a set body of facts for students to learn denies them the opportunity to find the knowledge they need, and the practice it takes to learn to distinguish what is relevant and valuable from what is not.  Enforcing compliance with established procedures denies students the opportunity to choose and experience the real-world outcomes of their choices.  Making standardized multiple-choice tests the measure of success in school denies students the opportunity to create and to demonstrate their learning in an authentic manner.  Restricting technology, a practice that is so clearly at odds with the real world needs of students in the 21st century, hampers their ability to connect meaningfully with one another, and denies the reality that communication is not limited to one approved medium or one appropriate form.

We must do better by our children, and we can.  But it will take a fundamental reassessment of our goals and a willingness to revolutionize our approach to public education.

_______

[1] http://patentlyo.com/patent/2011/01/tracing-the-quote-everything-that-can-be-invented-has-been-invented.html

[2] Pink, Daniel. A Whole New Mind. New York: Riverhead Books, 2005.

Leave a comment

Filed under four essential skills

Framing the story

Summer is coming and the story of the 2015-2016 academic year is approaching its conclusion.  Life, with all its untidy and ambiguous details, can be organized into stories, and the way we frame these stories determines how we understand life.

A school year is one way to frame a story, but there are others we often use.  Work anniversaries, the years I lived in this city or that house, when I was friends with this person, while I was married to that person – all are frames marking off the chronological segments of our time.

Historians use frames as well, and students of history need to be aware of this device and of its limitations.

We speak of the Great Depression not only as an economic event, but also as an era.  We tend to view everything that happened during times of war as somehow related to the war.  We use the administrations of different presidents as an indexing system, gathering every event under that label and treating them as if everything that happened, for instance, between January 20, 1981 and January 20, 1989 somehow belonged to Ronald Reagan.

Inclusion implies connection.  What is the relationship between the Reagan administration and The Cosby Show? Putting these overlapping events into the same story suggests we should understand them in that context.  But is that the only way they can be understood?

Framing the story not only brings into the narrative those things that fit within the frame, but also excludes those things outside the edges.

Yesterday I heard a story on NPR.  A young woman from Japan was found in the snow in rural North Dakota. Apparently she died of exposure while looking for the ransom money that was depicted in the film Fargo as having been buried in a field off a country road.  The national media told her story as a human interest piece – gullible girl, believed what she saw in a movie was real, foolish enough to go out alone into the winter air without a proper coat.

But the reporter for the NPR story wanted to know more. He travelled to Fargo, found and spoke with the handful of people she interacted with before she took her fateful trip out into the country.  He lay on the motel bed where she spent her last night.  He discovered that her last phone call was to Singapore, but when he called the number, there was no answer.  He tried to find someone she had spoken with about the movie Fargo to uncover some clue as to her plan for discovering the money.

Then he went to the police department to ask about the young woman.  “Oh, you mean the suicide,” said the investigator.

It turned out that a suicide note had been delivered by mail several days after her body was discovered, and after the grim human interest story about her death hit – and disappeared from – the national media.  It seems that she was a country girl who had come to the big city, had gotten a good job and found a boyfriend, only to lose the job and the boyfriend.  Her life spiraled downward from there.  She left everything behind in Japan, and came to North Dakota to end her life. The phone call to Singapore?  It was her ex-boyfriend’s number.  Why come to the frozen Northern plains?  We will probably never know for sure, but the reporter did discover that her ex-boyfriend was from Fargo.

The frame around the original story was a mistaken idea about a movie and a quest for lost treasure.  A wider search included additional information and invalidated the original frame while substituting a new explanation for her actions.  But – and this is perhaps the most important lesson here – the new frame, like the old one, leaves us with unanswered questions.  Would widening our search still further lead to new conclusions and invalidate our understanding of her motives in favor of a different theory – a different frame?

History is more than a catalogue of past events.  It is a method for understanding the past.  It is a method that may just as effectively lead us to misunderstand the past.  Life is untidy.  It is easy to draw that conclusion from our present day experiences.  But it is also important to remember that life in the past was just as untidy for those who experienced it.

From history, students can memorize facts, names, dates, sequences of events.  But those objective, easily testable data chunks are not all there is, nor is committing them to memory the most valuable learning opportunity we have in the study of history.  Motives, ideas, understandings and beliefs held by the actors of past events – these are things a student must learn how to construct. Learning how to interpret information, learning how to construct a frame that supports understanding – this is the real opportunity we have in the study of history.

______________

The NPR story referenced above can be found at: http://www.npr.org/2015/04/24/401965548/fargo

2 Comments

Filed under history

Ignore the rest

It was a month before the “testing window” and the principal called a meeting for all the teachers who had a high-stakes test associated with their classes.  I was teaching U.S. History, which had a state-mandated End-Of-Course Test.  This was in late March of 2014, the last year of the EOCTs, before the new Milestones tests were rolled out.

The principal reminded us of the importance of testing to the school and the importance of the procedures designed to ensure the validity of the results.  The notorious cheating scandal in neighboring Atlanta Public Schools had broken five years earlier, and the trials of several teachers and administrators would not be concluded until a year later.  The relaxed culture in which professionals were expected to do their jobs had been replaced by one in which teachers were treated as suspects.  The first year of the EOCTs, I had administered the tests to my own students, in our classroom.  This year, students would be herded into computer labs for testing, and their teachers would be held in another part of the building.

Then the principal turned to the subject of test-preparation.  “I want you to take a close look at each one of your students,” he said, “and divide them into three categories.

“One group will be the kids who could pass this test today, no problem, and no additional help on your part.

“The next group will be the kids who will never pass this test, no matter what kind of help or support we give them.

“Then the kids in the middle – the ones who might pass the test with a little extra work, and a little more attention on your part ….

“For the next month, I want you to focus your attention on those kids in the middle, because this school needs about half-a-dozen more students to pass their tests this year than passed last year.”

I looked around the room.  Could I have heard right?  Did he just ask us to ignore perhaps two-thirds of our students for a month so that the numbers could come out right for the school?

If anyone else in the room felt as extremely uncomfortable with this order as I did, I could not read it on their faces.  I don’t know if anyone else could read it on mine.

But it was at that moment that I realized my time as a public school teacher – 12-plus years at that point – needed to come to an end. I loved being a teacher, and I could see a future as a teacher in some other setting.  But when public schools are more interested in data collection than in the welfare of students – all of their students – there is really no role left for me to play that I am interested in playing.

It is with a touch of personal sadness that I see the testing window roll around each year.  But even more so, it is with concern and alarm that I hear stories of time – valuable learning time – spent on practice with multiple choice questions, drilling on the superficial knowledge required to recognize the correct answer. The human capacity for learning is so much greater.  The expectations of our schools should be so much greater.

For two years now, technological glitches have prevented the new Milestones tests from being counted.  This means that the hours of anxiety suffered by students, and the inconvenience of being shuffled around, waiting in uncomfortable chairs for computers that don’t work, is all for nothing.  But even more significantly, these tests that don’t count nevertheless continue to have a corrupting influence on instruction, and a debilitating effect upon learning.

We must cultivate learning and individual achievement in all students.  We cannot waste the youth of a generation by focusing on the narrow skill set required to pass a standardized test, while ignoring the vast potential of human ability.

Leave a comment

Filed under learning

Testing season

Computer_lab_1 2.jpg

It’s that time of the year again.  Children are eagerly awaiting the day when their teachers suddenly cease to care, when lessons end and the film festivals begin.  School still goes on, but whatever passes for education through nine months of the year is replaced by babysitting. What a relief!

But first we have to get past these damn tests.

Testing has become the center of gravity in public schools.  All attention is focused on test preparation, testing, and dealing with the fallout from the test results.  Once the high-stakes tests are completed for the year, the only reason to continue coming to school seems to be to fulfill the requirement of a 180-day school year.  So why not show a bunch of movies until the scores are tabulated and students can learn if they have qualified to advance to the next grade?

Except that, for the second year in a row, Georgia students may not even be learning that.

Last year, the state department of education replaced the End of Course Tests and the Criterion-Referenced Competency Tests with a new model – the Milestones tests.  In a wise move, the state did not require that the scores gathered be used to determine student progress.  Instead the data was used to establish norms for the test itself.

This year, schools have been plagued by technology issues.  On Saturday, the Atlanta Journal-Constitution reported on some of the problems.[1] today’s paper had more tales of computer issues as well as a story about parents who refuse to let their children take standardized tests. Yesterday, a high school teacher told me that at her school, the internet connection was lost for two hours on a testing day, which resulted in a long wait and a delayed start.  Students were restless and stressed, and were not able to eat lunch until 2:30 in the afternoon.

It is fair to ask – what are we testing here, the acquisition of knowledge, or the ability to operate under adverse conditions?  Any competent researcher measuring the progress of schools would throw out these results as tainted or as outliers.  But in the world of public education, every student needs to be tested, and the drive to get it done is far more powerful than any desire to do it right.  We are using these scores to measure the progress of schools, but we also want to use them to measure individual student achievement.  A bad day at school, an unreliable internet connection, a growling stomach, can result in a student failing a class.

As unfortunate as this scenario is, it does not represent the worst damage that our infatuation with testing has done.  Standards and standardized testing have robbed teaching of its art, and reduced learning to the skill of fitting into a cookie cutter.  Professional educators are no longer allowed to rely on their judgement and experience to craft their instruction – they must cover every point in the state-mandated bullet-point list that defines their curriculum.  A student told me recently that for weeks now, his biology teacher has had the class do nothing but take online practice tests.  No instruction, no feedback, just practice.  Surely she did not study and work at becoming the kind of scientist who could teach others to love biology – only to do this to her students!

So let’s get past this season of testing and on to the babysitting portion of our program.  I love a good movie on a warm Spring day.

________________________

[1] Walker, Marlon A. “State Testing Trips over Tech Troubles.” Atlanta Journal-Constitution. 30 Apr. 2016: B1-B2. Print.

Leave a comment

Filed under learning

Finally … Harriet

Harriet Tubman’s image will grace the $20 bill, according to the U.S. Treasury Department in a statement issued last week.  This is big news, and was a long time coming.[1]

About a year ago, I read of a group called Women on 20s that was lobbying for a woman’s face to appear on the $20 bill, which would mean displacing Andrew Jackson, whose face appears there now.  The group had conducted a survey and Harriet Tubman, the former Underground Railroad conductor, had come out the clear winner.  Then last summer, the Treasury Department announced that a woman – yet to be named – would be chosen to appear on the $10 bill, a decision that would have left Jackson in place and would have bumped Alexander Hamilton, our first Secretary of the Treasury.

The fact that Treasury altered its plan in a way that conformed to Women on 20s’ original proposal may be entirely coincidental, or it may be the result of successful lobbying by a determined citizens’ group.  It may be due in part to the fact that Hamilton, a rather obscure figure to most Americans in recent decades, is now the subject of a hit Broadway musical.  It may be due in part to the fact that Jackson’s reputation doesn’t wear very well in the 21st century.  His leading role in causing the Trail of Tears and the fact that he was a slaveholder has made him problematic enough for the Democratic Party that in many states, the annual fundraising “Jefferson-Jackson Day” banquet has been renamed.[2]

As a teacher, and American history enthusiast, I am well aware of both Jackson’s and Hamilton’s virtues and shortcomings.  I understand how their legacies can be viewed very differently in today’s social and political climate than they were viewed by past generations.[3][4]

But what fascinated me from the beginning was the fact that Harriet Tubman seemed so different from either man.  I don’t mean racial or gender differences, or the fact that she never held high government office.  I mean that fact that unlike Hamilton and Jackson, whose celebrated actions were very public and – while in some cases unpopular today – perfectly legal at the time, Tubman is celebrated today for actions that were illegal at the time.  She was a criminal who acted covertly.  It took the Civil War and the end of slavery to bring her story out into the open.[5]

It seems to me that Harriet Tubman’s rise to fame and approval, now solemnized by her inclusion on our nation’s currency, provides a lesson for anyone who wishes to learn from history.  There are unjust laws.  A person who acts illegally is not necessarily acting wrongly.  There is hope in a society where laws are made through citizen lobbying, by compassionate legislators, and by wise judges, that a better form of justice can emerge.[6]

I think it is a very good sign that Harriet Tubman, a woman who risked her life in opposition to unjust laws, is seen as a person worthy of honor today.  I hope Americans who care about justice will also see her as a person worthy of emulation.

______________

[1] Last June, I wrote a series of posts on the proposed change in currency:    https://jmarcuspatton.wordpress.com/2015/06/15/history-on-currency-part-i/

[2] A post from last August on Jefferson-Jackson Day and the evolution of the Democratic Party:  https://jmarcuspatton.wordpress.com/2015/08/17/jefferson-jackson-day/

[3] Hamilton’s legacy examined:  https://jmarcuspatton.wordpress.com/2015/06/22/history-on-currency-part-ii-hamilton/

[4] Jackson’s legacy examined:  https://jmarcuspatton.wordpress.com/2015/06/29/history-on-currency-part-iii-jackson/

[5] Harriet Tubman’s legacy examined:  https://jmarcuspatton.wordpress.com/2015/07/06/history-on-currency-part-iv-harriet-tubman-and-the-way-we-tell-the-story/

[6] A post on the issue of the law, morality, and history:  https://jmarcuspatton.wordpress.com/2015/05/18/the-problem-with-history-morality-and-the-law/

Leave a comment

Filed under history

No problem?

Sometimes a slight incongruity can reveal a major fault line.  The words people use reveal a lot about their point of view.  Certainly, choice of words is also shaped by the extent of one’s vocabulary, personal habit, and the influence of popular culture.  But when communication is the goal, meaning drives usage far more than habit or popular fashion can.

As a teacher who has examined, explored, and evaluated writing and other forms of verbal expression for years, I notice how people express themselves.  There have been many times during the course of a routine interaction – for instance when I thanked someone over a counter in a commercial establishment for rendering a service I had just purchased – when the reply seemed oddly out of place.  I didn’t really understand why, until I stopped to figure out the implications of the response.

I have been struck in recent years by the use of the phrase, “No problem.”

If you think about it, there is a big difference between “No problem,” and “You’re welcome.”  Bear with me now.  This is not just another distinction without a difference, or an old man decrying modern slang.  Words do have meaning, and sometimes uncover truths we don’t consciously intend when we speak them.

“You’re welcome” implies that I am entitled to whatever I have been given.  If it is a service or good I just paid for, I certainly am entitled to it, and my “Thank you” is more of a polite acknowledgement than an expression of gratitude. “You’re welcome” is a polite statement of the obvious. There is nothing wrong with being polite.

“No problem,” on the other hand, implies that whatever prompted my “Thank you” would be considered an imposition but for the personal generosity of the person providing it.

Would you mind if I cut in front of you in line?  I am running late and I have only one item while you have 10.

That is the kind of situation that would call for a response of “No problem.”

But if I thank a cashier for handing me my change from a cash purchase I just made, “No problem” suggests that giving me my money was optional, but that the cashier was kind enough – or wasn’t too busy – to do it anyway.  The phrase does not fit the actual situation at hand – that the person handing me my money is being paid to do so as a routine part of his job.  If it is a problem for the cashier, that is a matter for him to take up with his employer, not a customer.

But I suspect that the real problem is that the cashier is struggling with his role as a server, and is longing to assert himself as an individual.

Americans are a proud people – proud of our individuality, proud of our personal autonomy, proud of our accomplishments.  And even if we find ourselves in a situation in which we are not feeling very autonomous or accomplished, we still tend to puff out our chests.

Which makes it hard sometimes to reconcile individual pride with social responsibility.  As much as Americans love the idea of the rugged individualist, in our communities we are part of a group, and we all have roles to play.  Some of these roles include very little autonomy.  Some of them require servility.  That’s a tough role for a proud individual, if it’s the only one we get to play.  But we do get plenty of practice, and from a very young age.

In the complex community of a public school, which is for most of us the training ground for life as adults, we often have a difficult time encouraging individual pride while teaching social responsibility.  Schools tend to be authoritarian institutions, where adolescents who are trying to discover what makes them unique find themselves standing in line to go through the same motions as everyone else.  Schools are very good at rewarding compliance, and outstanding performance within narrow parameters, but they are even better at punishing individuality and noncompliance.

For many, leaving school is a declaration of independence.  It is an opportunity to be free from the scholastic regime of compliance.  So where do young people go to express their individuality?  Where can they accomplish great things and experience the pride that comes with accomplishment?

Not as a temp worker in a warehouse.  Not as a server in a coffee shop.  But for many young people just out of high school, these kinds of jobs would have to be considered good gigs among the available options.

As a teacher, I have seen young people struggle to establish an identity within an educational institution that loves facelessness.  As a man in the wider community in which I live and work, I have seen young people in monotonous, dead-end jobs, seemingly numbed by the realization that this might just be all there is.

But in the bleak landscape of the work-a-day world, there are signs of life – pride, perhaps unsupported by any great accomplishment behind that coffee shop counter – but pride just the same, finding the sunlight like new green growth between the cracks of a sidewalk.  And like that new growth, a mildly inappropriate response over a counter in a coffee shop might go unnoticed a hundred times before it is recognized as unwanted.  It might never be recognized as an assertion of individual pride. But I believe that is exactly what it is.

We have to do a better job of preparing children to take productive roles as adults in their communities.  We have to do a better job of encouraging individual pride in accomplishment for something other than meeting standards that someone else has set.  Schools are not doing as well as they should, and you don’t have to order a cup of coffee to see the evidence that this is true.

1 Comment

Filed under learning

Learning from tests

“May I use a dictionary?” the student wanted to know.  His class was taking a rather challenging Advanced Placement U.S. History test, and I thought that it was a reasonable request, so I agreed that he could consult one.  It was only after the test was over that I saw that the book he had taken from the shelf was one titled Dictionary of American History.  It was less a “dictionary” in the conventional sense, and more of a reference book with explanations of different terms and events.  He had looked up – or tried to, anyway – the answers to all the questions he could not answer on his own.

I guess he thought that if I caught him in the act, he could defend himself by claiming that it said “dictionary” on the cover.  But he didn’t seem to expect what happened when I caught him after the fact.

I told him that he could keep whatever grade he earned by putting down correct answers on his test, but he had to tell me what he had learned from the reference book during the test.

A bit awkwardly, and seemingly unsure whether my offer was just a ruse to get him to confess, he began a recitation of the questions that had stumped him, and the information he had gathered from the reference book.  As I questioned him to find out more precisely what he knew or thought he knew before the test, and what he learned during the test, and even what he still hadn’t been able to find out, he warmed to the task.  It was like detective work, but instead of him playing the role of criminal suspect, he was a collaborator with me in discovering the missing knowledge.  In fact, he was bringing me up to speed.

In the end, he still had questions he could not answer – but he knew what those questions were.  And he had my encouragement and approval to research more and find out the answers.  I told him that I really didn’t care whether he walked into the classroom that day knowing everything he needed to know for the test.  It was much better to walk out of the classroom absolutely sure of what he did know.

Testing is an opportunity for learning.  There should not be a wall between the two.  Failure may be the best way to recognize what you need to do in order to succeed.  Failure is a bump on the road, but it does not need to be a milestone.  Why make a curious mind wait for the return of a paper covered with red ink – memorializing shortcomings?  Why not make testing about more than the retention of information?  Why not allow it also to be about the process of learning?

1 Comment

Filed under learning