It’s difficult to ignore the appealing certainty that the times in which we are alive are unique and fundamentally different to any that have gone before. The most cited reason for this is the fact that the internet has changed everything.

Technology has been transforming education for as long as either have been in existence. Language, arguably the most crucial technological advancement in human history, moved education from mere mimicry and emulation into the realms of cultural transmission; as we became able to express abstractions so we could teach our offspring about the interior world of thought beyond the concrete reality we experienced directly.

This process accelerated and intensified with the invention of writing, which Socrates railed against, believing it would eat away at the marrow of society and kill off young people’s ability to memorise facts. He was right. The transformative power of writing utterly reshaped the way we think and how we use knowledge. From the point at which we were able to record our thoughts in writing, we no longer had to memorise everything we needed to know.

But education was very much a minority sport until the advent of the printing press, when suddenly books started to become affordable for the masses. Before Gutenberg, there was no need for any but a privileged elite to be literate, but as the number of printed works exploded exponentially, the pressure on societies to prioritise universal education slowly grew until, by the mid-twentieth-century education became not only a requirement but a right.

waves_of_innovationThe rate at which we now produce knowledge is staggering. The architect and inventor, Buckminster Fuller identified what he called the Knowledge Doubling Curve.* He noticed that until 1900 human knowledge doubled approximately every century. By the mid-twentieth century, knowledge was doubling every 25 years. Today on average human knowledge doubles in just over a year. Some estimates suggest soon what we collectively know is set to double every 12 hours.

No wonder so many have been persuaded that there is no longer a need to learn facts as what we know will quickly be superseded and, after all, we can always just look up whatever we need to know on the internet. This erroneous belief has certainly had a transformative, if largely nugatory effect on education in the last decade or so. I say nugatory because knowledge is only knowledge if it lives and breathes inside of us.

There’s a world of difference between knowledge – the stuff we think not just about, but with – and information. To make sense of the vast swathes of information available to us we need to know quite a lot. If you doubt this, consider what happens when you ask a student to look up an unfamiliar word in a dictionary: they may end up with five or six more they have to look up in order to understand the definition of the first. Some things we just need to know. What we know makes us who we are. Knowledge is what we both think with and about. You can’t think about something you don’t know – try it for a moment – and the more you know about a subject the more sophisticated your thoughts become. In order to critique the world we need to know as much as possible about its science, history, geography, languages, mathematics and culture.

But in response to the apparent obsolescence of knowledge, schools started reinventing themselves as places where children would learn transferable skills which would allow them to navigate the shifting, uncertain world of the future. Maybe the tradition curriculum of school subjects has had its day, as tech guru, Sugata Mitra claims. Maybe all we have to do is teach kids how to use Google and they will magically teach themselves all they need to know? After all, most of what schools teach is a waste of time, it seems. According to Mitra, the Chinese and Americans “don’t bother about grammar at all”. Children don’t need to know how to spell, and “the less arithmetic you do in your head the better.”

There is nothing more philistine, more impoverished than reducing the curriculum to the little that’s visible through the narrow lens of children’s current interest and passing fancies. How do they know what they might need to know? And in any case, do we really want to educate the next generation merely in what we think they will need?

Of course the future is uncertain, unknowable and so how best can we prepare students for it? Well, perhaps we should stop delivering rapidly outdated facts and instead teach students the skills they will need to thrive in the 21st century. And what are these futuristic skills? Typically they are considered to include critical thinking, problem-solving, communication, collaboration and creativity. Wonderful things, all of them – but attempting to substitute them wholesale for a more traditional school curriculum comes with problems.

Problem 1

Are these really ‘21st-century’ skills? Or in fact, hasn’t this stuff always been pretty important? And if it was important for Socrates to think critically, Julius Caesar to solve problems, Shakespeare to communicate, Leonardo da Vinci to be creative and the builders of the Great Wall of China to collaborate – how on earth did they achieve what they did without a specific, 21st century learning curriculum? The point is, these skills are innate human characteristics. We all, to a greater or lesser degree, use them all the time. How could we not? Of course, we can encourage children to be more creative, critical and collaborative, but can we actually teach these things as subjects in their own right?

Problem 2

How, exactly, do you teach someone to communicate or solve problems in more sophisticated ways? What is it we want students to communicate? What sorts of things do we want them to create? What do we want to collaborate on? The problem with attempting to teach a generic skill like critical thinking is that you must have something to think critically about; if you know nothing about quantum physics no amount of training in critical thinking is going to help you come up with much on the subject that is very profound. Likewise, to be truly creative we need to know a lot about the form or discipline we’re trying to be creative in. Skills divorced from a body of knowledge are bland to the point of meaninglessness. In fact, these so-called 21st Century skills are in fact biologically primary evolutionary adaptations. As I explained here, we are innately creative. We solve problems as a matter of course and collaboration comes to us naturally. What makes people appear to struggle with these innate attributes are that we want them to use them to manipulate biologically secondary, abstract knowledge. Anyone can collaborate on a playground game, but to collaborate on finding a cure for cancer you would need a lot of highly specialised expertise. The only thing that makes these innate skills desirable in the 21st century is the academic content on which they depend.

Problem 3

Is teaching facts really such a bad thing? Of course it’s true that we’re discovering new information at an exponential rate and that no one can ever learn anything but the tiniest fraction of what is known. Apparently, when Newton formulated the laws of force and invented calculus he knew everything that was currently known about science. This is no longer possible; as our collective knowledge grows our individual ignorance seems to expand. It might be the case, then, that the amount of new information is doubling every two years – but is it really true that half of what students studying a four-year technical degree learn in their first will be out-dated by their third year, as the makers of Shift Happens assert? Maybe those studying highly specialised areas of computer science will find the programming languages they learn are quickly superseded, but that doesn’t make the practice and discipline of learning them in the first place totally useless. And in most other fields of human endeavour – medicine, engineering, law, teaching – new discoveries and practices build upon a settled body of knowledge. Depriving students of this foundation is in no one’s interest and will do nothing to prepare young people for an uncertain future.

Looking backwards to move forwards

We are still easily seduced by the bright lights and glamour of the new (even when it’s
 not ‘new’ at all, just packaged and lauded as such). 
It’s all very well to criticise current qualifications but to suggest that exams should be aligned with some supposed change to the way students’ learn and think is naively foolish. In case you doubt anyone sensible might take this line I give you this example:

Th[e] latest revision to curriculum and assessment has not been designed for young people living in the 21st century, with 21st-century minds, and should be challenged. It does not fit with the era we are living in and it penalises a generation of young people who use their brains and knowledge differently through technology.

We may be living in the 21st century but, despite the many ways in which technology has advanced, we are still very much the descendants of primitive hunter gatherers. If we really wanted exams that fit with the way we think we’d probably be best off testing basic survival skills. Regrettably perhaps, the modern world places an increased value on those brains that can best rewire themselves to cope with applying and manipulating abstract knowledge. Thankfully – as I explained here – although we may be out of practice, remembering stuff is not only intellectually undemanding, it also helps students to think better.

* More properly this should be termed an information doubling curve.