Digital Learning : Tailored or Taylored?


The message I’m trying to send is that technology is political, and that many decisions that look like decisions about technology actually are not at all about technology – they are about politics, and they need to be scrutinized as closely as we would scrutinize decisions about politics.


Evgeny Morozov


They say a week is a long time in politics and the past week has been quite a big one for the UK government who have played not one but two cards in a recent initiative to demonstrate their belief in the role of technology in education.


In one initiative, the Year of Code, the government has positioned technology as an outcome of learning rather than enabler. Although to be fair, it’s not entirely clear what they have demonstrated beyond a woeful misunderstanding of the subject.


The initiatives director, Lottie Dexter, was thrown into the spotlight, like a sacrificial lamb to the slaughter, to explain the project on national television only to expose that she really didn’t know anything about computer programming beyond her scripted conviction that it was now an essential skill like reading and writing. It was regarded by many as car crash TV that also revealed that the, government influenced, committee of yes people behind the initiative also had next to no knowledge of the subject. Fast forward to 5:24 in the video below.



The second initiative, the Educational Technology Action Group (ETAG), seems more promising. A committee of the usual suspects and educational technology evangelists chaired by respected educationalist Stephen Heppell. Set up by UK government ministers Michael Gove, Matthew Hancock and David Willets with the brief


to identify barriers to the growth of technology that have been put in place (inadvertently or otherwise) by the Government, as well as thinking about ways that these barriers can be broken down.


For a government that entered parliament with the mission to close quango’s it is now on a mission to create as many as possible within its own image.


What could have happened to engender this about face and commitment to technology for learning?


Could it be as, open data designer, Adrian Short suggests, a demonstration of the administrations “neoliberal agenda” that calls for economic liberalizations, free trade and open markets, privatization, deregulation, and enhancing the role of the private sector in modern society?


Matthew Hancock MPMy concerns were raised initially by a speech given by UK skills minister Matthew Hancock at a private event in March 2013 to launch an EdTech incubator where he showed scant understanding of the education sector but a good nose for potential business growth which, after all, is his job.


Since then having garnered the support of EdTech start-ups looking for the door marked entry Mr Hancock has grown bolder in his statements. By December 2013 he was going on the record announcing his governments plans for teachers to “take a backseat in the imparting of knowledge”.


This was followed by a speech at the recent BETT EdTech trade show held in London where he said “An algorithm then takes that data, and works out how each child could learn more”.


It’s quite possible that Mr Hancock might have been using a standard issue government algorithm for speech writing given that in 2005, Ruth Kelly, Labour’s Education Secretary said “And in the future it will be more than simply a storage place – a digital space that is personalised, that remembers what the learner is interested in and suggests relevant web sites, or alerts them to courses and learning opportunities that fit their needs.”


Which brings us to these algorithms that are going to enable teachers to take a back seat and for “Technology” to decide what and how much your child can learn. I’m curious about who will own these algorithms, who will write them, how they will work and how they are biased. I say biased because as we should know by now algorithms aren’t neutral, they are designed and written by people, i.e. they are mediated. Suggesting they aren’t biased is like saying newspapers like the Daily Mail or The New York Times aren’t biased. Of course they are.


We know that digital platforms offer the most amazing possibilities for learning and that isn’t the question here. My book and digital resources for Learning {RE}imagined will document many interesting digital deployments for learning from 5 continents. The question relates to the point that writer Evegny Morozov makes in the opening quote of this post that technology is far from neutral, it is political.


In the early 1900’s the American engineer and management consultant Frederick Taylor in a desire to improve industrial efficiency conceived the “scientific management” approach to manufacturing. The underpinning of scientific management is the disdain for tradition preserved merely for its own sake or to protect the social status of particular workers with particular skill set. Its objective was the transformation of craft production into mass production. Whilst Taylor’s management theory were largely obsolete by the 1930’s most of its themes are still important parts of industrial management thinking.


In particular, Taylor’s management approach fetishised data which was collected at numerous points during the manufacturing process that could be used by management to determine what steps to take to improve efficiency. Taylorism, therefore, was probably one of the first attempts to use at the turn of the 19th century that which today we call “Big Data”.


The problem with all this data is that we arrive at what French social theorist, Jean Baudrillard, suggested when he wrote in his work , Simulcra and Simulations, “We live in a world where there is more and more information, and less and less meaning”. What he means here is that the data tells us what is happening but not why it’s happening.


Stephen_HeppellWhen I discussed some of my concerns with Stephen Heppell he told me that it would be important for educators to remain vigilant in the face of these prospects and, of course, he is right. We need to make sure that digital platforms for learning are not appropriated within a political tactic to introduce Taylorism into our education systems. That is, we shouldn’t believe that technology is an opportunity to de-skill and de-professionalise the teaching profession, to remove the craft of teaching in order to achieve the efficient manufacturing of children to a set of industrialised test standards.


Understanding as we do that algorithms and technology aren’t neutral, that technology isn’t as, suggested by Noam Chomsky, simply a tool like a hammer we should remember that simply a love for technology itself doesn’t breed change. We must, as Heppell suggests, be vigilant and we must, as Morozov implores, scrutinise technological decisions as we would the political.


It seems common today for our techno determinists, evangelists and festishists to simply reject all criticism as being anti-technological & anti-modern but this is unhealthy and stifles an important discourse around the deployment of digital platforms within our education systems. Ironically, the stifling of this debate could mean that technology continues to have little or no transformative effect on learning rather it becomes a management tool for enforcing 19th century ideas about schooling.


“They’ll have that repeated forty or fifty times more before they wake; then again on Thursday, and again on Saturday. A hundred and twenty times three times a week for thirty months. After which they go on to a more advanced lesson.”


“Till at last the child’s mind is these suggestions, and the sum of the suggestions is the child’s mind. And not the child’s mind only. The adult’s mind too—all his life long. The mind that judges and desires and decides—made up of these suggestions. But all these suggestions are our suggestions!


Aldous Huxley – Brave New World


Further reading


Silicon Valley – Open Up (Algorithmic Bias)
Evegny Morozov