Log on:
Powered by Elgg

Feed detail

January 02, 2014

Shock of the new


Down through the ages, there has always been resistance to change. The infographic on this page bears testament to that fact. Specifically, there has always been opposition to new technologies. Sabotage - a word synonymous with subversion through deliberate destruction, was first coined following the 15th century attempts of Dutch workers to break the newly introduced and very unpopular textile looms. It was rumoured that the workers threw their sabots (wooden clogs) into the machinery to break the cogs, because they feared that the new machines would render human workers obsolete. The same mentality was present when robots were introduced into the car manufacturing industry late in the last century, although less overt kinds of opposition manifested then. Even today, many people still shun the automated teller machines (ATMs) because they don't trust them.

Why are people technophobic? Is it the shock of the new? Is it that people are simply scared of change? Often it's the uncertainty that new technologies bring which seems to faze people into resisting them. Fear of the unknown has a strong effect on our thinking. Some of the warnings are on the surface, quite reasonable, but if you look just beneath the facade of the caveats, there resides a kind of techno-panic - an unreasonable fear of what the technology might really bring to society.

Much of our fear of technology is represented in popular culture. In the Terminator and I, Robot movies for example, our own creations become a threat to our future, our humanity, our very existence. This trope can be traced back at least to Mary Shelly's Frankenstein, and arguably even farther back in time to the myth of Prometheus. Bizarrely, there are strong links from Mary Shelly - via her poet husband Percy Bysshe Shelly - and a holiday they enjoyed with close family friend and fellow poet Lord Byron, to his daughter, the technophile Ada Byron (more commonly known as Ada Lovelace, acknowledged as the first computer programmer). Was there a connection between the two stances? One would surmise that the influences were there and that conversations between the Shellys and the Byrons might have led to discussions around the social implications of the emerging technology of the time.

Health warnings are the most prevalent warnings we hear today about new technology. When mobile (cell) phones were first introduced more than two decades ago, concerns were raised by health experts over the levels of non-ionising radiation caused by the devices. Many articles published during that period expressed anxiety over the long term legacy of using the mobile devices close to the head, and predicted an epidemic of brain tumours and other health problems. More recently, the naysayers are still at it, bad-mouthing those who are pioneering wearable technologies with labels such as 'glassholes'. Contemporary critics of technology such as Andrew Keen, Baroness Susan Greenfield, Nicholas Carr and Tara Brabazon have variously argued that the Internet has negative impacts on education, memory and perception, knowledge representation, scholarship and culture. There seem to be no end to the column inches in the popular press that are dedicated to exposing the failure of education and the role technology may be playing. History has shown us that the technophobes and doom merchants are always among us, and that they will not desist. But we also know this - technology will also continue to be with us, and it is just as resilient as its detractors. Technology does not stand still, but continues to evolve at pace. Yes, we know that introducing technology in education is not the silver bullet. But it is up to us as educators and learners to wisely harness its potential for better learning. Applied in appropriate circumstances and contexts, technology is making a difference to our children's learning. To ignore it or shun it means we are depriving younger generations of opportunities for authentic, real-world learning.

"Every man is a creature of the age in which he lives, and few are able to raise themselves above the ideas of the time." - Voltaire

Graphic source: Fear of the New

Creative Commons License
Shock of the new by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


January 01, 2014

The mobile agenda

2013 was quite a busy year for me. Alongside my own commitments to teach and research, I was also invited to participate in events held in several countries outside the UK. During the year I found myself travelling to Singapore, Las Vegas, Doha, Riyadh, Ljubjana, Oslo, Brussels, Berlin, Prague, Cairo, Sligo and Malta. I also presented over a dozen live webinars, including keynotes for the excellent Reform Symposium on the future of education and the influential eLearning Guild. However, it was a short series of events held in the UK that arguably provoked my most productive few weeks, at least in terms of thinking and writing. I was very pleased to be invited by Learning Pool to headline their autumn tour, at venues in the cities of Sheffield, Cardiff and London. During the conference series, I worked closely with Denise Hudson-Lawson and Andrew Jacobs, two of the other invited speakers, in workshops on mobile learning. This was a subject high on the agenda at Learning Pool, and remains a very important trend for work-based learning. Mobile has interesting implications for the compulsory and higher education sectors too, not least because untethering learning has radical consequences for the future of resourcing, curricula development, teacher roles and autonomy for learners. It also presents new challenges for organisations around interoperability, data protection/security, personal metrics and privacy.

Working with Andrew and Denise was a lot of fun - we were involved together in 3 excellent panel debates during the Learning Pool tour - and it was also creative and thought provoking to such an extent that between us we were inspired to generate several blog posts around mobile learning. Here are a few of Andrew's thoughts on our collaboration. The posts I wrote during this period are listed below, complete with the ensuing conversation from readers. I hope you find them as thought provoking and relevant to your own work as I have. As ever you are very welcome to leave your own comments on these posts and the ideas that they represent.
Photo by Steve Wheeler

Creative Commons License
The mobile agenda by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


December 30, 2013

Standing on the shoulders of giants

Steve and Sugata at EDEN in Oslo
Yesterday I posted some of my first reflections on a memorable year of encounters. Today, as the year draws to a close, I think back on the past year and I am amazed at the incredible people I have had the pleasure to work with, interview or simply spend some time with. You can learn a lot from just listening to others, and occasionally asking the right questions. I'm very grateful to have such a wonderful job, and it really feels like I'm standing on the shoulders of giants.

In June I had the singular honour of being invited to host a unique keynote session in Oslo for the EDEN annual conference. The event was attended by delegates from many nations, and all were assembled on Day 2 of the event to listen to two of the greatest education celebrities of our time. In the room with me was Sugata Mitra, famed for his 'Hole in the Wall' computer projects, child driven education, and also more recently for his work around 'minimally invasive education' and the School in the Cloud. Just before the conference, it had been announced that he had won the prestigious TED prize of one million US dollars for his innovative research into learning. Sharing the keynote session with him, on a live link from California, was education guru Sir Ken Robinson, celebrated for his TED talks and his unique perspectives on creativity and learning. His TED searing critique of the education system entitled 'How Schools Kill Creativity' has alone received over 20 million views.

My job was an absolute dream - to introduce Sir Ken and Sugata, and moderate the conversation between them. I didn't need to do very much to be honest. I can't recall a better time spent on stage than I had listening to two of the sharpest minds choreographing such a fascinating, intellectual dance. The conversation was humorous, insightful and challenging in equal measure. To say the audience was enthralled was an understatement, and listening to the conversations after the event, it was clear that it had been the highlight of the conference for them.

During the 3 days of EDEN I also had the pleasure to record video interviews with several more well-known people who were attending the conference, including Michael Moore, Grainne Conole and the American Psychological Society president Bernard Luskin. EDEN in Norway will be remembered for many things, but for me, its keen focus on schools and compulsory education - not normally a major theme of the event - was an important feature. The children who took part, presenting their ideas for more engaging learning environments, was another great highlight of the conference. Read my review of the children's contribution right here.

You can read my interview with Professor Sugata Mitra on the EDEN site and watch our video conversation on the EDEN YouTube Channel. More reflections on 2013 tomorrow.

Photo by Steve Wheeler

Creative Commons License
Standing on the shoulders of giants by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


December 29, 2013

Learning from the legends

The year's end is a time for reflection. For me, it is a good time to reflect on what I have learnt and achieved during the year. This year I spent some time with a few unique, generous and very talented people. It's amazing what you can learn from others if you ask the right questions and listen to their replies. It was a great privilege to work alongside them and to learn from their experience, passion and knowledge, and not a single minute of the time I spent with them will go to waste. It is all stored up, and will be used, repurposed and shared in the coming years. Here's the first part of the story:

Nic Negroponte and me in London
In January, I was delighted to be invited to speak at the Learning Technologies event in London. This event is a two day conference and exhibition for Learning and Development professionals at Olympia each year. Unfortunately I was only able to spend the first day at Learning Technologies because the following day I was scheduled to give two keynotes at the BETT Show, across town. Event Chair Don Taylor invited me to dinner with several other invited speakers the evening prior to Learning Technologies. It was a great honour for me to sit next to Nicholas Negroponte, a man considered by many to be one of the living legends of learning technology. Nic was responsible for establishing the MIT Media Lab and was a founding editor of Wired Magazine. He has also broken new ground through his innovative One Laptop per Child Project. Nic's work has had global impact, and continues to positively influence the lives of learners worldwide. I spend a couple of blissful hours in conversation with him as he regaled me and my fellow dinner guests with tales of the early days of educational computing, and he even signed my 1995 first edition copy of Being Digital!

In February, I was in Riyadh, Saudi Arabia, as an invited speaker at the Ministry of Higher Education ELI Conference.  Several other speakers from across the globe had also been invited to take part and I spend some quality time with author and speaker Richard Gerver, who impressed me with his passion for education and great evangelistic fervour to provide creative learning environments for kids. Richard and I also spent part of a day together with a film crew making a short documentary on learning and creativity for the Saudi Ministry.

Steve W meets Steve W. Wozniak and Wheeler in Riyadh
On the morning of my own speech at ELI, I wandered into the sumptuous speaker lounge the conference organisers had provided, and found myself face to face with a living legend - Steve Wozniak. Steve is co-founder (with Steve Jobs and Ronald Wayne) of Apple Corporation, and was the force behind the development of the Apple Mac personal computer and many subsequent household products. He is now one of the most sought after keynote speakers on the learning technology circuit. I spend 20 minutes alone with him discussing education, technology and the way forward. He spoke of his own school experience, the creative processes involved with developing Apple technology, and his relationship with Steve Jobs. It was a fascinating, unforgettable time, and I'm so pleased I was able to spend a little time with someone who has literally changed the world. I just wish I'd been able to record the conversation.

Tomorrow: Standing on the shoulders of giants - interviewing Sugata Mitra and Sir Ken Robinson.

Photographs by Steve Wheeler

Creative Commons License
Learning from the legends by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


December 20, 2013

2014 - the year of the wearable?

Wearable computing is just one small step up from mobile technology. A large percentage of the world's population is already connected through mobile phones, e-readers, games consoles and tablet computers. The next stage is inevitable - that users begin to wear their tools as jewellery, clothing, eyewear and even as implants. Will 2014 be the year when wearable technology emerges as the next stage in our connected future?

Ben Hammersley seems to think so. In the January 2014 issue of Wired Magazine, he argues that whilst the last decade has been about smart mobile technologies, the next decade will be about wearable computing. Hammersley is convinced that it isn't the technology - the gadget - that is the driving force of this transition. Rather, it is the growing desire to have a device that is with us, or in the words of Gerd Leonhard - 'on us' at all times. The wearable device, whether it be Google Glass or some other as yet to be conceived wearable, is only the physical manifestation of something a lot larger and more complex. Wearables are the tip of the vast, complex infrastructural, social and technological iceberg that includes cloud computing, social interaction, content generation and sharing, context awareness and a whole host of other services, effects and influences. They are the next stage of the interface between humans and the vast world of computational intelligence. Can wearable technologies fulfil their promise to become 'mind extensions', or will they end up as just another expensive gadget that was a fad for a time, before disappearing into the mists of obscurity? I think not.

My personal view is that the time is now ripe for the widespread and rapid adoption of wearable computing. I believe this for a number of reasons: The stage is already set. Today's culture is rich in digital technology, and several generations have now grown up with the expectation that they can access information when and where they want it. Younger generations are wedded to their smart mobile technologies, both psychologically and socially.

Entire industries have evolved and emerged into the mainstream premised on the use and exploitation of smart technology. They would not be able to function without it.

We are also increasingly aware of the connections that make our world move forward, politically, economically and socially, and we are reluctant to step backwards or stand still to risk this progression. The rise in nomadic and itinerant working, and the flexibility that results, is also a very recognisable facet of life and employment in the 21st Century. Knowledge has radically increased and has been more widely disseminated as a result of access to smart mobiles, social media and the Web.

Wearing computers as jewellery, embedded inside our clothing or even as implants inside our bodies seems to be the next stage in our development as technology users. We are already comfortable wearing such enhancements on our person - wrist watches inform us of the passage of time, and spectacles enhance our vision when it deteriorates. Plans to embed digital tools inside wristwatches and spectacles are already well under way.

Finally - and this is potentially what matters the most - venture capitalists are beginning to invest heavily in businesses that are developing new wearable devices. Where the money goes, innovation usually follows.

These are the signs that wearable computing will not go away, and will probably emerge as the next stage in our increasingly intimate relationship with technology. We will use computers in increasingly social and personal ways. As Hammersley intones at the end of his article: 'We will wear it like we wear our heart: on our sleeve.' 

Photo by Antonio Zugaldia on Wikimedia Commons

Creative Commons License
2014 - the year of the wearable? by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


December 19, 2013

Drive like an Egyptian

Anyone who has visited Cairo will have experienced the city's incredible traffic phenomenon. When the drivers of the millions of cars in the Egyptian capital are not at a complete nose to tail standstill (which is often), they are racing and weaving their way along the city's roads at top speed, watching for small gaps to open up, missing each other by fractions of a centimetre, all amidst a perpetual cacophony of horns. From the fog of dust and exhaust fumes you see people emerge, walking in the road, some actually in between the cars, and fortunately, the drivers are skillfully able to avoid them too. This chaotic choreography all takes quite a bit of time to get used to it. For the first few times you venture out in a car (please don't attempt to drive yourself, ask someone who is local to drive you), you find yourself holding on tightly with white knuckles to anything that is fixed, holding your breath, staring wide eyed and horribly fascinated at the many near misses and high speed chaos that is unfolding before you. You need nerves of steel. You sit there praying that your vehicle is not going to be hit. And somehow, it never is.

And then you notice that amongst all this chaos, the blaring horns, the brinkmanship as two drivers try to manoeuvre swiftly into a space that couldn't possibly accommodate them both, and the endless revving of engines and clouds of exhaust fumes... that some kind of organisation is actually present.  Every driver is acutely vigilant (360 degree vision, one of my drivers called it), and regularly sound their horns to warn other drivers where they are in relation to each other. It is a kind of organised chaos that somehow works, because although the large majority of cars have dents and scratches, there tend to be very few serious accidents. Every driver plays the game supremely, knowing exactly what the limits of the rules are and how they are applied. In a city the size of Cairo with over 9 million inhabitants, this is both remarkable and expedient.

Such self organisation takes a little time to evolve, but those within it must learn quickly to survive. Imagine venturing out to drive on the streets of Cairo for this first time. You would need to learn pretty fast, and adopt the conventions of driving with your horn, or risk a serious accident. You would need to know that it is not unusual for two cars to occupy the same lane with just a centimetre between them, and that cars are weaving in and out of the lanes continuously. Driving at top speed and braking suddenly are also completely acceptable, and cutting across other drivers is just a fact of life. All of this is normal in Cairo, and its drivers know these rules implicitly by being immersed in the culture of Egyptian driving.

This is a metaphor of self organised learning spaces, where unwritten rules have evolved to maximise the potential of the tools and environments with which we are increasingly familiar. Learning is no longer linear. Learning in digital environments is a meandering experience, where hyperlinks take you down new and surprising avenues, and conversations take an unexpected turn. On wikis and other shared spaces, there is a need to simply let go of content once you have submitted it, because as sure as there are pyramids in Egypt, someone will come along and edit (or perhaps even delete) your contribution. Be prepared for others to openly (and sometimes harshly) criticise your ideas when you blog, or post your video up to YouTube. If in doubt, or you don't have a thick skin, don't post. There's your slip road out of the chaos. Avoid potential car crashes by checking your facts and ensuring that your arguments are defensible. There's your seatbelt. Check to ensure that images and content you repurpose isn't copyrighted material. That's being street wise. Watch out for those on social media who are simply out to scam you or rip off your ideas to make them their own. Moderate comments to your blog to eliminate spam and trolling. There's your 360 degree vision. And finally, maintain your digital presence, and protect your digital footprint - by engaging your brain before you tweet, post or upload, you will preserve your reputation (and maybe your job) - and your indiscretions won't come back and rear-end you.

Photo by Steve Wheeler

Creative Commons License
Drive like an Egyptian by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


December 08, 2013

The future of knowledge navigation

Will we ever reach the point in our human development where our relationship with our technologies becomes so natural that they become a part of us? Will we ever be able to claim that they are a problem free extension of our physical capabilities - seamlessly connected to our minds? Many would argue that we have already achieved this. Perhaps though, this would have been an easier question to answer in the last century than it is today. The answer then would have been 'yes - we have already achieved it'. The widespread use of diverse technologies such as transportation vehicles, manufacturing tools, weapons and even writing implements, have shown that we can create technology to extend our abilities beyond our natural physical skills, and also adapt our bodies and minds to incorporate tools. This an effect epitomised in Marshall McLuhan's famous declaration that 'we shape our tools, and thereafter our tools shape us'. Although this has social and cultural connotations, it also reveals that we are naturally pliable, and can adapt our skills and expectations toward new ways of doing things. In this case, we learn to use our tools to extend and enhance our limited physical capabilities.

The problem is, as our tools become increasingly complex, so there is a need to learn more complex skills to be able to optimise our use of them. The computer is a classic example of complex technology that can be difficult to use. Things have dramatically improved since the introduction of Graphic User Interfaces, and Siri and Kinect have done a little to bring us closer to improved voice and gesture control. However, computers have also exponentially increased in power and utility, and we will always need to run to keep up. We can speculate that most of us fail to harness the full potential of computers because we simply don't have the skills to exploit their full potential.

Many skills and literacies required, to be able to maximise our use of computers so that they can navigate knowledge on our behalf. The fact that we now carry very powerful computers around with us in our pockets does little to change the problem - we are sentient, autonomous and emotional, whereas computers are simply cold, unthinking machines that blindly follow the instructions they are given. This differential is stark and unforgiving. We still need to be able to develop skills and competencies in the use of technology before we can reap its benefits. This takes time and effort, and no small amount of stress when things don't work out as we had anticipated.

Stephen Wolfram's recent announcement may change all that. The Wolfram Alpha natural language he has announced seems to be a solution to many complex human/computer interface problems. According to Wolfram, symbolic programming is the future of systems design. He says:

"There are plenty of existing general-purpose computer languages. But their vision is very different—and in a sense much more modest—than the Wolfram Language. They concentrate on managing the structure of programs, keeping the language itself small in scope, and relying on a web of external libraries for additional functionality. In the Wolfram Language my concept from the very beginning has been to create a single tightly integrated system in which as much as possible is included right in the language itself."

Wolfram also talks about the fluidity of the new language, suggesting that coding and data can become interchangeable:

"In most languages there’s a sharp distinction between programs, and data, and the output of programs. Not so in the Wolfram Language. It’s all completely fluid. Data becomes algorithmic. Algorithms become data. There’s no distinction needed between code and data. And everything becomes both intrinsically scriptable, and intrinsically interactive. And there’s both a new level of interoperability, and a new level of modularity."

Time will tell how much the Wolfram language will actually achieve to ameliorate the problems we face when we try to use technology and complex interfaces to solve human problems. Yet one thing is clear, and that is that the new language will present new ways to navigate knowledge, and may indeed represent a clear advance forward in how we manage data, and how it can be incorporated into our every day lives. Anyone who has used Wolfram Alpha as an answer engine would probably agree. I'm eagerly looking forward to seeing what the Wolfram language will be able to do for our use of computers in the future.

Image by Frankdzines

Creative Commons License
The future of knowledge navigation by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 23, 2013

Sleepless in cyberspace

It was interesting to see that many of my students identified with some of the 'symptoms' of computer dependency presented in the slideshow below. It got me thinking - how many of us are dependent to any extent on the technology we carry around with us? Computer dependency (not the same thing as 'addiction'), is defined as 'relying on computers to fulfill a need or compulsion'. Dependency manifests in many ways, from mild to severe. Have you ever for example been without your mobile phone or other device for a period of time and suffered 'withdrawal symptoms'? Have you ever started to become anxious that you are missing out on conversations on Twitter, or status updates from your friends on Facebook? Ever started to worry about all the emails that might be piling up, because you haven't been online for a few hours? Or days? Or weeks? Read more about computer dependency in this excellent post from one of my students, Christopher Nesbitt.

I once went on holiday to a very warm, overseas destination and deliberately left all my devices at home (except one mobile phone to be used in emergencies only). I wanted to find out if I could survive without them for two weeks. For the first day or two I thought a lot about what I might be missing online. I wondered whether people had left comments on my blog. I thought about what text messages or calls I might be missing. What would people think of me if I didn't reply? I worried about the huge backlog of emails I would have to deal with when I returned home. For the first two days, this obsessions was a little uncomfortable. Then gradually, I began to relax, and although I still thought about my online life periodically, I enjoyed my holiday, and we never had cold turkey for lunch.



There are several psychological theories that can be applied to help us understand the phenomenon of computer dependency. Many of them are represented in the slideshow on this page. For example, Julian Rotter's locus of control theory may explain some of the discomfort we experience when we feel we have lost control over our online lives. Lack of access to an internet enabled device for a period of time might give you feelings of detachment from the 'flow' of the discussions and social connections you normally enjoy. When my daughter's phone was stolen, I remember her being upset to the point of feeling bereaved. She tearfully told me she had 'lost all her friends'. When I pointed out that no-one had actually died, she told me I didn't understand. All of her friends contact details were in the memory of her phone, and she felt she had lost contact with them all. She had lost control of her social life. Find out more about your own locus of control by completing this online quiz.

Another theory relating to computer dependency is Leon Festinger's cognitive dissonance theory. When you spent far more time online than you know you should - perhaps you are engrossed in an online game, and can't stop until you reach that very difficult next level - and know that you should be spending more time with your family, do you rationalise that you will 'make it up to them'? Do you make other excuses to justify the amount of time you spend online? According to Festinger, this is the result of cognitive dissonance - where a conflict of beliefs can be 'resolved' by a form of rationalisation - usually excuse making that justifies doing what you know is bad for you. Addicts and gamblers do this a lot. Smoking causes lung cancer, but although it affects other smokers, it will never happen to me. I have lost a lot of money tonight on Blackjack, but my next bet will win me back all I have lost.

The implications of computer dependency can be quite profound. All teachers need to be aware that some children may be computer dependent, and may spend inordinate amounts of time online at home chatting or gaming when they should be getting on with their homework, or sleeping. It's a fine balance between using technology to benefit our lives, and becoming slaves to the routine and ritual of online life.

Photo by Ben Andreas Harding

Creative Commons License
Sleepless in cyberspace by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 16, 2013

Just an illusion?

The study of human perception is not easy, but it can be a lot of fun. How do we know for example, that we all represent reality in the same way? How do I know that my perception of the colour blue is the same as yours? We can't really know for certain. Human perception has its limitations, and we can be highly suggestible. Human perception is absolutely fascinating, and studying the processes by which we represent reality through our senses is completely engaging. Some of my digital literacy students discovered how thoroughly absorbing it can be to learn about perception this week. I took them on a brief tour of cognitive processes, including the human senses, memory and recall, and the representation of reality through perception. I showed how these processes connect up into cognitive architectures, through an analysis of the biological, psychological and physiological.

I showed them some optical illusions and got them to solve some lateral thinking problems. Then I demonstrated for them the most dangerous thing I have ever done in a live classroom setting. It involves a full can of baked beans and my index finger. It is quite stressful for all concerned, especially me, because if I get it wrong, I can severely damage my hand. (I might never play the guitar again!) I ask the students to decide whether what they are seeing is actually an illusion, or reality. The video of a previous demonstration is below:


I also showed them a video of the American illusionist David Copperfield disappearing the Statue of Liberty in front of a live audience. It is an astounding trick, and fools just about everyone who sees it. The video of the stunt, below, is absolutely breathtaking to watch, and full of showmanship, as you would expect from David Copperfield (no relation to the Charles Dickens character of the same name). I then told the students how it was done, and it all seemed just a little more mundane. I wasn't trying to ruin the magical effect of the illusion, merely showing them that much of life - just like all of the so called 'magic tricks' we see on our TVs - can be illusory, and many surprising phenomena can actually be explained rationally.



What teachers do in the classroom is a little like the illusionists. They not only perform in front of an audience, but there are elements within their act that can change students' perceptions of reality. What is it to have to unlearn and relearn something? It can be difficult to accept that you have been proved wrong, when you have believed something for so long. What happens to a child's confidence when they are told they are absolutely brilliant at doing something? And what would be the result if several 'average' children were singled out as 'very bright' students and then lavished with attention and praise? The latter actually happened in an experiment conducted by the psychologists Robert Rosenthal and Lenore Jacobson in the late 1960s, and was subsequently known as Pygmalion in the Classroom (after the George Bernard Shaw play in which a street urchin was transformed into a socialite). The illusion was that the children were chosen at random (usually with average IQ scores) but were hailed as 'highly intelligent'. The reality was that when the researchers returned to measure progress, the singled out children had indeed improved their IQ scores, and had risen above the rest of the class to become the highest achievers.

The effect of teachers believing that certain students are bright, whilst others are less intelligent often results in a self-fulfilling prophecy. The representation of the illusion becomes a reality, and the children who are considered bright, achieve more highly than those who have been told they are less bright. This is a salutary lesson for all teachers. Your decisions will shape the destiny of your students. When students are made to feel intelligent and capable, that is what they become. Are you changing illusions into reality?

Photo by Ian Stannard

Creative Commons License
Just an illusion? by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 11, 2013

PLN or CoP?

Type "PLN and CoP" into Google, and you're likely to be redirected to a currency conversion site (PLN is the abbreviation for Polish Zloty and COP stands for Columbian Pesos). That's quite an apt result because Google and many of the other large, supposedly 'free' social media tools are very much focused on making money to sustain their operations. But this post is not about money. Nor is it about the morality of social media companies. But it is about making connections for learning through the 'free' tools we have at our disposal - social media.

In the context of this post, PLN stands for Personal Learning Network, and CoP stands for Communities of Practice. PLNs have been described as informal networks of people one specifically interacts with within their personal learning environments. From a connectivist perspective, PLNs can emerge through our often random and serendipitous connections with others whom we encounter on the Web. Communities of Practice are described as groups of people who share a common interest and can be instrumental as a network within which learning can take place because of the critical mass of contributions from the group's members.

PLNs and CoPs sound so similar, we could be forgiven for thinking that they are more or less synonymous. A quick search reveals that not a lot has been written about the juxtaposition of the two. Little if any research seems to have been conducted into a comparison between them. Are PLNs and CoPs therefore one and the same? After all, both involve learning, both represent the interactions between individuals who have similar interests, and both exhibit personalised activities positioned at the heart of a rich social context. Yet if we examine the theories behind the two concepts, we see there are some subtle differences. Let me give you my perspective:

One of the key differences I see between the two is that in PLNs, connections can be fairly random and interactions largely informal. Often there is a common ground such as a mutual interest or shared concern, but generally those who make up my PLN are a fairly ad hoc group of friends, colleagues, family and also those who have casually connected with me either through my instigation or theirs. In CoPs, connections are generally more deliberate, focused upon practice, often of a professional nature, and the interactions are focused largely upon the shared business of that community of practice.

Secondly, according to Lave and Wenger, for a CoP to exist, there needs to be a domain of expertise. The domain needs to be shared, and it needs to be formalised. A CoP is rarely a loose, informal network of friends, but instead exists as a central resource where community members learn more about their common expertise and can share, manage and disseminate their understanding for the greater benefit of the entire community. PLNs can be less focused, made up of disparate kinds of people spread across an entire spectrum of abilities, competencies and domain expertise.

Finally, CoPs are usually something you subscribe into. You work your way inwards by way of legitimate participation from the periphery to the core of your community of practice, as you become more expert in the domain your CoP specialises in. Conversely, in your PLN, you are at the centre of the group from the outset. It is your personal network developed by you and for you, and you decide on the membership. All the other members are potential resources that support your learning, as you develop connections with them and gain access to their knowledge.

That's my interpretation of the differences between Personal Learning Networks and Communities of Practice. How do they differ from yours?

Photo by Steve Wheeler

Creative Commons License
PLN or CoP? by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 06, 2013

The meaning of pedagogy

Yes, the Ancient Greeks used tablets!
Following on from my recent posts on praxis and the meaning of education, here are a few thoughts on pedagogy. If you ask someone what a pedagogue is, they are likely to reply 'a teacher'. One fairly limited definition of the word pedagogue is: a school teacher. Another less kind definition suggests that pedagogues are people who instruct in a dogmatic or pedantic manner. We seem to have many views on the nature of pedagogy and how it is conducted. Unfortunately, these often lead to confusion. To gain a clear understanding of pedagogy, we first need to examine the origin - the etymology - of the word.

The word pedagogy has its roots in Ancient Greece. Rich families in Ancient Greece would have many servants (often slaves), one of whom would be specifically tasked to look after the children. Often these slaves would lead or escort the children to the place of education. The Greek word for child (usually a boy) is pais (the stem of this is 'paid'), and leader is agogus - so a paid-agogus or pedagogue was literally a leader of children. Later, the word pedagogue became synonymous with the teaching of our young. Taken in this context, we would probably all agree that pedagogy is about children's education. And yet this confines us to a very limited understanding of what pedagogy is, or has the potential to become.

If we take the principle of 'leading or guiding someone to education' (which in my last post I identified as deriving from the Latin word educere - 'to draw out from within), then we open up a whole new world of possibilities for learning. It's a well known aphorism - teachers teach, but educators reach - and also a principle that is at the very heart of true pedagogy. True pedagogy is far more than someone instructing. Pedagogy is leading people to a place where they can learn for themselves. It is about creating environments and situations where people can draw out from within themselves, and hone the abilities they already have, to create their own knowledge, interpret the world in their own unique ways, and ultimately realise their full potential as human beings. It's certainly not about absolutes, but is more likely to be about uncertainties. Good pedagogy is about guiding students to learning. It's about posing challenges, asking the right questions, and presenting relevant problems for learners to explore, answer and solve. True pedagogy is where educators transport their students to a place where they will be amazed by the wonders of the world they live within.

As one ancient Greek philosopher Socrates once said: 'Wisdom begins in wonder.'

Photo from Wikimedia Commons

Creative Commons License
The meaning of pedagogy by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 02, 2013

Hey, teacher, leave them kids alone!

We use a lot of words in education without really understanding what they mean. Take the word 'education' for example. Education is often associated with schooling, but to assume that the two are one and the same would be a serious error. When Pink Floyd sang 'we don't need no education' what they really should have said was 'we don't need no schooling' (although it wouldn't have fit into the tune quite as well). Education, if experienced in its pure form, is liberating, mind-expanding, essential. Often, schooling fails to do that for children. School for many is about uniformity, standardisation and synchronisation of behaviour. Schooling is the industrial process children are put through by the state to ensure they become compliant to authority, inculcated into the skills of reading, writing and numeracy, and systematically instructed (and then tested) about the world about them. They are batch processed by age, their behaviour is managed, their performance is scrutinised, and there is little time for self expression. One size has to fit all.

This is not education. It's indoctrination. A closer examination of the origins of the word 'education' will reveal that it comes from the Latin word educerewhich means to draw out or to lead from within. What does this mean? If you are a teacher, you will know that you can either instruct from the front, or you can take a backseat and create opportunities for your students to learn for themselves. It's a choice each teacher makes, and over a period of time, it has consequences. Swiss psychologist Jean Piaget once declared:

"Each time one prematurely teaches a child something he could have discovered for himself, that child is kept from inventing it and consequently from understanding it completely." (1)

To draw out a child from within themselves, we must first accept that the child has something within them to give. Every child has something unique to offer. Each has skills, abilities, knowledge, hopes, aspirations and individual personalities that can be nurtured, allowed to blossom, encouraged. Teachers who ignore this will not only fail to 'draw out' those individual attributes, they will also deprive children from a wonderful spectrum of opportunities to learn for themselves.

Whether children learn for themselves, or are instructed, depends on each teacher's personal philosophy on education. Does education for them mean schooling, or 'drawing out from within'? Most teachers probably take the middle ground and oscillate between instruction and facilitation of learning. Yet if they are honest, most teachers will admit they default to the instructional mode when they need to control behaviour, or 'get through' the content of the lesson.

Here's the bottom line: In its purest form, education is about drawing out the learner from within themselves, giving them space to express themselves, explore and play, ask the 'what if?' questions and learn in their own style and at their own pace. State funded schooling cannot and will not provide the flexibility for this kind of education to be realised. Friedrich Nietzsche once said: 'In large states education will always be mediocre, for the same reason that in large kitchens the cooking is usually bad.' The best we can hope for within the present industrial school system is that each teacher will be agile enough to interpret the curriculum that is imposed upon them in ways that offer children enough latitude to learn for themselves.  A question all educators need to ask is: Are we keeping them inside themselves, or are we drawing them out from within?

Next post: The meaning of Pedagogy

Reference
(1) Jean Piaget, quoted in the Early Years Development Framework for Child Care Centres, Ministry of Community Development, Youth & Sports, Republic of Singapore, 2011, p 9.

Photo from Wikimedia Commons

Creative Commons License
Hey, teacher, leave those kids alone by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


Hey, teacher, leave them kids alone!

We use a lot of words in education without really understanding what they mean. Take the word 'education' for example. Education is often associated with schooling, but to assume that the two are one and the same would be a serious error. When Pink Floyd sang 'we don't need no education' what they really should have said was 'we don't need no schooling' (although it wouldn't have fit into the tune quite as well). Education, if experienced in its pure form, is liberating, mind-expanding, essential. Often, schooling fails to do that for children. School for many is about uniformity, standardisation and synchronisation of behaviour. Schooling is the industrial process children are put through by the state to ensure they become compliant to authority, inculcated into the skills of reading, writing and numeracy, and systematically instructed (and then tested) about the world about them. They are batch processed by age, their behaviour is managed, their performance is scrutinised, and there is little time for self expression. One size has to fit all.

This is not education. It's indoctrination. A closer examination of the origins of the word 'education' will reveal that it comes from the Latin word educerewhich means to draw out or to lead from within. What does this mean? If you are a teacher, you will know that you can either instruct from the front, or you can take a backseat and create opportunities for your students to learn for themselves. It's a choice each teacher makes, and over a period of time, it has consequences. Swiss psychologist Jean Piaget once declared:

"Each time one prematurely teaches a child something he could have discovered for himself, that child is kept from inventing it and consequently from understanding it completely." (1)

To draw out a child from within themselves, we must first accept that the child has something within them to give. Every child has something unique to offer. Each has skills, abilities, knowledge, hopes, aspirations and individual personalities that can be nurtured, allowed to blossom, encouraged. Teachers who ignore this will not only fail to 'draw out' those individual attributes, they will also deprive children from a wonderful spectrum of opportunities to learn for themselves.

Whether children learn for themselves, or are instructed, depends on each teacher's personal philosophy on education. Does education for them mean schooling, or 'drawing out from within'? Most teachers probably take the middle ground and oscillate between instruction and facilitation of learning. Yet if they are honest, most teachers will admit they default to the instructional mode when they need to control behaviour, or 'get through' the content of the lesson.

Here's the bottom line: In its purest form, education is about drawing out the learner from within themselves, giving them space to express themselves, explore and play, ask the 'what if?' questions and learn in their own style and at their own pace. State funded schooling cannot and will not provide the flexibility for this kind of education to be realised. Friedrich Nietzsche once said: 'In large states education will always be mediocre, for the same reason that in large kitchens the cooking is usually bad.' The best we can hope for within the present industrial school system is that each teacher will be agile enough to interpret the curriculum that is imposed upon them in ways that offer children enough latitude to learn for themselves.  A question all educators need to ask is: Are we keeping them inside themselves, or are we drawing them out from within?

Next post: The meaning of Pedagogy

Reference
(1) Jean Piaget, quoted in the Early Years Development Framework for Child Care Centres, Ministry of Community Development, Youth & Sports, Republic of Singapore, 2011, p 9.

Photo from Wikimedia Commons

Creative Commons License
Hey, teacher, leave those kids alone by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 31, 2013

Praxis makes perfect

Praxis is not as commonly referred to in the educational field as it should be. It is a poorly understood concept, and not particularly well researched either. And yet praxis is (or should be) at the very heart of what we do, and who we aspire to be, as educators. What is praxis? My explanation is that praxis is at the nexus - the overlap - between theory and practice. It's the sweet spot of education in action. Praxis is the essence of what happens when theory is applied to practice, and can be simplified in this Venn diagram. But there is a lot more to understand about praxis.

My colleague Oliver Quinlan wrote a very thoughtful post about praxis. He argued that the theoretical models we learn, and the skills we acquire as teachers, are inextricably entwined. They influence each other, and in effect, become a part of who you are, your identity as an educator. He writes:

"...your theoretical framework influences your practice, but your experience in the classroom also continues to shape your framework; the two are not separate."

Others have also written eloquently about praxis. The Brazilian educator and theorist Paulo Freire for example, defined the gaining of praxis as a means to emerge from oppression and ignorance:

"One of the gravest obstacles to the achievement of liberation is that oppressive reality absorbs those within it and thereby acts to submerge human beings' consciousness. Functionally, oppression is domesticating. To no longer be prey to its force, one must emerge from it and turn upon it. This can be done only by means of the praxis: reflection and action upon the world in order to transform it." (Freire, 1970: 33).

Freire is concerned with liberty from oppression. This oppression takes on the form of ignorance as much as it does chains, or prison bars, or walls of a ghetto. He is saying that praxis gives us the awareness, or consciousness of where we are. A realisation of the predicament we are in. It is an awakening to the reality, and a call for action to do something about it. Knowing, and then doing something based on that knowledge, is a powerful response. But it's not as simple as that. Consider the following passage:

"We can now see the full quality of praxis. It is not simply action based on reflection. It is action which embodies certain qualities. These include a commitment to human well being and the search for truth, and respect for others. It is the action of people who are free, who are able to act for themselves. Moreover, praxis is always risky. It requires that a person 'makes a wise and prudent practical judgement about how to act in this situation' (Carr and Kemmis 1986: 190)."

Theory without action is just theory. Hot air. Action without theory can be just as hollow. How can you justify your actions and decisions in the classroom, if you have no theory to support you? The best equipped teachers are those who are best informed. The best way to use theory is to test it out in practice. The most effective teachers are those who not only innovate in their practice, but also know how to justify their actions through the application of appropriate theory. Praxis is the contextualisation of theory within action. It can, and should pervade every aspect of our professional practice and identity as an educator. It's time to stop thinking about theory and practice as separate concepts. It's time teachers began to meld the two together, so that thinking and action - theory and practice - combine to enable us to create, develop and maintain the best possible learning environments for our students. That's how important praxis is.

References
Carr, W. and Kemmis, S. (1986) Becoming Critical. Education, knowledge and action research, Lewes: Falmer Press.
Freire, P. (1970) Pedagogy of the Oppressed. London: Penguin Books.

Photo by Steve Wheeler

Creative Commons License
Praxis makes perfect by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


Praxis makes perfect

Today we discussed praxis. It's not as commonly referred to in the educational field as it should be. It is a poorly understood concept, and not particularly well researched either. And yet praxis is (or should be) at the very heart of what we do, and who we aspire to be, as educators. What is praxis? My explanation is that praxis is at the nexus - the overlap - between theory and practice. It's the sweet spot of education in action. Praxis is the essence of what happens when theory is applied to practice, and can be simplified in this Venn diagram. But there is a lot more to understand about praxis.

My colleague Oliver Quinlan wrote a very thoughtful post about praxis. He argued that the theoretical models we learn, and the skills we acquire as teachers, are inextricably entwined. They influence each other, and in effect, become a part of who you are, your identity as an educator. He writes:

"...your theoretical framework influences your practice, but your experience in the classroom also continues to shape your framework; the two are not separate."

Others have also written eloquently about praxis. The Brazilian educator and theorist Paulo Freire for example, defined the gaining of praxis as a means to emerge from oppression and ignorance:

"One of the gravest obstacles to the achievement of liberation is that oppressive reality absorbs those within it and thereby acts to submerge human beings' consciousness. Functionally, oppression is domesticating. To no longer be prey to its force, one must emerge from it and turn upon it. This can be done only by means of the praxis: reflection and action upon the world in order to transform it." (Freire, 1970: 33).

Freire is concerned with liberty from oppression. This oppression takes on the form of ignorance as much as it does chains, or prison bars, or walls of a ghetto. He is saying that praxis gives us the awareness, or consciousness of where we are. A realisation of the predicament we are in. It is an awakening to the reality, and a call for action to do something about it. Knowing, and then doing something based on that knowledge, is a powerful response. But it's not as simple as that. Consider the following passage:

"We can now see the full quality of praxis. It is not simply action based on reflection. It is action which embodies certain qualities. These include a commitment to human well being and the search for truth, and respect for others. It is the action of people who are free, who are able to act for themselves. Moreover, praxis is always risky. It requires that a person 'makes a wise and prudent practical judgement about how to act in this situation' (Carr and Kemmis 1986: 190)."

Theory without action is just theory. Hot air. Action without theory can be just as hollow. How can you justify your actions and decisions in the classroom, if you have no theory to support you? The best equipped teachers are those who are best informed. The best way to use theory is to test it out in practice. The most effective teachers are those who not only innovate in their practice, but also know how to justify their actions through the application of appropriate theory. Praxis is the contextualisation of theory within action. It can, and should pervade every aspect of our professional practice and identity as an educator. It's time to stop thinking about theory and practice as separate concepts. It's time teachers began to meld the two together, so that thinking and action - theory and practice - combine to enable us to create, develop and maintain the best possible learning environments for our students. That's how important praxis is.

References
Carr, W. and Kemmis, S. (1986) Becoming Critical. Education, knowledge and action research, Lewes: Falmer Press.
Freire, P. (1970) Pedagogy of the Oppressed. London: Penguin Books.

Photo by Steve Wheeler

Creative Commons License
Praxis makes perfect by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 28, 2013

Global conversations

Do you remember the days before the Web? I do. I'm talking about the period just prior to 1995. We would converse on a one-to-one basis using telephones (which were almost always tethered because there were also very few mobile phones before 1990, and they were the size of a house brick). We sent typed paper memos to each other, via the external postage or internal mail services, and we arranged face to face meetings that were frequent, tedious and time consuming. That was the way we got things done in the 1980s, and for the early part of the 1990s too. And then the technology innovations rapidly began to appear, one after another.

If someone from the 1980s was suddenly transported to today's world, what a difference they would see! Communication has been transformed beyond recognition. e-Mail is already embedded into the culture of most organisations, and is used by everyone without much thought. Mobile phones are also common place, giving us the ability to connect with anyone, just about anywhere. What's more, many people stare down at their mobiles, rather than holding them to their ear. This means that texting too, has become normal practice, as has our use of touch screens, social media and search engines. A lot has changed in less than two decades.

My wife and I were having a conversation with one of our daughters last night, via Skype, on an iPhone. We could see and hear each other perfectly, with no degradation of audio or video. It was as clear as watching the television, but it was there, in the palm of our hands, and we were mobile. For most of us, full motion video conferencing on a small touch screen device was science fiction 10 years ago. It's amazing to me how much we already take for granted, but probably the thing we take for granted the most, is our ability to have multiple, instantaneous and synchronous global conversations.

Consider how easy it is to have simultaneous conversations with several other people using e-mail, Twitter or Facebook. Think how easy it is to video conference using Google Hangouts or Skype. We already take these for granted, but they are the culmination of many years of technological evolution and convergence, resulting in devices that make communication across any distance a seamless experience. But what can we do with this ability that will transform education? How can these tools be harnessed for the benefit of our learners?

For me, the most important aspect of any global communication capability is the conversations we can have, sometimes at the drop of a hat. We can all learn a lot from each other, and the technology we have at our disposal can support that learning much more quickly than we could ever do in the past. Learning through conversation can involve many things: an exchange of ideas and views, discussion and argument, discursive activity resulting in the negotiation of meaning, reordering and repurposing of content, consensual organisation of knowledge. Such facets of global conversations not only enable us to connect with our peers worldwide to learn from each other, they can also facilitate exchanges that build bridges across language, cultural, ethnic and religious differences, political and social divides and gulfs in historical misunderstandings.

Now these are the kind of global conversations we simply cannot do without.

Photo by Stephen Janofsky

Creative Commons License
Global conversations by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


Global conversations

Do you remember the days before the Web? I do. I'm talking about the period just prior to 1995. We would converse on a one-to-one basis using telephones (which were almost always tethered because there were also very few mobile phones before 1990, and they were the size of a house brick). We sent typed paper memos to each other, via the external postage or internal mail services, and we arranged face to face meetings that were frequent, tedious and time consuming. That was the way we got things done in the 1980s, and for the early part of the 1990s too. And then the technology innovations rapidly began to appear, one after another.

If someone from the 1980s was suddenly transported to today's world, what a difference they would see! Communication has been transformed beyond recognition. e-Mail is already embedded into the culture of most organisations, and is used by everyone without much thought. Mobile phones are also common place, giving us the ability to connect with anyone, just about anywhere. What's more, many people stare down at their mobiles, rather than holding them to their ear. This means that texting too, has become normal practice, as has our use of touch screens, social media and search engines. A lot has changed in less than two decades.

My wife and I were having a conversation with one of our daughters last night, via Skype, on an iPhone. We could see and hear each other perfectly, with no degradation of audio or video. It was as clear as watching the television, but it was there, in the palm of our hands, and we were mobile. For most of us, full motion video conferencing on a small touch screen device was science fiction 10 years ago. It's amazing to me how much we already take for granted, but probably the thing we take for granted the most, is our ability to have multiple, instantaneous and synchronous global conversations.

Consider how easy it is to have simultaneous conversations with several other people using e-mail, Twitter or Facebook. Think how easy it is to video conference using Google Hangouts or Skype. We already take these for granted, but they are the culmination of many years of technological evolution and convergence, resulting in devices that make communication across any distance a seamless experience. But what can we do with this ability that will transform education? How can these tools be harnessed for the benefit of our learners?

For me, the most important aspect of any global communication capability is the conversations we can have, sometimes at the drop of a hat. We can all learn a lot from each other, and the technology we have at our disposal can support that learning much more quickly than we could ever do in the past. Learning through conversation can involve many things: an exchange of ideas and views, discussion and argument, discursive activity resulting in the negotiation of meaning, reordering and repurposing of content, consensual organisation of knowledge. Such facets of global conversations not only enable us to connect with our peers worldwide to learn from each other, they can also facilitate exchanges that build bridges across language, cultural, ethnic and religious differences, political and social divides and gulfs in historical misunderstandings.

Now these are the kind of global conversations we simply cannot do without.

Photo by Stephen Janofsky

Creative Commons License
Global conversations by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 27, 2013

Games for girls

I'm a big advocate of student publications. Many have some great ideas to share, and we encourage student blogging very strongly in Plymouth. There is nothing to stop students going farther and publishing their work in mainstream journals - if their work is good enough it should be shared widely. It's also very motivating for them. Some of my previous students have published in journals in the past few years. Check out this little gem from Dan Kennedy on the VLE/PLE debate. I'm therefore very pleased that we have another success. A 3rd year research assignment by one of my students Lucy Kitching (which I subsequently collaborated on and helped her re-write for publication) has appeared in the current issue of the prestigious and highly accessed online open access journal European Journal of Open, Distance and E-Learning. Congratulations Lucy! Here is the title and abstract:

Playing Games: Do Game Consoles have a Positive Impact on Girls’ Learning Outcomes and Motivation?

Games based learning is currently a hotly debated topic in education and is a fertile field of study (Holmes, 2011; Abrams, 2009). Many schools are exploring ways in which games can be embedded into the curriculum, to enhance learning through deeper engagement and higher levels of motivation (Miller and Robertson, 2010). This paper explores the use of game consoles to support learning for young students (ages 8-11) and evaluates their recent success in primary education. Over time game consoles and video games have been portrayed as a male oriented technology. This research investigated the current use of game consoles in learning and how it might positively affect a child’s learning and motivation, but focused solely on female students’ experiences. In the study we investigated the research question: ‘Do game consoles have a positive impact on girls’ learning and motivation?’ A semi-structured questionnaire was distributed to girls in Key Stage 2 (n=49) across three schools that have already incorporated game consoles into their curriculum. The study found that game consoles and video games can have a positive impact on girls’ learning and motivation and are key themes that have been raised by teachers. However, due to several limitations in this research some issues were not fully addressed, and we identify some future areas for research.

More student led research projects are in the pipeline for publication in the coming months. Read the complete article at this link.

Related Links
What is it about games?
The games we play

Photo courtesy of U.S. Navy

Creative Commons License
Games for girls by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


Games for girls

I'm a big advocate of student publications. Many have some great ideas to share, and we encourage student blogging very strongly in Plymouth. There is nothing to stop students going farther and publishing their work in mainstream journals - if their work is good enough it should be shared widely. It's also very motivating for them. Some of my previous students have published in journals in the past few years. Check out this little gem from Dan Kennedy on the VLE/PLE debate. I'm therefore very pleased that we have another success. A 3rd year research assignment by one of my students Lucy Kitching (which I subsequently collaborated on and helped her re-write for publication) has appeared in the current issue of the prestigious and highly accessed online open access journal European Journal of Open, Distance and E-Learning. Congratulations Lucy! Here is the title and abstract:

Playing Games: Do Game Consoles have a Positive Impact on Girls’ Learning Outcomes and Motivation?

Games based learning is currently a hotly debated topic in education and is a fertile field of study (Holmes, 2011; Abrams, 2009). Many schools are exploring ways in which games can be embedded into the curriculum, to enhance learning through deeper engagement and higher levels of motivation (Miller and Robertson, 2010). This paper explores the use of game consoles to support learning for young students (ages 8-11) and evaluates their recent success in primary education. Over time game consoles and video games have been portrayed as a male oriented technology. This research investigated the current use of game consoles in learning and how it might positively affect a child’s learning and motivation, but focused solely on female students’ experiences. In the study we investigated the research question: ‘Do game consoles have a positive impact on girls’ learning and motivation?’ A semi-structured questionnaire was distributed to girls in Key Stage 2 (n=49) across three schools that have already incorporated game consoles into their curriculum. The study found that game consoles and video games can have a positive impact on girls’ learning and motivation and are key themes that have been raised by teachers. However, due to several limitations in this research some issues were not fully addressed, and we identify some future areas for research.

More student led research projects are in the pipeline for publication in the coming months. Read the complete article at this link.

Related Links
What is it about games?
The games we play

Photo courtesy of U.S. Navy

Creative Commons License
Games for girls by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 26, 2013

Get it together

What can we learn from digital curation of content? Let's start with some theory: According to the revised Bloom's cognitive taxonomy by Anderson and Krathwohl, 'creating' is suggested as the peak of achievement. It replaces evaluation as the pinnacle in this revised model, but many have wondered why Anderson and Krathwohl suggested it in the first place. Why swap evaluation and synthesis in the taxonomy? I wrote about this new model question recently, and critiqued it in the context of emergent forms of digital learning. But all discussions need a reference point, a starting place from where the arguments can proceed. If we accept this premise, then it can be argued that activities such as curation should be placed at the apex at this 'creating' level.

When you curate you are actively seeking content, but you are also creating, organising and adding value to the content you have found. You may also have dialogue with your personal learning network as you discuss that content. During curation, you are synthesising content, concepts and contexts from disparate sources, and uniting them together in one place. You are creating a shop window for that content through synthesis. 'Synthesis' in the old model is replaced by 'Creating' in the new, revised Bloom model.

Let's look at this from the beginning: At the start of the process, organising it is not the most important task. Simply finding it and making sure that it is in the correct category, and is accurate is enough. This is a fairly low level cognitive process, but it does require some discernment and decision making ability.

The second stage, integrating your content within your repository, relies on a similar level of decision making. Where is it best placed? The default mode would be to place your most recently discovered piece of content at the top of the stack, and indeed in most cases that is where it sits. However, if you want a more defined display of content, sometimes you have to deliberately place it within a chronological, historical, cultural or alphabetical sequence. Some tools such as Storify will naturally sequence content chronologically. Others such as Scoop.it provide the flexibility to promote or demote content.

The third stage, also a choice for the curator, is to add extra value to the content already within your repository. You could add notes (annotations) or highlight sections with colour for example. Diigo is a tool that offers these options. Learning often emerges as a result of the writing, rewriting and editing of this content.

Finally, aware of the social context, you have the capability to share your content (or indeed your entire repository) with your professional learning network. The dialogue that ensues can in itself be quite powerful but unpredictable, because no-one can be sure which direction the conversation will take, or what conclusions might be made.

All of the above components demand specific ability and skills from the curator. Some are more critically reliant than others, but all of the stages as part of a process, have learning possibilities. It's not difficult to see why curation is becoming a very popular knowledge management activity, and with the recent introduction of ready to use tools, it has never been easier. It is up to us, the users, to organise content on the Web, and we learn while we do it. It may look simple, and anyone can do it, but don't be deceived. When done extensively, and at the highest level, curation of content can be a complex and deeply engaging process, providing rich learning opportunities for curator and readers alike.

Photo by Julia Frost

Creative Commons License
Get it together by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


Get it together

What can we learn from digital curation of content? Let's start with some theory: According to the revised Bloom's cognitive taxonomy by Anderson and Krathwohl, 'creating' is suggested as the peak of achievement. It replaces evaluation as the pinnacle in this revised model, but many have wondered why Anderson and Krathwohl suggested it in the first place. Why swap evaluation and synthesis in the taxonomy? I wrote about this new model question recently, and critiqued it in the context of emergent forms of digital learning. But all discussions need a reference point, a starting place from where the arguments can proceed. If we accept this premise, then it can be argued that activities such as curation should be placed at the apex at this 'creating' level.

When you curate you are actively seeking content, but you are also creating, organising and adding value to the content you have found. You may also have dialogue with your personal learning network as you discuss that content. During curation, you are synthesising content, concepts and contexts from disparate sources, and uniting them together in one place. You are creating a shop window for that content through synthesis. 'Synthesis' in the old model is replaced by 'Creating' in the new, revised Bloom model.

Let's look at this from the beginning: At the start of the process, organising it is not the most important task. Simply finding it and making sure that it is in the correct category, and is accurate is enough. This is a fairly low level cognitive process, but it does require some discernment and decision making ability.

The second stage, integrating your content within your repository, relies on a similar level of decision making. Where is it best placed? The default mode would be to place your most recently discovered piece of content at the top of the stack, and indeed in most cases that is where it sits. However, if you want a more defined display of content, sometimes you have to deliberately place it within a chronological, historical, cultural or alphabetical sequence. Some tools such as Storify will naturally sequence content chronologically. Others such as Scoop.it provide the flexibility to promote or demote content.

The third stage, also a choice for the curator, is to add extra value to the content already within your repository. You could add notes (annotations) or highlight sections with colour for example. Diigo is a tool that offers these options. Learning often emerges as a result of the writing, rewriting and editing of this content.

Finally, aware of the social context, you have the capability to share your content (or indeed your entire repository) with your professional learning network. The dialogue that ensues can in itself be quite powerful but unpredictable, because no-one can be sure which direction the conversation will take, or what conclusions might be made.

All of the above components demand specific ability and skills from the curator. Some are more critically reliant than others, but all of the stages as part of a process, have learning possibilities. It's not difficult to see why curation is becoming a very popular knowledge management activity, and with the recent introduction of ready to use tools, it has never been easier. It is up to us, the users, to organise content on the Web, and we learn while we do it. It may look simple, and anyone can do it, but don't be deceived. When done extensively, and at the highest level, curation of content can be a complex and deeply engaging process, providing rich learning opportunities for curator and readers alike.

Photo by Julia Frost

Creative Commons License
Get it together by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 24, 2013

Courses, or learning episodes?


Delgates take pics during Andrew Jacob's Hero StoryDuring a recent Learning Pool Live event in London, I posted the following message on Twitter:

Move away from courses, towards events, experiences, challenges.

It was the result of a conversation and some thoughts during one of the presentations. It was retweeted several times, and one or two people asked for clarification. It's difficult to say too much in 140 characters, so here is a brief blog post to explain the thinking behind that tweet.

We were discussing a range of digital provisions for learning and development in the corporate world. It's universally accepted right now in this current economic climate that stringency and cost cutting are hitting the training budgets of just about every organisation. Courses can be very expensive to deliver. They also take a lot of time to develop, and as Don Taylor recently wrote, organisational training suffers from bad reputation. The perception of training courses being delivered didactically within a classroom or via e-learning as a 'just in case' provision, is far from the ideal human development many companies wish to aim for. What can the learning professional do to ameliorate this situation without compromising the integrity of the learning development offered in their company? Someone at the conference showed a cartoon which depicted two managers discussing the training budget. One complains about the cost and asks: 'What if we train them and they leave?' The other counters 'What if we don't and they stay?' Clearly, eliminating training is not an option, but modifying the offer might provide some solutions. That idea was reflected in my tweet.

It was argued that 'full course' delivery was no longer a viable option for many organisations. This was not simply because of cost, but also because of lack of efficacy. Compliance training, for example, is routinely presented as a short course, made up of a sequence of information presented as electronic page turning. The prevalent format is for learners to read the content, occasionally answer multiple choice questions to check their understanding, and then conclude with a summary and final test of memory. The 'next page' button is a constant feature of this kind of e-learning course, and is hated with a passion by many employees. Not a great deal is remembered from these training packages, and they are completed in a prefunctory manner with little thought about the meaning of the content. This is not just because they are presented with rather uninteresting packaging, it is also because learning is fairly passive.

Just what are the options? Can we do better than the course? Some might argue that events, experiences and challenges (which I call 'episodes') are all components of courses. True, and there's the rub. What would stop organisations from extracting these from courses so that they become stand alone learning activities, or learning 'episodes'? Nothing at all, and some companies are starting to do just that. The bite size learning experience is sometimes all that is needed to raise productivity, raise awareness or improve safety within the workplace. Also, such disaggregation of learning content provides learners with a greater choice of learning and development possibilities, where smaller and more focused experiences take less time to complete away from the job, and 'just enough' learning is achieved. Such bite sized learning could also be pushed directly to employees' smart devices if the company wished.

Often, goes the argument, courses contain simply too much content (harking back to the 'just in case' curriculum Don Taylor talks about), much of which is not needed at that point in time. Presenting a menu of activities, including challenges, quizzes, problems, experiences, and other learning 'episodes' does not preclude learners eventually completing 'courses'. It simply means they can take their time, at their own pace to accrue a portfolio or gain an open badge containing their achievements, whilst their learning is delivered at the point of need. Learners direct their own decision making, choosing from the menu exactly what they require as they work, and over time, they gain accreditation if it is required. Building more challenges and problems into the events would also encourage more active, and deeper forms of learning.

I therefore suggest that learning episodes rather than courses could be the way forward for 'just in time' and 'just enough' learning that is personalised, and delivered at the point of need.  Ultimately, it's a matter of granularity, and an idea based on making all of the components of a course available separately, in any sequence, and deliverable on any platform. Such flexibility is now both achievable and desirable. But how many organisations have the vision to make it happen?

Photo by Paul Clarke

Creative Commons License
Courses, or learning episodes? by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


Courses, or learning episodes?


Delgates take pics during Andrew Jacob's Hero StoryDuring a recent Learning Pool Live event in London, I posted the following message on Twitter:

Move away from courses, towards events, experiences, challenges.

It was the result of a conversation and some thoughts during one of the presentations. It was retweeted several times, and one or two people asked for clarification. It's difficult to say too much in 140 characters, so here is a brief blog post to explain the thinking behind that tweet.

We were discussing a range of digital provisions for learning and development in the corporate world. It's universally accepted right now in this current economic climate that stringency and cost cutting are hitting the training budgets of just about every organisation. Courses can be very expensive to deliver. They also take a lot of time to develop, and as Don Taylor recently wrote, organisational training suffers from bad reputation. The perception of training courses being delivered didactically within a classroom or via e-learning as a 'just in case' provision, is far from the ideal human development many companies wish to aim for. What can the learning professional do to ameliorate this situation without compromising the integrity of the learning development offered in their company? Someone at the conference showed a cartoon which depicted two managers discussing the training budget. One complains about the cost and asks: 'What if we train them and they leave?' The other counters 'What if we don't and they stay?' Clearly, eliminating training is not an option, but modifying the offer might provide some solutions. That idea was reflected in my tweet.

It was argued that 'full course' delivery was no longer a viable option for many organisations. This was not simply because of cost, but also because of lack of efficacy. Compliance training, for example, is routinely presented as a short course, made up of a sequence of information presented as electronic page turning. The prevalent format is for learners to read the content, occasionally answer multiple choice questions to check their understanding, and then conclude with a summary and final test of memory. The 'next page' button is a constant feature of this kind of e-learning course, and is hated with a passion by many employees. Not a great deal is remembered from these training packages, and they are completed in a prefunctory manner with little thought about the meaning of the content. This is not just because they are presented with rather uninteresting packaging, it is also because learning is fairly passive.

Just what are the options? Can we do better than the course? Some might argue that events, experiences and challenges (which I call 'episodes') are all components of courses. True, and there's the rub. What would stop organisations from extracting these from courses so that they become stand alone learning activities, or learning 'episodes'? Nothing at all, and some companies are starting to do just that. The bite size learning experience is sometimes all that is needed to raise productivity, raise awareness or improve safety within the workplace. Also, such disaggregation of learning content provides learners with a greater choice of learning and development possibilities, where smaller and more focused experiences take less time to complete away from the job, and 'just enough' learning is achieved. Such bite sized learning could also be pushed directly to employees' smart devices if the company wished.

Often, goes the argument, courses contain simply too much content (harking back to the 'just in case' curriculum Don Taylor talks about), much of which is not needed at that point in time. Presenting a menu of activities, including challenges, quizzes, problems, experiences, and other learning 'episodes' does not preclude learners eventually completing 'courses'. It simply means they can take their time, at their own pace to accrue a portfolio or gain an open badge containing their achievements, whilst their learning is delivered at the point of need. Learners direct their own decision making, choosing from the menu exactly what they require as they work, and over time, they gain accreditation if it is required. Building more challenges and problems into the events would also encourage more active, and deeper forms of learning.

I therefore suggest that learning episodes rather than courses could be the way forward for 'just in time' and 'just enough' learning that is personalised, and delivered at the point of need.  Ultimately, it's a matter of granularity, and an idea based on making all of the components of a course available separately, in any sequence, and deliverable on any platform. Such flexibility is now both achievable and desirable. But how many organisations have the vision to make it happen?

Photo by Paul Clarke

Creative Commons License
Courses, or learning episodes? by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 18, 2013

Mobile learning and blended interaction

Veteran education theorist Michael G. Moore once wrote about three types of interaction. Learners interact, he said, with content, with their teachers and with each other. Other theorists subsequently expanded on this interactional triumvirate. Leslie Moller suggested a fourth kind of interaction - interaction with the interface. His proposal reflected not only the proliferation of computer technologies but also a growing interest in Human Computer Interaction (HCI) and cognitive science.

The advent of mobile communication has expanded this taxonomy still further. In this post I explore how the use of mobile (cell) phones is liberating learners to interact in many new ways and in many different contexts. These are initial thoughts and I value your comments in shaping them into something less nascent.

Consider the benefits of learning while on the move. Once this could only be achieved using books. In previous posts I have argued that personal, handheld technologies such as smart phones, e-readers, tablet computers and games consoles enable mobile learning at the pace of the individual, in any place and at any time. Let's assume for the moment that we can connect to the Web from anywhere we are, and that everyone has a mobile device (This is far from reality, but humour me). This would represent a paradigm shift for education and a personalised learning revolution for every student.

Learners would not only be able to learn whilst traversing any environment, they would experience continuous, seamless delivery of content, interaction with their tutors and connections with their fellow students, or interpersonal interaction. They would also be able to interact with their environment and objects within it, known as extrapersonal interaction, and also with objects within their personal space such as the interface of their device - peripersonal interaction. What is inevitable is intrapersonal interaction. This happens in all learning contexts, because it is the internal dialogue students have with themselves as they assimilate knowledge, reason, analyse, evaluate and reflect on their experiences. The difference here though, is that mobile learners will be in a place of their own choosing, and will continue the internal self-talk whilst in total and perpetual contact with others. We can speculate that this internal interaction has the potential to be amplified through the mobile device to the network of others, across multiple interactions. What I am arguing for here is that the power of thinking (intrapersonal) can be amplified across the network provoking dialogue (interpersonal) while each member of that network interacts with their devices (peripersonal) environments (extrapersonal). What's more, I believe when using mobile devices, it is possible that these multiple interactions can be both blended and simultaneous.

I should also add that the advent of the smart phone brings with it the ability to transcend many of the previously insurmountable barriers to good interpersonal communication, including language and distance. There's a mobile app for everything, or if not, there soon will be. We are only just beginning to appreciate and comprehend the disruptive and transformational potential the mobile phone brings to learning.

Graphic by Steve Wheeler

Creative Commons License
Mobile learning and blended interaction by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


Mobile learning and blended interaction

Veteran education theorist Michael G. Moore once wrote about three types of interaction. Learners interact, he said, with content, with their teachers and with each other. Other theorists subsequently expanded on this interactional triumvirate. Leslie Moller suggested a fourth kind of interaction - interaction with the interface. His proposal reflected not only the proliferation of computer technologies but also a growing interest in Human Computer Interaction (HCI) and cognitive science.

The advent of mobile communication has expanded this taxonomy still further. In this post I explore how the use of mobile (cell) phones is liberating learners to interact in many new ways and in many different contexts. These are initial thoughts and I value your comments in shaping them into something less nascent.

Consider the benefits of learning while on the move. Once this could only be achieved using books. In previous posts I have argued that personal, handheld technologies such as smart phones, e-readers, tablet computers and games consoles enable mobile learning at the pace of the individual, in any place and at any time. Let's assume for the moment that we can connect to the Web from anywhere we are, and that everyone has a mobile device (This is far from reality, but humour me). This would represent a paradigm shift for education and a personalised learning revolution for every student.

Learners would not only be able to learn whilst traversing any environment, they would experience continuous, seamless delivery of content, interaction with their tutors and connections with their fellow students, or interpersonal interaction. They would also be able to interact with their environment and objects within it, known as extrapersonal interaction, and also with objects within their personal space such as the interface of their device - peripersonal interaction. What is inevitable is intrapersonal interaction. This happens in all learning contexts, because it is the internal dialogue students have with themselves as they assimilate knowledge, reason, analyse, evaluate and reflect on their experiences. The difference here though, is that mobile learners will be in a place of their own choosing, and will continue the internal self-talk whilst in total and perpetual contact with others. We can speculate that this internal interaction has the potential to be amplified through the mobile device to the network of others, across multiple interactions. What I am arguing for here is that the power of thinking (intrapersonal) can be amplified across the network provoking dialogue (interpersonal) while each member of that network interacts with their devices (peripersonal) environments (extrapersonal). What's more, I believe when using mobile devices, it is possible that these multiple interactions can be both blended and simultaneous.

I should also add that the advent of the smart phone brings with it the ability to transcend many of the previously insurmountable barriers to good interpersonal communication, including language and distance. There's a mobile app for everything, or if not, there soon will be. We are only just beginning to appreciate and comprehend the disruptive and transformational potential the mobile phone brings to learning.

Graphic by Steve Wheeler

Creative Commons License
Mobile learning and blended interaction by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


<< Back Next >>