Log on:
Powered by Elgg

Feed detail

November 28, 2012

Authentic learning

In his 1970 book Deschooling Society, the radical philosopher Ivan Illich wrote: 'Most learning is not the result of instruction. It is rather the result of unhampered participation in a meaningful setting. Most people learn best by being "with it," yet school makes them identify their personal, cognitive growth with elaborate planning and manipulation.' 

This is a real challenge to many schools. Some of the most effective learning methods involve students doing and making, problem solving, and playing games, all of which comply with the notion of being in a meaningful setting. This kind of situated learning is powerful because it immerses students in contexts that are authentic. Medical students learn through problem based learning, often a complex situated form of education that places them in the role of decision maker. Pilots do a lot of their training in simulators, where 'real life' problems and challenges can be presented to them, and to which they must respond. This kind of learning, according to Jean Lave (1988), is powerful because it is rooted in context, and avoids much of the abstract nature of content that is delivered traditionally. Brown, Collins and Duguid (1989) agree, believing that authentic learning contexts are vitally important if students are to acquire and develop cognitive skills that are transferable to real world living. 

So how do we bring these powerful ideas into school classrooms? Often, we see children bored or demotivated because they are presented with content that is abstract and meaningless, or without a specific context or 'situatedness'. It's not all bad news though. There is evidence that some schools are beginning to adopt authentic learning methods. Saltash.net, a school near to my home, managed to get around this issue by placing children in situations where they had to use tools and techniques to solve real life problems. In their small working farm located within the grounds of the school, they kept chickens, pigs and goats. The children took turns managing the farm, and were often required to purchase food for the animals, or sell eggs at the market. To do this they needed to know about how a market operates, and had to understand concepts such as supply and demand, profit and loss, sell by dates, and so on. Teaching them how to use an Excel spreadsheet would have been dull and boring if it was kept within the four walls of a classroom or ICT suite. Taking this skill outside and putting them in a position where they had to learn by applying spreadsheets to the problem of buying of corn and the selling of eggs at a good price and maintaining records placed their learning within a meaningful setting. There are endless examples of situated learning in a school near you. 

In one American school I visited, teachers chose two students each day who were tasked to edit and present the following day's morning news programme on school radio. All of the children took it in turns to be the morning DJs and news presenters, and their responsibility was to make sure their school was kept up to date on current affairs through their research, editing, filtering and presentation. Many schools in the UK are adopting the School Radio approach too, and children are relishing the challenge of informing their classmates and teachers, deciding on music playlists, reporting on weather and sport, while acquiring authentic critical, organisational and reflective skills. This is learning by stealth, and it is incredibly powerful.

Ultimately, it is the teacher's role to create learning contexts that support authentic learning. If teachers can situate learning in meaningful contexts and real life (or realistic) settings, not only will students become more motivated, they will also acquire authentic transferable skills that they can call upon for the rest of their lives.

References
Brown, J.S., Collins, A. and Duguid, S. (1989) Situated cognition and the culture of learning. Educational Researcher, 18(1), 32-42.
Illich, I. (1970) Deschooling Society. London: Marion Boyars Ltd.
Lave, J. (1988) Cognition in Practice: Mind, Mathematics and Culture in Everyday Life. Cambridge: Cambridge University Press.

Photo by Cobalt123

Creative Commons License
Authentic learning by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 26, 2012

Parabolic learning

Reflection and Amplification
Now that I have some time, I can sit down and reflect on an extraordinary two hour session with my BA Education Studies students this morning. They are only a small group of a dozen students, but over the last few months, my elearning module group has created a very large amount of content, including blogs, wiki pages and videos. The group wiki is here if anyone wishes to view some of their content. We have previously explored a number of learning theories, new learning technologies, concepts around crowdsourcing, wisdom of crowds, folksonomies and user generated content, Web 2.0, mobile learning and a whole host of other themes during the course.

Today was different, because normally I prepare thoroughly for the sessions. Today, I took the risk of going  into the room with just a germ of an idea to see how it would develop. That germ of an idea evolved over the course of the two hour session into something beyond anything I could ever have planned. It proves to me that sometimes spontaneity can pay dividends. The incorporation of a number of social media tools into the mix proved to be an amazing platform from which the students and I could reflect on the process of learning, and amplify our ideas to each other and the world.

I started the session with the aim of encouraging the group to learn deeply and critically about a particular topic - MOOCs (Massively Open Online Courses). I asked them to prepare for a debate next week, and put up the slide: 'This house believes that MOOCs will signal the demise of campus based higher education'.  I then divided the students randomly into two teams, one arguing for the motion, and the other arguing against. I asked the members of the two teams to research their arguments, with supporting evidence, and blog their ideas in preparation for next week's debate.

As a doorknob strategy, I asked two students to act as content curators. Their task would be to create a new wiki page, and begin to populate it with resources related to MOOCs. This would act as baseline reference materials for the two sides to incorporate into their arguments, but it would also mean that the two students would need to investigate both sides of the argument and post content related to the discourse around MOOCs.

I then tweeted (and encouraged the students to do the same) a few messages to the online educator community to ask them their views on the question of whether MOOCs would eventually replace traditional forms of education. This kind of crowdsourcing activity is always a risk and quite unpredictable, because you never know who will respond (if anyone) or what they will say. I added the hashtag #moocplym for good measure so we could track the conversation across the community. Next, I projected Twitterfall and VisibleTweets live backchannel feeds of responses on the large screen at the front of the classroom. Another task then came the way of the curator team. Their next challenge was to create an archive of all the tweets, blogs, and other content related to the hashtag #moocplym and maintain a chronological record throughout the week using Storify or some similar curation tool.

Over the coming week, the two teams (with the curation team in attendance) will therefore explore the history, culture, technology and pedagogy of MOOCs, a topic they are not particularly familiar with. They will critically analyse the discourse surrounding MOOCs, create and share content on their learning, and reflect on it. Their ideas, and their associated content will be presented and amplified through the social media channels, and the ultimate act will be the debate, followed by a discussion of the entire process from start to finish. There will be a lot to talk about if it all goes according to schedule. Oh, and why did I title this post parabolic learning? Because a parabolic reflector collects energy, focuses and transforms it, and then reflects it back with greater intensity. That's exactly what I want my students to do.

First image source
Second image by Steve Wheeler

Creative Commons License
Parabolic learning: reflection and amplification by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 25, 2012

Making a difference

Many times I've heard it said that there is no evidence that technology improves learning. This is a vacuous claim, based on ignorance of the research literature, and possibly borne out of a fear or dislike of technology in general. My usual retort to such a claim is that children with special educational needs are a classic example of technology improving learning. For children with special needs, especially those with physical disabilities such as deafness or vision impairment, technology not only improves learning, it actually enables learning. Without adaptive technology, many disabled children could not access certain types of education. But there is a mass of evidence to show that technology is not only making the difference for all learners, it is actually creating new and previously unattainable opportunities for learning. Technology does make a difference.

A recent research study at the Durham University in the North East of England suggests that multi-touch, multi-user surfaces can improve the learning of mathematics. 400 children were involved in the study, which demonstrated that 'smart tables' enabled better collaboration and problem solving during maths lessons. Class teachers receive a live feed of output from the children's interactions on the surface, and can intervene when necessary. Research has shown that the touch surfaces enable children to discover a range of alternative solutions to maths problems, simply through interacting with each other in new ways.

Image source

Creative Commons License
Making a difference by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 23, 2012

Are QR codes redundant?

It seems only a short while since we first became aware of Quick Response (QR) codes. In fact, they have been around since 1994, and were originally created to enable the Japanese car manufacturing giant Toyota to track its vehicles during the manufacturing process. Now QR tags are just about everywhere you look, including advertising hoardings, buses and trains, magazines and even coins. They are essentially two dimensional bar codes that you can scan using your mobile device. The beauty (if you can call it that) of the QR tag is that it will quickly take your mobile device browser to a web site with no other effort than a button click. But as many users will tell you, scanning a QR code can be a little hit and miss.

QR codes have polarised the education community over their usefulness. Some argue that they have no real use beyond faddishness and 'wow' factor, whilst other educators are forging ahead, developing ideas for their pedagogical use. Slowly over the last few years, educational uses have begun to emerge, with some pedagogical applications already being tried out in authentic contexts. And yet, even while QR codes in education are still in their emergent state, questions are being asked about their future, and whether they have already become redundant.

Enter Blippar, an augmented reality tool that is hailed as the QR killer. Apparently it can do everything QR codes can do, but a whole lot more too. I first heard of Blippar when I picked up the November 2012 issue of the ShortList magazine, currently the most widely circulated free men's lifestyle magazine in the UK. The banner headline read 'Special Interactive Gaming Issue', which immediately piqued my interest. From cover to cover, the magazine features, articles, adverts and editorial are all marked with a small yellow 'Scan this page for more' symbol. Using the downloadable app from Blippar, readers can capture the image of the page, which takes them to an interactive website or gaming application. Blippar's managing director Jessica Butcher is fairly triumphant about what she naturally considers to be the advantages of Blippar over QR tags, declaring 'Rather than adding an ugly black and white pixellated box to an ad creative, Blippar can take the creative itself (the whole poster, a logo, the product itself) as the trigger for an interactive engagement.'

She has a point. We certainly wouldn't wish to ruin the aesthetics of adverts, would we? Seriously, I have always thought QR tags to be a little ugly in their appearance. The Blippar app is designed to recognise an image from almost any angle, at a distance, and even in poor light conditions, depending on the quality of your mobile device camera. This makes it a whole lot more reliable than scanning a QR tag, in my experience at least. Just like QR codes, Blippar can also recognise where the user is geographically through the GPS system on the mobile device they are using. For advertisers this is a distinct advantage, but I can also see many educational uses for these features.

Ultimately, those who are speculating on the future of paper based resources might like to consider Blippar and other similar data capture augmented reality tools. The future is likely to see a combination of paper based and e-books, or more likley a hybrid of paper based and AR enabled products, designed to function together with the user's mobile device, working in concert to provide students with interactive learning experiences wherever they are. Paper is not dead yet. It's just become enhanced.

Read the full article here: Can Blippar make QR codes redundant?

Image by Steve Wheeler

Creative Commons License
Are QR codes redundant? by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 21, 2012

Teaching artistry

I taught my first art lesson today. Ever. Passing colleagues were a little surprised to see me teaching in the art room, completely out of context. Normally I'm found teaching a session on educational theory or psychology, or information and communication technologies. Teaching an art lesson is therefore a little outside my comfort zone. And yet, earlier today, I found myself surrounded by students with easels, wielding pencils, as we conducted a drawing class.

The drawing session was a part of our BA degree in Education Studies, and the module we were teaching - 'Creativity in Education' - which encourages students to explore through embodied practice the theoretical and practical relationships between education and creativity.  Throughout the year we will be exploring creativity through a range of activities, including dance, photography, video, music, and art. During the module the students will be asked to keep a reflective blog or video diary. At the end of the module they will present their work as a creative portfolio, and the final session will see a public performance of their work. Many of the sessions will involve some aspect of learning by making, a powerful pedagogical method also known as constructionism.

I say the drawing session was outside my normal comfort zone, because it is quite a departure from my normal teaching topics. And yet those who know me will recall that when I was younger I studied fine art and graphic design for a couple of years at Hereford College of Art. I have never stopped being an artist. Whether painting a water colour landscape (my favourite medium) or making a new slideshow for a talk, I always try to portray my ideas creatively, in a manner that is pleasing to the eye. Although I have never given an art lesson before, it seemed fairly natural to me to do so now. With the students we explored a range of drawing activities, from conventional still life drawing, through to speed drawing, where the objects were constantly changing. Of particular interest to me, as always, was the conversation I had with the students as we were working. Many also admitted to being outside their comfort zones as they participated in the drawing exercises, because they professed no skill or expertise in art. Their willingness to engage spoke volumes, because ultimately, the session was not about learning how to draw, but learning an appreciation about how creativity can be applied to classroom layout, curriculum design and teaching. One aim of the module is to encourage students to think creatively about education, using their imagination, and exploring a variety of perspectives on how creativity can be unleashed in the current school systems.

Most of us would acknowledge that teaching is an art as well as a science. There is a certain artistry that educators need to acquire and practice if they want success in the classroom. Teaching is a performance, and those who are creative are constantly able to reinvent lessons, resources and spaces. Creative teachers tend not to worry too much about barriers or constraints, but are constantly seeking solutions and new ways to do things, to improve and enhance learning. Too often, teachers and learners are constrained by their environment, time, school culture, legislation or simply not having access to appropriate resources. Probably the worst barrier to good teaching and learning though, turns out to be lack of imagination.

"Anything can make you look, but only art can make you see."

Image source

Creative Commons License
Teaching artistry  by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 16, 2012

Next generation learning

In my previous blog post, the architecture of learning, I outlined some of the key characteristics of learning in a digital age, and started to identify some of the main differences between Learning 1.0 (before social media) and Learning 2.0. In the summary of the article, I suggested that the distinct differences between the two types of learning are mostly based on how learners are changing the ways they interact, and their increased ability to create, share and organise their own learning. Learning 2.0 is socially much richer and more participatory, and relies more on interaction with other learners than any previous learning approach. This change has been realised through access to inexpensive internet tools that offer easy ways to connect with others of similar interest. There is a growing understanding that it's not so much what you know anymore, but who you know. No longer is the computer your only mind tool and extension of your memory - now you can also call on everyone else in the world. Social media are enabling learners everywhere to connect and work together with each other, forming convenient communities and networks of shared interest. The full power of the Learning 2.0 approach has yet to be realised, but already we are seeing radical shifts in the way learning is conducted. I also argued that if we view sequenced versions of the Web, based on the way learners use it, we will inevitably have to think of Learning 3.0, and beyond. This led me to think about what we might see in the future of learning, based on present trends, and our anticipation of what new technologies and approaches we think are on the horizon. So here we go - Learning 3.0...

Learning 3.0, if we are to believe all the hype, will be located within a semantic based architecture of webs - a 'meta-web'. I see it arising partly from what is happening on the web right now, but also as a result of new intelligent filtering tools. Increasingly, as users contribute to the content, links and pathways of the social web, it will become more 'intelligent', and will recommend to its users the best ways to find what they are looking for. It will also recommend things that users don't know they need yet, predicting their 'needs' based on their previous behaviour and choices. Learning 3.0 will see learners using sophisticated new web tools that are intricately connected to each other, are context aware, and are accessed through intuitive and natural interfaces. Here we begin to think not only of voice activated, gestural controlled interfaces, but we also need to start considering biometric recognition systems such as retinal scanning, facial recognition and even directly implanted devices that allow us to control our devices merely by thinking (see the table below). Where Learning 1.0 was organised around taxonomies and content was largely expert generated, Learning 2.0 has seen as shift toward user generated content, and the emergent property of folksonomies. We have known for some time that people learn better when they are actively engaged in making things, solving problems and engaging with others. Social media have provided the tools to achieve this on a global level.

Learning 3.0 will be user and machine generated, and will in all respects be represented in what I will call  'rhizonomies'. The rhizonomic organisation of content will emerge from chaotic, multi-dimensional and multi-nodal organisation of content, giving rise to an infinite number of possibilities and choices for learners. As learners choose their own self determined routes through the content, so context will change and new nodes and connections will be created in what will become a massive, dynamic, synthetic 'hive mind'. Here I do not refer to any strong artificial intelligence model of computation, but rather a description of the manner in which networked, intelligent systems respond to the needs of individual learners within vast, ever expanding communities of practice. Each learner will become a nexus of knowledge, and a node of content production. Extending the rhizome metaphor further, learners will act as the reproduction mechanisms that sustain the growth of the semantic web, but will also in turn be nurtured by it. Learning 3.0 will be a facet of an ongoing, limitless symbiotic relationship between human and machine.

Whatever Learning 3.0 is or will become, we can be assured it will be completely different to what has preceded it. We will witness new modes of learning, new ways of interacting and new ways of representing knowledge that will be both robust and mutable, personally contextualised, but without boundaries. I believe the future of learning is going to be very exciting indeed.

Postcript: My thinking in this blogpost is embryonic and is as ever, open to challenge. I may be hopelessly off target here, because this is uncharted territory for me. But I am taking the risk to air my views in public about this topic just to see what feedback I will receive from my professional learning network. I therefore value any dialogue (on this blog and elsewhere), corrections, advice and suggestions as I attempt to navigate my way through the thinking process about what kinds of learning we might see in the future.

Image source

Creative Commons License
Next generation learning by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 13, 2012

The architecture of learning

One of the characteristics of Web 2.0, according to the man who coined the phrase, is to be found in its architecture. As far as Tim O'Reilly is concerned, Web 2.0 tools are configured in such a way that they 'get smarter the more people use them.' This facet was explained very clearly in Michael Wesch's excellent video Web 2.0 .. The Machine is Us/ing Us, which shows how web tools work better the more people use them. Social tagging for example, becomes increasingly stronger as people populate it with content and links. Blogs rely not only on content, but on users, and ultimately on the dialogue that ensues between all those who read the content. In his famous Wired article, Kevin Kelly predicted this by suggesting that Web 2.0 was about leveraging collective intelligence. Web 2.0 has marked a shift in emphasis from the personal computer to the web, and the services it conveys. Web 2.0 is qualitatively different to what preceded it. Essentially, where Web 1.0 was about pushed content, and a 'sticky internet' where users could change very little, the evolution of the web into Web 2.0 has been viewed as epitomising the power of participation, and arguably, it's also about the democratisation of the internet.

So how does Learning 2.0 fit into this landscape? In order to deconstruct Learning 2.0 - Stephen Downes was the first to coin the phrase eLearning 2.0 - we first need to decide what we mean by Learning 1.0. For me, Learning 1.0 (if there ever was such a thing and it can be equated to Web 1.0) represents a relatively passive individual learning mode where expert generated content is pushed at the learner. It represents a top-down, hierarchical delivery of content (and content really is king in this mode), which ideally demands specific (observable) behaviours from the learner that can be measured and assessed objectively.  Behaviourism and Cognitivism are theories that could comfortably be applied to describe the activities seen within a Learning 1.0 scenario. Bloom's taxonomy is also a framework that might be applied to underpin and explain the levels of activity that would ensue from Learning 1.0 type activities. It is reminiscent of the 1980s Computer Assisted Learning model, where learners sat at a computer, received linear sequences of content, responded to it by answering multiple choice questions, and were presented with remedial loops or 'relearning' when they failed to reach the required standard of understanding.

By contrast, Learning 2.0 is recognised by more active and participatory modes of learning, and they are rarely isolated learning activities. As Web 2.0 has evolved, we have seen an increasing amount of interactive content becoming available. This content is generated not only by the experts, but also increasingly by the learners themselves, and tends to be organised by the community rather than by the experts. It is not a hierarchy and it does not obey top down rules, but in more likely to be a heterarchy. The emergent properties of content organisation are folksonomies, and are the product of loose organisation that is bottom-up rather than top-down. One of the best theories to describe how learning is organised in Web 2.0 environments is social constructivism, because learners increasingly rely on social interaction, and appropriate tools to mediate dialogue. Collaborative, shared online learning spaces such as wikis and discussion forums are characteristic meeting places where content can be created and shared, and the community also organises and moderates this content using specialised services such as aggregation, curation and tagging tools.

When we talk about web versions, we inevitably travel down a road where significant step changes in the evolution of the web mark new ways of using it. If there really is a Web 1.0 and a Web 2.0, then we can expect eventually to see a Web 3.0, and can expect to see new forms of learning and social interaction advancing as a result. In my next blog post, I will try to describe what we can expect from Learning 3.0 using a similar explanatory framework.

Photo by Steve Wheeler

Creative Commons License
The architecture of learning by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 12, 2012

The future of gaming

Games based learning is one of the most important strategies for 21st Century education. We have enjoyed playing games since time immemorial, and video arcade games such as Asteroids and Space Invaders of the 1970's were just the start of the emergence of digital games. Recently, with the development of handheld controls (such as the Nintendo Wii), 3D screens (Nintendo 3DS) and non touch gestural and voice controls (Microsoft's XBox 360 Kinect) games have become increasingly captivating, and have an immersive quality. Games, whether digital or analogue, have the capability to motivate learners, challenge them to improve their dexterity, problem solving and reasoning skills, encourage teamwork and collaboration (Nemerow, 1996) - especially social games such as World of Warcraft or Call of Duty - and performance is under constant peer review. These match some of the key skills required to succeed in the world of work where digital technology is prevalent. Thiagarajan (1998) believes that games have five major characteristics that are important for learning, These are conflict, control, closure, contrivance, and competency. Clearly, digital games have a great deal to offer the future of learning. So what can we expect of games based learning in the future?

Recent interviews in the magazine Short List by Ellison (2012) feature the opinions of several acknowledged video games experts. Ben Wilson, editor of Official Playstation Magazine UK believes that games will continue to improve in quality, with characters exhibiting more realistic human behaviour. Drivers in racing games for example, 'could be pressurised into making errors, footballers might make more realistic runs, or be angered into reacting to a late lunge or a dig in the ribs.' David Darling, who is one of the co-founders of Codemasters, sees games consoles becoming even smaller, and agrees with Wilson that the resolution of graphics will continue to improve. His main contention though is that games will become more augmented, and tied into the human emotions via retinal projection. Darling sees us playing games in the near future by proxy, controlling our avatars from a distance, with our senses stimulated so that we feel we are 'virtually there.'

Brain control is also something predicted by David Cage. The visionary designer who is behind PS3 games such as Heavy Rain and Beyond: Two Souls, believes that in the future, games playing will be radically different, requiring no controls. We will simply think our way around the game using our mind power, using directly implanted sensors. Jon Hicks, editor of Official Xbox 360 Magazine also takes a futuristic view. He feels that we have just about reached the limits of what we can achieve with screens and controllers. The next stage, he says, is to place the gamer even deeper into the virtual world. He believes that motion sensors will use information about our body postures, facial expressions and biofeedback to tap into our emotions, and then do 'amazing or even terrifying things with that information'. Combine this emotion tracking with augmented reality and we are approaching the ultimate experience. 'Imagine a Silent Hill game that can work out how scared you are, and change accordingly' he says, ominously.  

Whatever the future holds for gamers, we can be sure it will be different, more enhanced and more realistic than it is right now. So remember, the next time you venture into Azeroth, or don the hood of Connor Kenway, you may be taking your first steps toward a brave (and very scary) new world where reality blurs with fantasy, and where your learning will never be the same again.

References

Nemerow, L. G. (1996) Do classroom games improve motivation and learning? Teaching and Change, 3 (4), 356-361.
Ellison, J. (2012) You're going to need a bigger living room. ShortList, 8 November, 250, 49-52.
Thiagarajan, S. (1998) Ask Thiagi. Thiagi Game Letter, 1 (4), 6.

Photo by Steve Wheeler


Creative Commons License
The future of gaming by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 11, 2012

Skills for Learning 2.0

I have been thinking and writing about 'Learning 2.0' for some time now. This is the argument that there has been a paradigm shift in the way students learn - from 1.0 to 2.0, from passive to active, from individual to social and from consumer to producer. This shift seems to run parallel to the development of the web over the last decade, and resonates with many who observe 21st century, digitally mediated learning in all its forms.

The University of Toronto's Mark Federman is a major contributor to this discourse. The writings of Federman's late Canadian compatriot Marshall McLuhan clearly pervades his work. During a live television programme on 21st Century learning recently, he was asked whether the three 'R's (Reading, Writing and aRithmetic) would still be relevant to this generation of learners. Federman's response was slick and insightful, even though it had probably been scripted well in advance of the TV show. He declared confidently that for this generation, the three 'R's would not be as important as the four 'C's. Asked to expand on this, he listed the four 'C's: Connection, Context, Complexity and Connotation. Although these are essentially characteristics of modern life, we can contextualise them as skills or literacies. Here are my thoughts and  interpretation of Federman's framework, illustrated above with one of my most recent slide graphics.

Firstly, learners need to be able to connect. In today's fast paced and change ridden world, learners need now, like never before, to be able to connect through technology to peers, experts, content and services. One of the most valuable assets a 21st Century learner has is their personal learning network (PLN). And we are all 21st Century learners, even if we are not enrolled on an accredited study programme. A lot of what is learned (some claim up to 70 per cent) is informal, and with a powerful enough network of connections to a PLN, there is no limit to what a learner can achieve.

Secondly, learners need to be able to contextualise their learning. Bill Gates once famously stated that content was king. This is no longer the case. Now context is king, because situated learning is powerful, and access to content is just the start of learning. Learning can be contextualised in so many different ways, and this is why personal learning tools are so important. The capability to personalise learning environments, exercise agency over the tools and systems you wish to use, and the ability to apply learning to your own individual situation, are extremely important components of successful learning today.

Thirdly, learners need to be able to work with complexity and be able to interpret, filter out extraneous content, and make meaning. They need to be prepared for uncertain futures, none of which can be accurately predicted. In short, they need to be able to see the wood from the trees. There are many tools available today that learners can use to harness the power of web based content, including aggregation, curation and tagging tools, all of which can simplify complexity and allow learners to gain a purchase on chaos.

Finally, learners today need to be able to make meaning from the mass of content they are bombarded with each and every day. Many learners make meaning through discussion, but increasingly we are witnessing a shift toward user generated content, where learners are creating their own videos, blogs, podcasts, slidesets and other digital artefacts to make meaning.

Graphic by Steve Wheeler

Creative Commons License
Skills for Learning 2.0 by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 07, 2012

Be open

In Lord of the Rings, the wizard Gandalf deliberated and wrestled long and hard to open the doors to the mines of Moria. In the story of Ali Baba and the 40 thieves, the mouth of the cave was opened by uttering the phrase iftaḥ ya simsim - 'Open Sesame'. In the New Testament, Jesus Christ healed a deaf girl by uttering the word 'Ephratha' - meaning 'be opened'. All through our history and popular culture we hear stories about difficult problems or barriers being solved or overcome. There are many, many problems in the world, some of which are impossible to solve. Others appear to be impossible to solve until someone comes up with a solution, and then we all say - ah yes, I can see the answer now. 

One problem we face in the 21st Century is how to educate everyone. If we believe education is a fundamental human right, then we go all out to provide good, affordable, accessible opportunities to learn the important things we will need to survive in an uncertain world. And yet, 500 million children remain outside of education because they cannot afford to attend. We have enough money to make it happen. But it stays the same old problem. In the speech below, which I gave at the Solstice Conference in June 2012 at Edge Hill University, I argue that we need to be more open about our content and tools, ownership of learning, intellectual property and even the very practices we participate in on a daily basis - open scholarship if you will. I talk about Creative Commons, open source software, open access journals, open educational resources, community led initiatives such as MOOCs and the whole idea about being open and sharing your learning. These ideas may not fully address the problem of how to educate everyone, but at least we will make a start by making learning more accessible.

Knowledge is like love. You can give it away as much as you like, but you never lose it. The more we give away our knowledge, the more we are educating our world. So be open. You know you want to.




Creative Commons License
Be open by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 05, 2012

Digital Learnscapes

It's that time of the year again where we are planning for the Pelecon - the Plymouth Enhanced Learning Conference - that we hold each year in April at Plymouth University. Pelecon 2012 was probably our best conference yet. Many have said that Pelecon events inspire and energise delegates and challenge perspectives on education, learning and development, and the role of technology. Others have said that Pelecon is one of the best forums for informal debate on the learning technology conference circuit. The Pelecon conference is for teachers and learning professionals in all sectors of education and training, and attracts delegates from all over the world. Pelecon 2012 was the seventh conference, and featured invited speakers including Alec Couros (Canada), Leigh Graves Wolf (USA), Helen Keegan, Simon Finch, Keri Facer, David Mitchell and Jane Hart. Pelecon 2013 will maintain the pace and dynamism of this year's event, and already several well known keynotes have been announced. Go to this link to keep up to date as we make further keynote speaker announcements over the next two weeks. The call for papers for Pelecon 13: Digital Learnscapes is now live, and is summarised below:

We live in a period of change and uncertainty. Many are bewildered by these changes and find it difficult to keep up, particularly in the education and training sectors. The ability to anticipate and prepare for change is the mark of innovative educators, as is the skill of harnessing new and emerging tools to promote good learning.

At Pelecon 13 we want to provide learning professionals with opportunities to explore, discover and discuss new approaches, new technologies and new ideas to enhance, enrich and extend their own professional practice. There will be particular emphasis this year on simulations and games, personal learning tools, new pedagogies and practices, learner and teacher voice, and digital literacies.

The deadline for submission of workshop, paper and demonstration proposals is January 25, 2013 and you can submit your abstracts here. We invite submissions from primary, secondary and tertiary education, as well as from learning and development and other training sectors. Just to whet your appetite and pique four interest, below is a teaser video made by our very own maestro Dr Jason Truscott. We hope to see you at the Pelecon in April!



Creative Commons License
Digital Learnscapes by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


November 01, 2012

Theories for the digital age: Paragogy

In previous posts in this series I have explored some of the characteristics of learning in the digital age. One more notable feature of 21st Century learning is peer learning. Highlighting the fast paced nature of the web, Thomas and Seely-Brown (2011) suggest that peer learning can be both timely and transient. They show that never before has access to information and people been so easy and so widespread, and that we make connections with people who can help us manage, organize, disseminate and make sense of the resources. Such interconnectedness and willingness to share creates a new kind of peer mentoring that operates at multiple levels and many degrees of expertise, supporting learning in all its complexity. The notion of ‘paragogy’ (Corneli and Danoff, 2011) relates to the peer production of learning but as Corneli (2012) warns, such an agenda may be at odds with established educational systems in some respects, and may even be opposed by some. This is due to the challenge that ‘students teaching themselves’ might pose to the privileged knowledge and power structures many formal educational institutions continue to hold in such high regard.


In essence, Corneli and Danoff’s paragogy thesis is premised on the argument that online environments are now sufficiently developed to support peer production of content which can be shared freely and widely, and can promote learning for all within any given community. Again, this echoes the connectionist and heutagogic ideals earlier discussed in previous posts, whilst at the same time presenting a challenge in terms of the quality, reliability and provenance of content. The user generated content currently available on the web has been criticised for its inconsistent quality (Carr, 2010) and its potential to encourage plagiarism, piracy and a host of other nefarious practices (Keen, 2007). User generated content has also attracted criticism over issues of mediocrity, lack of accuracy and superficial scholarship (Brabazon, 2002; 2007). Notwithstanding, many are now turning to web based user generated content to educate themselves and to share their learning. In many ways, the ability to use personal technologies to create, organise, share and repurpose content, in many formats across the global web environment has become a democratising, liberating factor in education. There are now a variety of new ways we can create peer networks, learn from each other and share our ideas. In so doing, we are building what Illich (1971) once termed ‘the learning webs’ that will enable each of us to defines ourselves by both learning, and contributing to the learning of others.     

References
Brabazon, T. (2002) Digital Hemlock: Internet Education and the Poisoning of Teaching. University of South Wales, Australia.
Brabazon, T. (2007) University of Google: Education in the (Post) Information Age. Aldershot, Ashgate Publishing. 
Carr, N. (2010) The Shallows: What the Internet Is Doing to Our Brains. New York, NY: W. W. Norton.
Corneli, J. and Danoff, C. J. (2011) Paragogy. In: Proceedings of the 6th Open Knowledge Conference, Berlin, Germany.
Corneli, J. (2012) Paragogical Praxis, E-Learning and Digital Media, 9(3), 267-272
Illich, I. (1971) Deschooling Society. London: Calder and Boyers.
Keen, A. (2007) The Cult of the Amateur: How today’s Internet is killing our culture and assaulting our economy. London: Nicholas Brealey Publishing.    
Thomas, D. and Brown, J. S. (2011) A New Culture of Learning: Cultivating the Imagination for a World of Constant Change. Douglas Thomas and John Seely Brown. 

Image source

Creative Commons License
Theories for the digital age: Paragogy by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 29, 2012

Theories for the digital age: Self regulated learning


Informal and self regulated learning are defining characteristics of 21st Century education. Various commentators suggest that as much as seventy percent of learning occurs outside of formal educational settings (Cofer, 2000; Dobbs, 2000; Cross, 2006). If these are accurate statistics, they present a significant challenge to schools, colleges and universities. One challenge for education providers is to decide whether they will support the desire of students to self regulate their learning activities using personal technologies. Institutes that discourage the Bring Your Own Device (BYOD) movement may be perceived by their students as anachronistic. Those who do support BYOD for students and staff will need to invest significant time and resources into ensuring cross platform operability and seamless delivery to students’ personal technologies.

Self regulation of learning is thought to be a characteristic of individual students (Beishuizen, 2008) but increasingly can be contextualised within social learning environments. A number of collaborative and social networking tools regularly play a role within the average student PLE. Self regulation has been shown to enhance and improve learning outcomes (Paris & Byrnes, 1989; Steffens, 2008), enabling learners to achieve their full potential (Delfino et al, 2008).  Personal technologies are thought to enable self-regulation at a number of levels, including the ‘object’ and ‘meta’ levels of learning, supporting maintenance, adaptation, monitoring and control of a variety of higher level cognitive processes (Nelson & Narens, 1990). By using personal devices as ‘mindtools’ to offload simple cognitive tasks, students can extend their own memories (Jonassen  et al, 1999), build their confidence, and increase their motivation levels (Goldsworthy et al, 2006). Further, personal devices enable individuals to gain access and to participate at many levels within their communities of practice, from ‘entering by learning’ through to ‘transcending by developing’ (Ryberg & Christiansen, 2008). All of this is often achieved by students outside the formal surroundings of school or university, with no time or location constraints.

Moreover, there is a sense that personal technologies encourage learners to be self-determined in their approach to education. Hase and Kenyon’s (2007) conceptualisation of self determined learning - or heutagogy - places the emphasis on non-linear, self-directed forms of learning, and embraces both formal and informal education contexts. The central tenet of heutagogy is that people inherently know how to learn. The role of formal education is to enable them to confidently develop these skills, encouraging them to critically evaluate and interpret their own personal reality according to their own personal skills and competencies. The ethos of heutagogy extends to learner choice, where students can create their own programmes of study, a feature often seen in the loose and unstructured aspects of some Massively Open Online Courses (MOOCs). In many ways, heutagogy is aligned to other digital age theories, in that it places an importance on ‘learning to learn’, and the sharing rather than hoarding of that knowledge. It is not difficult to see that such sharing of knowledge can be easily achieved through social media and the use of personal digital technologies. 

[This is an excerpt from a forthcoming publication entitled: Personal Technologies in Education: Issues, Theories and Debates]

References
Beishuizen, J. (2008) Does a community of learners foster self-regulated learning? Technology, Pedagogy and Education, 17 (3), 183-193.
Cofer, D. (2000) Informal Workplace Learning. Practice Application Brief No. 10, U.S. Department of Education: Clearinghouse on Adult, Career, and Vocational Education.
Cross, J. (2006) Informal Learning: Rediscovering the natural pathways that inspire innovation and performance. London: John Wiley and Sons. 
Delfino, M., Dettori, G. and Persico, D. (2008) Self-Regulated Learning in Communities. Technology, Pedagogy and Education, 17 (3), 195-205.
Dobbs, K. (2000) Simple Moments of Learning. Training, 35 (1), 52-58.
Goldsworthy, S., Lawrence, N. and Goodman, W. (2006) The use of Personal Digital Assistants at the Point of Care in an Undergraduate Nursing Program. Computers, Informatics, Nursing, 24 (3), 138-143.
Hase, S. and Kenyon, C. (2007) Heutagogy: A Child of Complexity Theory, Complicity: An International Journal of Complexity and Education, 4 (1), 111–118.
Jonassen, D. H., Peck, K. and Wilson, B. G. (1999) Learning with technology: A constructivist approach. Upper Saddle River, NJ: Prentice-Hall.
Nelson, T. O. and Nehrens, L. (1990) Metamemory: A theoretical framework and new findings. In G. H. Bower (Ed.) The Psychology of Learning and Motivation, New York, NY: Academic Press.
Paris, S. G. and Byrnes, J. P. (1989) The constructivist approach to self-regulation and learning in the classroom. In B. J. Zimmerman and D. H. Schunk (Eds.) Self Regulated Learning and Academic Achievement: Theory, Research and Practice. New York, NY: Springer.  
Ryberg, T. and Christiansen, E. (2008) Community and social network sites as Technology Enhanced Learning Environments. Technology, Pedagogy and Education, 17 (3), 207-220. 
Steffens, K. (2008) Technology Enhanced Learning Environments for self-regulated learning: A framework for research. Technology, Pedagogy and Education, 17 (3), 221-232.  

Drawing Hands by M C Escher 

Creative Commons License
Theories for the digital age: Self regulated learning by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 28, 2012

Theories for the digital age: Postmodern perspectives

Postmodernist views of society can be appropriated as lenses to analyse the personalised use of digital technology. Consumers of Web based content tend to search randomly and nomadically, due to the multi-layered, multi-directional nature of hyperlinked media and this aligns neatly with some post modern theory. The writings of Deleuze and Guattari (1980), for example, feature the nomadic thought processes that characterise contemporary perceptions, and portray the chaos of modern life. They employ the botanic metaphor of rhizomatic root systems to describe multiple, chaotic non-hierarchical interpretations of knowledge. Rhizomes resist chronology and organisational structures, thereby more accurately representing the unstructured but purposeful manner in which many people now use the Web.

Significantly, because rhizomes are open ended, the importance of Deleuze and Guattari’s rhizome explanation is not invested in individual components, but rather in the direction of motion the entire organism can adopt at any given time. This is reminiscent of the participatory Web, which consists not so much of the insights and offerings of individuals, but rather of what Surowiecki (2009) has termed ‘the wisdom of the crowds’ – the seemingly random folksonomic directions chosen by entire communites of users as having meaning and importance. The community decides what is important to learn, so in effect, the community becomes the curriculum (Cormier, 2008).

According to Cormier (2008) a rhizomatic interpretation of education is useful because it embraces the ever changing nature of knowledge, is open ended, and is not driven by specific curricula whilst learning is ‘constructed and negotiated in real time by the contributions of those engaged in the learning process.’  This form of negotiated meaning more clearly represents the knowledge acquisition processes that occur within the transient discussion threads and ephemeral collaborative spaces on the World Wide Web.

The colonisation of knowledge spaces by communities is self sustaining, and in Deleuze and Guattari’s terms, we see individuals assuming the roles of nomads, maintaining a constant state of becoming and transformation. Again, this is reminiscent of the random searching, scanning and jumping around content through hyperlinking that learners participate in when they traverse the digital landscape. In effect, students participate as flâneurs, acting as individual agents, investigators and explorers of their own personal digital terrains. Their seemingly aimless behaviour belies their essentially purposeful wandering, as learners interrogate their environment in attempts to make sense of it, understand it, participate in it, and ultimately portray it (Baudelaire, 1964). 

[This is an excerpt from a forthcoming publication entitled: Personal Technologies in Education: Issues, Theories and Debates]  

References

Baudelaire, C. (1964) The Painter of Modern Life, New York, NY: Da Capo Press. (Originally published in Le Figaro, in 1863).
Deleuze, G. and Guattari, F. (1980) A Thousand Plateaus: Capitalism and Schizophrenia. London: Continuum.
Surowiecki, J. (2009) The Wisdom of Crowds: Why the Many are Smarter than the Few. London: Abacus. 

Photo by Steve Wheeler

Creative Commons License
Theories for the digital age: Postmodern perspectives by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 27, 2012

Theories for the digital age: The digital natives discourse

Is learning in the 21st Century significantly different to learning in previous years? One of the more controversial theories of the digital age is the claim that technology is changing (or rewiring) our brains (Greenfield, 2009) whilst some also claim that prolonged use of the Web is detrimental to human intellectual development (Carr, 2010). It could be argued that these theories stem back to the seminal claim of Marshall McLuhan (1964) that ‘we shape our tools and thereafter, our tools shape us.’ This belief was also the basis for the Digital Natives and Immigrants theory (Prensky, 2001), a persistent discourse that has greatly influenced the thinking of educators in recent years. A significant body of work has arisen around the Digital Natives and Immigrants theory, including descriptions of younger students as ‘the Net Generation’ (Tapscott, 1998), ‘Screenagers’ (Rushkoff, 1996), ‘Born Digital’ (Palfrey and Gasser, 2008), ‘Millennials’ (Oblinger, 2003), and ‘Homo Zappiens’ (Veen and Vrakking, 2006). The latter theory suggests that younger students learn differently, through searching rather than through absorbing, through externalising rather than through internalising information, are better at multitasking, and see no separation between playing and learning (Veen & Vrakking, 2006).

If these theories are true, and younger students do learn differently, the implications for education are profound, demanding changes to the way formal learning content is developed, delivered and organised, and a reappraisal of our conception of knowledge and what it means for education. There are, inevitably, objections to the Digital Natives position.

All of the above theories tend to characterise younger learners as being different to previous generations in their use of technology. These positions are countered by researchers who maintain that such claims are largely based on anecdotal and intuitive arguments, and that there is no significant difference in the way younger or older students manage their online learning activities (Crook and Harrison, 2008; Ito et al, 2009; Kennedy et al, 2010) and that the current generation of learners is far from homogenous (Bennett et al, 2008; Jones and Healing, 2012). Bennett et al (2008) also assert that there is no clear evidence that multi-tasking is a new phenomenon and exclusively the preserve of younger learners. Jones and Healing (2010) criticise the Digital Natives and Immigrants theory as too simplistic, and point out that a greater complexity exists in the way students of all ages use technology, based not on generational differences, but on agency and choice. There is yet further dissent. Vaidhyanathan (2008) argues that ‘there is no such thing as a digital generation.’ He suggests that every generation has an equal distribution of individuals with low, medium and high levels of technology competency. Vaidhyanathan is uncomfortable with the erroneous misclassification of generations and associated assumptions of technology competency levels, and warns: ‘We should drop our simplistic attachments to generations so we can generate an accurate and subtle account of the needs of young people – and all people, for that matter.’

Perhaps the most sensible advice comes from Selwyn (2009) who argues that contrary to the popularist beliefs expressed in the Digital Natives discourse, young people’s engagement with technology is often unspectacular (Livingstone, 2009). According to Selwyn, accounts of Digital Natives are often based on anecdotal evidence, are inconsistent or exaggerated, and hold very little in common with the reality of technology use in the real world. The Digital Natives discourse tends to alienate older generations from technology, and teachers can make dangerous assumptions about the capabilities of young people (Kennedy et al, 2010). Selwyn counsels: ‘Whilst inter-generational tensions and conflicts have long characterised popular understandings of societal progression, adults should not feel threatened by younger generations’ engagements with digital technologies, any more than young people should feel constrained by the “pre-digital” structures of older generations’ (Selwyn, 2009, p. 376).

Arguably the most useful explanatory framework for current online activities is offered by White and Le Cornu (2011), who have argued that habitual use of technology develops sophisticated digital skills regardless of the age or birth date of the user. They call these users ‘Digital Residents’ and suggest that those who are ‘Digital Visitors’ are less likely to be digitally adept because of their casual or infrequent use of digital tools.

References
Bennett, S., Maton, K. and Kervin, L. (2008) The ‘digital natives’ debate: A critical review of the evidence, British Journal of Educational Technology, 39 (5), 775–786.
Carr, N. (2010) The Shallows: What the Internet Is Doing to Our Brains.
New York, NY: W. W. Norton.
Crook, C. and Harrison, C. (2008) Web 2.0 Technologies for Learning at Key Stages 3 and 4,Coventry: Becta Publications.
Greenfield, S. (2009) The Quest For Identity In The 21st Century. London: Sceptre.
Ito, M., Horst, H., Bittanti, M. and Boyd, D. (2009) Living and Learning with New Media. Cambridge: MIT Press.
Jones C. and Healing G. (2010) Net Generation Students: Agency and Choice and the New Technologies. Journal of Computer Assisted Learning, 26, (3), 344–356.
Kennedy, G., Judd, T., Dalgarnot, B. and Waycott, J. (2010) Beyond Digital Natives and Immigrants: Exploring Types of Net Generation Students, Journal of Computer Assisted Learning, 26 (5), 332-343.
Livingstone, S.(2009) Children and the Internet. Cambridge: Polity Press.
Oblinger, D. (2003) Boomers, Gen-xers, and Millennials: Understanding the new students. Educause Review. 38 (4).
Palfrey, J. and Gasser, U. (2008) Born Digital: Understanding the First Generation of Digital Natives.New York, NY: Basic Books.
Prensky, M. (2001) Digital Natives, Digital ImmigrantsOn the Horizon, 9 (5).
Rushkoff, D. (1996) Playing the Future: What we can learn from digital kids. London: Harper Collins.
Selwyn, N. (2011) The Digital Native: Myth and Reality. Aslib Proceedings,61 (4), 364-379.
Tapscott, D. (1998) Growing up Digital: The Rise of the Net Generation. New York: McGraw Hill.
Vaidhyanathan, S. (2008) Generation Myth.The Chronicle of Higher Education.
Veen, W. and Vrakking, B. (2006) Homo Zappiens: Growing up in a Digital Age London: Network Continuum Education.
White, D. S. and Le Cornu, A. (2011) Visitorsand Residents: A new typology for online engagement. First Monday, 16 (9).

Image source

[This is an excerpt from a forthcoming publication entitled: Personal Technologies in Education: Issues, Theories and Debates]

Creative Commons License
Theories for the digital age: The digital natives discourse by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 26, 2012

Theories for the digital age: Connectivism

Learning in the industrialised world can now be contextualised within a largely technological landscape, where the use of digital media is assuming increasing importance.  Much of this learning is informal, (Commentators such as Cofer (2000), Cross (2006) and Dobbs (2000) place the proportion of informal learning at around 70%) and is also generally location independent.

The present technology rich learning environment is characterised by a sustained use of digital media, their integration into formal contexts, and a shift toward personalisation of learning. These facets of modern life in combination have led educators to question the validity of pre-digital age learning theories. In recent years a variety of new explanatory theories have been generated that can be applied as lenses to critically view, analyse and problematise new and emerging forms of learning. 

One highly visible theory is Connectivism (Siemens, 2004). Connectivism has been lauded as a ‘learning theory for the digital age’, and as such seeks to describe how students who use personalised, online and collaborative tools learn in different ways to previous generations of students. The essence of Siemens’ argument is that today, learning is lifelong, largely informal, and that previous human-led pedagogical roles and processes can be off-loaded onto technology. Siemens also criticises the three dominant learning theories, namely behaviourism, cognitivism, and constructivism, suggesting that they all locate learning inside the learner. His counterargument is that through the use of networked technologies, learning can now be distributed outside the learner, within personal learning communities and across social networks.

Perhaps the most significant contribution of Connectivist theory is the premise that declarative knowledge is now supplemented or even supplanted by knowing where knowledge can be found. In a nutshell, connectivism argues that digital media have caused knowledge to be more distributed than ever, and it is now more important for students to know where to find knowledge they require, than it is for them to internalise it. This places the onus firmly upon each student to develop their own personalised learning tools, environments, learning networks and communities within which they can ‘store their knowledge’ (Siemens, 2004). In McLuhan’s view, as we embrace technology, ‘our central nervous system is technologically extended to involve us in the whole of mankind and to incorporate the whole of mankind in us’ (McLuhan, 1964, p. 4). Clearly our social and cultural worlds are influenced by new technology, but are there also biological implications?

References
Cross, J. (2006) Informal Learning: Rediscovering the natural pathways that inspire innovation and performance. London: John Wiley and Sons. 
Cofer, D. (2000) Informal Workplace Learning. Practice Application Brief No. 10, U.S. Department of Education: Clearinghouse on Adult, Career, and Vocational Education.
Dobbs, K. (2000) Simple Moments of Learning. Training, 35 (1), 52-58.
McLuhan, M. (1964) Understanding Media. London: McGraw Hill.
Siemens, G. (2004) Connectivism: A LearningTheory for the Digital Age. eLearnspace

Image source

[This is an excerpt from a forthcoming publication entitled: Personal Technologies in Education: Issues, Theories and Debates]

Creative Commons License
Theories for the digital age: Connectivism by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 25, 2012

Writer's block

Anyone who writes regularly will tell you this: There are times when you struggle to write something worthwhile ... or even anything at all. Call it writer's block, call it the white page syndrome (or white screen in the age of the word processor), call it whatever you wish - there are times when the words won't come, and there is very little you can do about it. At such times, I tend to either write rubbish and then ditch it (boy, you should read some of my rejects - you'd laugh yourself sick), or more likely, walk away from the page/screen and go and do something else instead.

Blog posting is a very immediate kind of writing, so you need to make sure you have done it correctly. Once you have clicked the Publish button, your ideas are out there for the whole world to read. It's publish and be damned. Lawrence Lessig said about blogging that it is 'the most important form of unchoreographed public discourse that we have.' Counter this with Katie Hafner's wry parodic comment 'never have so many people written so much to be read by so few' and you will see that there are ups and downs to blogging (the patron saint of ups and downs is St Francis of a Seesaw). No matter how good your blog post is, no matter how incisive, devastatingly witty or profound your points are, if there is no audience for your writing, you may as well be whistling in the wind. Just how you drive people to your blog though, is beyond the scope of this particular post (phew, escaped from that one).

So how do you start off writing a blog post, and avoid the writer's block syndrome? More importantly, how do you write something that is worthwhile writing? My advice is to just start writing. If it turns out to be rubbish, you can always discard it. But write you must. Find a memorable or inspirational quote to start you off. Sometimes an evocative image will set your thought processes going. Write about something you know about, have an opinion on, or feel passionately about. You can also be controversial. Draw on evidence that supports your viewpoint, but also find those who argue against and include those too, for some balance. Use language that is accessible and easy to understand. But don't compromise on your own writing voice, which is often the one tool you can wield with devastating effect in any writing genre. Most importantly, try to engage your reader. Address them personally. That's something that makes you want to keep reading, isn't it?

There are all sorts of bells and whistles you can put into a blog post, but I have elaborated on several of my own ideas already so I won't bore you again. Ultimately, you should write blog posts because you want to share your ideas and receive comments and feedback from your readership. When done correctly, blogging is not just writing - it's a conversation. As always I welcome your comments on this post.



Image by Daniel Gies

This post first appeared on 29 August 2011, on this 
blog

Creative Commons License
Writer's block? by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


October 23, 2012

Learning with robots

In a previous blog post I wrote about learning by making, and discussed the theory of constructionism, which holds that we learn through immersing ourselves in, and engaging with situations. Not to be confused with constructivism (the theory first proposed in its cognitive form by Jean Piaget, and in its social form by Levrenti Vygotskii). A lot of research into learning by making was conducted by Seymour Papert, with notable learning tools such as the LOGO programming language being developed. As far as Papert is concerned, learning in this manner is important because it is a departure from transmission models of education, enabling us to construct and reconstruct knowledge in our own unique ways. One of the first uses for LOGO was to enable children to program a floor robot, giving it instructions to move around the room and perform simple tasks. This remains a very effective learning device - children love the idea of robots, and enjoy being able to control them. When used in conjunction with other talks, floor robots such as Beebot can become very powerful in introducing children to new ideas and new skills, and can encourage them to experiment, learn from their mistakes and develop higher cognitive processes.

This video demonstrates how the theory can be applied to scaffold children's numeracy skills


Creative Commons License
Learning with robots by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported LicenseBased on a work at steve-wheeler.blogspot.com.


October 15, 2012

Working the system


One of the questions I discussed today with some of my third year teaching students was: what is the use of school exams? We discussed why we should put kids through the stress and anxiety of testing, when tests do little to help kids to learn meaningful things. Testing is essentially a snap shot of what the student knows when the test is administered. It's a very effective method of scaring kids to death, and it's also a very efficient method with which Governments can gather data to indicate how well the cohort of students in each school has had their heads crammed full of useless facts. And so, educators then find themselves 'teaching to the test', just so that they can give their students a better chance at passing with a reasonable grade. The more students in that school who get good grades, the higher the school will appear in the league tables. Yesterday I wrote about the way the UK Government has cynically manipulated recent test results, with disastrous consequences.

So what about the kids? Isn't school meant to be for their benefit? Exams do little to help children to learn deep and meaningful stuff they can later translate into the reality of life beyond the school gates. How much do I recall from the exams I swotted for? Not a lot. What exams teach children is that if they rote learn lots of facts, figures and information, they can manipulate the system. Being able to regurgitate this kind of surface knowledge onto a test paper to score as high a grade as possible is as far removed from education as it is possible to be. Exams are at best a test of memory and a snapshot of what students 'know' when the test is administered. The exam itself tells us nothing about how children will cope with the messy, complex problems they will face in real life, or how good they are for example, at working in a team. Exams tell us next to nothing about their creative abilities or their cognitive agility. Project work, continuous assessment and monitoring of progress are much more likely to be indicators of how well a child is doing in school.

Chris Husbands, Director of the Institute of Education, recently made a telling statement on the topic in the Guardian newspaper:

"I'm not sure there is any evidence that exams are an improvement device on their own. What improves education is improving teaching and learning. Where exams play a part is the extent to which they provide structures that encourage improved teaching and learning. It's really important that we have rigour in our assessment. It's also really important that we are clear about what rigour means. And rigour means assessing children and young people on the basis of the knowledge, skills and understanding that are going to prepare them for adult life."

Do we need an overhaul of the school examination system? I think in it's current format, it is broken beyond repair. I would be very interested to hear your views.

Calvin and Hobbes cartoon courtesy of Universal Press Syndicate

Creative Commons License
Working the system by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported LicenseBased on a work at steve-wheeler.blogspot.com.


October 13, 2012

Moving the goalposts

Listening to the Welsh Minister for Education and Skills speak yesterday at the iNet Conference in Cardiff made me question once more the reasons for school exams. What Leighton Andrews AM had to say in his speech made me also question the sanity of those behind the recent GCSE exams fiasco in England. Schools are now resorting to legal action to challenge the UK Government's decision to downgrade the results of an entire national cohort of students.

Earlier this year, without any consultation or warning, grade boundaries were changed on the order of the UK Government exam watchdog Ofqual. Teachers who had prepared students for a particular grade expectation had the carpet pulled out from under their feet. Many students were disappointed by their downgraded results. Schools, students, teachers and parents all feel betrayed. And there is no comeback it seems. And yet the Welsh Assembly, which was devolved several years ago from Central Westminster control, took the bold and intelligent step to say 'no' to the results. As far as Leighton Andrews is concerned, the students who took the exams under one condition, should be marked under that same condition, and their grades upheld. He ordered the WJEC (Welsh examinations board) to regrade all the downgraded results so that students received the original grades they deserved. Andrews deserves a medal for his stand. He is one of very few who actually have the backbone to stand up and be counted on this issue. In his speech, Andrews asked how we could possibly expect school improvement, when devaluing examination results militates against their position in the school league tables? It's as if all schools are now being punished for simply following the rules.

The bottom line is this: In the UK, exams are used by Government more to provide indicators of school effectiveness than they are for providing students with qualifications. The GCSE qualifications are political footballs that are kicked around by both sides of the House, and ultimately, the metrics generated by each year's results are crunched together to produce school league tables. This disgraceful state of affairs has been happening for some time. Exams are no longer about giving students the opportunity to shine, to show what they have learnt. It is now purely a mechanism for data gathering. Yet according to some commentators, the current fiasco will render school league tables invalid, for this year at least.

Now we also have a politically motivated and grossly unfair assessment regime. Imagine Olympic athletes sprinting for the line, only to discover half way through the race that the finishing tape had been moved another few hundred metres down the track. Imagine if they had trained for 400 metres and then had to run 800 instead. Unfair? Yes it would be. Grossly unfair. And yet this is exactly the same trick that has been perpetrated upon an entire year of students. We cannot prepare children for examinations using one set of standards, and then impose a new set without warning. We don't move the goalposts halfway through a football game. Why did the UK Government sanction grade boundary changes right in the middle of an academic year? What message does this send to an entire generation of young people? I remarked in my speech at iNet that it was a real shame that the English could not devolve from Westminster as the Welsh have done. It raised a few smiles, but it was a serious remark.  Not only have the Welsh stood up against Westminster and refused to play the moving goal post game, they also banned standardised testing for under 16s throughout their school system. And for good reason.

Back in 2007, the General Teaching Council argued that school exams should be banned for children under 16 because the stress caused to young children was 'poisoniong attitudes toward education'. The GTC also called for a review of all standardised testing practices because there is no evidence that exams are improving school standards. The GTC was disbanded by the Government in 2010, as a part of its 'austerity' cutbacks. If exams are causing students unnecessary stress, and testing is not contributing toward school improvement, then why are we still persisting? One definition of madness is trying the same thing over and over again, in the hope that a different result might be obtained. There are many better methods of tracking student progress than exams.  Many assessment methods are substantially more effective in assessing for learning. Isn't it about time we had that review the GTC called for?

Photo by Walter Baxter

Creative Commons License
Moving the goalposts by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported LicenseBased on a work at steve-wheeler.blogspot.com.


October 05, 2012

Questions, questions

All friend of mine recently told me that his child's school has complained to him about his son. It seems his son has been asking too many questions, and it's interfering with the running of the class. The young lad keeps asking why he has to do certain tasks in the lessons. Teacher is getting sick of having to justify everything she is doing. Oh dear. How disruptive. What an unruly child...

Actually, that teacher probably needs a kick up the backside. What was she thinking? Why would any teacher want to stop children questioning? Why would any school discourage children from asking 'why'? Surely, questioning is a fundamental part of learning at any age. Asking questions is always more effective than receiving answers, because it opens up all the possibilities and allows the questioner to frame the world in their own unique, individual way. From questions come other questions. From those come learning. Children need a psychologically safe environment within which they can question, explore and make mistakes, with no negative repercussions. The moment teachers stifle a child's curiosity is the moment school ceases to be relevant, to that child, to the community, to society at large. If ever there was an ideal place for children to be encouraged to ask 'why?' it has to be the school. The problem with the current school system is that far too many demands are placed on teachers, and there is little time left to spend on exploration and discovery.

More time and space needs to be allocated during the school day for thinking and questioning. Children need to ask questions, because it's a natural part of their cognitive development. But when the school systems as it stands, serves to knock their curiosity out of them, something has to change. It's interesting to read Sir Ken Robinson's take on this issue. He suggests that as children grow older, their curiosity and their creativity tend to decline. This is not because they are 'growing up' he says, but rather because they have been 'educated'. Schooling has knocked the curiosity out of them. Alvin Toffler once said 'We don't need to reform the system, we need to replace the system'. He could easily have been talking about schooling.

I had six faithful serving men, who taught me all I knew. Their names are what and why and when, and where and how and who.

Image source

Creative Commons License
Questions, questions by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported LicenseBased on a work at steve-wheeler.blogspot.com.


October 04, 2012

Keep juggling

How many balls can you keep in the air at one time? One of the most interesting stands at the World of Learning Exhibition and Conference in Birmingham this week was the Blue Beetle Training stand, which among other rolling performances featured a 10 minute tuition on three ball juggling. Reminiscent of Simon Finch's finest Pelecon 2012 keynote moment, the presenter - Graham David - worked manfully to convince passing, reluctant delegates to stop for a while, and engaging them by showing them how easy it was to learn to keep three balls in the air continuously. It was great audience participation, and quite entertaining to watch, too. Yet there was a serious underlying message to be received, too.

Juggling is not easy, and takes a lot of practice. But in one sense we are all jugglers, because many of us regularly keep many 'balls in the air' including a full-time job, childcare and family duties, voluntary work, and so on. How many of us would like tuition in how to do that successfully? Take the job of teaching - how many things do we need to do simultaneously to be an effective teacher? What skills do we need to not only keep our heads above water in our jobs, but also to excel, to become the best we can possibly be in our chosen areas?

Blue Beetle Training is one of a number of companies popping up in the learning and skills sector that focus on developing creative and innovative new ways to learn. We certainly need more of that. Creative learning is going to be a growth area in Learning and Development, because many are tired of the old ways of training in rows. We have the technology, but that is not enough. We also need a sea change in the way learning and development are conceptualised. Learning by doing, particularly if that 'doing' is situated in work practices, is arguably one of the most effective ways of training employees effectively. Problem based leaning, simulations, learning by making (constuctionism) and experiential approaches to personal development have all been shown to be highly effective. Couple these with social learning mediated through the personal tools and devices that most employees carry around in their pockets (but many employers currently ban), and you have a very powerful, sustainable and lifelong method to be workers skilled and productive. Exactly what will your organisation be doing in the coming years to teach your employees how to juggle?

This week the World of Learning Conference and Exhibition celebrated its 20th anniversary at the NEC, Birmingham, UK.

Photo by Steve Wheeler

Creative Commons License
Keep juggling by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported LicenseBased on a work at steve-wheeler.blogspot.com.


September 28, 2012

Smart learning

I spend most of my time in the future, but I do go home at weekends. At least, that is my explanation about why I am so fascinated with the future. I always have been, ever since I was a little boy and started reading science fiction novels. From Asimov to Heinlein, and Clark to Wells, I hungrily devoured them all and fed my mind on what was to come.

I'm in a dream job now, talking about technology and learning, and how we can optimise one from the other. Inevitably, people invite me to speak at events and ask me to give my ideas about what is just around the corner.

My recent involvement with the New Media Consortium Horizon Report committee was another outlet for thinking about the future. That came up in today's keynote at the 15th International Conference on Interactive and Collaborative Learning (#ICL2012), here deep in the Austrian Alps, in a little town called Villach. I spoke of Learning 3.0, and speculated on what learning might look like in a few years. I proposed that a great deal of learning will take place in the future through the use of mobile tools, and that tablets, phones and other handheld devices would be just the start of our new technology enhanced learning journey.



Augmented reality, intelligent filtering, 3D spatial interaction, enhanced vision and other seemingly exotic or out of reach technologies will one day merge to become our new reality. The technology is already there, but as William Gibson once said 'the future is not widely distributed'. Eventually, all widely adopted technologies tend to fade into the background and become mundane, as learning breaks through to take centre stage. When will this happen? We can't say for sure, because other random factors continually intervene. Pictures and quotes from the past, shown in the slide set above, demonstrate that we are not always very good at predicting accurately what our future will hold, but with current trends, we can see farther than we have ever been able to see before down the corridor of time. And we can see that technology is not slowing down, and neither is our thirst for small gadgets, smart objects and embedded technologies. Learning 3.0 will be the nexus, the meeting point of all these smart tools, where people will connect seamlessly with other people, objects and information and learn when they need to, what they need to, where they need to. We will then be tapping into the combined intelligence of the entire globe, and that will be powerful beyond measure. At least, that is my vision for the future.

I hope you enjoy looking at the slides. I certainly enjoyed presenting them today, and discussing my ideas with delegates at ICL2012 in Villach, Austria.

Photo by Joaquim F. Silva

Creative Commons License
Smart learning by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported LicenseBased on a work at steve-wheeler.blogspot.com.


September 27, 2012

In an Internet minute

The futurologist Ray Kurzweil once said that 'change is not linear, it is exponential.' Sociologist Alvin Toffler described three waves of evolution in our interaction with tools. The first two, the agricultural and industrial revolutions laid the foundations for the larger, Third Wave - the technology revolution. Prior to the technological wave,  life changed relatively slowly, and change was linear, but in the advent of new technologies, we are lurching from change to change without pause. The technological Wave has changed everything in life, including the way we work and trade, learn, provide healthcare, entertain ourselves, conduct our relationships and interact. Arguably, old rubrics which described, but did not govern the pace of technological change (see for example Metcalfe's Law or Moore's Law) may already be outmoded.

The infographic on this page illustrates the sheer volume of  user generated content and user activity that occurs every 60 seconds somewhere on the Internet. In one minute there are over 2 million search queries on Google, 6 million Facebook views, over 200 million e-mails sent and 100,000 tweets. These staggering metrics are only the tip of the iceberg. We can expect to see exponential rises in all of these and the emergence of new and more dynamic social media and communication systems. One of the most marked changes are the upsurge in the use of mobile technology, with 1,300 new mobile users every minute. Mobile phones, tablets and laptops are portable gateways into the Internet, and it is predicted by 2015 that there will be twice as many mobile devices on the planet as there are humans. This means that access to the Web will increase, and there will be a steep rise in Internet activity, probably beyond what we can ever imagine. This assumes that the growth rate will continue at its current pace, which of course it won't. It also assumes that people will (and are) using more than one device to access the Internet. I currently have 3 devices with me, and I'm using two of them to write this blog post.

Whichever way we look at this, we know beyond reasonable doubt that demands on the Internet will continue to rise. Are we prepared for this exponential rise in use? The biggest challenge facing us now is not how to use the Internet, but being able to futureproof it so that we have enough bandwidth, capacity and storage space to hold all of the user generated content that is coming. Another challenge is to ensure that the Internet improves in terms of speed, security and usability.

Image source

Creative Commons License
In an Internet minute by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported LicenseBased on a work at steve-wheeler.blogspot.com.


September 25, 2012

Live to learn

As I was walking in to the university yesterday, a thought struck me. I asked myself the question, why do I work at the university? It certainly isn't because of the money. Firstly, I could earn a heck of a lot more in other industries. Secondly, I don't really need to work any more. I don't have a mortgage on the house now, and my children are grown up and independent (two of them have their own homes now). So why do I stay employed? The answer came clearly, as if from the skies above. I 'work' at the university because I love to learn, and going to the university, spending time with my colleagues and students, doing research and exploring all the many possibilities of my chosen subject of study, is incredibly rewarding. I realised that I don't go to university to 'work', but I'm very fortunate that I do get paid doing something I really love. I live to learn. And my university is the closest place to home I know where I am free to explore all the possibilities of that learning - to push the boundaries, try out new things, take risks and see how far I can go with learning new things.

This is also why (I realised, as I was walking to university) that whenever I launch a new student course, module or programme, I always try to agree a contract with my students. I tell them 'I know you want to learn from me, but I would also like to learn from you.' About that point I usually get some strange looks from some of them, but my students all 'get it' in the end. We make an agreement to learn from each other, because even the greatest minds on the planet don't know everything. The wisest minds on the planet are those who realise they actually know very little, and who seek out to try to discover and explore to fill some of the void. Have you ever stood at the edge of the ocean, or gazed up at the stars on a clear night, and felt so very, very small? That's the kind of awe we should feel when we consider learning.

Learning is lifelong and life-wide, but I didn't always know that. I believed the lie I was fed in school that learning stops when you leave formal education. It took me a while to discover that learning is actually only beginning when we leave school. Most people don't actually discover a passion for learning until they have entered a world of work. Tragically, many never discover a passion for learning at all. In a recent post I quoted Ashley Tan who said 'teachers teach, but educators reach'. For those of us who aspire to be educators, to reach beyond mere teaching, this has to be the line in the sand. Are we simply going to teach, or are we going to reach out to a lost generation of learners?

I know why I continue to work in education. Because I have fallen in love with learning. What about you?

Photo by Terry Robinson

Creative Commons License
Live to learn by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported LicenseBased on a work at steve-wheeler.blogspot.com.


<< Back Next >>