Log on:
Powered by Elgg

Feed detail

March 23, 2012

Create, connect, collaborate

With Pelecon (the 7th Plymouth Enhanced Learning Conference) only a few weeks away, preparations are almost complete. We are very excited about this year's event, with its lineup of world class speakers. As many readers will already know, we rebranded the event this year. With a new logo and website, we are sure Pelecon will retain its reputation as a friendly and stimulating event that showcases advances in technology enhanced learning. We believe Pelecon provides an excellent meeting place for those involved in promoting the use of learning technology across all sectors of education and training. Set in an idyllic part of the South West of England, we hope Pelecon is now established on the annual international conference schedule. There is still plenty of time to register for the event, and a chance to hear firsthand the thoughts of leading thinkers in the field including Keri Facer, Alec Couros, Jane Hart, Simon Finch, Leigh Graves Wolf, Helen Keegan and David Mitchell.

Pelecon has some secret weapons. Our social media team of Oliver Quinlan, Edd Bolton and Jason Truscott have been hard at work behind the scenes creating a host of social media platforms to promote the pre-conference discussions and raise awareness of the event.

We set up a Pelecon Twitter account which is a regular broadcast channel for all the latest news and views on the conference as we draw closer to the day. The account already has in excess of 200 followers, and is growing its reach daily.

Our Pelecon blog already hosts links to abstracts of all of the accepted papers for the conference this year, and an open invitation for anyone to post comments and questions to any of the speakers at the event. We want to encourage dialogue before the conference starts. The Pelecon blog will also feature regular posts from the team, including interviews with speakers, news and other updates as the conference progresses.

The Pelecon YouTube channel features videos of previous talks given by our invited conference speakers, and other associated content related to the event.

We have a Pelecon Lanyrd site where you can see all those attending, and where those who cannot attend can track the conference as it progresses.

The Pelecon Flickr site hosts images of previous conferences, and has a facility for delegates to add further content from their own personal photo collections related to the conference.

There is even a Pelecon official Facebook event page (but we won't talk too much about that...) and a LinkedIn page too!

The official hashtag for the conference is #pelc12 which is already being used in the run-up to the event. There are also session specific hashtags such as #TMpelecon which will be used during the Pelecon Teachmeet.

We think we have the social media angle well covered for the conference, but if you think we are missing a trick, I would like to hear from you.

Creative Commons License
Create, connect, collaborate by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at steve-wheeler.blogspot.com.


March 19, 2012

Bring your own

Mobile learning is on the rise. It was inevitable that the mobile phone would be brought into the classroom, with or without 'permission'. Many children use their mobile phones in class even though school rules forbid them to do so. What would encourage schools to sanction the use of personal devices?

There has been a lot of discussion recently about Bring Your Own Device (BYOD) in schools. There are two camps forming. On one side, there are those who believe that children should not be permitted to use their own devices in the school because mobile phones are distracting, can cause behaviour management issues and can also lead for example to serious issues such as cyberbullying and sexting. There are also teachers who fear that allowing children to bring their own devices will amplify the socio-economic digital divide - a kind of Bring Your Own Divide. Some children will have the latest, expensive devices while others from less affluent families will have cheaper, less enabled devices, or none at all. Concerns have also been voiced about liability and the potential loss, theft or damage of devices while children are inside the boundaries of the school.

On the other side, there are teachers who believe that allowing children to bring their own devices into school will liberate learning. Supporters of BYOD argue that allowing students to use their own devices, with which they are familiar, will give them a head-start where they don't need to learn to use a tool before learning through it. Children already use their mobile devices for a large variety of social purposes, including networking with their friends, accessing peer-related information and sharing content (images, links, status updates). The argument is that it would be natural for children to use their devices for learning in formalised settings. Teachers who support BYOD argue that children will feel more comfortable using their own devices, that BYOD will teach children to take more responsibility for their actions, and that policing their use should not be problematic.

This is a simplified version of what is shaping up to be a complex debate, but there is a strong case for both sides of the argument. There are of course many grey areas too. Some teachers have no strong views about BYOD, but for those who are actually implementing BYOD in the classroom, there are claims of positive outcomes.

In a post at the end of 2011 I reported on my visit to Albany Senior High School in Auckland, New Zealand, who have been supporting a school wide BYOD scheme for some time. To get around the problem of the perceived 'digital divide' the school also provides laptops and other tools for children who don't have their own personal device. They have also discovered that giving children the responsibility to manage their own learning through their own devices has largely eliminated behavioural problems. Children cherish the freedom to use their own devices, don't wish to run the risk of losing their privilege, and therefore take the responsibility to keep within the school rules seriously.

What are your views on the debate? Do you know of schools that have successfully implemented school-wide BYOD?  Do you have stories of BYOD failure?

Image by Freefoto

Creative Commons License
Bring your own by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at steve-wheeler.blogspot.com.


March 16, 2012

Web 2.0 culture

In previous posts I argued that as teachers, we should be prepared to give our content away for free. There are two reasons for this. One is to benefit those learners worldwide who wish to learn from you and need to see your content. Secondly, it is so you can reap the exponential rewards the social web offers. In Giving it all away I showed how offering free online access to your ideas and works actually increases your audience size. Licensing your content under a Creative Commons agreement that allows for repurposing or remixing provides an opportunity and invitation for others to translate your slides or blogposts into another language. Several of my posts and slideshows have been translated into Spanish, which opens up vast new audiences in South America I can share my ideas with, with no extra effort.

Look at the photograph. There were several images I could have used to illustrate this post, but all were protected by a copyright licence. In doing so those photographers lose the opportunity for their work to be amplified to a larger audience. The image I chose was licenced for free use and remix with attribution, so Noel Hidalgo gets the prize and receives a larger audience for his fabulous picture.  But the ethos of sharing on the social web goes deeper than the act of sharing content. It's also the adoption of a new mindset and a new culture for many professionals - the culture of Web 2.0. By way of explanation, here's an adapted extract from a book I published a couple of years ago:

The introduction of wikis into conservative environments such as classrooms requires all participants to adopt a new culture - one of co-operation and sharing. When they understand they can actually create and share content on a global stage, students can be both excited and daunted. Many of those who welcome the experience are probably in some way already connected into the culture of Web 2.0 and will probably already have accounts on social networking sites such as Facebook. They may be familiar with other media sharing sites such as YouTube or Flickr, and aware of the protocols that are active within these micro-cultures.


Those who are reluctant to share or co-operate, or anxious in some way about posting their content up on the web for all to see, may need to work a little harder to assimilate the culture of Web 2.0. It is only later, when they are more immersed into the Web 2.0 culture, and they have begun to develop the specialist digital literacies which gain them full access into it, that these students begin to understand the power and potential of sharing, co-operation and collaboration. Some never make the transition, and steadfastly refuse to allow their work to be edited by others, preferring instead to protect their ideas and maintain sole ownership over their content.

Canadian academic Brian Lamb once declared that during times of economic challenge, when so many people need access to learning, it seems preverse to hoard knowledge in any form. And yet, in schools, colleges and universities around the globe, there are many teachers and academics who jealously guard their content, as if by doing so they will benefit in some way from their protectionism. They may receive some financial reward, but will they have the satisfaction of knowing that in some way they have also helped other people, without cost? I have a message for such professionals. Change your mind. Choose to share your content openly and freely - it is only through giving it away that you will begin to reap the full rewards of the Social Web. Knowledge is like love. You can give as much away as you like, but you still get to keep it.

Adapted from Wheeler S. (Ed: 2009) Connected Minds, Emerging Cultures: Cybercultures in Online Learning. Charlotte, NC: Information Age. (p. 9).

Image by Noel Hidalgo

Creative Commons License
Web 2.0 culture by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at steve-wheeler.blogspot.com.


March 14, 2012

Shock of the new

Teachers in many schools will tell you they are running hard just to stand still as they attempt to adopt new technologies for learning.  It is a real struggle to keep up with the rapid pace of change that defines the digital age. Often, this is a bewildering process, and one which many teachers try to avoid. And yet, with a clear framework or roadmap for adoption, many of the challenges of adopting new technology can be met, and many of the fears teachers have can be assuaged.

Mandinach and Cline (1994) identified four distinct phases of adoption of new technology in schools. In the first phase, known as survival, teachers struggle to define what they wish to achieve with the new technology, and attempt to learn how to use it effectively to support pedagogy. Often, schools make the mistake of purchasing new technology before they have fully considered the reasons they need it. This suggests that the survival phase could be shortened if forethought went into the design of learning, before technology was procured.

The second phase of adoption is known as mastery, and involves teachers moving beyond the survival phase and into a phase where they start to apply the technology to meaningful and authentic learning contexts. During this phase, the technology should become transparent to the users - that is, it should begin to be used without significant cognitive energy.

The third phase, impact, is evaluative, and requires users to apraise the extent to which the technology is being effective . It also involves an assessment of how well teachers and learners are coping with any new issues or challenges that may have arisen during the implementation of the new tools.

The final phase, referred to as innovation, is where teachers have developed enough expertise to begin experimenting with new and innovative ways to use the technology. This can be a particularly creative phase, and often gives rise to the incorporation of even newer technologies, or the development of new pedagogical techniques. Venezky (2004) suggested that this final phase is recognisable by the number of restructured learning activities that occur within the classroom and the extent to which these enhance or extend best practice. Schools that are in the fourth phase of adoption are generally staffed by teachers who feel free to adapt technology to their own particular styles of teaching.

[Adpated from John, P. D. and Wheeler, S. (2009) The Digital Classroom: Harnessing technology for the future. Abingdon: Routledge/David Fulton. (p 99).]

References
Venezky, R. L. (2004) Technology in the classroom: Steps toward a new vision. Education, Communication and Information, 4 (1), 3-21.
Mandanach, E. B. and Cline, H. F. (1994) Classroom dynamics: Implementing a technology based learning environment. Hillside, NJ: Lawrence Erlbaum Associates.

Image by David Wright

Creative Commons License
Shock of the new by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at steve-wheeler.blogspot.com.


March 13, 2012

New ideas in a digital age

How are new ideas spread through society? Before the digital age, we had mass media to do the job - TV, radio and newspapers and prior to that, more primitive technologies were employed to spread news. But these media were used to spread ideas, news, and views that were often sanctioned by the broadcasting channel or publisher. We can go right back to the oral cultures where stories were told to preserve cultural values and tribal history from generation to generation. This kind of transmission of ideas was less filtered and more closely aligned to the culture it was aimed at. Today we are more and more reliant on social media channels to access ideas and news. Because this content is often crowd sourced and relatively unfiltered, it is arguably closer to the oral culture of ideas transmission than the mass media that dominated during the last century. One of the best explanatory models I have ever seen about the diffusion of ideas was devised by American sociologist Everett Rogers. In his famous 1962 model (in the figure below) Rogers synthesised the work of over 500 published innovation studies, and identified five phases of innovation diffusion, which are represented in the model as adoption types.

An interesting feature of the model is a gap or chasm between the early adopters and the early majority, which has been referred to as the 'bowling alley'. This concept was elaborated further by Geoffrey A. Moore in his 1991 book 'Crossing the Chasm'. For some ideas, this chasm can be difficult to bridge but must be if the idea is to achieve critical mass and penetrate sufficiently into the collective consciousness of the target society or community. This means that enough people have to subscribe to the idea before it becomes acceptable and desirable for the majority of that society. In most cases, 16 per cent is simply not enough. This model is a useful explanatory framework not only for ideas, but also for new technologies.

If we apply Rogers' model to technology in schools it follows that newer technologies such as tablets, games consoles or 3D televisions need to be purchased by enough schools for manufacturers to earn enough income to establish scalability of production, hopefully lowering their prices in the process. Another social effect is the self-help user groups that spring up to support the product and its application.

Image source 

In the digital age however, many have questioned whether Rogers' model still has any traction. One critic of the model is Rudi Dornbusch, who argues that change does not always occur along the trajectory that Rogers describes. Why might this be? In a time where the power and reach of mass media is beginning to ebb, and instantaneous global communication is now possible; and where individuals have the power to engage immense audiences through handheld tools, does the model still hold any significance?

Several years ago I published an edited volume entitled Transforming Primary ICT (Wheeler, Ed: 2005). In the opening chapter I attempted to provide a 21st Century contextualisation of Rogers' model. Essentially, I reasoned that the categories of Rogers' innovation adoption model could be reframed to enable a better understanding of how people adopt new technologies in the fast moving and hype-ridden age of disposable devices.

We know that the innovators identified in Rogers' model are those who generally adopt new ideas with little difficulty. Some may stand waiting outside a store for hours before the doors open, so they can be the first to own a new device the moment it is released for sale. For the digital age, I thought of this group as 'techno-romantics' because many who fall into the innovator zone tend to see technology as 'the answer'.

The next group - the early adopters - are often opinion leaders within a community, and in this position of respect, they can influence behaviour. They are a little more pragmatic in their outlook, and tend to buy into a new idea or technology when they see its momentum growing. They may also be 'technophiles', in that they have an affinity with new technology and perceive no particular threat to their way of working, but rather embrace it as a means to enhance or extend their practice.

The early majority are the 'techno-realists' - people who deliberate their decisions about purchase of technology and who carefully watch what the technophiles do, before eventually buying into the trend. By the time this section of society adopts the new idea, prices have already begun to fall due to manufaturing economies of scale, and at this point in the lifecycle, version two has probably been released. At this time, the new technology is no longer seen as a fad, or a gimmick, and probably has earned a certain amount of kudos as a desirable device or tool to use.

By the time the late majority have adopted the technology it is no longer new. The late majority are the 'techno-sceptics' who prefer to remain at the periphery of innovation, and only buy into a new device when it has been long established, and there is evidence of good use, and a large enough support network.

The final group, which Rogers calls the 'laggards' are those who never, or only rarely adopt a new idea or technology. They are in digital terms 'techno-luddites', and in Rogers' terms, this group tend to have no opinions leaders within their ranks, but if provoked or threatened by the new tools, may actually take some form of negative action. According to Venezky (2004) this model can explain the adoption of ICT in schools, and holds true in many countries. Although Rogers' model of diffusion of ideas is now more thanfive decades old, I believe it still has a place in our understanding of how technology is adopted.

References

Moore, G. A. (1991) Crossing the Chasm: Marketing and selling high-tech products to mainstream customers. New York: Harper Business Essentials.
Rogers, E. (1962) Diffusion of Innovations. Glencoe: Free Press.
Venezky, R. L. (2004) Introductory Paper: Technology in the classroom: Steps toward a new vision. Education, Communication and Information, 4(1), 2-22.
Wheeler, S. (Ed: 2005) Transforming Primary ICT. Exeter: Learning Matters.

Main image source

Creative Commons License
New ideas in a digital age by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at steve-wheeler.blogspot.com.


March 11, 2012

Radio Waves

I have just returned from a very enjoyable few days up in Leicester where I spoke at the NAACE Annual Conference. The event was well organised, and we were all very well looked after. I had the pleasure of meeting up again with several old friends, and was pleased to make so many more new friends during the three days of the event, which was held at the Marriot Hotel. Having dinner with former Education Secretary Charles Clarke and hearing his inspirational after dinner speech was a significant highlight of the conference, as were the many encounters with truly knowledgeable and passionate educators from across all sectors of education. Other significant presentations came from the likes of David Mitchell, Stephen Breslin and Mick Waters, all three of whom presented in the same plenary session on the final day of the conference.

Leon Cych approached me prior to my keynote to request an interview about the content of my presentation and I was happy to oblige. The link to the short 3 minute audio interview is below:

Interview with Steve Wheeler at #Naace12 (mp3)

After my keynote, I was interviewed again, this time by an extraordinary young man. 16 year old Lewis Phillips is a student at Inverkeithing High School in Fife, Scotland, where he helps to run the RadioWaves internet radio station. Many of the interviews conducted by the RadioWaves team of school children during the conference can be found on this website. My video interview can be accessed from this link.

Image source

Creative Commons License
Radio Waves by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at steve-wheeler.blogspot.com.


March 09, 2012

Who let the blogs out?

Who, who, who who? Yes. Exactly. I have unleashed my blog.

It was about time. For all these years I have been focusing mainly on content. It was substance over style. Focusing solely on content at the expense of context is a mistake. In my previous blog posts I discussed the age old debate about the tension between the two, but I have come to the conclusion that content and context are not a binary. They are dependent upon each other, and need to be balanced. So I have balanced the two here on this new look blog, I hope.

But context is still vitally important. In education, if all learners receive is content, content, content, then they will be... well, discontent. They will feel overwhelmed, hemmed in by the continuous onslaught. Students need to be given some time to reflect, digest, ask the 'what if?' type questions. They need context for the content they have been given. All too often in formalised education settings, there is no time built into the programme to do this, because curriculum comes first. But we need to challenge this. We need to start asking the questions that will cause our leaders to stop and rethink the constraints they are imposing upon the teaching profession. Teachers are doing their best, but with the best will in the world, how are they going to inspire young people to get excited about learning, if they have no time themselves to teach creatively? We need to ask what exactly are schools for? Why are there so many subjects covered in the curriculum? Why is so much time spent on testing, and so little spent on the development of critical skills, creativity, experimentation?

So I gave my blog a makeover a few days ago. I invoked one of the new templates that Blogger has just started to offer its users. You can see the difference it has made. I have unleashed my blog, and now it's free to make as much of an impression on my readers as they are to ask of it. I think it's a cleaner context, a more open and accessible format for the content to sit within. Many others have already agreed, and interestingly, my blog traffic has almost doubled. I'm not claiming that this solely because I have changed the context, the format of my blog. But it seems strange that in the last two days, all I have done is alter the look and feel of the wrapper, and have added no new content. Yet, in the last two days I have received over 10,000 views, up from the normal 2500-3000 views per day I would normally get during the week. For me, this is at least an indication of the power of context. It holds the content, and presents it in a manner that is more accessible, easy to explore and in a more dynamic way. Can we do the same with school content in the given constraints? Success will rely on the tenacity, determination and inventiveness of creative teachers, but as I have always said, teachers are the best society has to offer, and somehow we will find ways to do it. Doctors save lives, but teachers make lives. Let's unleash the content.

Image source

Creative Commons License
Who let the blogs out? by Steve Wheeler is licensed under a Creative Commons Attribution 3.0 Unported License.
Based on a work at steve-wheeler.blogspot.com.


March 06, 2012

...context is king

In yesterday's post I made the statement that the internet is better as a creative space than it is as a repository. Let me clarify this statement. Much of pre-social web based content was difficult to edit or change. Web 1.0 - sometimes referred to as the 'sticky web'- was largely controlled by webmasters and corporations, and was used mainly as a broadcast channel to promote ideas and products. The advent of Web 2.0 type participatory tools and services such as social media and social networks, voting and filtering tools and personalised spaces, provided users with the ability to be directly involved in the creation of web content. Media sharing sites such as Flickr, YouTube and a variety of podcasting services offered users the capability to go beyond the repository mentality of earlier web iterations, to host their own TV and radio channels, blogs enabled them to publish their own newspapers. The web had become a place where people could generally create and share their ideas on a global stage.

The concept of digital repository - collections of useful artefacts that are aggregated together online in an accessible form - is a good one, but the idea loses traction if users cannot interact with the content and hold relevant conversations about it. Increasingly, users also want to repurpose and remix content they find, something which the old style repositories did not allow. The introduction of copyright workarounds such as Creative Commons have given web users the capability to use content in new and creative ways, thereby extending the capabilities, reach and scalability of the content beyond the original intentions of its creator.

As I argued yesterday however, content is no longer the driving force of the web, and should not be viewed in isolation. The context within which the content is situated should also be focused upon as an important component of any analysis of web based learning activity. Content can have two completely different meanings (or functions) if seen in two different contexts. Writing about assessment methods on a teacher discussion site would probably be well received, and users would no doubt engage with any ensuing conversations. Posting the same article up on a site frequented by accountants would be stupid. Unless of course the assessment you were talking about was tax assessment.

I was joking yesterday when I tweeted that my post 'Content is a tyrant' had received over 500 hits in just a few hours, and perhaps the reason was because I used a pretty picture. I was of course being ironic, because it's debatable whether the picture is content or context. For me, the images featured in my blogposts are better seen as contextual, in that they frame the content and provide additional meaning. Photographs are indeed content, but in this setting, they serve as signposts and illustrations to situate the content. We also need to be aware that the value of such content, any content, is subjective and can be interpreted any way the reader wishes.

More to follow on these thoughts in a future post...

Image from Fotopedia (can you see what it is yet?)


Creative Commons Licence
...context is king by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


March 05, 2012

Content is a tyrant...

Never before has gaining access to information been so easy. The imminent arrival of widespread 4G broadband and LTE (Long Term Evolution) will usher in wider availability to information and push even more data to our mobile devices over the same amount of radio spectrum. At least that is the plan. Better coverage and faster download/upload speeds would ensure that just about everyone who is connected would have even greater access to online content and services anytime, anywhere. But in adopting these communication advances, are we also opening the door for a deluge of content? Are we not already swamped by a tsunami of content?

In 1996 Microsoft's Bill Gates claimed that 'content is king'. Those who are hot on history will recall that it was around this time that the internet first started to enter the collective consciousness. The mid-1990s was an interesting time. Microsoft dominated the computer software market, and Google wasn't yet conceived (Brin and Page didn't launch Google until September 1998). In 1996, most pre-Google online searching was done using Yahoo! (a company founded by Jerry Yang and David Filo in January 1994) and I remember using Pegasus e-mail, and browsing the web using Netscape Navigator. Mobile telephones were a lot larger than they are today, and quite expensive to buy and use. When people talked about ‘smart phones’ they were referring to the design and appearance of the device, not its capability. Looking back on that embryonic period of telecommunication, and considering the sophisticated tools and services we now have at our disposal, and can use without a second thought, does the statement made about content by Bill Gates still stand?

The reasoning behind the Gates statement is that content is what drives the web. So for example, if a blog constantly publishes good content, the theory is that people will keep coming back to read more. The medium itself is not as important as the content it holds.

The Canadian media theorist Marshall McLuhan had a somewhat different take on media. His famous statement, in pre-internet times, was that 'the medium is the message' (or indeed the massage). Put simply, McLuhan was more interested in the characteristics of the medium that conveyed the content to the user. In 1996 Richard E. Clark, argued that media and technologies were 'mere vehicles' that delivered content to users in much the same way that delivery vans brought goods. His argument was that all media and technologies are neutral, and that the user imposes their own interpretations upon them. His view was that media do not influence learning any more than the delivery van influences diet. While Clark held the view that media do not influence learning, Robert Kozma countered by arguing that specific media do possess certain characteristics that suit particular types of learning activity. Kozma made the statement: 'If we move from "Do media influence learning?" to "In what ways can we use the capabilities of media to influence learning for particular students, tasks, and situations?" we will both advance the development of our field and contribute to the improvement of teaching and learning.'

In essence, Kozma and McLuhan both believed that context (i.e. the tools, the media), were at least as important as the content they delivered, whilst Clark agreed with Gates that the content was king. Increasingly, in today's digital age, many of us are following Clark’s perspective, focusing on content, without paying much attention to the tools we use to make sense of it. In some ways, this is a natural progression, because tools and technologies are becoming more transparent and easy to use without too much thought. Yet in focusing on the content, as McLuhan warned, we may miss the entire message. Highly digitally literate individuals are able to communicate effectively across several platforms without loss of power or nuance. This is known as 'transliteracy', a sophisticated grasp of the affordances of the media and technologies that is becoming the passport to success for today's digital learner and scholar. Transliteracy goes beyond content, and exploits the power and potential of many different tools and services, giving the user an edge over content, enabling them to connect, communicate, consume, create and collaborate more effectively.

Access to information is one thing. But information should not be confused with knowledge. Knowledge comes about through learning and through the diligent application of information. Anyone who is interested in learning will also be interested in cognition and its relationship to knowledge. A popular recent theory is that cognition does not exclusively occur inside the head, but is also increasingly reliant on tools and other people. This theory represents a distributed form of cognition that is highly resonant in the age of ubiquitous and personal connections. David Jonassen talked about using computers and the internet as 'mind tools' - extensions of our cognitive ability and mental space which have the potential to advance personal learning beyond the constraints of normal boundaries and spaces. This mind tool effect can be observed today in large social networks and across distributed communities of practice, and might be explained through connectivist theory which holds that we now store our knowledge more with our friends than we do in any physical repository.

Yet connecting into a community of practice can work as a double edged sword. Although membership of an online network of interest (or community of practice) brings many benefits and rewards, it also has the potential to swamp individuals with content, because every active community member is generating, sharing and recommending content. The larger the community network, the more content is likely to be made available. This experience has been likened to taking a drink from a fire hydrant. Enter any term into a search engine and you are likely to receive back millions of hits. The veritable tsunami of content that assails us can make us feel as though we are drowning in a sea of information. Content has become a tyrant, and although there are many tools to help us moderate and filter this content, not everyone knows how to use them effectively.

One final thought: The internet is better as a creative space than it is as a repository. This is due in no small part to the gradual evolution of so called Web 2.0 tools and services, the majority of which are richly social and participatory in nature. The capability of social networks to connect people with similar interests from across the globe also promotes the need to create, organise, share and consume content within appropriate contexts. As a society, and within our communities of practice, we need to be able to discern the good content from the bad content.

Next time: .... Context is king

Image from Fotopedia


Creative Commons Licence
Content is a tyrant... by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


March 02, 2012

Library 2.0

What does the future hold for our university libraries? Are they obsolete or are they essential? The library has long been seen by many as a very traditional, conservative institution, and is often portrayed as a place where rows upon rows of antiquated book shelves slowly gather dust. Yet a visit to the university library today will reveal a substantial investment in technology to streamline research and provide users with a more seamless and rewarding experience.

Just how are libraries adapting to the digital age and all it brings? In the past they have been a pivotal part of university life. They are not simply a repository of books and learning resources, although many may see them as just that. If all libraries did was store and loan out books, their doors would have closed years ago. The digital age would have put paid to them. In an era where digital media holds sway, and where online stores such as Amazon announce they are now selling more Kindle and e-versions of books than paper versions, what will be the future for the university library? What changes are they making that bring them into the digital age, and enable them to compete with current advances in technology?

Firstly, libraries offer specialised search services which go beyond the simple searches you can perform on Google or other search engines. Publications such as Kelly et al's Library 2.0 indicate the trends away from traditional repository approaches to a more distributed range of digital services for staff and students, with particular emphasis on the tools students are already familiar with - Web 2.0 social media.

Secondly, as Ian Clarke (2010) suggests in his Guardian article, we still need libraries because they inform users about best practice in the use of search tools and the promotion of better digital literacies. Clarke also shows how libraries can bridge the digital divide, arguing: "Libraries are a bridge between the information-rich and the information-poor. They need reinforcing, not dismantling. We need to continue to provide a highly skilled service that is able to meet the needs of the general public." He warns though, that libraries must continue to innovate and keep pace with the changes fomented by digital media, because without the services they offer, we would run the risk of living in an ill-informed society. It's not difficult to see that this perspective is influenced by the rise in informal learning, but those who are engaged in formal education also rely on centralised library services.

The College Online website provides an excellent list of reasons why librarians are not obselete that includes arguments about the changing roles of librarians, but in essence focuses on practicalities. One reason offered is that not everything is on the Internet.  Whilst this is still a reasonable argument to make at present, we can speculate that this may not always be the case. How long will it take to digitise everything so that it becomes available online? The advent of Google Books, Amazon's Look Inside feature and other similar services offer potential readers a preview of the insides of books and other artefacts. Although the entire book may only be readable on purchase, it may not be long before the open access movement gains enough ground to facilitate the digitisation of everything - for free. Some authors and publishers will resist the open movement, but if they do, they are likely to find themselves marginalised from the literary world and on the periphery of the global reading experience. The digitisation of difficult to find materials is sensible and sustainable. Readers can now access a great many historical maps, genealogical records or rare volumes without leaving their armchairs. But there is still a great deal to achieve in the grand plan to digitise everything, and there are those who are opposed to the very idea.

More convincing is the argument that library attendance isn't falling, it's just migrating to virtual attendance. By this, the writer is arguing that more users are deciding to access library services online, and with more university libraries digitising their content and services, this seems to be a rising trend. If so, what becomes of the physical library space in the future? This is a question each library must answer in its own way, because each library is different. Will some for example, begin to repurpose their spaces to provide different services? Will some create culturally and socially rich environments which will attract users back into the physical space? Or will they instead downscale their physical footprint to enable the funding of other digital services that require less groundspace?

In a future blogpost, I will report back on the librarians' perspective on these and other related questions. I will also present a keynote speech at the Chartered Institute of Library and Information Professionals of Scotland (CILIPS) conference in Dundee on June 11 - where I will elaborate on this discussion.

References

Clarke, I. (2010) Why we still need libraries in the digital age. The Guardian, 13 July. Available online at: http://www.guardian.co.uk/commentisfree/2010/jul/13/internet-age-still-need-libraries (Accessed 2 March, 2012)

Kelly, B., Bevan, P., Akerman, R., Alcock, J. and Fraser, J., (2009) Library 2.0: balancing the risks and benefits to maximise the dividends. Program Electronic Library and Information Systems, 43 (3), 311-327. Available online at: http://opus.bath.ac.uk/15260/ (Accessed 2 March, 2012)

Picture by Steve Wheeler (Victoria State Library, Melbourne, Australia)


Creative Commons Licence
Library 2.0 by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 29, 2012

Everyone's a critic

Everyone's a critic, they say.

Until it comes to academic writing, that is. Many students fail to realise their full potential when it comes to essay writing, usually because they can't seem to find their way out of the descriptive cul-de-sac they make for themselves. If they could only find it within themselves to write critically, they would earn higher grades. So why do some find critical writing such a problem?

Firstly, knowing your field of study is an important factor in academic writing, and some students simply don't trawl deeply enough. If you know one theory, but are unaware that it has been challenged by another theory, you only have half the story, and you then find yourself on the periphery of the discourse. Knowing the weaknesses of a particular theory will only come from gaining an insight into how that theory came about, and understanding how it can be applied in particular contexts. So to be able to write critically, you need to have read around your subject - you need to have seen the 'big picture'.

Knowledge of your field is not enough though. Critical thinking is crucial to the process. You can think without writing, but you cannot write without thinking. It follows that critical writing comes from critical thinking. If you can't think critically, you won't be able to write critically. Students need to learn to think in a particular mode to be able to do this. One of the top tips I can give to anyone who wishes to write critically, is first to think critically about what they are reading, and learn to ask questions of the text. It is a kind of conversation the reader has with the author. Best questions to ask are questions such as 'how does this writer justify what s/he is saying?' or 'What support does this writer have for their ideas?' You may like to dig deeper and find out how their evidence was obtained. Were the data they used obtained from a particular sample, and were they biased, or contrived in some way? Is the writer being totally objective, or is there some hidden agenda in there?

Academic writing has the capability to generate a great deal of angst. For example, students often get hung up on whether they should be using the personal pronoun in their essays and projects. My view is that there is nothing wrong with it, provided the writer is not expressing their own unsupported opinion. Writing 'I reflected upon this experience and subsequently adjusted my professional practice...' is justifiable, but simply writing 'I believe that ....' is not enough.

In his blog post 5 ways to develop critical thinking in ICT, Terry Freedman offers some great advice on how teachers can probe understanding by repeatedly asking 'why?', or 'how do you know that?' If students can do this during the process of writing up their assignments, many of their descriptive, lack lustre passages could be transformed into dynamic critical, reflective and analytic pieces of writing. One aspect of marking assignments I find particularly unpalatable is when students churn out the same old, bland writing which merely represents what has been covered in the module, and not what they have learnt and critically applied to their practice.

Another pet hate I have is disjointed essay writing. Some students seem to think that they will impress the marker if they pepper their writing with copious direct quotations from the set readin lists. All they end up achieving is a series of unconnected quotations with no particular thread of reasoning running through them. Better by far is the art of paraphrasing key points from published authors and then applying these to support an argument that you are developing. Even better still is the ability to counterpoise these paraphrased elements to form a finely balanced discussion that shows you have thought deeply about all perspectives associated with your argument, and can logically organise them. Whichever way you examine essay writing though, it all tends to comes back to the ability to think critically.

Probably the best form of critical thinking emerges from dialogue within the community of practice. Carr and Kemmis (1997) highlighted the importance of dialectical thinking. Based on Hegel's realist philosophies, Carr and Kemmis propose that the tensions between opposing perspectives, where opponents take the stance of 'thesis' and 'antithesis', usually result in some kind of 'synthesis' of ideas. Although this can often be a compromise between the two opposing perspectives, more often than not, it is also a merging of the strengths of both arguments to form an even stronger, newer thesis. Development of such world views are the basis of all critical learning, and require the student to be open to new ideas, and open to being challenged in their own beliefs, values and thought processes.

Reference
Carr, W. and Kemmis, S. (1997) Being Critical: Education, knowledge and action research. London: Falmer Press.

Image by Enrique Sanabria


Creative Commons Licence
Everyone's a critic by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 28, 2012

On Aer

I spent two great days in Ireland this weekend, and would like to thank my hosts at the Computer Education Society of Ireland for inviting me. It's a beautiful country, the culture is rich and the people are all so friendly. Ahead of my keynote speech at the CESI 2012 conference in Portlaiose, I managed to do an interview for Dublin City FM 103.2. The programme Inside Education, presented by Seán Delaney, is a regular radio and podcast Irish perspective on news and stories from the world of education. We sat outside in the early spring sunshine of Portlaiose and discussed blogging, social media, the state of education, ideal schools of the future, innovation and technology, and a whole host of related topics.

I emphasised the importance of blogging as a means of teacher professional development and best uses of technology in education (social media, interactive whiteboards, VLEs, videoconferencing, iPads), and we discussed choice and adoption of new technologies in education. It was a wide ranging interview in which we also explored the use of Twitter as a rich communication backchannel and social networking media, discussed personal learning networks and communities of practice, and talked at length about the idea of classrooms without walls, BYOD and open educational resources. We also touched on youth culture, txt speak and digital literacies. The most important thing, I emphasised, is for teachers to consider the pedagogy, the potential learning gain and the student experience before they decide on the purchase of any technology.

Seán was a very good interviewer because he listened to what I had to say and then followed up my statements with useful questions that delved deeper into my ideas. The programme was broadcast on Sunday evening, and the podcast featuring the first part of the interview can be found here. The interview lasts approximately 20 minutes - listen carefully and you can actually hear the crows in the background! - do have a listen.

Image source

Creative Commons Licence
On Aer by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 22, 2012

The Commons touch

Many people assume that because the web is open, any and all content is open for copying and reuse. It is not. Use some content and you could well be breaking copyright law. Many sites host copyrighted material, and many people are confused about what they can reuse or copy. My advice is this - assume that all content is copyrighted unless otherwise indicated.In the last few years, the introduction of Creative Commons licensing has ensured that a lot of web based content is now open for reuse, repurposing and even commercial use. The Stanford University law professor Lawrence Lessig is one of the prime movers behind this initiative. Essentially, Creative Commons has established a set of licences that enables content creators to waive their right to receive any royalties or other payment for their work. Many are sharing their content for free, in the hope that if others find it useful, they will feel free to take it and use it. Creative Commons is a significant part of the Copyleft movement, which seeks to use aspects of international copyright law to offer the right to distribute copies and modified versions of a work for free, as long as it is attributed to the creator. Any subsequent reiterations of the work must also be made available under identical conditions. In keeping with similar open access agreements, Copyleft promotes four freedoms:

Freedom 0 – the freedom to use the work,
Freedom 1 – the freedom to study the work,
Freedom 2 – the freedom to copy and share the work with others,
Freedom 3 – the freedom to modify the work, and the freedom to distribute modified and therefore derivative works.

Finding free for use images on the web is now fairly easy. Normal search will unearth lots of images. But these are not necessarily free images. Many will have copyright restrictions. To find the free stuff go to Google and click on the cog icon at the top right of the screen. S
elect the Advanced Search option. Next, scroll down the screen until you find the drop down box labelled 'usage rights'. You will be presented with four options:

Free to use or share
Free to use or share, even commercially
Free to use, share or modify
Free to use, share or modify, even commercially

Whatever option you choose, you will be presented with a reduced collection of images that still meet the requirements of the search, but under the conditions of that specific licence. Now you have a collection of images you can use under the agreements of Creative Commons. Use them for free under these agreements and you are complying with international copyright law. Don't forget the attribute the source!

So why would people wish to give away their content for nothing? I have previously written about my own personal and professional reasons for doing so in 'Giving it all away', but just for the record, I will summarise:

Giving away your content for free under a CC licence ensures that anyone who is interested in your work does not have to pay for it or worry about whether they are licenced under copyright law to use your content. In today's economic uncertain climate, it makes sense to be equitable and to give content away that others have a need to see and can make good use of. It also means that users will do some of your dissemination for you. Your ideas will be spread farther if you give them away for free, than they necessarily will if you ask people to pay a copyright fee or royalty. If you allow repurposing of your content, the rewards can be even greater. Some of my slideshows have been translated into other languages. Having your content translated into Spanish for example, opens up a huge new audience not only in Spain, but also most of the continent of South America. Many are now licensing their work under CC because they know it makes sense. Much of the content on Wikipedia for example is licensed under Wikimedia Commons - a version of CC. So look out for Creative Commons licensing - it's going to be very big news indeed for all web users in the near future.

Image source


Creative Commons Licence
The Commons touch by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 19, 2012

Learning pathways

I recently heard a story about the building of a new university campus. Unusually, the architect hadn't designed any pedestrian paths into his plan. When asked why there were no pathways between the buildings, he replied cryptically that he was waiting to see what happened. Soon, over a period of time, as students and staff walked between the buildings, they made their own tracks or 'desire lines' through the grass. Once these tracks had become established as the most natural and preferred routes, the architect ordered the builders in to pave over the tracks. 'Better they create their own pathways', he said, 'than for me build them, and then for them not be used'. Instead of imposing his own ideas onto the community, the architect had crowd sourced his design.

How often do we impose pathways upon students which do not meet their needs, or fit their expectations? How many times have we invested in technology, environments and curricula that is simply a waste of time and resources? The institutional learning platform - the VLE - is a classic case of decisions made about learning without consulting the learner. How can we reach a place in education where students find their own level and make their own pathways through learning?

Deleuze and Guattari's 1980 publication A Thousand Plateaus might offer us some clues. It was hailed by some as a masterpiece of post-modernist 'nomadic' writing. Others criticised it for its dense, pseudo-scientific prose. Whichever way you view this book however, it was notable for introducing rhizome theory as a metaphor for knowledge representation. According to Deleuze and Guattari, rhizomes are unlike any other kind of root system, having no beginning and no end. Rhizomes don't follow the rules of normal root systems, because they resist organisational structure and chronology, 'favouring a nomadic system of growth and propagation.' In plain English, the authors are attempting to describe the way ideas spread out naturally to occupy spaces like water finding its level. The rhizome is not linear, but planar they argue - and therefore can spread out in any and all directions, connecting with other systems as it goes. The same might be said about the way communities form, create their preferred ways of communication and decide their priorities.

Rhizome theory is also a useful framework for understanding self-determined learning - the heutagogy described by Hase and Kenyon. Hase and Kenyon contextualise heutagogy with reference to complexity theory, and suggest a number of characteristics including 'recognition of the emergent nature of learning' and 'the need for a living curriculum'. The self-determined pathway to learning is fast becoming familiar to learners in the digital age, and is also the antithesis to the formal, structured learning found in traditional education.

Dave Cormier - one of the foremost contributors to rhizomatic learning theory - takes this concept deeper into digital territory by equating rhizomatic learning to 'community as curriculum'. The advent of social media, mobile communications and digital media facilitate large, unbounded personal learning networks that mimic the characteristics of rhizomes. If we accept that there is a need for a living curriculum, it would be logical to also accept that a self-determined community generates and negotiates its own knowledge, thereby forming the basis of what its members learn. Rhizomatic learning is also premised on an extension of community as curriculum, where: 'knowledge can only be negotiated, and the contextual, collaborative learning experience shared by constructivist and connectivist pedagogies is a social as well as a personal knowledge-creation process with mutable goals and constantly negotiated premises'.

Students can, and do, create their own personalised learning pathways. There is also evidence that learning communities informally decide their own priorities, often observed in the emerging folksonomies that result when digital content is organised, shared and curated. These processes often occur in spite of the strictures and rules imposed upon students by the institution. Most are the result of informal learning, achieved outside and beyond the walls of the traditional education environment. Self-determined learning pathways are crucial for individual learners as well as learning communities and they are by their very nature beyond the control of universities and schools. Schools and universities cannot (and should not attempt to) harness these processes, but they can facilitate them. Just like the architect, institutions can refrain from imposing structures and pre-determined tools, wait to see what their students prefer and then provide them with the best possible conditions to support self-determined learning.

Image by justpeace


Creative Commons Licence
Learning pathways by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 16, 2012

Never mind the quality

While waiting for my flight home from Cyprus last week, I did an impromptu interview for some colleagues from Pakistan in the departure lounge. They quizzed me about my views on quality in education, and recorded my responses on video. They intend to share the video online once all the airport public address announcements have been edited out. In the meantime, here's the essence of the interview:

My view on quality in primary education is that it cannot solely be measured through standardised testing or other performance related metrics. These are used by governments as measures of whole school compliance to policies rather than as measures of how individual children are learning. Standardised testing is a device to control schools and systems. It has never been about learning. The quality of personal learning gain can only be measured through authentic forms of assessment, and the more individualised these are, the better. I suggest ipsative assessment which involves measuring a student's learning against their own previous achievements. This is a much fairer method, and has the potential to inspire learners rather than show them how big a failure they are. The Assessing Pupil Progress (APP) schemes already practiced in some UK schools are exploiting this potential, and it's a more equitable method of assessment than the old norm or criterion referenced forms that are still being used by many schools throughout the world.

How do we ensure quality learning in education? The best way I know how to do this is to provide space for children to express themselves creatively. Children need to be given licence to ask questions, no matter how ridiculous or bizarre they are, to explore outrageous possibilities, to exercise their imagination and to create something they can be proud of. The lack of expressive subjects such as art and music in the English Baccalaureate (EBAC) subjects is a travesty, and should be redressed as quickly as possible.

Children also need to be given space to make mistakes without any condemnation. Alvin Toffler once declared: “The illiterate of the 21st century will not be those who cannot read and write, but those who cannot learn, unlearn, and relearn.” Too often 'success' culture has been so deeply ingrained within the fabric of school life, that there is no room for failure from which we can learn.

If children are able to control what they learn and create things, their interest will grow, and if they are interested in the subject they will learn. They don't always have to be happy or comfortable for quality learning to occur. Sometimes discomfort, dissatisfaction or a lack of closure will spur them on to achieve even more in learning. Children need to be given tools to help them to learn, and then they need to be left alone to use the tools in the best ways they can find toward deeper learning. Better still, allow them to use the tools they are already familiar with.

Standardised curricula are bad news for schools. More trust needs to be invested in young people to be responsible for their own choices. Too often when teachers are pressured, they tend to revert to methods they are most familiar with. Often, these methods bear no resemblence to the needs of contemporary society, because it has moved on from the time they were themselves in school. Often we forget that teaching today is about the children, not the teachers. It's not our learning, it's theirs, because as the Indian poet Rabindranath Tagore once warned: 'Do not limit a child to your own learning, for he was born in a different time'.

Image source


Creative Commons Licence
Never mind the quality by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 15, 2012

Bear pit pedagogy

In our digital literacy teacher training programme at Plymouth University we create environments that encourage critical thinking. My colleague Peter Yeomans (AKA @ethinking on Twitter) says we create the 'bear pits' for our students. In other words, we enable digital and physical learning spaces in which they can freely explore ideas, argue with each other (and us) over concepts and theories and in so doing, develop their reasoning and thinking skills.

In order to develop key critical thinking skills, learners need to be able to argue effectively. They need to be aware that there are alternative perspectives and they need to be able to defend a position from attack. They must also investigate theories critically, because if they simply accept a theory as 'truth', they may be leading their entire classroom down a blind alley. Too much bad theory has crept into the classroom in recent years, as I have previously commented, and we want to ensure that our trainee teachers are aware of flaws, counter-arguments and alternatives to all theories. That's why we encourage our students to critically engage with course material, and then to extend their knowledge by creating their own additional content around it.

We encourage them to develop their own Personal/Professional Learning Networks (PLNs) so they can lock into and exploit the vast communities of practice that already exist out there in the rapidly expanding Blogosphere and Twitterverse. They are quite adept at using the tools at their disposal to create these connections, but first they need to be convinced. Once they realise the benefits of blogging or tweeting, and can see how much they learn as a result of engaging with remote peers, they engage with it enthusiastically. When students are given projects to complete, blogs, videos, podcasts, they are expected to organise their ideas, form their argument and present them in seminar or digital format - and then they must defend them. You see, when students are required to present something they have learnt to an audience, they need to know it well before they can present it convincingly. It's not the easiest route for learning, but it invariably turns out to be deep learning. The bear pit approach is more akin to dropping them in the deep end, and it can be a little uncomfortable at times.

One final point: We also give students the license to challenge us, and sometimes, if we feel it necessary, tutors may even debate each other in front of the students. Academics don't (and can't) always agree on everything, so why not model critical discussion for the benefit of the students? I would be interested to hear from other teacher educators about what approaches you use and whether you see any value in what we are doing with our bear pits.

Image source


Creative Commons Licence
Bear pit pedagogy by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 10, 2012

Damascus Road

My keynote presentation yesterday to the Cyprus International Conference on Educational Research had a mixed reception. Some delegates agreed with the points I made, others were more sceptical. It's interesting when you present what are considered radical ideas to a rather conservative audience and see the reactions around the room. It's like watching the surf hitting the rocks outside the window of my beach hotel room. An unstoppable force against an immovable object, and all that.

The majority of the delegates present were from from Asia, the Middle East and North Africa. When asked, around 75% admitted that they had no involvement whatsoever with any of the social media or networking tools I was talking about. I had to pause at this point to rub my eyes. How can you expect to understand what your students are doing if you don't yourself engage with these tools? was my challenge. I think some were quite appalled when I suggested that even if they ban mobile devices and social networks in their classrooms (which many are doing), the students will still continue to use them, probably under the tables. There were some worried glances when I suggested that the reason students are using mobile devices and social media in the classroom might be to check out how accurate and truthful the lecturer's statements are. This kind of challenge to authority may not be palatable for many conservative academics, but its a plain fact - it happens all the time, and it will grow in its intensity and reach. My message was - get over it - it isn't going away.

I also caused a few ripples on the normally placid pond of academic publishing by showing some recent figures on how successfully the major publishers are exploiting our good will in offering our work to them for free. I called for an end to the enormous profiteering that is currently perpetrated by some publishers, and pointed out that often, public money has funded the research that ends up behind a paywall. That was the main reason, I declared, that I resigned from my job as Co-editor of a major Taylor and Francis journal late last year. I could not, in good conscience, continue to help the publishers to line their pockets off the back of free labour, and publicly funded research that ended up behind a pay wall, read by very few people who had the means to pay for it.

I cited figures from two of my own papers, both published around the same time (in the slideset above) which showed the unacceptable editorial/review lead in times for many closed journals in comparison to open online journals. Paper based journals suffer from editorial back logs and there is little they can do to alleviate this problem. Some have established online 'early' publishing systems that host accepted papers prior to full publication, but they remain behind the paywalls. The most stunning comparison I offered was between the citations metrics of my two papers. The closed journal paper had received 19 citations against 511 for the open journal publication in the same time period. This alone, I argued, shows that open journals have the edge over closed journals, with many, many more people reading the free to view articles. If we want widespread dissemination of our findings, we need to look to the open journals, with their vast readerships.

During the question time, objections were voiced. I expected it. One delegate claimed that the review processes for open journals were not as rigorous. Well, that's just your perception, I countered, and it's a very challengable statement. I pointed out that in some open journals, review processes are even more rigorous - my open access journal article for example, was reviewed by three separate reviewers. The fact that they were unblinded (they knew our author names and we knew their names) and that the reviews and our responses were posted up online alongside the paper openly, created a higher quality, and more transparent review than the traditional closed, double blinded reviews could ever hope to achieve. Well, I did my best, and hopefully, some delegates will have a Damascus Road experience before they submit their next journal article. Perhaps some will think twice about banning mobile devices and social media in their classrooms in the future - and hope against hopes - some may even take the plunge and subscribe to a social network or two. We live in hope.


Creative Commons Licence
Damascus Road by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 09, 2012

A dangerous game

There's a dangerous game they play in Cyprus. It's called Meze, and it's far more brutal than the Spanish Tapas equivalent. The game goes like this: There are two teams: the eating team and the waiter team. The waiter team tries to beat the eating team into submission by delivering a constant supply of small dishes, containing far more food than they are ever likely to need in a full calendar month. It begins innocuously, with a few plates of pitta bread, humus and tzatziki. The eating team is lulled into a false sense of security. This is nice, they think, we can do this. Then more dishes begin to arrive at an alarming rate.

As the eating team finishes one dish, it is removed and three more replace it. The goal of the waiter team is to fill the table up so completely with food that there is no room left, and the eating team has no choice but to eat their way out to safety. But the game is a fix. No matter how much the eating team consume, there are always more dishes arriving. Kebabs, eggplants, grilled cheese, prawns, skewered meat, fried octopus - you name it, it all arrives far too quickly. There is a sadistic streak in the waiter team. Even when the eating team has had enough, the waiting team continue to deliver knockout blows, placing even more food directly on to their plates. Eventually, and inevitably, the eating team are writhing in extreme agony on the floor clutching their stomachs and yelling 'Enough! We surrender!' The end of the game is signalled by the waving of a white napkin, and then you can observe the smug grins on the faces of the waiter team, who look at each other and nod knowingly. Yes, we have defeated yet another group of tourists with our clever food manoeuvres. Our job is done.

This got me thinking that many of the world's education systems are a little like the eating game of Meze. We pile the students plates high with content. Content of every kind is presented to be consumed, and the poor students don't stand a chance. Many are overwhelmed by the amount of content they need to learn, and the pace at which they have to learn it. Even while they are struggling their way through an overburdened 'just in case' curriculum, still more content continues to arrive at an alarming pace. Some learners cry out for mercy, but they are still compelled to consume the content, because later, they are required to regurgitate it in an examination to obtain their grades. The examinations bear no resemblance to that which will be required of them in the real world. No wonder so many wish to leave the table early. What can teachers do to obviate this problem? Some are making a difference, reinterpreting the curriculum they are given by enabling activities and creating resources that facilitate student centred learning. Learning at one's own pace, and in a manner that suits the individual will overcome some of the problems of overload, but more needs to be done. Things are changing, but they are changing slowly, too slowly for many people's tastes. It's a dangerous game we are playing in education. Isn't it about time we stopped?

Image source


Creative Commons Licence
A dangerous game by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 06, 2012

My mouse is dead

No, we haven't had a pet bereavement ... and there is no tiny rodent laying legs up in its little cage (although ... there's a thought. My daughter's pet mouse sometimes keeps us awake at night with its irritating noises....). No, I'm referring to that wonderful old computer peripheral device that was first introduced in the last century. The mouse has served its purpose, and has been a faithful servant for all those using computers. But the fact is, the mouse is going the way of 5.25 inch disk drives (remember those?), CRT screens and dot matrix printers. Many of us wondered what we would do when the floppy disk drive was phased out. But how many of us miss it now? Same goes for the bulky visual display units and the clunky, noisy printers.

The computer mouse is old technology, and for a growing number of users has recently been superceded by touch screen devices (and the soon to be widely used non-touch devices). The only time I ever use a mouse now is when I am at my desk at the university, and am compelled to use a desktop computer. Most of the time I'm out and about using my iPod Touch, iPhone and a touch-pad laptop. Personally I haven't needed computer rodentia for several years. My mouse is dead. It is an ex-mouse. It has gone to join the squeaky choir of rodents in the sky.

I'm wondering how many other people are also of the same opinion. I have often watched young children trying hard to control a mouse, particularly when their hands are small and they can't quite grip it correctly. I have also watched children with fine motor control problems struggling to use them. Perhaps it's about time the more intuitive touch screen interfaces were introduced widely in schools. Hand-eye co-ordination is also required to control a computer from a mouse. Most of us can do it easily, but it's not everyone's experience. I have watched some older people struggle to get the mouse pointer in the correct position to execute a click. For older people with motor control difficulties or reduced visual accuity, the more intuitive touch screen, voice activation and gesture control may also be a clear advantage over mouse driven computers.

I think the mouse has had its day. The cat can have it. Now it's time for the next generation of intuitive interfaces. What do you think?

Image by Nik Hewitt


Creative Commons Licence
My mouse is dead by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 05, 2012

Tim flies

Tomorrow I head off to Nicosia to keynote the Cyprus International Conference on Educational Research. The event, hosted by the Middle East University (North Cyprus campus), will feature four keynote speakers and presentations of papers, workshops, posters, seminars and virtual presentations on a wide range of pedagogical research themes. In total, it looks as though there are over 400 presentations accepted into the three day programme.

The conference aims to "bring together educational scientists, administrators, counsellors, education experts, teachers, graduate students and civil society organizations and representatives together, to share and to discuss theoretical and practical knowledge in a scientific environment".

The three other keynote speakers are Janet Parker (Open University, UK) who will speak on the topic of 'Encouraging Early Career Researchers to become Expert Published Writers', Lejf Moos (NTNU Trondheim, Norway) whose theme is 'European Educational Research Today', and local academic Mehmet Çağlar (Near East University, Cyprus).

My own keynote will cover the proposition that social media, mobile technologies and the Web are together changing the way we perceive knowledge, learning and education. I'm going to  propose that we are witnessing a radical shift in the way knowledge is represented, consumed, created and shared, and that as a result, we need to reappraise the way we conduct research and disseminate our findings. I'm going to talk about adopting open access journal publishing as the best way forward for widespread and effective publication of research, and I'm going to champion open scholarship. Let's see how that will be received.

I'm going to blog again about the conference once I'm there and it's in full swing. Cyprus is a wonderful country to visit any time of the year, but doubly so at the moment, with the inclement weather here in the UK assailing the senses. The island's temperate Mediterranean climate will be very welcome, and the Cyprus culture and history are rich. It's a dirty job, but somebody has to do it. 


Creative Commons Licence
Tim flies by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 03, 2012

Five tools for global educators

Recently I have been considering the changing role of teachers who are adopting technology to extend the walls of the classroom. These are a new breed of teachers who do not necessarily accept that the classroom is contained within four walls. In effect, through the use of social media and telecommunication technologies, these teachers are becoming global educators. I consider myself a global educator and have tried to articulate my ideas on why this is a different role to traditional teaching. We are connected educators, linked in to a number of powerful global communities of practice, and we have access to resources, dialogue and audiences we would not enjoy in a traditional learning and teaching role. But what tools do we use to enable us to connect with these communities, resources, audiences around the globe? Here are my top five tools:

Webinar: There are a number of ways to teach and present live from beyond the classroom. I regularly present live (synchronous) webinars or web seminars, and other teaching sessions from my home office, or from a hotel room, and conceivably just about anywhere else there is connectivity to the internet. I have presented from Australia to the USA (strange timezone differences there) and from Europe to the USA, and even, in such events as the Reform Symposium, presented to a worldwide audience of educators. Webinar tools include Elluminate (now known as Blackboard Collaborate), WebEx and Adobe Connect all of which have similar screen topographies and perform similar functions, but all have an associated cost. All of the above tools support live audio (you should use a headset to maintain quality) and video communication (a webcam or internal camera on a laptop is needed for this), slideshow presentation tools and text communication. Webinars could also be conducted on Skype which is currently free, but quality may be more variable using this tool.

Blog: Blogging is arguably one of the most powerful tools for global education. I have already written a great deal about the power of blogging, so I won't elaborate too much here. What I will say is that by following a few simple guidelines, teachers can write and present content in accessible formats, and can incorporate images (pictures, diagrams), videos, audio and hyperlinks, all of which can help students to investigate a topic in greater detail if they wish. The comments boxes below each post support dialogue, and the tagging feature on most blogs enables easier search for content.

Twitter: This social networking tool is deceptively simple, but deeply sophisticated and versatile due to its inherent filtering facilities. It is also an excellent connecting tool - retweets are not repetition, they are amplification of content. The power of Twitter lies not only in its simplicity, but also in its accessibility. Whether used as a backchannel to amplify an event, or as a closed channel to converse between small groups, Twitter has an appeal that enables a great deal more expression that one would expect from a 140 character limit. Hyperlinks and other media links can be shared, and with the addition of a URL shortener, can also make more space for a few annotations. Used in conjunction with the other tools showcased on this page, it is indeed a very powerful tool for the global educator.

Video: Social media tools such as YouTube are maturing into sophisticated tools that enable all kinds of visual media sharing. Over 24 hours of video footage is uploaded to the YouTube servers every minute. Most of it can be disregarded, but some content found on YouTube is gold dust for teachers. It is now possible to create your own personal channel on the service, simply by clicking a few buttons. There is an editing facility available that allows teachers to select specific sequences of video and create new versions for showing to students. The comments box at the foot of each video clip enables dialogue between presenter and students. It's asynchronous, but can still be a highly effective way of sending quality content to distributed student groups.

Slideshare: If you have a Powerpoint presentation or a document and you want to share it with a wider audience, then Slideshare is probably your first port of call. Several of my recent presentations have gone viral simply because the tool is easy to access and is being used by large numbers of people every day. You can see at a glance how many views your slideshow has received, how many favourites, downloads, embeds, and most importantly, you can respond to comments to create dialogue with your remote students.

These are just a few of the vast array of tools that are currently available to the global educator, and they are my preferences. I am sure others will have different preferences or recommendations to make. Please feel free to share your expertise and ideas below in the comments box.

Image source


Creative Commons Licence
Five tools for global educators by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


February 02, 2012

Human 2.0

Post human. Such a strange concept, and one that many people struggle to understand. At its simplest, being post-human is a state closely aligned to the cyborg, or cybernetic-organism - part human, part machine. In other words, the post-human condition emerges when humankind and technology merge to the point where they become a part of each other. We can understand cyborgs - and for many, the idea of a half-man, half-machine evokes deep seated fears about how far technology can go. Donna Haraway (2004) makes a point of singling out Rachel - a replicant character in the sci-fi movie Bladerunner - as 'the image of a cyborg culture's fear, love, and confusion.' We have seen many other popular culture examples of the cyborg, from the six-million dollar man to Robocop - and each is endowed with superhuman strength, or enhanced senses. We recognise them because they are on their own, isolated, lost in a world of otherwise normality, unnatural, freaks of non-nature.

No-one really knows exactly if or when a post-human phase emerged, it is all theory and supposition. But we can trace the history of prosthetics and reflect on the incorporation of various kinds of technology into the human body. Replacement limbs may not strictly be accepted as a merging of technology and humanity, unless they are robotic limbs. Heart pacemakers, valves and other forms of technology implant or merger might be. Computer scientist and philosopher Andy Clark, in his 2003 book Natural Born Cyborgs, argues that humankind has an innate need to interface with technology: 'What the human brain is best at is learning to be a team player in a problem-solving field populated by an incredible variety of nonbiological props, scaffoldings, instruments and resources' (p 26). Essentially, when wetware (biological entity) meets hardware, the software can be interoperable. Clark sees the merging of mind and machine to be unstoppable and inevitable. He believes it's not a matter of if, but when. Some would argue that the transient phase leading to post-humanism is the non-invasive but just as powerful welding together of human and computer, as seen in the addictive video game playing of geeks, or the smartphone ultra-dependency of our current youth generation.

So are we now on the verge of a new phase in human development? Are we at the cusp of the incorporation of technology into the human body because we have such a desire to enhance our senses, increase our physical and mental performance, or otherwise extend the capabilities of what is considered to be 'natural'? Are we about to embark on a post-human phase in human development? Some would affirm this, citing several notable 'real examples' of cyborgs in recent years. Meet Kevin Warwick, a professor at the University of Reading, probably the world's first true cyborg. Professor Warwick is interested in how technology can enhance human senses and improve performance. In the foreword to his book I Cyborg, he writes: 'Humans have limited capabilities. Humans sense the world in a restricted way, vision being the best of the senses. Humans understand the world in only 3 dimensions and communicate in a very slow, serial fashion called speech. But can this be improved on? Can we apply technology to the upgrading of humans?' In essence, Warwick is asking: Can we become Human 2.0?

In a famous experiment in 1998 Warwick had a chip transponder surgically implanted into his arm. A computer was then able to track Warwick as he moved around the university campus, and allowed him to open doors, turn on lights, and operate computers without touching them. Other phases of the experiment involved more advanced transponder implants that monitored Warwick's internal condition, such as his emotional responses, stress levels and even thoughts. The speculation was that if others also had similar transponders implanted, people might then be able to communicate their thoughts and emotions to each other via computer mediation.

More recently, Tanya Vlach made headlines with her plans for a new prosthetic eye. She has a dream to transform herself into an 'enhanced human being' after being involved in a serious car accident in which she lost her left eye. She is now planning to have an 'eye-cam' - installed inside her prosthetic eye, complete with zoom control, infra-red and ultra-violet capabilities and the facility for face recognition. The eye-cam would interface with a custom made app, housed in a standard smartphone. She is currently waiting for technology to catch up with her vision, and one day soon, hopes to be able to hard wire the eye-cam directly to the vision centre in her brain, and in so doing become a truly enhanced human being - a cyborg - a post human. Scary, fascinating, challenging stuff - the cyborg becomes the iBorg.

Computer scientist Jaron Lanier's keynote speech at Learning Technologies in London recently served to illustrate several of the dangers and caveats of the post-human condition. Jaron Lanier vehemently rejects Ray Kurzweil's vision of a future where computers can exceed the capabilities of humans. 'You have to be somebody before you can share yourself', he warns. He suggests that we already have expanded memories (search engines of the web) and remote ears and eyes (mobile phones and webcams). Lanier sees no techno-eutopia in the future, but warns instead that we are in danger of dystopia. Indeed, he advised the makers of the movie Minority Report on what might be expected from a technology dominated future in which people were manipulated like chess pieces. The data mining capabilities of the social networks alone can enslave us by owning our purchasing habits, internet search preferences and all other personal data he suggests. He sees Facebook and other social networks undermining and devaluing friendships. The technology should work for us, not us for the technology. Lanier is a contentious, thoughtful character. In just a few minutes of conversation with him in the speaker's lounge, my impression was that he opposes anything that involves a 'hive mind'. 'Why are you wearing a Creative Commons badge?' he asked me as we gazed out over West London. I explained that I believe in giving all my content away for free and that to me, that is the essence of the future of learning. 'I'm going to speak against that today', he warned. It's clear that generally, Jaron Lanier holds a somewhat more pessimistic view of our possible cyborg future.

In the final analysis though, it is mind amplification that is the ultimate goal for humankind's future enhancement. The ability to distribute knowledge beyond the confines of the human brain, and the capability to extend the mind through and across networks does not demand or require any co-joining of human and computer. We have already achieved much of this through mind tools such as social media, which according to Karen Stephenson enable us to store our knowledge with our friends. Do we really need a post-human future? iThink not. 


Top image by Elif Ayiter

References

Clark, A. (2003) Natural Born Cyborgs: Minds, Technologies and the Future of Human Intelligence. New York: Oxford University Press.

Haraway, D. J. (2004) A Manifesto for Cyborgs: Science, Technology, and Socialist Feminism in the 1980s. New York: Routledge.

Lanier, J. (2010) You are not a Gadget. London: Penguin.


Creative Commons Licence
Human 2.0 by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


January 30, 2012

Digital learning futures

As I write this blog post, the above slideshow has received almost 18,000 views in just 48 hours since it was posted up onto Slideshare. These slides accompanied my presentation during the Learning Technologies conference and exhibition held at London's Olympia on 25-26 January. I was pleasantly surprised by the huge turnout to hear me speak, and grateful to Don Taylor and his team for inviting me to speak at this excellent event.

During my talk, I discussed a number of possible scenarios that might result when wholesale adoption of digital technologies occurs in education and training. I touched on personal learning networks, mobile technologies, games and gamification, the use of social media in learning, the role of user generated content, the phenomenon of ubiquitous connection, and technological convergence. The latter in particular is a trend that is allowing us to use web-enabled television, dual view screens, and in the near future will enable a merging between e-mail and social media. I also discussed pedagogical issues such as deep and surface learning, creative thinking and the transformation of knowledge consumption. As a nod to the possible futures we might see, I discussed the development of semantic web technologies (Web 3.0 and Web x.0), touch screen tablets, non-touch technologies and smart objects, as well as the potential of Open Educational Resources, open learning and open scholarship to support a global democratisation of learning.

I'm immensely gratified to think that so many more people outside the auditorium at Learning Technologies are now downloading and viewing these ideas. My audience has been extended beyond the walls of the event to a global classroom through the amazing power of social media. Here's to all the possible futures of learning!


Creative Commons Licence
Digital learning futures by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


January 29, 2012

Back to the future

I was lucky to witness firsthand some of the earliest attempts at educational computing in the UK. In 1976 we set up a project called Investigations into Teaching with Microprocessors as an Aid (ITMA). I was in the technical team that built some of the first personal computers from kit form, which we then deployed among our student teachers to explore how these new tools might possibly be used in teaching and learning. Educational computing was still very much in its infancy, and there was a lot of interest in whether they could or would actually change learning.

Later, in 1981 I changed jobs to work in a nurse training school in the National Health Service where the only computers were very large ones that were used for management and administration. They were kept behind locked doors, and only a few select individuals ever got to enter the room.

Around 1981, Acorn and the BBC joined forces to produce one of the first affordable educational computers. It was called rather obviously, the BBC Microcomputer. Various versions were released over the decade including the 'B', the 'Master' and the 'Archimedes'. Each had to be supplemented by an external 5.5 inch floppy hard drive and a metal cube screen Microvitec monitor. The entire set was cream coloured, and could be further supplemented by a plinth which housed the whole ensemble. My nursing school, with my encouragement, purchased a dozen or so, and then it was my job to deploy them in meaningful contexts to promote learning. I placed one in the corridor outside my office, and wrote a small programme which printed out on a dot matrix printer information about every single transaction that took place each day. When a student nurse accessed a programme, the printed record showed me the name of the programme, when it was activated, how long the student remained on the programme, and even what score they achieved in the tests on the software. I discovered that the programmes, simple as they were, had the effect of drawing students to engage with learning on a mostly informal basis, anytime they were passing my office on the way to the training rooms, library and coffee area. They were in effect, one of the first technology supported self-directed study methods ever used in nurse education.

I deployed a second BBC computer alongside the first, and the use increased. Very soon I procured a small room in which we positioned an entire suite of BBC computers. I began writing programmes in conjunction with the nurse tutors, and in no time at all we were selling the Computer Assisted Learning (CAL) packages to other nursing schools all across the country. The software was written in 'BBC Basic' (Beginners All purpose Symbolic Instruction Code) and were indeed basic, mainly consisting of text, questions and tests, and remedial loops with a score presented at the end. In the mid 80s, this was fairly leading edge, and seemed to align comfortably with the teaching ethos of the time, which in nurse education was essentially a behaviouristic 'drill and practice' approach. Today, the programmes would seem primitive, inappropriate and probably very very boring. In the mid 80s, they attracted students like bees to a flower garden. They queued to used the computers. One programme I wrote was a remix of the Basically Eliza programme, which mimicked a therapist by matching inputted questions with a small data base of responses. My programme had a twist. Instead of merely trying to converse with the student nurses, the programme threw insults back at them too, taking the conversation to an entirely new and hilarious level. It became the most popular programme in the suite, especially for our mental health nurses.

It was with a wonderful feeling of nostalgia that I walked into the National Museum of Computing dome at Learning without Frontiers and saw the array of BBC computers on display. They were even accompanied by the BBC Acorn User Guide with it's glossy coloured cover and spiral binder. The sight took me back over three decades to the time I wrestled with how to deploy new and untried technology in authentic learning contexts. I remember the excitement I experienced when I unpacked the BBCs for the first time, and connected and switched them on, to see what they were capable of. We have come a long way since those early pioneering days, but the same questions still remain. How can we embed new technology effectively? What can we do with this new technology that we couldn't do before? How will this new technology effect and affect pedagogy? Even then, 30 years ago, I believed fervently that computers would radically transform education and training, and I still hold that hope. Education has indeed changed, and continues to evolve as technology drives change. Radical change though, will only come when teachers everywhere see the potential and power of technology to extend, enhance and enrich learning for all.


Creative Commons Licence
Back to the Future by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


January 28, 2012

Positive deviance and the IPD




We all know that organisations and institutions impose barriers to innovation. The larger they are, the more rules they tend to generate. This is because by nature large organisations are conservative and there is a perceived need to protect the status quo and maintain order. But this isn't always good news for creativity and innovation. James Clay once called such enforcing agencies 'Innovation Prevention Departments', and claimed that every institution has one. I think he's right. Trying to innovate in such circumstances, especially when there is an IPD saying 'that's against the rules', 'it can't be done' or 'it's too expensive' can be hard going, but innovation is never impossible. I was interviewed at the Learning Technologies conference, about my views on innovation, organisational constraints and positive deviance. The interview was actually recorded downstairs in the Learning without Frontiers dome zone, which explains the theatrical lighting. Above is the video of the interview in full (duration 90 seconds).

Interview by Martin Couzins


Creative Commons Licence
Positive deviance and the IPD by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.


<< Back Next >>