






In yesterday's post I made the statement that the internet is better as a creative space than it is as a repository. Let me clarify this statement. Much of pre-social web based content was difficult to edit or change. Web 1.0 - sometimes referred to as the 'sticky web'- was largely controlled by webmasters and corporations, and was used mainly as a broadcast channel to promote ideas and products. The advent of Web 2.0 type participatory tools and services such as social media and social networks, voting and filtering tools and personalised spaces, provided users with the ability to be directly involved in the creation of web content. Media sharing sites such as Flickr, YouTube and a variety of podcasting services offered users the capability to go beyond the repository mentality of earlier web iterations, to host their own TV and radio channels, blogs enabled them to publish their own newspapers. The web had become a place where people could generally create and share their ideas on a global stage. 













Post human. Such a strange concept, and one that many people struggle to understand. At its simplest, being post-human is a state closely aligned to the cyborg, or cybernetic-organism - part human, part machine. In other words, the post-human condition emerges when humankind and technology merge to the point where they become a part of each other. We can understand cyborgs - and for many, the idea of a half-man, half-machine evokes deep seated fears about how far technology can go. Donna Haraway (2004) makes a point of singling out Rachel - a replicant character in the sci-fi movie Bladerunner - as 'the image of a cyborg culture's fear, love, and confusion.' We have seen many other popular culture examples of the cyborg, from the six-million dollar man to Robocop - and each is endowed with superhuman strength, or enhanced senses. We recognise them because they are on their own, isolated, lost in a world of otherwise normality, unnatural, freaks of non-nature.
No-one really knows exactly if or when a post-human phase emerged, it is all theory and supposition. But we can trace the history of prosthetics and reflect on the incorporation of various kinds of technology into the human body. Replacement limbs may not strictly be accepted as a merging of technology and humanity, unless they are robotic limbs. Heart pacemakers, valves and other forms of technology implant or merger might be. Computer scientist and philosopher Andy Clark, in his 2003 book Natural Born Cyborgs, argues that humankind has an innate need to interface with technology: 'What the human brain is best at is learning to be a team player in a problem-solving field populated by an incredible variety of nonbiological props, scaffoldings, instruments and resources' (p 26). Essentially, when wetware (biological entity) meets hardware, the software can be interoperable. Clark sees the merging of mind and machine to be unstoppable and inevitable. He believes it's not a matter of if, but when. Some would argue that the transient phase leading to post-humanism is the non-invasive but just as powerful welding together of human and computer, as seen in the addictive video game playing of geeks, or the smartphone ultra-dependency of our current youth generation.
So are we now on the verge of a new phase in human development? Are we at the cusp of the incorporation of technology into the human body because we have such a desire to enhance our senses, increase our physical and mental performance, or otherwise extend the capabilities of what is considered to be 'natural'? Are we about to embark on a post-human phase in human development? Some would affirm this, citing several notable 'real examples' of cyborgs in recent years. Meet Kevin Warwick, a professor at the University of Reading, probably the world's first true cyborg. Professor Warwick is interested in how technology can enhance human senses and improve performance. In the foreword to his book I Cyborg, he writes: 'Humans have limited capabilities. Humans sense the world in a restricted way, vision being the best of the senses. Humans understand the world in only 3 dimensions and communicate in a very slow, serial fashion called speech. But can this be improved on? Can we apply technology to the upgrading of humans?' In essence, Warwick is asking: Can we become Human 2.0?
In a famous experiment in 1998 Warwick had a chip transponder surgically implanted into his arm. A computer was then able to track Warwick as he moved around the university campus, and allowed him to open doors, turn on lights, and operate computers without touching them. Other phases of the experiment involved more advanced transponder implants that monitored Warwick's internal condition, such as his emotional responses, stress levels and even thoughts. The speculation was that if others also had similar transponders implanted, people might then be able to communicate their thoughts and emotions to each other via computer mediation.
More recently, Tanya Vlach made headlines with her plans for a new prosthetic eye. She has a dream to transform herself into an 'enhanced human being' after being involved in a serious car accident in which she lost her left eye. She is now planning to have an 'eye-cam' - installed inside her prosthetic eye, complete with zoom control, infra-red and ultra-violet capabilities and the facility for face recognition. The eye-cam would interface with a custom made app, housed in a standard smartphone. She is currently waiting for technology to catch up with her vision, and one day soon, hopes to be able to hard wire the eye-cam directly to the vision centre in her brain, and in so doing become a truly enhanced human being - a cyborg - a post human. Scary, fascinating, challenging stuff - the cyborg becomes the iBorg.



We all know that organisations and institutions impose barriers to innovation. The larger they are, the more rules they tend to generate. This is because by nature large organisations are conservative and there is a perceived need to protect the status quo and maintain order. But this isn't always good news for creativity and innovation. James Clay once called such enforcing agencies 'Innovation Prevention Departments', and claimed that every institution has one. I think he's right. Trying to innovate in such circumstances, especially when there is an IPD saying 'that's against the rules', 'it can't be done' or 'it's too expensive' can be hard going, but innovation is never impossible. I was interviewed at the Learning Technologies conference, about my views on innovation, organisational constraints and positive deviance. The interview was actually recorded downstairs in the Learning without Frontiers dome zone, which explains the theatrical lighting. Above is the video of the interview in full (duration 90 seconds).
Interview by Martin Couzins

Positive deviance and the IPD by Steve Wheeler is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License.