Feb 242013

My sister just sent me a link to a TED talk.  TED talks are fascinating.  This one describes itself thus:

Elizabeth Gilbert muses on the impossible things we expect from artists and geniuses — and shares the radical idea that, instead of the rare person “being” a genius, all of us “have” a genius. It’s a funny, personal and surprisingly moving talk.

I think it’s a beautiful idea, one I am inclined to value highly.  I have been a teacher most of my working life – and soon learned to value highly the contribution of students.  Not only in terms of what I asked them to do but – also, and more importantly – in terms of what they learned to ask me to do.

Genius is not the preserve of a man or woman our society determines as being so.  And even if it is, it is only because our society is incapable of perceiving the genius that all of us contain.  Even as we like to focus from a distance on the visibly astonishing, we miss out on the beauty that we exhibit every single day of our lives.  We are clever souls, we human beings.  The virtual democratisation of content we are witnessing this last decade is not primarily a cause of information ills but, rather, a massive release of pent-up generations of humanity unable for so long to visibly express their genius.

And now I have a confession to make.  I haven’t watched the TED talk my sister has sent me as yet.  And I probably won’t.  I really do hope, however, that she doesn’t stop sending them to me.  Today’s post would not have got written if it hadn’t been for her thoughtful including of me in a footnote to a Facebook post.  Although I very rarely watch videos at all, their synopses rapidly read do often spark unfinished and engaging business.

To be honest, I think there’s a reason.  I think I’m a natural reader, not a watcher.  What’s more, I think those who watch are – more often than not (though clearly an exception in the case of my book-loving sister) – natural watchers, not readers.  Which leads me to draw the following conclusion: the old-age battle (or, at least, the sixty-year-old battle) waged between literature and television has subtly restarted since the arrival of the web.  Following on from the middle of the 20th century, our early 21st century online humanity has reasserted a division which should please us enormously.  For between the geniuses of industrialised art and the geniuses of individualised art, we stumble across everything we should admire.  That some of us should continue to find pleasure and intellectual involvement in this century’s equivalent of tablets and scrolls of yore and that others of us should continue to find pleasure and intellectual involvement in this century’s equivalent of more oral and theatrical tradition simply underlines the power and strength of them both.

All those centuries ago, we got it right first time.

The instincts to register through writing and speech the thoughts, occurrences and imaginations of a wonderful species were just as accurate and apposite then as they still are these days – continuing as they do to strive and fight their way above the flood waters of passing and irrelevant technologies and discourses.

A reader then, are you?  Or a watcher?  Or a marvellous – highly literate – combination of the two?

Lucky you!

Aug 072012

A while ago I posted this piece on the virtues of pursuing excellence compared to the downsides of competition:

Everywhere that commerce gets involved in what used to be public spaces, there is the same tendency to make exclusive of each other different products and services supplied by different providers.  From software such as Microsoft Office which locks you into proprietary data formats to supermarkets with private malls and parking places which can only be used for a certain time and only for a certain purpose, the desire by powerful companies to own our physical and intellectual spaces only seems, as time goes by, to march unstoppably onwards and upwards.

And yet commerce wouldn’t have to be like that if excellence rather than competition were the name of the game.  A massive evolutionary step forwards it – indeed – would be, in fact.  And perhaps, in a way, we are in the anteroom of such a step forwards: whilst the web is still in its relative infancy, we – even so – are able to perceive on the social horizon many tendencies and tools which might allow for a perfect perception of true excellence – above and beyond the tricks of marketing and persuasion which currently tend to cloud realities.

I then went on to conclude that:

In the name of competition, specialisation arose.  Through this process of specialisation, disconnection began to spread.  Now we only know how to keep a community together by creating as big a sense of distance and difference as possible from those beings we are forced unerringly to compete against.  By creating a worldwide web of interconnectedness on the back of such specialisation, we have created an impossibly gigantic circle the squaring of which can surely only break us.

My conclusion?  We either stop using, at least as we have done to date, that specialisation I mention to advance our society – or we work out some pretty convincing alternative way of overcoming the Chinese walls that are breaking up our ability to share our evermore uncommon experiences.

Either way, it’s going to be an uphill battle for the cooperative instincts at the heart of humanity.

And an example, perhaps, of where a progress measured only empirically distorts a wider understanding of what excellence – and, as a result, our society itself – should really look like.

In the above piece, I set out the arguments in favour of moving on from the age-old competitive instincts of Darwinian capitalism to a more objective, more reality-grounded, more cooperative-focussed, goal of achieving excellence in all fields.

Yesterday, however, I was minded to recapitulate: I finally saw the film “El Caballero Oscuro: La Leyenda Renace” (the Spanish dubbed version of “The Dark Knight Rises”, the final film in the Christopher Nolan take on the Batman mythology).  Amongst many other wonderful things (Blake is almost like a Luke Skywalker reprise; Catwoman a Hans Solo delightfully playing off the moral centre thus constructed), the film talks of the dangers of a just revolution – even when you are right, by acting on such righteousness you may further contribute to the destruction of civilisation.

And part of this righteousness lies in our competitive pursuit of excellence above all.  If we teach, through our consumerism, our children and youth to believe in absolute notions of value for money, of best is first, of maximising outcomes in everything we do and everyone we get to know, we can only conclude that excellence must be applied to every field of human endeavour.

The nominal baddie in the film goes by the name of Bane.  (The bane of Batman, in fact – even as Robin John Blake alludes to stealing the latter’s right to an autumnal morality, as the Batcave substitutes his beloved attachment to good policing.)  At one point in the narrative, Bane and his gang invade Wall Street’s Stock Exchange.  The following exchange sets up their moral justification for their violent occupation of those who have used other tools to commit injustices:

“Esto es la bolsa, aquí no hay dinero para robar.”

“¿De verdad? Y vosotros, ¿qué hacéis aquí?”

Which loosely translates as:

“This is the Stock Exchange, there’s no money here you can steal.”

“Really? And you lot, what are you doing here then?”

In this film, we see how the absolutism of corporate competitiveness has led all kinds of human beings – both good and manifestly evil – to acquire the same mindsets of excellence in what they do.  Bane’s plan is as coherent and thought-through as any marketing of a global brand has ever managed to be: even, perhaps, as ingenious and effective as that plan which has sold us the narrative that contains his story.

“The Dark Knight Rises” explains history quite magnificently.  From the dangers of a new French Revolution to the unhappy reality that, sometimes, evil individuals operating on the backs of masses do change the direction of humanity, Nolan’s images underline how fragile the order which contains our worst instincts really is.

In the light of the above, then, do I still believe in cooperative excellence over competitive Darwinism?

I think I do.

But after watching Nolan’s film, a single caveat: sometimes, civilisation needs uncivilised means to put evil genies back in their bottles.  The problem we have, when we decide this is the case, is that the process we use to choose who and when is still fraught with the unempowering hierarchies of old.

We cannot solve our crises of morality if the genie-containing procedures are not in themselves shared moral acts.

That a Tony Blair or a George W Bush take it upon themselves to lie to us (as, in the film, Commissioner Gordon did to his people for eight long years about the true nature of their alleged saviour Harvey Dent) in order, that is, to save us from our enemies … well, this is not only immoral but also – as we have seen in both the cinema and our own realities – rankly inefficient.  If for no other reason, then, than that of saving pecuniary pain, we should change not only when we go to war (whether figurative or literal) but also how we make that decision.

Perhaps, in truth, we need a little less excellence than we have always assumed.  Perhaps it is time to stop stretching the envelope so competitively.  Perhaps the mirror image of the Apples of this world truly is the Banes of cinematic existence.

Perhaps it is time to be less human – and more humane.

Apr 252012

My thoughts on this matter continue to emerge.  Bring yourself up to date, if you wish, by reading this and its associated posts.

Just a couple more ideas to throw into the mix.

The social web’s major achievement seems to have been to convince people to work for global corporations for free.  Not only for free but also in exchange for handing over personal data such as names, locations, dates of birth and so forth.  We spend hours every day inputting what starts out as our data in a process whereby it essentially becomes their data – much of which in a discrete sense is of very little value.  But bundled together, as sparse data often has been over history, it takes on a whole new life and existence.

So where has that selfsame history brought us?  Whilst the 20th century was characterised by the multiple players of the industry of film taking over from the single authorship of the previous century’s novels, the 21st century will be characterised by a virtual sweat-shop of voluntary and addicted labour inputting its individually irrelevant datasets in order that algorithms and clever software manage to tease creative content out of the mix.

The creativity crisis both Chris and Rick speak so eloquently of is, in fact, no crisis at all – for there is plenty of employment to go around; the only slight problem from a living-your-life point of view being that it’s manifestly unpaid.

If we feel that the creative arts are inadequately funded, it’s because we’re looking in all the old places to create them.  The new and brightest locations for creativity exist in the online constitutions which convert the product of evermore humble data-inputters across the globe into interesting and engaging Web 2.0 content.  And funding isn’t necessary because the dumbing-down of process which characterises such corporate bodies everywhere has now also been applied to the end-users of such tools.  Which does beg the question: who, in fact, could justify paying anything to anyone for simply liking or commenting on an article?  In essence, we’ve been sold the donkey that what we do is ephemeral and worthless by itself – when in reality, using such dumbed-down processes which gather together and combine disparate data in new and unusual ways, it is really rather valuable, permanent and complex.

Are the machines on the point of taking over then?  I would argue, with billion-dollar stock market flotations and user populations in the hundreds of millions, the modern social web has already turned us into industrialised cogs – freeloading as it does quite brutally on the back of our own falling standards of living as we work for zilch.

This software I talk of serves to take the basest of another’s data and turn it into a financial gold which is then stripped of all authorship and right to proper remuneration.

A virtual alchemy finally exists, then, in the 21st century.  And its objects and goals – and victims too – just happen to be ourselves.

Oh, and one final thought to be going away with: if you believe in remunerating content providers properly but at the same time are thinking of using collated datasets of social content to run your businesses, think for a moment where all the latter information comes from – who produces it, under what conditions and how.

You may discover that the phrase “two-faced” comes to mind as you fight to impose your copyright laws on end-users of film, video, music and journalism – end-users who in a separate context you’re effectively employing unwaged in order that you might market better such legally protected products.

Yes.  Web 2.0 is a classic example of getting something for nothing.  Which doesn’t stop the most fervent supporters of copyright, even as we speak, resorting hypocritically to its charms.

Apr 202012

Chris rightly asks the question:

The answer is that all pose what might be the most important question in economics – of how to encourage creativity.

I think, however, the question is misplaced – misplaced because economics, as well as observers of the creative industries themselves, still sees human endeavour on a playing-field where individuals are more important than mobs.  In fact, some would eagerly blame open source movements and other crowdsourcing efforts for having removed the individual – as well as their due compensation – from modern creation.

But if we’re honest about this, it started at least as early as the nascent 20th century production line that was the Hollywood film industry.  (There are, if I remember rightly, historical references to the Flemish geniuses of Renaissance art also running their own industrially produced outputs – though obviously nothing on the scale of Hollywood.  On the other hand, what did the printing-press bring to authorship if not the industry of the many cooks who might very well spoil the broth constructively?)

And this selfsame Hollywood, for quite a while, was able to impose a model that other industries such as newspapers readily copied: take advantage of the multifarious skills the properly channelled mob might apport; pay them minimally for their efforts; and cream off the profitable results in terms of massive gains for hierarchies and shareholders decade after decade.

The problem, of course, for all the above now, is that the mob which once scraped a living by working for the corporates – which quite correctly invoked the added value that centralised communications, places of work and managed teams of able staff brought to very many creative people – has “disintegrated” into free-culture producer-consumers on the web.  The problem with the web isn’t just that the corporates are getting their content “ripped off”; the problem with the web is, really, that the ant-hill mob of selfless striving has replaced the permanent expectation to be individually famous – and paid for it.

If you stop blogging, another blog will replace you.  If you stop posting to Flickr, another photographer will step into your shoes.  We have taken on board so completely the fifteen-minutes-of-fame dynamic of Warhol’s that we actually now expect to be eventually trodden on – and our only desire is to carry on scurrying creatively for as long as our own personal resources last.

The problem, then, with creativity in modern economies isn’t finding ways of generating more of it.  We only have to read up on YouTube’s download and upload stats, on Wikipedia’s daily pageviews and on Pinterest’s current levels of interest to realise that quantity – and even quality – isn’t an issue.  The ant-hill mob is doing its biz – there’s no doubt about that.

No.  The real problem with creativity only exists within an individualist – and perhaps libertarian – focus on what human reward should really look like.  Even as traditional socialism vanishes from most of modern political practice, the old sharing and community instincts which form a part of being a human being find their expression in modern online creativity.

Essentially, creativity has finally gone all post-modern on us: it no longer needs the traditional economic process of investment, worker oppression and shareholder reward to produce its goods.

The question is whether this is satisfactory for any of us who still believe we human beings should be more than grains of sand on anonymous beaches.

And to that question, I really have no answer.

Maybe because part of its answer, sadly, lies in the meaning of life itself.

Oct 042011

Here’s an interesting idea:

We use the internet for everything – from dating and gossip to hobbies and work. We also, crucially and increasingly, use it to form our most basic and closely held views about the world. And there is a huge amount of bad, wrong, misleading and malicious information, often masquerading as the genuine article.

Teachers are worried that young people are not being equipped to tell the difference. The 500 teachers polled rated their pupils’ ability to recognise bias, apply fact checks and verify sources to be below average.

Other surveys indicate that around one in four 12-15 year olds make no checks at all when visiting a new website.

Some form of ‘digital fluency’, encompassing both traditional critical thinking skills and specific knowledge about how the online world (say, how search engines operate) works needs to be put at the heart of education.

This issue reminds me a little of when I first started studying film at uni.  We were grandly – and, in the event, quite rightly – informed we would learn not how to “watch” a film but how to “read” a film.  And yet what was most interesting was how much film language we already knew.  You try and imagine now what it would be like to see a Hollywood film for the first time – and if you can properly imagine it, you’d know how lost you would feel.  Film is so ever-present that – as a language and medium – it has entered our very souls; to such an extent we may even dream in its cross-cutting, close-up and cutaway techniques.

I know I certainly do.

It does occur to me, however, that perhaps this call to digital literacy – similar in appearance and apparent sensibility to the industrially spoonfed and popcorned moviegoers of the 20th century, with their film schools, film studies and film festivals – is not a little misplaced.  Most young people, I would imagine, studies and surveys notwithstanding, are much better positioned to understand and sense Internet falsehoods than their elders who still read the Daily Mail and the Sun.  And if they aren’t, then it’s precisely what they’ve experienced in the home through their parents use of mainstream media which has led them to uncritically take as read the stories they come across when surfing the Internet with their smartphones.

The real problem isn’t a virtual digital literacy, then – but, rather, a far longer and deeper tradition than that: knowing how to count on ten offline and real-world fingers.

And those who really need the kind of digital literacy I am talking about are those voters in that section of society who believe every supporting prejudice their cosy columnists choose to write about.