Saturday 15 October 2011

The Contagious Moment


As the credits roll at the end of Rise of the Planet of the Apes a jet carrying a pilot infected with a deadly brain virus traverses the screen. The pilot has already coughed up blood in the departure terminal at San Francisco International airport so we know that the prognosis for mankind is not good, and as the plane traces a path between New York, Frankfurt and other destinations it isn't long before the globe is ensnared in a web of inter-connecting lines.

Like a Lufthansa map iterated to the power of ten, the lines symbolize both the path the virus will take and the technological network that governs the transmission of material and immaterial objects in Rise's post-modern, and soon to be post-human, world.

It is a neat way of signaling the imminent Armageddon and the films sequel. Ever since the early 1980s when an Air Canada flight attendant, Gaetan Dugas - aka 'patient zero'supposedly introduced AIDS to North America, novel contagions have been difficult to contain. Thanks to international air travel and the ceaseless demands of global trade and commerce, deadly rainforest pathogens are never more than a truck, train, or plane-ride away from the nearest metropolis, which in our highly networked world is the same thing as saying that in an instant they can be everywhere.

However, it is not the incursion of nature red in tooth and claw into sanitised urban spaces that is the true subject of films like Rise and Steven Soderbergh's Contagion, which gets its long-overdue release in the UK this week, so much as the mayhem wrought by 21st century technological networks. Sure, the posters for Contagion are adorned with biohazard signs, but it is the contaminating effects of panicwhat might be termed moral or emotional contagionthat really gets Soderberghs creative juices flowing, hence the films tag line, Nothing spreads like fear.

Fearand the money that can be made from itis also at the centre of  Robert Harriss new thriller,The Fear Index, in which the author employs Charles Darwins The Expression of the Emotions in Man and Animals (1872) as a none-too-subtle plot device to signal to readers the thrills ahead.

The heart beats quickly and violently, so that it palpitates or knocks against the ribs, writes Harris quoting Darwin shortly before an intruder bursts into the home of his central character, Dr Alexander Hoffman. As fear rises to an extreme pitch, the dreadful scream of terror is heard. Great beads of sweat stand on the skin…’

However, it is not the physiology of fear that interests Harris so much as its psychology and epidemiology. Hoffman, a brilliant physicist who has quit CERN to set up a hedge fund in Geneva, has devised an alogithm, codenamed VIXAL-4, to monitor fear and its impacts on financial markets.

As one would expect from a writer whose ear is tuned to the Zeitgeist, Harris studs the narrative with references to those now familiar bogeys, quants and credit-default swaps. Stripped of financial jargon, however, the principle behind VIXAL-4 is simple: it is a machine that shorts stocks when fear and anxiety are on the up. In a pitch to investors, Hoffman explains that with fear driving the world as never before, VIXAL-4 is a licence to print money.

But why should there be more fear about today than during the Cold War when the world lived with the threat of mutually assured nuclear destruction? Hoffman, or rather Harriss explanation, is intriguing: The rise in market volatility, in our opinion, is a function of digitalisation, which is exaggerating human mood swings by the unprecedented dissemination of information via the internet. In other words, writes Harris, digitalisation itself is creating an epidemic of fear.

This notion that digitalisation and the growth of informatics networks tend to propagate fear and other forms of emotional contagion also informs Contagion. As Jude Law, playing a conspiracy-minded blogger, races to uncover the truth about a deadly flu-like virus while officials at the Centers for Disease Control (CDC) in Atlanta agonise over how much to tell the public, Soderbergh suggests that in our technologised, network-dependent world, fear and hysteria are impossible to contain. Instead, amplified by social media and bloggers packaging and re-packing rumours and half-truths, disinformation takes on a life of its own. The result is that just as viruses co-opt our DNA to make numerous copies of themselves, so fear 'goes viral', endlessly replicating itself and sowing doubt and distrust wherever it lands.

But should we employ biomedical metaphors in the context of informatics and social epidemiology? After all contagion is not the same as transmission.
Nor is contamination with a living viruswhich  presupposes the materiality and mess of actual bodiesthe same as a virtual infection with an aberrant piece of computer code, which is what Harris is surely talking about when he reduces fear to an algorithm that can infect financial markets.

Accustomed as we are to thinking of contagion as an epidemiological term that can be metamorphised into non-biological contexts, perhaps this is an abstraction too far and all this talk of computer viruses and digital epidemics is obscuring what is really going on in informatics networks.

There is also another side to the equation. Just as in the world of information security biological tropes are used to understand computer viruses and design computer immune systems, so in mathematical epidemiology statistical and probabilistic methods are used to study the dynamics of populations and disease distributions.

Thanks to the World Health Organizations Global Outbreak Alert And Response Network (GOARN) and electronic disease reporting systems such as ProMED, search engines now routinely trawl the internet for unusual disease outbreaks at the same time as other computers simulate epidemics and forecast patterns of morbidity and mortality.

The result is what the philosopher and media theorist Eugene Thacker calls a
‘real-time battle between networks’, one biological, the other informational.
Thackers argument is that just as particular types of computer behaviour can be understood through the lens of biology, so infectious disease can be understood through the paradigm of mathematics and informatics. But while internet-based disease surveillance systems and statisical modelling can be highly effectivewitness, for instance, the WHOs just-in-time response to SARS in 2003they can also backfire.

Interestingly, this is not only a matter of having accurate data. Although material and biological processes can be abstracted by epidemiology into statistical processes, at the end of their day they are about real, material things. And real viruses, unlike their immaterial, metaphorical counterparts, are messy; no matter how good the data there is no guarantee they will behave in the way the mathematical models predicthence the wildly inaccurate forecasts about the projected deaths from swine flu in 2009.

It is such premature and often erroneous prognostications that explain the publics growing distrust of science and the popularity of the sort of conspiracy theories expressed by Alan Krunwiede, the character portrayed by Jude Law in Contagion, who spends much of the film obsessively blogging that the CDC is in cahoots with Big Pharma while promoting a dubious homeopathic remedy. The irony, or course, is that in social epidemiological terms this distrust is itself contagious. With the help of Twitter, Facebook and other social media, conspiracy theories now spread as rapidly as any virus, sparking skepticism, fear and, sometimes, hatred of the authorities, whether they be health officials sitting in Whitehall or police chiefs with a hotline to News of the World.

In this respect we should not have been shocked by the response to the shooting of Mark Duggan in Tottenham in August. After all, as many commentators pointed out, the Arab world had just given us an object lesson in the power of social media to spread contagious ideas. That the London rioters did not appear to share the lofty political ideals of the protestors in Tahrir Square was beside the point. It was the network that enabled the looters to organize and infect others with their criminal intent.

For many, this was proof that BBM, Facebook, and Twitter were forces for evil and that the authorities should block social media in times of trouble. But to blame the internet for propagating contagious ideas is a little like blaming trucks, planes and birds for spreading infectious viruses. Without trucks and planes there would be no oranges on British breakfast tables in winter and no summer holidays in Tuscany. Those criss-crossing lines at the end of Rise are what make global trade and travel possible. It is not the fault of the network that it also facilitates the spread of bird and swine flu.

But if contagion is not equivalent to transmission in the virtual world of the internet, nor should we conflate viruses with transmission networks in the material world of biology. That is the trap epidemiologists fall into when they try to use mathematical models to make sense of real-life contagions.

Just because the CDC was able to trace a pattern of sexual contacts between Gaetan Dugas and men infected with HIV in San Francisco, Los Angeles and other north America cities in the early 1980s, that does not make Dugas patient zero any more than it makes the pilot jetting off at the end of Rise the progenitor of the downfall of the human race (for all we know there may well be other patient zeros and other pilots). It simply makes them so many nodes in a network.

Nor does the frequent recourse to metaphors of financial contagion explain the present volatility of world stock markets. While it is tempting to see such volatility as a function of digitalisation, Harris is surely wrong to blame it on the internets tendency to exacerbate human mood swings. Rather, as the Bank of England pointed out in a recent report such volatility flows directly from the increasing complexity of such networks and the interconnectedness of modern financial institutions.

By spreading defaults across the system, the bank argues, such linkages reduce the likelihood that the losses of a large institution like Lehman Brothers will trigger similar defaults by other bank and brokerage houses. At the same time, however, such linkages increase the potential for contagion to spread more widely. The result is what the bank call a robust-yet-fragile tendency that makes the probability of contagion low, but the effects extremely widespread when problems arise.

This is the contagious moment we find ourselves in today and explains why, as European leaders dither over the scale of the Greek bail out, every day brings further plunges in the FTSE, Dow and Nikkei. Moreover, while it is true that these plunges may be exacerbated by hedge funds trading on fear much like Hoffmans VIXAL-4 algorithm, it is the linkages between institutionsnot the individual investors themselvesthat spread that fear more widely.

That is why restoring confidence is so much more of a challenge today that it was in 1932 when Franklin D. Roosevelt issued his famous peroration to the America people that they had ‘nothing to fear but fear itself’. Today, it is our very connectedness that makes us vulnerable to emotional and other forms of contagion, and the networks themselves that are the real source of instability and dread.

Tuesday 21 June 2011

Schadenfreude and human nature

If emotions were Twitter feeds then schadenfreude would surely be ‘trending’ high right now. From Cheryl Cole’s surprise ejection from American X-factor to Manchester United’s drubbing at the hands of Barcelona, schadenfreude is the emotion of the moment or, at least, the term hacks are most likely to reach for to describe that warm fuzzy feeling we get on learning of the deserved comeuppance of celebrities of dubious talent and overpaid premiership footballers.
Given the inherent unfairness of the capitalist system, the popularity of schadenfreude should not surprise us. Look on it as nature’s way of compensating us for the wealth and fame unfairly showered on a lucky few.
What is less easily explained, however, was the reluctance of social psychologists to reach for the term following the death of Osama Bin Laden. This was the starting point of a fascinating talk given by the anthropologist Allan Young at a conference, ‘Mastering the Emotions’, held at Queen Mary, University of London, in June.
Usually defined as ‘the pleasure derived from the misfortune of others’, schadenfreude would appear to be an apt description of the collective joy that erupted on streets of America the night the White House announced it had finally got its man. But writing in the New York Times the day after Bin Laden’s death, Jonathan Haight, a professor of psychology at the University of Virginia, chose to describe the frenzied celebrations as a form of ‘collective effervescence’ – Durkheim’s term for the joy experienced by individuals when they participate in group rituals for the communal good.
According to Young, Haight’s eschewal of schadenfreude spoke volumes, pointing to the way that the term has been reduced by purveyors of ‘social brain’ theory to the neural equivalent of ‘empathic cruelty’.
Now nobody likes to be thought of as both empathic and cruel but one of the key findings from the burgeoning field of neuroeconomics is that such traits may be hard-wired into each and every one of us as part of a mechanism of ‘altruistic punishment’ that may have once conferred an evolutionary advantage. In a famous experiment conducted by Dominique de Quervain and his colleagues at the University in Zurich in 2004 involving a scenario known as the ‘trust’ game, player A is given a sum of money and offered the choice of advancing all or a portion of it to player B. If player A decides to trust B by advancing all of his money, the investigator quadruples his gift to B, and if B then sends half the money back both players are better off. However, if B violates A’s trust, then one minute after B makes his decision, A is given the option of punishing B by revoking the investigator’s top-up gift. In one variant A also has to reduce the amount he pays himself.
To their surprise, when the researchers scanned player A’s brain they found that a region known as the dorsal striatum lit up in anticipation of inflicting the punishment on player B. In other words, punishers were empathically mirroring the imagined (anticipated) distress of the cheaters and, at the same time, experiencing pleasure. Not only that but the more intense the punishment doled out to cheaters the more the punisher’s dorsal striatum lit up.
You don’t need to be a neurobiologist to realise that these findings are somewhat disturbing. After all, if we’re all hard-wired for altruistic punishment the implication is that ‘normal’ people may not be that far removed from psychopaths.
Young’s suggestion, if I understand him correctly, is that such findings are also uncomfortable for social psychologists as it undermines enlightenment conceptions of human nature as both rational and perfectible. Instead, schadenfreude becomes an epiphenomenon of the brain, an ignoble reminder of our base beginnings when cruelty was its own reward, hence, presumably, Haight’s avoidance of the term.
I do not have space here to do justice to the rest of Young’s talk, suffice to say that he argues that the evolutionary narrative of the social brain has clear affinities with Adam Smith’s account of the moral underpinnings of capitalist social relations. However, Young argues that the thinker who comes closest to anticipating the notion of ‘empathic cruelty’ and confronting its implications for human nature is Nietzsche in his On the Genealogy of Morality (1887).
For Nietzsche the model of altruistic punishment makes no sense as, unlike Smith, Nietzsche does not see suffering as a form of economic exchange. It is not convertible like coins, transferable form one hand to another. Instead Nietzsche solves the riddle by arguing that the punisher’s gratification comes not from seeing the cheater suffer but from the visceral proof it gives him of his own power. In other words it is the punisher’s thirst for knowledge about himself and his world that explains the bond between empathy and cruelty.
Nietzsche’s solution is appealing as it seems to go the heart of the emotional displays that followed Bin Laden’s death. After all, what were they about if not visceral proof of American power and celebration of that fact? At the same time, Nietzsche’s solution accounts for the popular everyday usage in which schadenfreude is emotional shorthand for the affirmation we get from seeing those we consider undeserving of their elevated status being pulled down a peg or two.
The difference is that while the latter type of schadenfreude is generally considered a bit of harmless fun, the former type often strikes us as excessive and malicious. The hope, on this side of the Atlantic at least, is that schadenfreude isn’t reducible to mere affect and that while we may be hard-wired for empathic cruelty our capacity for reflection means we can also sometimes rise above it.




Friday 27 May 2011

Apocalypse Redux

So now we know. The World Health Organisation (WHO) did not ‘fake’ the 2009 swine flu pandemic in order to ‘line the pockets’ of vaccine manufacturers. While its decision may have struck many observers as precipitous and disproportionate, far from being a stooge for the pharmaceutical industry the WHO acted out of the best intentions.
That, at least, was the gloss being put on a long-awaited report into the WHO’s handling of the 2009 pandemic by its director-general Margaret Chan this week.
At first glance, Chan would appear to have good reason to say ‘I told you so’. The Chinese-born public health supremo has endured stinging criticism since her hubristic announcement in June 2009 that the WHO’s global influenza surveillance systems had given the world a ‘head start’ on responding to the swine-origin H1N1 virus that had emerged unexpectedly in Mexico just months before.
One of her harshest critics was the British Medical Journal which, following an investigation into suspected financial links between members of the WHO’s emergency committee that advised on the timing of the pandemic and Big Pharma, warned that Chan’s refusal to disclose the names of the panel members would ‘seriously damage the WHO’s credibility’. At the time, Chan insisted that ‘commercial interests’ had played no part in the WHO’s decision-making process and that she owed the panel members a duty of confidentiality.
This week’s report by the International Health Regulations (IHR) Review Committee is broadly supportive of Chan’s stance. Not only did the review led by Dr Harvey Fineberg, president of the Institute of Medicine, find no evidence of financial ‘malfeasance’, it also found that far from rushing the declaration of a pandemic the WHO delayed the announcement until it was sure the virus spreading between countries in a sustained manner.
For all that Chan will claim that the IHR report exonerates the WHO of wrong-doing, however, the small print contains some sharp criticisms. The overall impression conveyed by Fineberg and his fellow rapporteurs is of an organisation that is over-wieldy and overly-sensitive to criticism. Indeed, the committee found that the accusations of fakery and financial malfeasance stemmed to a large extent from the WHO’s failure to take seriously public suspicions and confusion about the way it had shifted its definition of a pandemic. According to the report, much of that confusion stemmed from an innocent but unfortunately timed technical revision – one that was compounded by some very untransparent amendments to the WHO’s online documentation.
Up until April 2009, the WHO had defined pandemics as causing ‘enormous numbers of deaths and illness’. But on 29 April, with mounting evidence that the H1N1 virus was spreading rapidly to other countries from Mexico and the United States, the WHO suddenly deleted the phrase from its website, replacing it instead with a definition based solely on influenza morbidity (numbers reporting sick) and the degree of spread. To compound the offence and provide further fuel to internet conspiracy theorists, without notice or explanation on 4 May the WHO also altered other online documents so as to make its definitions more precise and internally consistent.
In fact, as the IHR makes clear, these changes had been under discussion for nearly a year and half. According to the report, the WHO’s original definition had been adopted in 2005 to reflect then pandemic concerns about the bird flu virus, H5N1, which although it transmitted poorly between people had, in a handful of cases where it had infected humans, proved extremely lethal. By early 2009, however, with mounting evidence that bird flu was spreading worldwide and that early action could contain and halt its further spread to human populations, WHO experts recommended removing the severity requirement. The result was that from May 2009 all that was required to trigger a pandemic - and those hugely lucrative sleeping vaccine contracts - was sustained ‘human-to-human spread’ in at least two countries in one WHO region plus ‘community level outbreaks’ in at least one country in another WHO region.
When bloggers noticed the changes and drew attention to the coincidence in timing on social media sites, the WHO’s press office was inundated with inquiries. Rather than publish all the documents and dispel the suspicions, however, the WHO compounded them by burying its head in the stand. Thus was yet another conspiracy theory born.
In retrospect, given that swine flu did not turn out to be the ‘Armageddon strain’ many scientists feared, the IHR concludes that the WHO would have done better to retain the severity requirement. The report says it also could and should have done more to dispel the suspicion about the timing of its definition change. At the same time, by insisting on keeping the identities of the emergency committee members confidential, the report concludes, the WHO ‘paradoxically fed suspicions that the Organization had something to hide’. It also recommends that in future WHO adopt more open procedures for ‘disclosing, recognizing and managing conflicts of interest among expert advisers’.
Judging by her hyper-sensitivity on the issue, Chan is unlikely to welcome even these mild criticisms. In the past, she has argued that it was only a stroke of luck that the swine-origin H1N1 virus proved so mild. The irony is that had swine flu proved as virulent as bird flu then we would be thanking her for triggering those sleeping vaccine contracts, not scolding her. As she rather pompously informed the BMJ in an open letter last year, the decision to raise the pandemic alert level was based on ‘clearly defined virological and epidemiological data’ and ‘it is hard to bend these criteria, no matter what the motive’.
In the future, Chan would be advised to put rather less faith in science and use some common-sense, for history shows that the only thing that is truly predictable about influenza pandemics is their unpredictability. Or as the IHR puts it: ‘Lack of certainty is an inescapable reality when it comes to influenza’.

A Sentimental Journey

On my way to the US to escape the Royal Wedding, I killed time by selecting the King’s Speech on the in-flight movie channel. Having somehow conspired to miss Colin Firth’s Oscar winning portrayal of George VI the first time round I was keen to see it for myself and judge whether he deserved his garlands.
I don’t consider myself a blubber, but by the time Firth emerged from the sound studio at Buckingham Palace I don’t mind admitting I was in tears. The odd thing was the following morning my lachrymosity worsened. Awaking bleary-eyed at 6am to catch a glimpse of ‘that dress’, I wept buckets as Michael Middleton escorted Kate up the aisle before offering her up to Wills and a grateful nation.  In my semi-comatose state I was even prepared to overlook Piers Morgan’s disingenuous commentary for CNN and his ludicrous declaration that ‘the British monarchy is back!’.
As a republican and someone who studies emotions for a living, I ought to be immune to such sentimental claptrap. That I am not cannot simply be put down to jet-lag and being far from home.
Royal weddings, like Firth’s artful portrayal of George VI, are a performance. To succeed they must persuade us not only to suspend disbelief but to invest in the delusion that but for an accident of birth we too could be walking up the aisle or, in my case, giving away my daughter’s hand in marriage.
Just as the hypnotist uses suggestion to achieve mastery over his mesmeric subject, so the Windsors beguile us, their subjects, with an artful combination of pageantry and a narrative rich in symbolic portents. No matter that we have been here before. Unlike in 1981 when Charles married Di, this time we are told the fairytale really has come true. 
I have no idea whether it is true romance this time but on April 29th, like billions of others watching around the globe, I suspended disbelief and surrendered to the emotion of the occasion. Though isolated and alone in my hotel room, thanks to the magic of TV I felt intimately connected to the crowds camped outside the Abbey and along the Mall – hence when they cried, I cried too.
Such emotional expressions, I would argue, are mere reflexes, involuntary responses to the power of ceremony and ritual to move us in mysterious ways. As the King’s Speech showed, communications technology is crucial to this process. The advent of radio massed people in new ways, allowing both the House of Windsor and Adolf Hitler to reach out to and manipulate what Gustave le Bon called the ‘crowd mind’. By contrast, in the Victorian era, when news of a royal marriage or death was conveyed by the electric telegraph mass outpourings of joy or grief were less predictable -- and usually delayed until the following morning.
In the modern era, of course, thanks to television and the internet, the process is virtually instantaneous. When Charles married Di it is said that during the dull bits - such as the signing of the register when people got up to boil their kettles - the National Grid became ‘a barometer of national feeling’. In April 2011 the best barometer was You Tube’s ‘Royal Wedding channel’ where at time of writing Kate and Wills’s kiss on the Buck palace balcony has registered in excess of two million views. 
Like the Westminster Abbey verger Ben Sheward’s spontaneous cartwheels of joy along the red carpet (544,000 views to date), these ‘moments’ of emotion now appear on an endless loop, captured for ever. Watching them in 100 years time, I wonder what historians will make of them and whether they will weep in sympathy or shake their heads with incredulity?

Tuesday 26 April 2011

Talking 'bout our generations

It could happen to any of us. Your son or daughter asks if they can have some friends round for a ‘gathering’. Thrilled that he/she has taken you into their confidence and considers you ‘cool’, you acquiesce. Wanting to seem even cooler you buy them some beer and arrange to go out for a couple of hours, leaving them the run of the house.
Nine times out of ten that would be the end of the story. But for Brian Dodgeon it was the beginning of a nightmare, one that, to judge by the report in yesterday’s Times, could end in a ten-year prison sentence for the University of London lecturer. I have no idea what went on in Mr Dodgeon’s home in North Kensington last weekend, but the death of 15-year-old Isobel Reilly from an apparent drug cocktail at a party given by his daughter in the early hours of Saturday morning should be a wake up call for every parent.
I did not know Isobel well. In all, ‘Issy’, as she was known to my son and his friends, visited our home on four occasions. My son was dating a girl from Chiswick and Issy was her best friend and we accepted her as part of the package. Now and again Issy would pop downstairs to use the loo or get a drink, and on occasion she would wander into my study and quiz me about my work – she seemed intrigued that I was a journalist and from later exchanges I realised that she had Googled me. I liked her. She had personality and pizzazz.
My son seemed to like having Issy around too so when, one evening, he asked if he could host a gathering we were happy to oblige. We even bought him a few beers and went out for a couple of hours, leaving him and his friends the run of the house. After all, wasn’t that the ‘cool’ thing to do?
In our case, I am happy to report that when we returned there were no irate neighbours standing on our doorstep. Nor were there any breakages or pools of vomit on the kitchen floor. The party, in other words, would not make headline news. But should we have provided alcohol at all? After all, these were 15-year-olds and below the legal drinking age? And as appears to have been the case on the night of Issy’s death, there were no adults around to supervise proceedings should anything have gone wrong – although, in our case, we had only gone out for a couple of hours and had been back well before midnight.
Another difference was that in our house, unlike, if the Independent is to be believed, in Mr Dodgeon’s, there was no drug stash waiting to be discovered. But what if, unbeknownst to us, someone had introduced drugs to the mix and someone had overdosed? Would we then too have risked charges of child abandonment and/or reckless endangerment?
At the impromptu memorial service held for Issy on Chiswick Green yesterday, these were the questions at the forefront of many parents’ minds. As one might expect, there was a wide disavowal of drugs, but the alcohol issue was less clear cut. From what I could gather the prohibitionists were in the minority. As one dad put it: ‘Better they drink at home than on the streets’. Curfews proved an equally thorny issue, with some parents favouring a lights out at 11pm policy and others opting for midnight or later. Whatever the policy in one’s own home, however, there was widespread recognition that there was little we could do to control what went on in other parents’ homes. And there was the rub. None of us wants to play bad cop. We all want to be the parents your son or daughter is happy to introduce to his/her friends. But in fostering a climate of accommodation are we guilty of blurring the boundaries between the generations? Would it not be better to act our age and demand that our children do the same?
These are not new issues. My parents faced the same dilemmas in the 1970s when I was a teenager. The difference then was the drugs that we might bring or ‘discover’ at a party were far less likely to cause catastrophic toxic reactions, though I can recall plenty of friends turning a lighter shade of pale, most usually as a result of drinking to excess and inhaling on unfiltered joints laced with hash and nicotine-rich tobacco.
No doubt Issy’s death will provoke the usual laments from right-wing moralists about our ‘overly’ promiscuous society and the steady decline in ‘family values’. Then there is the very modern delusion, as one teacher from Issy’s school suggested yesterday, that it is all the fault of internet and that were it not for Google and SMS text messaging we could somehow keep children ignorant of the temptations waiting to snare them out there in the ‘adult’ world. But children have always been adept at discovering society’s hidden vices and you cannot reverse half a century of social and technological change.
The challenge for all of us, children and adults alike, is to live in the present, in the world as it is, not as we wish or imagine it should be. Doing that will require a franker dialogue between the generations than, as far as I can discern, has been the case hitherto. It is a dialogue I would have liked to have had with Issy. Next time I will not be so cool.

Saturday 29 January 2011

Some very brief reflections on celebrity


Is celebrity an appropriate topic for academic discourse? Can an analysis of the ways in which celebrity is ‘produced and consumed’ – to use the jargon of cultural studies departments - deepen our understanding of social and political processes, or does it risk trivializing our engagement with history? In short, can historians talk about celebrity and still be taken seriously?
I only ask because issue two of a new journal, Celebrity Studies, has just arrived in my in-tray. The brainchild of two media studies academics, the journal’s entries this month include  ‘hypertrophic celebrity in Victoria Beckham’ and  ‘Gordon Brown and the work of emotion.’
No, I don’t know what ‘hypertrophic celebrity’ is either, nor how it came to be embodied in Victorian Beckham and, on a cursory reading, its easy to see how this stuff can earn academics a bad name. Certainly, that was the line the Independent took when Routledge launched the title, accusing the editors of ‘pseudo-academic mumbo jumbo’ and of taking the subject of celebrity too seriously.
Until last week, I would probably have been inclined to agree. Celebrities already take themselves seriously enough, thank you. The last thing they need is canonisation by academics too.
Following the latest revelations in the News Of The World phone-hacking saga, however, I am no longer so sure. For future political historians may well see January 28th, 2011 as the day when Rupert Murdoch’s grip on power suffered a fatal blow – all thanks to an obscure interior designer who also happens to be Sienna Miller’s step-mother. Yes, dear reader, like you (I hope) until last week I had never heard of Kelly Hoppen, but she’s certainly got my attention now.
If, as Hoppen’s lawyers’ allege, it turns out that NOTW was hacking her phone as recently as last March and that executives higher up the food chain knew about it, then News International’s claim that the phone-hacking was the work of one or two rogue reporters begins to wear increasingly thin, jeopardizing not just former executives on the paper but Murdoch’s strategy for taking majority control of Sky too. It’s a big if, of course. Murdoch may yet see off Hoppen and the other celebtoids nipping at his heels, but it will be a delicious irony should Hoppen – a product of today’s media obsession with celebrities – eventually prove to be Murdoch’s undoing. Murdoch’s papers have been able to make – and break – celebrities on both sides of the Atlantic for more than two decades now, but if the phone-hacking scandal demonstrates anything it’s that in the same period there’s been a slow but steady shift of power away from the media towards its Frankenstein-like creations.
Without coming over all ‘cultural studies’ about it, Hoppen et al are no longer just  tame products to be consumed and discarded by an ‘omnipotent’ media, they are now themselves laying claim to that power by challenging the media’s right to intrude upon their lives and dictate their public identities.
Of course, the media isn’t the only field in which celebrities increasingly call the shots. From Hollywood, where ‘star’ actors get to green-light movies, to the catwalks of Milan and Paris where Victoria Beckham gets to decide which designers are ‘in’ this season, to charitable causes in Africa fronted by pop stars like Bono toting Louis Vuitton bags, the power of celebs is now everywhere on show.
As we have seen in California with the election of Arnold Schwarzenegger and, arguably, in London with Boris Johnson’s elevation to the mayoralty off the back of his comic turn on Have I Got News For You,  it is also increasingly a short-cut to high political office.
There is nothing particularly new about this, of course. As Fred Inglis makes clear in his fascinating A Short History of Celebrity (if you’re going to write about celebrity and still want to be taken seriously as a historian ‘short’ is definitely the way to go) the modern production of celebrity can be traced back to the Enlightenment. Today, we have Katie Price and Damien Hirst. The 18th century equivalents were Georgiana, Duchess of Devonshire, and Lord Byron. The democratisation of society in the Victorian period sealed the deal, turning celebrity into a commodity with mass appeal and making Florence Nightingale and Annie Oakley, the star of Buffalo Bill’s Wild West Show, household names.
Similarly, in my own period – the 1890s – I have been struck by how the Graphic and the Illustrated London News prefigured the productions of Heat and Hello, pioneering the use of photo-spreads celebrating the lives of Victorian actresses and minor members of the royal family. More than any other publication, it was, of course, the Daily Mail, launched in 1896, that ushered-in the era in which newspapers sought to confer celebrity on everyone from politicians to pantry maids, with Alfred Harmsworth playing the role of self-appointed puppet master-in-chief. The difference was that in 1920’s Harmsworth (by then Lord Northcliffe and also the proprietor of The Times)  truly had a lock on the means of celebrity production, controlling 40 percent of the British newspaper market. That is a power that, today, Murdoch can only dream of.

Thursday 9 December 2010

History of the Present

As a PhD student of un certain age it’s sometimes easy to feel that history is passing you by. Then there are days when the present comes vividly alive. I am too young to have been a soixante-huitard and was in Paris for the poll tax riots in 1990, but there was definitely a whiff of revolution in the air today as I rounded the corner of Malet Street on my way to a seminar at the London School of Hygiene and Tropical Medicine and found my path blocked by a phalanx of police in Day-Glo jackets. It was midday and a little further up the road, a boisterous crowd had already been kettled outside Senate House. By the late afternoon, as darkness fell on Parliament Square, the TV news would be full of images of bonfires and swinging batons, but in Malet Street five hours earlier there had been something of a carnival spirit, a desire to burst the ConDem balloon with barbs not bombs.
‘Vince Cable can’t decide whether to get into bed with Cameron or cower under the duvet,’ mocked one speaker. ‘Pinocchio Clegg didn’t pay to go to university so why should we?’ demanded another.
Amidst the predictable mass-produced Cameron Pig signs reminiscent of the Class War placards of the early 80’s there were also stabs at homemade humour. My favourite read: ‘If I have learned anything from these protests it’s that spellcheck doesn’t work on cardboard.’ I’m pleased to report the sign was correctly spelt, so obviously a university degree still has some uses.
So now the government has got its majority, will the student demonstrations fizzle out or morph into a genuine mass protest movement? To listen to some of the speakers gathered in Malet Street it already has: ‘You’ve already shown your solidarity with the Greek protestors and forged links with the unions. Whatever happens today, this isn’t over until you kick the bankers and their stooges out of office.’
Certainly, this generation of students has already shown impressive organisational skills, but the hotheads are rapidly giving the movement a bad name and it is hard to see the centre holding during what promises to be a long ConDem winter. I have no doubt the anger on the streets is genuine, but I fear it has less to do with the scale of the fee hikes than the sense of betrayal at the Liberals’ broken promises. After all, it is not the 20-year-olds on the streets of London today who are going to be saddled with upwards of £30,000 of debt, but the 16-year-olds coming up behind them. They face a stark choice: to accept the tuition fees as a fait accompli and knuckle down to the task of scaling the corporate ladder so that one day they may have the means to repay them, or to take up where previous generations left off. The 68’ers weren’t just fighting for themselves but for a different kind of society and a different model of historical progress, one in which capital and free markets were not the only measure of value. During the neo-conservative boom years that followed the collapse of communism we heard a lot about the ‘end of history.’ Perhaps December 9, 2010 was the day that ‘history’ finally spluttered back to life.