New Nerd Order

What's on the minds of our masters?

Text by Peter Lyle

Tank _vol 7issue 250

Collage by Timothy Holloway



How do geeks think? We live in a world in which Facebook, Google, Twitter and Amazon are sometimes credited with as much political and cultural influence as old-fashioned nation-states. And the faces behind the software that made these new systems of communication possible are now being scrutinised and celebrated with the same fascination once reserved for heads of state or all-conquering industrialists.

If Julian Assange’s past year of media stardom is the most obvious example of this new era of geekography, then last year’s Facebook film The Social Network comes a close second. From the start, Aaron Sorkin’s script deftly paints Facebook’s creator Mark Zuckerberg as a brilliant mind who nevertheless has trouble relating to people. His status as an outsider who yearns to belong causes him to see success in life as the accumulation of a series of official, branded achievements (being a member of certain college clubs, having a hot girlfriend, having a large number of verified “friends”, only doing what is “cool”). Of course, as well as building his site as a means to becoming successful, a somebody, he also constructs it as a kind of temple to that idea.

Jesse Eisenberg as Mark Zuckerberg fluctuates compellingly between childlike, outsider awkwardness (nervous rocking, avoidance of eye contact) and acute, incisive arrogance. He’s nervous, but he’s Napoleonic. (A New Yorker profile timed to coincide with the film’s release ended with Zuckerberg, an avid classics student, recollecting a passage from Virgil’s Aeneid about building “a nation/empire without bound”.)

It’s a convincing representation of the archetypal geek: obsessive, fiercely independent but yearning for community, critically analytical, socially awkward but charismatic, shy but egomaniacal, suspicious of emotion, incredibly smart but unusually suggestible, intellectually bold but physically vulnerable, preoccupied with freedom of information but also innately private. You see this archetype in portrayals of Assange as the avenging superman who, if pushed, could bring the whole of Western civilisation tumbling down with one USB stick, and in sympathetic media coverage of Gary McKinnon, the British hacker fighting extradition for raiding US military secrets. (It was also unmistakable in some journalistic accounts of the recent history of Bradley Manning, the soldier charged with leaking US diplomatic cables to WikiLeaks, and in early US responses to a new autobiography by Paul Allen, who co-founded Microsoft with Bill Gates and who has been criticised for being unduly ungenerous about his old friend in the memoir.)

In other words, the supergeek of today’s digitally driven world is still portrayed as the same kind of character as the loser-geeks of yesterday: capable of great feats of abstract intelligence on the one hand, but emotionally immature and incapable on the other.

I have to confess to a lifelong interest in geekdom, which is founded on two key factors. The first is an accident of birth. If you’re a male humanoid who grew up in the 1980s, when Star Wars was still the dominant cultural obsession and the first, primitive but programmable home computers were finding their way into your well-heeled school friends’ homes, interest in geeks is kind of inevitable. I grew up with geeks. I had a cousin who, aged eight, would spend days making complex Lego Technics constructions and then dismantling them without ever playing with them, and be utterly humourless about any deviation in the process. He’s now a sought-after software developer. I knew another boy who started doggedly making little tunes on a late-1980s computer called an Atari ST and ended up making special effects for George Lucas. There’s another whose childhood obsession with audiovisual technology led him to the much-envied, well-paid joy of filming Premiership football games for a living.(I asked a couple of them to talk to me, but one was too busy, in the middle of an all-hours coding crunch; the others were understandably guarded and reluctant to be the subject of semi-anthropological assessments.)

I was also loosely aware that I didn’t want to be a geek. That was partly because I knew I’d never have the singular patience or mathematical brilliance to turn play into methodical product in the way my friends did. But the second key factor in my interest was that I was scared of the idea of becoming one. This was not purely because of the word’s loser connotations, but also because I was ambivalent about the singular focus and seeming sacrifices that geeks’ achievements seem to demand of them. I was prone to extreme enthusiasms and obsessions, to that boyish compulsion to collect and catalogue things, but I also always had a fear that this urge, left unchecked, could be unhealthy. Prime example: I really liked Star Wars. I really wanted all the toys. At the same time, I distinctly remember a primary school teacher who, in 1982, had every single toy that you could buy, in multiples where thematically appropriate. My awe turned into a kind of creeped-out contempt for his joyless, compulsive attendance to and organisation of them.

That is to say, it seemed that letting your obsession with complete or perfect systems overwhelm your primal response to the thing itself – officially becoming a completist trainspotter rather than a mere lover of locomotives; pinning butterflies to boards instead of admiring them in flight – was a way of embalming the thing you’d originally loved.

Another aspect of my interest in geekery came in adult life. Over the past decade, I’ve got to watch someone with a form of autism grow into adolescence, and I’ve also got to be reasonably well acquainted with the work of Simon Baron Cohen, the Cambridge psychologist who is head of the university’s Autism Research Centre. He has proposed the idea of the “extreme male brain”, namely that autistic disorders are an overdeveloped version of the normal male tendency to be better at abstract problem-solving than at empathy with other humans. Which is something I saw in my geek friends growing up, and heard them joke about later in life. When the last round of media debates about “designer babies” came along, Baron Cohen warned against the idea of screening out “undesirable” DNA for qualities such as autism – if we bred autism out of the human race, we’d also risk doing away with exceptional mathematicians. Indeed, mathematicians, actuaries, programmers, hackers and their like can all do what the rest of us can’t precisely because of this singular, obsessive focus.

While we’re talking about the archetype of the geek and autistic conditions, these passages from two of Baron Cohen’s numerous papers on aspects of autism, though long, make the argument very clear:

Individuals with autism spectrum conditions display marked difficulties in reciprocal social interaction. However, while clinically important, focusing exclusively on the interpersonal difficulties in autism may overshadow the importance of the self in underlying such difficulties. Historically, the self has always been integral in defining autism. The word “autism” derives from the Greek word autos and literally translates to “self”. Early clinical accounts anecdotally suggested that individuals with autism spectrum conditions are completely self-focused or “egocentric in the extreme”. Later work demonstrated that this egocentrism may be manifest in the lack of viewing oneself as embedded within social contexts and via the lack of distinguishing self from other. In addition to this lack of distinguishing self from other, individuals with autism also have marked difficulties in self-referential cognitive processing. These difficulties extend to reflecting on one’s own false beliefs […] or intentions.

Which is to say, a geekish removal from the world, an apparent shyness, may actually be a manifestation of a kind of self-regard, one so pronounced that it can’t even adjust for the mindsets of others. Moreover, this kind of self-regard may make you even worse than the rest of us at acknowledging the subjectivity of your own thought.

The Autism Research Group was the first group to suggest a link between autistic traits, familial autism risk and talent at systemising. Systemising is the drive to analyse or construct a system. A system is anything that follows rules and is thus lawful. It might be a mechanical system (e.g., a machine or a spinning wheel), an abstract system (e.g., number patterns), a natural system (e.g., water flow or the weather), or a collectible system (e.g., classifying objects such as DVDs by author or toy cars by shape, colour, size). We have found that people with autism or Asperger Syndrome may have unusual talents at systemising (e.g., in physics), and that people who are gifted mathematicians may be more likely to have a diagnosis of autism or Asperger Syndrome.

Asperger Syndrome is a high-function autistic spectrum disorder, and Baron Cohen is the expert who recently diagnosed it in Gary McKinnon. Hans Asperger, the Austrian paediatrician who first identified the disorder, called his patients “little professors”. A recent Radio 4 programme about Asperger Syndrome asked those with the condition to convey the way they saw the world to “normal” listeners. One interview, Bob Delo, was diagnosed at 11, after being caught bugging his parents’ phone calls.

Delo found that he had to learn “normal” behaviour like someone studying a strange tribe: he found nonsensical social conventions such as shaking hands and making eye contact infuriating. Talking to someone and looking into their eyes at the same time, as is traditionally deemed normal and empathetic, meant that he had to multitask, and couldn’t. It was “too much information” for him to cope with. Only when his ego was challenged would he do something that didn’t appeal to him. “I wasn’t interested in being potty-trained,” he remembered, “and it wasn’t really until my sister started to become potty-trained that I actually started to feel some competition, and kept up with her.”

The metaphorical ways in which we use language still infuriated him, though. Sometimes, he said, when told to “watch the road”, he literally would, and in so doing feel that he was “making a point about ‘their’ language”. The sloppy, hyperbolic manner in which almost all of us almost always speak, he said, “slows down progress” by causing confusion. “If I had my way,” he said, “everybody would feed information to me like a computer.”

Which perhaps brings us back to the Jesse Eisenberg version of Mark Zuckerberg decoding what makes people tick and having the talent to build a virtual world around it, but not knowing what makes a particular person feel bad. In The Social Network, he is clearly unable to do what Baron Cohen describes as “viewing oneself as embedded within social contexts,” and so builds a definitive social context with which to redress that.

But the purported strange habits or nervous tics of this or that cyber-age celebrity aside, what about how they engage with messy human concepts such as politics? “Freedom” is the one nebulous idea that seems to unite cyber-protesters and entrepreneurs alike (it is hacker collective Anonymous’ rallying cry for its patrols of CCTV cameras and attacks on corporate servers), but take a look at Assange’s outrage when the media took an interest in his private life and you’ll see it’s not quite clear what the word actually means to them. “Democracy” means saying Like or Dislike, or expressing your political disapproval with a retweet. To ask more is to hinder “progress”. You can’t have a more nuanced discussion if you’re within a piece of software that has fossilised the debate into two binary choices.

Sam Harris is a mathematician whose recent book, The Moral Landscape: How Science Can Determine Human Values, looks at how science can supplant religion’s ethical role. But to me, his arguments boil down to a crude kind of maths about well-being that hasn’t evolved much from John Stuart Mill’s 1861 idea of utilitarianism. Here’s Harris on his basic argument: “The moment you’ve linked morality to the well-being of conscious creatures, you see that the practices of the church don’t maximise human well-being. The church is as confused in talking about morality as it would be in the physics of the transubstantiation.” But how does this help us to deal with eminent environmental scientist James Lovelock’s prediction of an overcrowded future where large sections of today’s human habitat will be out of bounds? The maths might say to kill a third of the world’s population because the total well-being output of those who remain would exceed that of an unhappy, resource-starved, overpopulated series of islands. The morality suddenly seems a little messier.

It’s a very clear, simple system for making decisions, but it’s a very, very basic (I would even say daft) one that claims to render all unscientific objections redundant. But science is meant to be about exploration and endless openness to observable phenomena. As another philosopher, Jacques Derrida, pointed out, the cult of progress is actually about the opposite: “Scientific practice is always devoid of scientism. Scientism is the positivist allegation of scientific power; it is not knowledge and science.” When internet essayist Clay Shirky uses the term “cognitive surplus” – his description for the wasted brainpower we’ve spent decades watching telly with, and can now pool for a measurably better, freer future – I always wonder if he has mistaken people for abacuses.

That’s another thing about the digital age and the new super-coders who have built it. Constructing a website or a piece of software means making a flawless, internally coherent system. And artificial intelligence works by systems of commands that enable a programme to work out what to do at every turn by comparing a piece of information against a hierarchy of instructions. Perhaps people who have spent their childhoods becoming brilliant and fluent at these artificial languages, who sometimes wish humans would be more like computers, are not so different from the other types of non-digital nerds, the bright schoolboys who study politics, philosophy and economics at Oxford and go on to become technocratic politicians. After all, they too feel obsessively compelled to talk about “progress” and launch new initiatives to cure extreme Muslims and improve happiness indices. There have always been people who are more comfortable in virtual worlds than in real ones, in clear systems than in sticky human situations. They have often found ways to become giants in those worlds, and so, sometimes, to ultimately do so in ours. §